2026-03-10T13:58:47.499 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-10T13:58:47.507 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T13:58:47.535 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059 branch: squid description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.0} 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/yes 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '1059' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/v18.2.0 ' name: kyr-2026-03-10_01:00:38-orch-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 8043 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b targets: vm03.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIIzkXBzSZ5P64Dctu6UUhEuabSLRs6ynchRAxojYKY8Mrar691lpQVCZCo/lRT8l+4XPkQSaO45uCR/uOWC0AY= vm04.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJaq9o8X6/5N9A6RUq5hhEepgPV1GEy2o/T0MCcj3KrVBmM5ZLnl03nZz2z9Vm7pvLEreC3aCAzm7aSR40+PD3c= tasks: - install: exclude_packages: - ceph-volume tag: v18.2.0 - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.io/ceph/ceph:v18.2.0 roleless: true - print: '**** done end installing v18.2.0 cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 2 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay true - cephadm.shell: host.a: - ceph fs set cephfs inline_data false - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - ceph-fuse: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: - /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/suites/orch/cephadm/mds_upgrade_sequence/tasks/3-upgrade-mgr-staggered.yaml meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: false teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-10_01:00:38 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs false || true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-10T13:58:47.535 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa; will attempt to use it 2026-03-10T13:58:47.535 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks 2026-03-10T13:58:47.535 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-10T13:58:47.535 INFO:teuthology.task.internal:Checking packages... 2026-03-10T13:58:47.535 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-10T13:58:47.536 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-10T13:58:47.536 INFO:teuthology.packaging:ref: None 2026-03-10T13:58:47.536 INFO:teuthology.packaging:tag: None 2026-03-10T13:58:47.536 INFO:teuthology.packaging:branch: squid 2026-03-10T13:58:47.536 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T13:58:47.536 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-10T13:58:48.300 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-10T13:58:48.301 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-10T13:58:48.302 INFO:teuthology.task.internal:no buildpackages task found 2026-03-10T13:58:48.302 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-10T13:58:48.302 INFO:teuthology.task.internal:Saving configuration 2026-03-10T13:58:48.311 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-10T13:58:48.312 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-10T13:58:48.323 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm03.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 13:57:33.650732', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:03', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIIzkXBzSZ5P64Dctu6UUhEuabSLRs6ynchRAxojYKY8Mrar691lpQVCZCo/lRT8l+4XPkQSaO45uCR/uOWC0AY='} 2026-03-10T13:58:48.331 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm04.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 13:57:33.650248', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:04', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJaq9o8X6/5N9A6RUq5hhEepgPV1GEy2o/T0MCcj3KrVBmM5ZLnl03nZz2z9Vm7pvLEreC3aCAzm7aSR40+PD3c='} 2026-03-10T13:58:48.331 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-10T13:58:48.332 INFO:teuthology.task.internal:roles: ubuntu@vm03.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-10T13:58:48.332 INFO:teuthology.task.internal:roles: ubuntu@vm04.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-10T13:58:48.332 INFO:teuthology.run_tasks:Running task console_log... 2026-03-10T13:58:48.364 DEBUG:teuthology.task.console_log:vm03 does not support IPMI; excluding 2026-03-10T13:58:48.372 DEBUG:teuthology.task.console_log:vm04 does not support IPMI; excluding 2026-03-10T13:58:48.372 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f6e29f72170>, signals=[15]) 2026-03-10T13:58:48.372 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-10T13:58:48.373 INFO:teuthology.task.internal:Opening connections... 2026-03-10T13:58:48.373 DEBUG:teuthology.task.internal:connecting to ubuntu@vm03.local 2026-03-10T13:58:48.373 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T13:58:48.437 DEBUG:teuthology.task.internal:connecting to ubuntu@vm04.local 2026-03-10T13:58:48.437 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T13:58:48.499 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-10T13:58:48.500 DEBUG:teuthology.orchestra.run.vm03:> uname -m 2026-03-10T13:58:48.559 INFO:teuthology.orchestra.run.vm03.stdout:x86_64 2026-03-10T13:58:48.560 DEBUG:teuthology.orchestra.run.vm03:> cat /etc/os-release 2026-03-10T13:58:48.618 INFO:teuthology.orchestra.run.vm03.stdout:NAME="CentOS Stream" 2026-03-10T13:58:48.618 INFO:teuthology.orchestra.run.vm03.stdout:VERSION="9" 2026-03-10T13:58:48.618 INFO:teuthology.orchestra.run.vm03.stdout:ID="centos" 2026-03-10T13:58:48.618 INFO:teuthology.orchestra.run.vm03.stdout:ID_LIKE="rhel fedora" 2026-03-10T13:58:48.618 INFO:teuthology.orchestra.run.vm03.stdout:VERSION_ID="9" 2026-03-10T13:58:48.618 INFO:teuthology.orchestra.run.vm03.stdout:PLATFORM_ID="platform:el9" 2026-03-10T13:58:48.619 INFO:teuthology.orchestra.run.vm03.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T13:58:48.619 INFO:teuthology.orchestra.run.vm03.stdout:ANSI_COLOR="0;31" 2026-03-10T13:58:48.619 INFO:teuthology.orchestra.run.vm03.stdout:LOGO="fedora-logo-icon" 2026-03-10T13:58:48.619 INFO:teuthology.orchestra.run.vm03.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T13:58:48.619 INFO:teuthology.orchestra.run.vm03.stdout:HOME_URL="https://centos.org/" 2026-03-10T13:58:48.619 INFO:teuthology.orchestra.run.vm03.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T13:58:48.619 INFO:teuthology.orchestra.run.vm03.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T13:58:48.619 INFO:teuthology.orchestra.run.vm03.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T13:58:48.619 INFO:teuthology.lock.ops:Updating vm03.local on lock server 2026-03-10T13:58:48.623 DEBUG:teuthology.orchestra.run.vm04:> uname -m 2026-03-10T13:58:48.642 INFO:teuthology.orchestra.run.vm04.stdout:x86_64 2026-03-10T13:58:48.642 DEBUG:teuthology.orchestra.run.vm04:> cat /etc/os-release 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:NAME="CentOS Stream" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:VERSION="9" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:ID="centos" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:ID_LIKE="rhel fedora" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:VERSION_ID="9" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:PLATFORM_ID="platform:el9" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:ANSI_COLOR="0;31" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:LOGO="fedora-logo-icon" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:HOME_URL="https://centos.org/" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T13:58:48.700 INFO:teuthology.orchestra.run.vm04.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T13:58:48.701 INFO:teuthology.lock.ops:Updating vm04.local on lock server 2026-03-10T13:58:48.705 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-10T13:58:48.707 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-10T13:58:48.708 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-10T13:58:48.708 DEBUG:teuthology.orchestra.run.vm03:> test '!' -e /home/ubuntu/cephtest 2026-03-10T13:58:48.710 DEBUG:teuthology.orchestra.run.vm04:> test '!' -e /home/ubuntu/cephtest 2026-03-10T13:58:48.756 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-10T13:58:48.757 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-10T13:58:48.757 DEBUG:teuthology.orchestra.run.vm03:> test -z $(ls -A /var/lib/ceph) 2026-03-10T13:58:48.765 DEBUG:teuthology.orchestra.run.vm04:> test -z $(ls -A /var/lib/ceph) 2026-03-10T13:58:48.778 INFO:teuthology.orchestra.run.vm03.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T13:58:48.812 INFO:teuthology.orchestra.run.vm04.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T13:58:48.812 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-10T13:58:48.821 DEBUG:teuthology.orchestra.run.vm03:> test -e /ceph-qa-ready 2026-03-10T13:58:48.835 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T13:58:49.037 DEBUG:teuthology.orchestra.run.vm04:> test -e /ceph-qa-ready 2026-03-10T13:58:49.055 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T13:58:49.251 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-10T13:58:49.253 INFO:teuthology.task.internal:Creating test directory... 2026-03-10T13:58:49.253 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T13:58:49.256 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T13:58:49.276 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-10T13:58:49.278 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-10T13:58:49.280 INFO:teuthology.task.internal:Creating archive directory... 2026-03-10T13:58:49.280 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T13:58:49.314 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T13:58:49.335 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-10T13:58:49.336 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-10T13:58:49.336 DEBUG:teuthology.orchestra.run.vm03:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T13:58:49.390 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T13:58:49.390 DEBUG:teuthology.orchestra.run.vm04:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T13:58:49.406 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T13:58:49.406 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T13:58:49.433 DEBUG:teuthology.orchestra.run.vm04:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T13:58:49.458 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T13:58:49.467 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T13:58:49.472 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T13:58:49.484 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T13:58:49.485 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-10T13:58:49.487 INFO:teuthology.task.internal:Configuring sudo... 2026-03-10T13:58:49.487 DEBUG:teuthology.orchestra.run.vm03:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T13:58:49.510 DEBUG:teuthology.orchestra.run.vm04:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T13:58:49.553 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-10T13:58:49.555 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-10T13:58:49.556 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T13:58:49.579 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T13:58:49.610 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T13:58:49.662 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T13:58:49.721 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T13:58:49.721 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T13:58:49.779 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T13:58:49.805 DEBUG:teuthology.orchestra.run.vm04:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T13:58:49.866 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T13:58:49.867 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T13:58:49.927 DEBUG:teuthology.orchestra.run.vm03:> sudo service rsyslog restart 2026-03-10T13:58:49.929 DEBUG:teuthology.orchestra.run.vm04:> sudo service rsyslog restart 2026-03-10T13:58:49.955 INFO:teuthology.orchestra.run.vm03.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T13:58:50.003 INFO:teuthology.orchestra.run.vm04.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T13:58:50.339 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-10T13:58:50.342 INFO:teuthology.task.internal:Starting timer... 2026-03-10T13:58:50.342 INFO:teuthology.run_tasks:Running task pcp... 2026-03-10T13:58:50.345 INFO:teuthology.run_tasks:Running task selinux... 2026-03-10T13:58:50.347 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-10T13:58:50.348 INFO:teuthology.task.selinux:Excluding vm03: VMs are not yet supported 2026-03-10T13:58:50.348 INFO:teuthology.task.selinux:Excluding vm04: VMs are not yet supported 2026-03-10T13:58:50.348 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-10T13:58:50.348 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-10T13:58:50.348 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-10T13:58:50.348 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-10T13:58:50.349 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-10T13:58:50.349 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-10T13:58:50.351 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-10T13:58:50.959 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-10T13:58:50.988 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-10T13:58:50.988 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventoryt1j7gnb8 --limit vm03.local,vm04.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-10T14:00:53.540 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm03.local'), Remote(name='ubuntu@vm04.local')] 2026-03-10T14:00:53.541 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm03.local' 2026-03-10T14:00:53.541 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T14:00:53.611 DEBUG:teuthology.orchestra.run.vm03:> true 2026-03-10T14:00:53.696 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm03.local' 2026-03-10T14:00:53.696 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm04.local' 2026-03-10T14:00:53.697 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm04.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T14:00:53.767 DEBUG:teuthology.orchestra.run.vm04:> true 2026-03-10T14:00:53.849 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm04.local' 2026-03-10T14:00:53.849 INFO:teuthology.run_tasks:Running task clock... 2026-03-10T14:00:53.853 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-10T14:00:53.853 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T14:00:53.853 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T14:00:53.855 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T14:00:53.855 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T14:00:53.896 INFO:teuthology.orchestra.run.vm03.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T14:00:53.922 INFO:teuthology.orchestra.run.vm03.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T14:00:53.944 INFO:teuthology.orchestra.run.vm04.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T14:00:53.959 INFO:teuthology.orchestra.run.vm03.stderr:sudo: ntpd: command not found 2026-03-10T14:00:53.961 INFO:teuthology.orchestra.run.vm04.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T14:00:53.978 INFO:teuthology.orchestra.run.vm03.stdout:506 Cannot talk to daemon 2026-03-10T14:00:53.995 INFO:teuthology.orchestra.run.vm03.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T14:00:53.996 INFO:teuthology.orchestra.run.vm04.stderr:sudo: ntpd: command not found 2026-03-10T14:00:54.007 INFO:teuthology.orchestra.run.vm04.stdout:506 Cannot talk to daemon 2026-03-10T14:00:54.011 INFO:teuthology.orchestra.run.vm03.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T14:00:54.020 INFO:teuthology.orchestra.run.vm04.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T14:00:54.032 INFO:teuthology.orchestra.run.vm04.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T14:00:54.062 INFO:teuthology.orchestra.run.vm03.stderr:bash: line 1: ntpq: command not found 2026-03-10T14:00:54.083 INFO:teuthology.orchestra.run.vm04.stderr:bash: line 1: ntpq: command not found 2026-03-10T14:00:54.088 INFO:teuthology.orchestra.run.vm03.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T14:00:54.088 INFO:teuthology.orchestra.run.vm03.stdout:=============================================================================== 2026-03-10T14:00:54.088 INFO:teuthology.orchestra.run.vm03.stdout:^? gw-001.oit.one 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T14:00:54.088 INFO:teuthology.orchestra.run.vm03.stdout:^? 79.133.44.143 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T14:00:54.088 INFO:teuthology.orchestra.run.vm03.stdout:^? ntp1.lwlcom.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T14:00:54.088 INFO:teuthology.orchestra.run.vm03.stdout:^? ntp2.lwlcom.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T14:00:54.089 INFO:teuthology.orchestra.run.vm04.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T14:00:54.089 INFO:teuthology.orchestra.run.vm04.stdout:=============================================================================== 2026-03-10T14:00:54.089 INFO:teuthology.orchestra.run.vm04.stdout:^? 79.133.44.143 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T14:00:54.089 INFO:teuthology.orchestra.run.vm04.stdout:^? ntp1.lwlcom.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T14:00:54.089 INFO:teuthology.orchestra.run.vm04.stdout:^? ntp2.lwlcom.net 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T14:00:54.089 INFO:teuthology.orchestra.run.vm04.stdout:^? gw-001.oit.one 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T14:00:54.089 INFO:teuthology.run_tasks:Running task install... 2026-03-10T14:00:54.091 DEBUG:teuthology.task.install:project ceph 2026-03-10T14:00:54.091 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T14:00:54.091 DEBUG:teuthology.task.install:config {'exclude_packages': ['ceph-volume'], 'tag': 'v18.2.0', 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T14:00:54.091 INFO:teuthology.task.install:Using flavor: default 2026-03-10T14:00:54.093 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-10T14:00:54.093 INFO:teuthology.task.install:extra packages: [] 2026-03-10T14:00:54.093 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.0', 'wait_for_package': False} 2026-03-10T14:00:54.093 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T14:00:54.093 INFO:teuthology.packaging:ref: None 2026-03-10T14:00:54.093 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T14:00:54.093 INFO:teuthology.packaging:branch: None 2026-03-10T14:00:54.093 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:00:54.897 DEBUG:teuthology.repo_utils:git ls-remote https://github.com/ceph/ceph v18.2.0^{} -> 5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T14:00:54.897 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T14:00:54.898 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.0', 'wait_for_package': False} 2026-03-10T14:00:54.898 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T14:00:54.898 INFO:teuthology.packaging:ref: None 2026-03-10T14:00:54.898 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T14:00:54.898 INFO:teuthology.packaging:branch: None 2026-03-10T14:00:54.898 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:00:54.898 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T14:00:55.470 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/ 2026-03-10T14:00:55.470 INFO:teuthology.task.install.rpm:Package version is 18.2.0-0 2026-03-10T14:00:55.502 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/ 2026-03-10T14:00:55.502 INFO:teuthology.task.install.rpm:Package version is 18.2.0-0 2026-03-10T14:00:55.823 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T14:00:55.823 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:00:55.823 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T14:00:55.843 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T14:00:55.843 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:00:55.843 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T14:00:55.861 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T14:00:55.862 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T14:00:55.862 INFO:teuthology.packaging:ref: None 2026-03-10T14:00:55.862 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T14:00:55.862 INFO:teuthology.packaging:branch: None 2026-03-10T14:00:55.862 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:00:55.862 DEBUG:teuthology.orchestra.run.vm04:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.0/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T14:00:55.881 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T14:00:55.881 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T14:00:55.881 INFO:teuthology.packaging:ref: None 2026-03-10T14:00:55.881 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T14:00:55.881 INFO:teuthology.packaging:branch: None 2026-03-10T14:00:55.881 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:00:55.881 DEBUG:teuthology.orchestra.run.vm03:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.0/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T14:00:55.936 DEBUG:teuthology.orchestra.run.vm04:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T14:00:55.960 DEBUG:teuthology.orchestra.run.vm03:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T14:00:56.020 DEBUG:teuthology.orchestra.run.vm04:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T14:00:56.048 INFO:teuthology.orchestra.run.vm04.stdout:check_obsoletes = 1 2026-03-10T14:00:56.050 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean all 2026-03-10T14:00:56.053 DEBUG:teuthology.orchestra.run.vm03:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T14:00:56.083 INFO:teuthology.orchestra.run.vm03.stdout:check_obsoletes = 1 2026-03-10T14:00:56.085 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean all 2026-03-10T14:00:56.227 INFO:teuthology.orchestra.run.vm04.stdout:41 files removed 2026-03-10T14:00:56.252 DEBUG:teuthology.orchestra.run.vm04:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T14:00:56.295 INFO:teuthology.orchestra.run.vm03.stdout:41 files removed 2026-03-10T14:00:56.323 DEBUG:teuthology.orchestra.run.vm03:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T14:00:57.377 INFO:teuthology.orchestra.run.vm04.stdout:ceph packages for x86_64 80 kB/s | 76 kB 00:00 2026-03-10T14:00:57.449 INFO:teuthology.orchestra.run.vm03.stdout:ceph packages for x86_64 84 kB/s | 76 kB 00:00 2026-03-10T14:00:58.016 INFO:teuthology.orchestra.run.vm04.stdout:ceph noarch packages 15 kB/s | 9.3 kB 00:00 2026-03-10T14:00:58.103 INFO:teuthology.orchestra.run.vm03.stdout:ceph noarch packages 15 kB/s | 9.3 kB 00:00 2026-03-10T14:00:58.650 INFO:teuthology.orchestra.run.vm04.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-10T14:00:58.731 INFO:teuthology.orchestra.run.vm03.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-10T14:00:59.438 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - BaseOS 12 MB/s | 8.9 MB 00:00 2026-03-10T14:01:00.342 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - BaseOS 5.6 MB/s | 8.9 MB 00:01 2026-03-10T14:01:02.418 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - AppStream 12 MB/s | 27 MB 00:02 2026-03-10T14:01:03.006 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - AppStream 15 MB/s | 27 MB 00:01 2026-03-10T14:01:06.144 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - CRB 8.3 MB/s | 8.0 MB 00:00 2026-03-10T14:01:06.436 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - CRB 12 MB/s | 8.0 MB 00:00 2026-03-10T14:01:07.730 INFO:teuthology.orchestra.run.vm04.stdout:CentOS Stream 9 - Extras packages 29 kB/s | 20 kB 00:00 2026-03-10T14:01:08.032 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - Extras packages 28 kB/s | 20 kB 00:00 2026-03-10T14:01:08.676 INFO:teuthology.orchestra.run.vm03.stdout:Extra Packages for Enterprise Linux 37 MB/s | 20 MB 00:00 2026-03-10T14:01:08.759 INFO:teuthology.orchestra.run.vm04.stdout:Extra Packages for Enterprise Linux 21 MB/s | 20 MB 00:00 2026-03-10T14:01:13.496 INFO:teuthology.orchestra.run.vm04.stdout:lab-extras 63 kB/s | 50 kB 00:00 2026-03-10T14:01:13.616 INFO:teuthology.orchestra.run.vm03.stdout:lab-extras 62 kB/s | 50 kB 00:00 2026-03-10T14:01:14.983 INFO:teuthology.orchestra.run.vm04.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T14:01:14.983 INFO:teuthology.orchestra.run.vm04.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T14:01:14.987 INFO:teuthology.orchestra.run.vm04.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T14:01:14.988 INFO:teuthology.orchestra.run.vm04.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T14:01:15.021 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout:Installing: 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph x86_64 2:18.2.0-0.el9 ceph 6.4 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base x86_64 2:18.2.0-0.el9 ceph 5.2 M 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 ceph 835 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 ceph 142 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 ceph 1.4 M 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 ceph-noarch 127 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 ceph-noarch 1.7 M 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 ceph-noarch 7.4 M 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 ceph-noarch 47 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 ceph 7.6 M 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test x86_64 2:18.2.0-0.el9 ceph 40 M 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: cephadm noarch 2:18.2.0-0.el9 ceph-noarch 209 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 ceph 30 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 ceph 653 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel x86_64 2:18.2.0-0.el9 ceph 126 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 ceph 155 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados x86_64 2:18.2.0-0.el9 ceph 321 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd x86_64 2:18.2.0-0.el9 ceph 297 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw x86_64 2:18.2.0-0.el9 ceph 99 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 ceph 86 k 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T14:01:15.026 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 ceph 169 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout:Upgrading: 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: librados2 x86_64 2:18.2.0-0.el9 ceph 3.3 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: librbd1 x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout:Installing dependencies: 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 ceph-noarch 23 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds x86_64 2:18.2.0-0.el9 ceph 2.1 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 ceph-noarch 240 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 ceph-noarch 15 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 ceph 24 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 ceph 161 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 ceph 474 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: librgw2 x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T14:01:15.027 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 ceph 45 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 ceph 119 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T14:01:15.028 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout:Installing weak dependencies: 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout:Install 117 Packages 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout:Upgrade 2 Packages 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:15.029 INFO:teuthology.orchestra.run.vm04.stdout:Total download size: 182 M 2026-03-10T14:01:15.030 INFO:teuthology.orchestra.run.vm04.stdout:Downloading Packages: 2026-03-10T14:01:15.173 INFO:teuthology.orchestra.run.vm03.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T14:01:15.174 INFO:teuthology.orchestra.run.vm03.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T14:01:15.179 INFO:teuthology.orchestra.run.vm03.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T14:01:15.179 INFO:teuthology.orchestra.run.vm03.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T14:01:15.210 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:01:15.217 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout:Installing: 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph x86_64 2:18.2.0-0.el9 ceph 6.4 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base x86_64 2:18.2.0-0.el9 ceph 5.2 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 ceph 835 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 ceph 142 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 ceph 1.4 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 ceph-noarch 127 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 ceph-noarch 1.7 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 ceph-noarch 7.4 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 ceph-noarch 47 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 ceph 7.6 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test x86_64 2:18.2.0-0.el9 ceph 40 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: cephadm noarch 2:18.2.0-0.el9 ceph-noarch 209 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 ceph 30 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 ceph 653 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel x86_64 2:18.2.0-0.el9 ceph 126 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 ceph 155 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados x86_64 2:18.2.0-0.el9 ceph 321 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd x86_64 2:18.2.0-0.el9 ceph 297 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw x86_64 2:18.2.0-0.el9 ceph 99 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 ceph 86 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 ceph 169 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout:Upgrading: 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: librados2 x86_64 2:18.2.0-0.el9 ceph 3.3 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: librbd1 x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout:Installing dependencies: 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 ceph-noarch 23 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds x86_64 2:18.2.0-0.el9 ceph 2.1 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 ceph-noarch 240 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 ceph-noarch 15 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 ceph 24 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 ceph 161 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T14:01:15.218 INFO:teuthology.orchestra.run.vm03.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 ceph 474 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: librgw2 x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 ceph 45 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 ceph 119 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T14:01:15.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout:Installing weak dependencies: 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout:Install 117 Packages 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout:Upgrade 2 Packages 2026-03-10T14:01:15.220 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:15.221 INFO:teuthology.orchestra.run.vm03.stdout:Total download size: 182 M 2026-03-10T14:01:15.221 INFO:teuthology.orchestra.run.vm03.stdout:Downloading Packages: 2026-03-10T14:01:15.910 INFO:teuthology.orchestra.run.vm03.stdout:(1/119): ceph-18.2.0-0.el9.x86_64.rpm 19 kB/s | 6.4 kB 00:00 2026-03-10T14:01:16.884 INFO:teuthology.orchestra.run.vm04.stdout:(1/119): ceph-18.2.0-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-10T14:01:18.750 INFO:teuthology.orchestra.run.vm03.stdout:(2/119): ceph-fuse-18.2.0-0.el9.x86_64.rpm 294 kB/s | 835 kB 00:02 2026-03-10T14:01:19.045 INFO:teuthology.orchestra.run.vm03.stdout:(3/119): ceph-immutable-object-cache-18.2.0-0.e 483 kB/s | 142 kB 00:00 2026-03-10T14:01:19.844 INFO:teuthology.orchestra.run.vm04.stdout:(2/119): ceph-fuse-18.2.0-0.el9.x86_64.rpm 282 kB/s | 835 kB 00:02 2026-03-10T14:01:20.143 INFO:teuthology.orchestra.run.vm04.stdout:(3/119): ceph-immutable-object-cache-18.2.0-0.e 477 kB/s | 142 kB 00:00 2026-03-10T14:01:21.105 INFO:teuthology.orchestra.run.vm03.stdout:(4/119): ceph-mds-18.2.0-0.el9.x86_64.rpm 1.0 MB/s | 2.1 MB 00:02 2026-03-10T14:01:21.143 INFO:teuthology.orchestra.run.vm03.stdout:(5/119): ceph-base-18.2.0-0.el9.x86_64.rpm 955 kB/s | 5.2 MB 00:05 2026-03-10T14:01:21.896 INFO:teuthology.orchestra.run.vm03.stdout:(6/119): ceph-mgr-18.2.0-0.el9.x86_64.rpm 1.8 MB/s | 1.4 MB 00:00 2026-03-10T14:01:22.223 INFO:teuthology.orchestra.run.vm04.stdout:(4/119): ceph-mds-18.2.0-0.el9.x86_64.rpm 1.0 MB/s | 2.1 MB 00:02 2026-03-10T14:01:22.343 INFO:teuthology.orchestra.run.vm03.stdout:(7/119): ceph-mon-18.2.0-0.el9.x86_64.rpm 3.7 MB/s | 4.4 MB 00:01 2026-03-10T14:01:22.997 INFO:teuthology.orchestra.run.vm04.stdout:(5/119): ceph-base-18.2.0-0.el9.x86_64.rpm 827 kB/s | 5.2 MB 00:06 2026-03-10T14:01:23.018 INFO:teuthology.orchestra.run.vm04.stdout:(6/119): ceph-mgr-18.2.0-0.el9.x86_64.rpm 1.8 MB/s | 1.4 MB 00:00 2026-03-10T14:01:23.629 INFO:teuthology.orchestra.run.vm03.stdout:(8/119): ceph-radosgw-18.2.0-0.el9.x86_64.rpm 5.9 MB/s | 7.6 MB 00:01 2026-03-10T14:01:23.764 INFO:teuthology.orchestra.run.vm03.stdout:(9/119): ceph-selinux-18.2.0-0.el9.x86_64.rpm 182 kB/s | 24 kB 00:00 2026-03-10T14:01:24.294 INFO:teuthology.orchestra.run.vm04.stdout:(7/119): ceph-mon-18.2.0-0.el9.x86_64.rpm 3.4 MB/s | 4.4 MB 00:01 2026-03-10T14:01:24.664 INFO:teuthology.orchestra.run.vm03.stdout:(10/119): ceph-common-18.2.0-0.el9.x86_64.rpm 2.0 MB/s | 18 MB 00:09 2026-03-10T14:01:24.809 INFO:teuthology.orchestra.run.vm03.stdout:(11/119): libcephfs-devel-18.2.0-0.el9.x86_64.r 210 kB/s | 30 kB 00:00 2026-03-10T14:01:25.009 INFO:teuthology.orchestra.run.vm03.stdout:(12/119): libcephfs2-18.2.0-0.el9.x86_64.rpm 3.2 MB/s | 653 kB 00:00 2026-03-10T14:01:25.109 INFO:teuthology.orchestra.run.vm03.stdout:(13/119): libcephsqlite-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 161 kB 00:00 2026-03-10T14:01:25.210 INFO:teuthology.orchestra.run.vm03.stdout:(14/119): librados-devel-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 126 kB 00:00 2026-03-10T14:01:25.268 INFO:teuthology.orchestra.run.vm03.stdout:(15/119): ceph-osd-18.2.0-0.el9.x86_64.rpm 5.2 MB/s | 18 MB 00:03 2026-03-10T14:01:25.314 INFO:teuthology.orchestra.run.vm03.stdout:(16/119): libradosstriper1-18.2.0-0.el9.x86_64. 4.5 MB/s | 474 kB 00:00 2026-03-10T14:01:25.386 INFO:teuthology.orchestra.run.vm04.stdout:(8/119): ceph-common-18.2.0-0.el9.x86_64.rpm 2.1 MB/s | 18 MB 00:08 2026-03-10T14:01:25.414 INFO:teuthology.orchestra.run.vm03.stdout:(17/119): python3-ceph-argparse-18.2.0-0.el9.x8 451 kB/s | 45 kB 00:00 2026-03-10T14:01:25.485 INFO:teuthology.orchestra.run.vm04.stdout:(9/119): ceph-selinux-18.2.0-0.el9.x86_64.rpm 242 kB/s | 24 kB 00:00 2026-03-10T14:01:25.515 INFO:teuthology.orchestra.run.vm03.stdout:(18/119): python3-ceph-common-18.2.0-0.el9.x86_ 1.2 MB/s | 119 kB 00:00 2026-03-10T14:01:25.516 INFO:teuthology.orchestra.run.vm04.stdout:(10/119): ceph-radosgw-18.2.0-0.el9.x86_64.rpm 6.3 MB/s | 7.6 MB 00:01 2026-03-10T14:01:25.617 INFO:teuthology.orchestra.run.vm03.stdout:(19/119): python3-cephfs-18.2.0-0.el9.x86_64.rp 1.5 MB/s | 155 kB 00:00 2026-03-10T14:01:25.622 INFO:teuthology.orchestra.run.vm04.stdout:(11/119): libcephfs-devel-18.2.0-0.el9.x86_64.r 290 kB/s | 30 kB 00:00 2026-03-10T14:01:25.720 INFO:teuthology.orchestra.run.vm03.stdout:(20/119): python3-rados-18.2.0-0.el9.x86_64.rpm 3.0 MB/s | 321 kB 00:00 2026-03-10T14:01:25.823 INFO:teuthology.orchestra.run.vm03.stdout:(21/119): python3-rbd-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 297 kB 00:00 2026-03-10T14:01:25.877 INFO:teuthology.orchestra.run.vm03.stdout:(22/119): librgw2-18.2.0-0.el9.x86_64.rpm 7.3 MB/s | 4.4 MB 00:00 2026-03-10T14:01:25.924 INFO:teuthology.orchestra.run.vm03.stdout:(23/119): python3-rgw-18.2.0-0.el9.x86_64.rpm 986 kB/s | 99 kB 00:00 2026-03-10T14:01:25.977 INFO:teuthology.orchestra.run.vm03.stdout:(24/119): rbd-fuse-18.2.0-0.el9.x86_64.rpm 863 kB/s | 86 kB 00:00 2026-03-10T14:01:26.079 INFO:teuthology.orchestra.run.vm03.stdout:(25/119): rbd-nbd-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 169 kB 00:00 2026-03-10T14:01:26.178 INFO:teuthology.orchestra.run.vm03.stdout:(26/119): ceph-grafana-dashboards-18.2.0-0.el9. 233 kB/s | 23 kB 00:00 2026-03-10T14:01:26.279 INFO:teuthology.orchestra.run.vm03.stdout:(27/119): ceph-mgr-cephadm-18.2.0-0.el9.noarch. 1.2 MB/s | 127 kB 00:00 2026-03-10T14:01:26.521 INFO:teuthology.orchestra.run.vm03.stdout:(28/119): rbd-mirror-18.2.0-0.el9.x86_64.rpm 5.0 MB/s | 3.0 MB 00:00 2026-03-10T14:01:26.585 INFO:teuthology.orchestra.run.vm03.stdout:(29/119): ceph-mgr-dashboard-18.2.0-0.el9.noarc 5.6 MB/s | 1.7 MB 00:00 2026-03-10T14:01:26.656 INFO:teuthology.orchestra.run.vm04.stdout:(12/119): ceph-osd-18.2.0-0.el9.x86_64.rpm 4.8 MB/s | 18 MB 00:03 2026-03-10T14:01:26.687 INFO:teuthology.orchestra.run.vm03.stdout:(30/119): ceph-mgr-modules-core-18.2.0-0.el9.no 2.3 MB/s | 240 kB 00:00 2026-03-10T14:01:26.758 INFO:teuthology.orchestra.run.vm04.stdout:(13/119): libcephsqlite-18.2.0-0.el9.x86_64.rpm 1.5 MB/s | 161 kB 00:00 2026-03-10T14:01:26.787 INFO:teuthology.orchestra.run.vm03.stdout:(31/119): ceph-mgr-rook-18.2.0-0.el9.noarch.rpm 475 kB/s | 47 kB 00:00 2026-03-10T14:01:26.859 INFO:teuthology.orchestra.run.vm04.stdout:(14/119): librados-devel-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 126 kB 00:00 2026-03-10T14:01:26.887 INFO:teuthology.orchestra.run.vm03.stdout:(32/119): ceph-prometheus-alerts-18.2.0-0.el9.n 148 kB/s | 15 kB 00:00 2026-03-10T14:01:26.952 INFO:teuthology.orchestra.run.vm04.stdout:(15/119): libcephfs2-18.2.0-0.el9.x86_64.rpm 491 kB/s | 653 kB 00:01 2026-03-10T14:01:26.964 INFO:teuthology.orchestra.run.vm04.stdout:(16/119): libradosstriper1-18.2.0-0.el9.x86_64. 4.4 MB/s | 474 kB 00:00 2026-03-10T14:01:26.988 INFO:teuthology.orchestra.run.vm03.stdout:(33/119): cephadm-18.2.0-0.el9.noarch.rpm 2.0 MB/s | 209 kB 00:00 2026-03-10T14:01:27.067 INFO:teuthology.orchestra.run.vm04.stdout:(17/119): python3-ceph-argparse-18.2.0-0.el9.x8 438 kB/s | 45 kB 00:00 2026-03-10T14:01:27.094 INFO:teuthology.orchestra.run.vm03.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 386 kB/s | 40 kB 00:00 2026-03-10T14:01:27.169 INFO:teuthology.orchestra.run.vm04.stdout:(18/119): python3-ceph-common-18.2.0-0.el9.x86_ 1.1 MB/s | 119 kB 00:00 2026-03-10T14:01:27.170 INFO:teuthology.orchestra.run.vm03.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 938 kB/s | 72 kB 00:00 2026-03-10T14:01:27.271 INFO:teuthology.orchestra.run.vm04.stdout:(19/119): python3-cephfs-18.2.0-0.el9.x86_64.rp 1.5 MB/s | 155 kB 00:00 2026-03-10T14:01:27.283 INFO:teuthology.orchestra.run.vm03.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 6.9 MB/s | 794 kB 00:00 2026-03-10T14:01:27.328 INFO:teuthology.orchestra.run.vm03.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 4.1 MB/s | 184 kB 00:00 2026-03-10T14:01:27.361 INFO:teuthology.orchestra.run.vm03.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 1.0 MB/s | 33 kB 00:00 2026-03-10T14:01:27.374 INFO:teuthology.orchestra.run.vm04.stdout:(20/119): python3-rados-18.2.0-0.el9.x86_64.rpm 3.1 MB/s | 321 kB 00:00 2026-03-10T14:01:27.429 INFO:teuthology.orchestra.run.vm03.stdout:(39/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 3.7 MB/s | 253 kB 00:00 2026-03-10T14:01:27.477 INFO:teuthology.orchestra.run.vm04.stdout:(21/119): python3-rbd-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 297 kB 00:00 2026-03-10T14:01:27.518 INFO:teuthology.orchestra.run.vm03.stdout:(40/119): python3-cryptography-36.0.1-5.el9.x86 14 MB/s | 1.2 MB 00:00 2026-03-10T14:01:27.572 INFO:teuthology.orchestra.run.vm03.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 1.9 MB/s | 106 kB 00:00 2026-03-10T14:01:27.579 INFO:teuthology.orchestra.run.vm04.stdout:(22/119): python3-rgw-18.2.0-0.el9.x86_64.rpm 983 kB/s | 99 kB 00:00 2026-03-10T14:01:27.631 INFO:teuthology.orchestra.run.vm03.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 2.2 MB/s | 135 kB 00:00 2026-03-10T14:01:27.664 INFO:teuthology.orchestra.run.vm04.stdout:(23/119): librgw2-18.2.0-0.el9.x86_64.rpm 6.2 MB/s | 4.4 MB 00:00 2026-03-10T14:01:27.693 INFO:teuthology.orchestra.run.vm04.stdout:(24/119): rbd-fuse-18.2.0-0.el9.x86_64.rpm 752 kB/s | 86 kB 00:00 2026-03-10T14:01:27.710 INFO:teuthology.orchestra.run.vm03.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 1.6 MB/s | 126 kB 00:00 2026-03-10T14:01:27.775 INFO:teuthology.orchestra.run.vm03.stdout:(44/119): ceph-mgr-diskprediction-local-18.2.0- 5.9 MB/s | 7.4 MB 00:01 2026-03-10T14:01:27.822 INFO:teuthology.orchestra.run.vm04.stdout:(25/119): rbd-nbd-18.2.0-0.el9.x86_64.rpm 1.3 MB/s | 169 kB 00:00 2026-03-10T14:01:27.843 INFO:teuthology.orchestra.run.vm03.stdout:(45/119): python3-urllib3-1.26.5-7.el9.noarch.r 1.6 MB/s | 218 kB 00:00 2026-03-10T14:01:27.925 INFO:teuthology.orchestra.run.vm04.stdout:(26/119): ceph-grafana-dashboards-18.2.0-0.el9. 223 kB/s | 23 kB 00:00 2026-03-10T14:01:28.115 INFO:teuthology.orchestra.run.vm03.stdout:(46/119): flexiblas-3.0.4-9.el9.x86_64.rpm 109 kB/s | 30 kB 00:00 2026-03-10T14:01:28.116 INFO:teuthology.orchestra.run.vm04.stdout:(27/119): ceph-mgr-cephadm-18.2.0-0.el9.noarch. 672 kB/s | 127 kB 00:00 2026-03-10T14:01:28.211 INFO:teuthology.orchestra.run.vm03.stdout:(47/119): boost-program-options-1.75.0-13.el9.x 239 kB/s | 104 kB 00:00 2026-03-10T14:01:28.423 INFO:teuthology.orchestra.run.vm03.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 70 kB/s | 15 kB 00:00 2026-03-10T14:01:28.423 INFO:teuthology.orchestra.run.vm04.stdout:(28/119): ceph-mgr-dashboard-18.2.0-0.el9.noarc 5.5 MB/s | 1.7 MB 00:00 2026-03-10T14:01:28.537 INFO:teuthology.orchestra.run.vm04.stdout:(29/119): rbd-mirror-18.2.0-0.el9.x86_64.rpm 3.4 MB/s | 3.0 MB 00:00 2026-03-10T14:01:28.659 INFO:teuthology.orchestra.run.vm03.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 679 kB/s | 160 kB 00:00 2026-03-10T14:01:28.825 INFO:teuthology.orchestra.run.vm03.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 273 kB/s | 45 kB 00:00 2026-03-10T14:01:28.849 INFO:teuthology.orchestra.run.vm03.stdout:(51/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 4.1 MB/s | 3.0 MB 00:00 2026-03-10T14:01:29.075 INFO:teuthology.orchestra.run.vm04.stdout:(30/119): ceph-mgr-modules-core-18.2.0-0.el9.no 446 kB/s | 240 kB 00:00 2026-03-10T14:01:29.171 INFO:teuthology.orchestra.run.vm03.stdout:(52/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 766 kB/s | 246 kB 00:00 2026-03-10T14:01:29.176 INFO:teuthology.orchestra.run.vm04.stdout:(31/119): ceph-mgr-rook-18.2.0-0.el9.noarch.rpm 474 kB/s | 47 kB 00:00 2026-03-10T14:01:29.347 INFO:teuthology.orchestra.run.vm04.stdout:(32/119): ceph-prometheus-alerts-18.2.0-0.el9.n 85 kB/s | 15 kB 00:00 2026-03-10T14:01:29.517 INFO:teuthology.orchestra.run.vm04.stdout:(33/119): cephadm-18.2.0-0.el9.noarch.rpm 1.2 MB/s | 209 kB 00:00 2026-03-10T14:01:29.617 INFO:teuthology.orchestra.run.vm04.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 409 kB/s | 40 kB 00:00 2026-03-10T14:01:29.639 INFO:teuthology.orchestra.run.vm03.stdout:(53/119): librdkafka-1.6.1-102.el9.x86_64.rpm 814 kB/s | 662 kB 00:00 2026-03-10T14:01:29.789 INFO:teuthology.orchestra.run.vm03.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 377 kB/s | 233 kB 00:00 2026-03-10T14:01:29.790 INFO:teuthology.orchestra.run.vm04.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 418 kB/s | 72 kB 00:00 2026-03-10T14:01:29.981 INFO:teuthology.orchestra.run.vm03.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 858 kB/s | 292 kB 00:00 2026-03-10T14:01:30.002 INFO:teuthology.orchestra.run.vm04.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 3.7 MB/s | 794 kB 00:00 2026-03-10T14:01:30.017 INFO:teuthology.orchestra.run.vm03.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 186 kB/s | 42 kB 00:00 2026-03-10T14:01:30.049 INFO:teuthology.orchestra.run.vm04.stdout:(37/119): ceph-mgr-diskprediction-local-18.2.0- 4.6 MB/s | 7.4 MB 00:01 2026-03-10T14:01:30.067 INFO:teuthology.orchestra.run.vm04.stdout:(38/119): libquadmath-11.5.0-14.el9.x86_64.rpm 2.8 MB/s | 184 kB 00:00 2026-03-10T14:01:30.103 INFO:teuthology.orchestra.run.vm04.stdout:(39/119): mailcap-2.1.49-5.el9.noarch.rpm 626 kB/s | 33 kB 00:00 2026-03-10T14:01:30.141 INFO:teuthology.orchestra.run.vm04.stdout:(40/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 3.4 MB/s | 253 kB 00:00 2026-03-10T14:01:30.226 INFO:teuthology.orchestra.run.vm04.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 1.2 MB/s | 106 kB 00:00 2026-03-10T14:01:30.322 INFO:teuthology.orchestra.run.vm04.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 1.4 MB/s | 135 kB 00:00 2026-03-10T14:01:30.394 INFO:teuthology.orchestra.run.vm04.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 1.8 MB/s | 126 kB 00:00 2026-03-10T14:01:30.433 INFO:teuthology.orchestra.run.vm04.stdout:(44/119): python3-cryptography-36.0.1-5.el9.x86 3.8 MB/s | 1.2 MB 00:00 2026-03-10T14:01:30.519 INFO:teuthology.orchestra.run.vm04.stdout:(45/119): python3-urllib3-1.26.5-7.el9.noarch.r 1.7 MB/s | 218 kB 00:00 2026-03-10T14:01:30.627 INFO:teuthology.orchestra.run.vm04.stdout:(46/119): flexiblas-3.0.4-9.el9.x86_64.rpm 273 kB/s | 30 kB 00:00 2026-03-10T14:01:30.793 INFO:teuthology.orchestra.run.vm04.stdout:(47/119): boost-program-options-1.75.0-13.el9.x 289 kB/s | 104 kB 00:00 2026-03-10T14:01:30.868 INFO:teuthology.orchestra.run.vm04.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 200 kB/s | 15 kB 00:00 2026-03-10T14:01:30.991 INFO:teuthology.orchestra.run.vm04.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.3 MB/s | 160 kB 00:00 2026-03-10T14:01:31.053 INFO:teuthology.orchestra.run.vm04.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 740 kB/s | 45 kB 00:00 2026-03-10T14:01:31.169 INFO:teuthology.orchestra.run.vm03.stdout:(57/119): python3-babel-2.9.1-2.el9.noarch.rpm 5.2 MB/s | 6.0 MB 00:01 2026-03-10T14:01:31.281 INFO:teuthology.orchestra.run.vm04.stdout:(51/119): ceph-test-18.2.0-0.el9.x86_64.rpm 6.8 MB/s | 40 MB 00:05 2026-03-10T14:01:31.461 INFO:teuthology.orchestra.run.vm04.stdout:(52/119): librdkafka-1.6.1-102.el9.x86_64.rpm 1.6 MB/s | 662 kB 00:00 2026-03-10T14:01:31.543 INFO:teuthology.orchestra.run.vm03.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 654 kB/s | 244 kB 00:00 2026-03-10T14:01:31.632 INFO:teuthology.orchestra.run.vm03.stdout:(59/119): openblas-openmp-0.3.29-1.el9.x86_64.r 3.2 MB/s | 5.3 MB 00:01 2026-03-10T14:01:31.833 INFO:teuthology.orchestra.run.vm03.stdout:(60/119): python3-jinja2-2.11.3-8.el9.noarch.rp 859 kB/s | 249 kB 00:00 2026-03-10T14:01:31.985 INFO:teuthology.orchestra.run.vm03.stdout:(61/119): python3-jmespath-1.0.1-1.el9.noarch.r 135 kB/s | 48 kB 00:00 2026-03-10T14:01:31.985 INFO:teuthology.orchestra.run.vm04.stdout:(53/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 350 kB/s | 246 kB 00:00 2026-03-10T14:01:32.017 INFO:teuthology.orchestra.run.vm03.stdout:(62/119): python3-libstoragemgmt-1.10.1-1.el9.x 961 kB/s | 177 kB 00:00 2026-03-10T14:01:32.094 INFO:teuthology.orchestra.run.vm04.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 368 kB/s | 233 kB 00:00 2026-03-10T14:01:32.156 INFO:teuthology.orchestra.run.vm03.stdout:(63/119): python3-mako-1.1.4-6.el9.noarch.rpm 1.0 MB/s | 172 kB 00:00 2026-03-10T14:01:32.227 INFO:teuthology.orchestra.run.vm04.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 1.2 MB/s | 292 kB 00:00 2026-03-10T14:01:32.227 INFO:teuthology.orchestra.run.vm03.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 166 kB/s | 35 kB 00:00 2026-03-10T14:01:32.283 INFO:teuthology.orchestra.run.vm04.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 223 kB/s | 42 kB 00:00 2026-03-10T14:01:32.495 INFO:teuthology.orchestra.run.vm03.stdout:(65/119): ceph-test-18.2.0-0.el9.x86_64.rpm 4.5 MB/s | 40 MB 00:08 2026-03-10T14:01:32.817 INFO:teuthology.orchestra.run.vm03.stdout:(66/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 9.3 MB/s | 6.1 MB 00:00 2026-03-10T14:01:32.971 INFO:teuthology.orchestra.run.vm03.stdout:(67/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 595 kB/s | 442 kB 00:00 2026-03-10T14:01:32.982 INFO:teuthology.orchestra.run.vm04.stdout:(57/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 1.3 MB/s | 3.0 MB 00:02 2026-03-10T14:01:33.180 INFO:teuthology.orchestra.run.vm04.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 1.2 MB/s | 244 kB 00:00 2026-03-10T14:01:33.233 INFO:teuthology.orchestra.run.vm03.stdout:(68/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 213 kB/s | 157 kB 00:00 2026-03-10T14:01:33.275 INFO:teuthology.orchestra.run.vm04.stdout:(59/119): python3-jinja2-2.11.3-8.el9.noarch.rp 2.6 MB/s | 249 kB 00:00 2026-03-10T14:01:33.286 INFO:teuthology.orchestra.run.vm03.stdout:(69/119): python3-pyasn1-modules-0.4.8-7.el9.no 591 kB/s | 277 kB 00:00 2026-03-10T14:01:33.386 INFO:teuthology.orchestra.run.vm04.stdout:(60/119): python3-jmespath-1.0.1-1.el9.noarch.r 432 kB/s | 48 kB 00:00 2026-03-10T14:01:33.420 INFO:teuthology.orchestra.run.vm03.stdout:(70/119): python3-requests-oauthlib-1.3.0-12.el 119 kB/s | 54 kB 00:00 2026-03-10T14:01:33.460 INFO:teuthology.orchestra.run.vm03.stdout:(71/119): python3-toml-0.10.2-6.el9.noarch.rpm 241 kB/s | 42 kB 00:00 2026-03-10T14:01:33.497 INFO:teuthology.orchestra.run.vm04.stdout:(61/119): python3-libstoragemgmt-1.10.1-1.el9.x 1.6 MB/s | 177 kB 00:00 2026-03-10T14:01:33.589 INFO:teuthology.orchestra.run.vm04.stdout:(62/119): python3-mako-1.1.4-6.el9.noarch.rpm 1.8 MB/s | 172 kB 00:00 2026-03-10T14:01:33.591 INFO:teuthology.orchestra.run.vm03.stdout:(72/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 485 kB/s | 64 kB 00:00 2026-03-10T14:01:33.599 INFO:teuthology.orchestra.run.vm03.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 15 MB/s | 111 kB 00:00 2026-03-10T14:01:33.607 INFO:teuthology.orchestra.run.vm03.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 40 MB/s | 308 kB 00:00 2026-03-10T14:01:33.659 INFO:teuthology.orchestra.run.vm03.stdout:(75/119): socat-1.7.4.1-8.el9.x86_64.rpm 1.2 MB/s | 303 kB 00:00 2026-03-10T14:01:33.677 INFO:teuthology.orchestra.run.vm03.stdout:(76/119): libarrow-9.0.0-15.el9.x86_64.rpm 63 MB/s | 4.4 MB 00:00 2026-03-10T14:01:33.677 INFO:teuthology.orchestra.run.vm04.stdout:(63/119): python3-markupsafe-1.1.1-12.el9.x86_6 397 kB/s | 35 kB 00:00 2026-03-10T14:01:33.678 INFO:teuthology.orchestra.run.vm03.stdout:(77/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 1.2 MB/s | 25 kB 00:00 2026-03-10T14:01:33.680 INFO:teuthology.orchestra.run.vm03.stdout:(78/119): liboath-2.6.12-1.el9.x86_64.rpm 18 MB/s | 49 kB 00:00 2026-03-10T14:01:33.686 INFO:teuthology.orchestra.run.vm03.stdout:(79/119): libunwind-1.6.2-1.el9.x86_64.rpm 8.8 MB/s | 67 kB 00:00 2026-03-10T14:01:33.696 INFO:teuthology.orchestra.run.vm03.stdout:(80/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 51 MB/s | 838 kB 00:00 2026-03-10T14:01:33.699 INFO:teuthology.orchestra.run.vm03.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 10 MB/s | 29 kB 00:00 2026-03-10T14:01:33.702 INFO:teuthology.orchestra.run.vm03.stdout:(82/119): python3-backports-tarfile-1.2.0-1.el9 22 MB/s | 60 kB 00:00 2026-03-10T14:01:33.705 INFO:teuthology.orchestra.run.vm03.stdout:(83/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 18 MB/s | 43 kB 00:00 2026-03-10T14:01:33.708 INFO:teuthology.orchestra.run.vm03.stdout:(84/119): python3-cachetools-4.2.4-1.el9.noarch 12 MB/s | 32 kB 00:00 2026-03-10T14:01:33.711 INFO:teuthology.orchestra.run.vm03.stdout:(85/119): python3-certifi-2023.05.07-4.el9.noar 4.1 MB/s | 14 kB 00:00 2026-03-10T14:01:33.714 INFO:teuthology.orchestra.run.vm03.stdout:(86/119): python3-asyncssh-2.13.2-5.el9.noarch. 19 MB/s | 548 kB 00:00 2026-03-10T14:01:33.716 INFO:teuthology.orchestra.run.vm03.stdout:(87/119): python3-cheroot-10.0.1-4.el9.noarch.r 39 MB/s | 173 kB 00:00 2026-03-10T14:01:33.721 INFO:teuthology.orchestra.run.vm03.stdout:(88/119): python3-google-auth-2.45.0-1.el9.noar 50 MB/s | 254 kB 00:00 2026-03-10T14:01:33.724 INFO:teuthology.orchestra.run.vm03.stdout:(89/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 4.3 MB/s | 11 kB 00:00 2026-03-10T14:01:33.726 INFO:teuthology.orchestra.run.vm03.stdout:(90/119): python3-jaraco-classes-3.2.1-5.el9.no 7.8 MB/s | 18 kB 00:00 2026-03-10T14:01:33.728 INFO:teuthology.orchestra.run.vm03.stdout:(91/119): python3-cherrypy-18.6.1-2.el9.noarch. 27 MB/s | 358 kB 00:00 2026-03-10T14:01:33.729 INFO:teuthology.orchestra.run.vm03.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 9.7 MB/s | 23 kB 00:00 2026-03-10T14:01:33.731 INFO:teuthology.orchestra.run.vm03.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 6.8 MB/s | 20 kB 00:00 2026-03-10T14:01:33.731 INFO:teuthology.orchestra.run.vm03.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 7.8 MB/s | 19 kB 00:00 2026-03-10T14:01:33.733 INFO:teuthology.orchestra.run.vm03.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 11 MB/s | 26 kB 00:00 2026-03-10T14:01:33.734 INFO:teuthology.orchestra.run.vm03.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 3.8 MB/s | 9.0 kB 00:00 2026-03-10T14:01:33.736 INFO:teuthology.orchestra.run.vm03.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 16 MB/s | 41 kB 00:00 2026-03-10T14:01:33.739 INFO:teuthology.orchestra.run.vm03.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 18 MB/s | 46 kB 00:00 2026-03-10T14:01:33.742 INFO:teuthology.orchestra.run.vm03.stdout:(99/119): python3-more-itertools-8.12.0-2.el9.n 24 MB/s | 79 kB 00:00 2026-03-10T14:01:33.745 INFO:teuthology.orchestra.run.vm03.stdout:(100/119): python3-natsort-7.1.1-5.el9.noarch.r 20 MB/s | 58 kB 00:00 2026-03-10T14:01:33.752 INFO:teuthology.orchestra.run.vm03.stdout:(101/119): python3-pecan-1.4.2-3.el9.noarch.rpm 42 MB/s | 272 kB 00:00 2026-03-10T14:01:33.756 INFO:teuthology.orchestra.run.vm03.stdout:(102/119): python3-portend-3.1.0-2.el9.noarch.r 4.6 MB/s | 16 kB 00:00 2026-03-10T14:01:33.760 INFO:teuthology.orchestra.run.vm03.stdout:(103/119): python3-kubernetes-26.1.0-3.el9.noar 40 MB/s | 1.0 MB 00:00 2026-03-10T14:01:33.761 INFO:teuthology.orchestra.run.vm03.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 20 MB/s | 90 kB 00:00 2026-03-10T14:01:33.762 INFO:teuthology.orchestra.run.vm03.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 13 MB/s | 31 kB 00:00 2026-03-10T14:01:33.765 INFO:teuthology.orchestra.run.vm03.stdout:(106/119): python3-routes-2.5.1-5.el9.noarch.rp 43 MB/s | 188 kB 00:00 2026-03-10T14:01:33.766 INFO:teuthology.orchestra.run.vm03.stdout:(107/119): python3-rsa-4.9-2.el9.noarch.rpm 15 MB/s | 59 kB 00:00 2026-03-10T14:01:33.769 INFO:teuthology.orchestra.run.vm03.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 11 MB/s | 36 kB 00:00 2026-03-10T14:01:33.770 INFO:teuthology.orchestra.run.vm03.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 20 MB/s | 86 kB 00:00 2026-03-10T14:01:33.775 INFO:teuthology.orchestra.run.vm03.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 39 MB/s | 230 kB 00:00 2026-03-10T14:01:33.776 INFO:teuthology.orchestra.run.vm03.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 17 MB/s | 90 kB 00:00 2026-03-10T14:01:33.782 INFO:teuthology.orchestra.run.vm03.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 4.2 MB/s | 22 kB 00:00 2026-03-10T14:01:33.784 INFO:teuthology.orchestra.run.vm03.stdout:(113/119): python3-werkzeug-2.0.3-3.el9.1.noarc 49 MB/s | 427 kB 00:00 2026-03-10T14:01:33.785 INFO:teuthology.orchestra.run.vm03.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 6.1 MB/s | 20 kB 00:00 2026-03-10T14:01:33.788 INFO:teuthology.orchestra.run.vm03.stdout:(115/119): re2-20211101-20.el9.x86_64.rpm 42 MB/s | 191 kB 00:00 2026-03-10T14:01:33.823 INFO:teuthology.orchestra.run.vm03.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 42 MB/s | 1.6 MB 00:00 2026-03-10T14:01:34.209 INFO:teuthology.orchestra.run.vm03.stdout:(117/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 20 MB/s | 19 MB 00:00 2026-03-10T14:01:35.021 INFO:teuthology.orchestra.run.vm03.stdout:(118/119): librbd1-18.2.0-0.el9.x86_64.rpm 2.5 MB/s | 3.0 MB 00:01 2026-03-10T14:01:35.559 INFO:teuthology.orchestra.run.vm04.stdout:(64/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 3.3 MB/s | 6.1 MB 00:01 2026-03-10T14:01:35.713 INFO:teuthology.orchestra.run.vm04.stdout:(65/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 2.8 MB/s | 442 kB 00:00 2026-03-10T14:01:35.804 INFO:teuthology.orchestra.run.vm04.stdout:(66/119): openblas-openmp-0.3.29-1.el9.x86_64.r 1.5 MB/s | 5.3 MB 00:03 2026-03-10T14:01:35.805 INFO:teuthology.orchestra.run.vm04.stdout:(67/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 1.7 MB/s | 157 kB 00:00 2026-03-10T14:01:35.823 INFO:teuthology.orchestra.run.vm04.stdout:(68/119): python3-babel-2.9.1-2.el9.noarch.rpm 1.7 MB/s | 6.0 MB 00:03 2026-03-10T14:01:35.917 INFO:teuthology.orchestra.run.vm04.stdout:(69/119): python3-requests-oauthlib-1.3.0-12.el 484 kB/s | 54 kB 00:00 2026-03-10T14:01:35.950 INFO:teuthology.orchestra.run.vm04.stdout:(70/119): python3-pyasn1-modules-0.4.8-7.el9.no 1.9 MB/s | 277 kB 00:00 2026-03-10T14:01:35.985 INFO:teuthology.orchestra.run.vm04.stdout:(71/119): python3-toml-0.10.2-6.el9.noarch.rpm 610 kB/s | 42 kB 00:00 2026-03-10T14:01:36.052 INFO:teuthology.orchestra.run.vm04.stdout:(72/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 963 kB/s | 64 kB 00:00 2026-03-10T14:01:36.059 INFO:teuthology.orchestra.run.vm04.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 15 MB/s | 111 kB 00:00 2026-03-10T14:01:36.066 INFO:teuthology.orchestra.run.vm04.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 45 MB/s | 308 kB 00:00 2026-03-10T14:01:36.104 INFO:teuthology.orchestra.run.vm04.stdout:(75/119): socat-1.7.4.1-8.el9.x86_64.rpm 1.9 MB/s | 303 kB 00:00 2026-03-10T14:01:36.113 INFO:teuthology.orchestra.run.vm04.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 2.8 MB/s | 25 kB 00:00 2026-03-10T14:01:36.129 INFO:teuthology.orchestra.run.vm04.stdout:(77/119): libarrow-9.0.0-15.el9.x86_64.rpm 71 MB/s | 4.4 MB 00:00 2026-03-10T14:01:36.131 INFO:teuthology.orchestra.run.vm04.stdout:(78/119): liboath-2.6.12-1.el9.x86_64.rpm 2.8 MB/s | 49 kB 00:00 2026-03-10T14:01:36.132 INFO:teuthology.orchestra.run.vm04.stdout:(79/119): libunwind-1.6.2-1.el9.x86_64.rpm 26 MB/s | 67 kB 00:00 2026-03-10T14:01:36.147 INFO:teuthology.orchestra.run.vm04.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 37 MB/s | 548 kB 00:00 2026-03-10T14:01:36.150 INFO:teuthology.orchestra.run.vm04.stdout:(81/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 42 MB/s | 838 kB 00:00 2026-03-10T14:01:36.151 INFO:teuthology.orchestra.run.vm04.stdout:(82/119): python3-autocommand-2.2.2-8.el9.noarc 7.0 MB/s | 29 kB 00:00 2026-03-10T14:01:36.153 INFO:teuthology.orchestra.run.vm04.stdout:(83/119): python3-backports-tarfile-1.2.0-1.el9 24 MB/s | 60 kB 00:00 2026-03-10T14:01:36.154 INFO:teuthology.orchestra.run.vm04.stdout:(84/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 18 MB/s | 43 kB 00:00 2026-03-10T14:01:36.156 INFO:teuthology.orchestra.run.vm04.stdout:(85/119): python3-cachetools-4.2.4-1.el9.noarch 14 MB/s | 32 kB 00:00 2026-03-10T14:01:36.156 INFO:teuthology.orchestra.run.vm04.stdout:(86/119): python3-certifi-2023.05.07-4.el9.noar 6.3 MB/s | 14 kB 00:00 2026-03-10T14:01:36.160 INFO:teuthology.orchestra.run.vm04.stdout:(87/119): python3-cheroot-10.0.1-4.el9.noarch.r 45 MB/s | 173 kB 00:00 2026-03-10T14:01:36.167 INFO:teuthology.orchestra.run.vm04.stdout:(88/119): python3-cherrypy-18.6.1-2.el9.noarch. 33 MB/s | 358 kB 00:00 2026-03-10T14:01:36.169 INFO:teuthology.orchestra.run.vm04.stdout:(89/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 5.3 MB/s | 11 kB 00:00 2026-03-10T14:01:36.171 INFO:teuthology.orchestra.run.vm04.stdout:(90/119): python3-google-auth-2.45.0-1.el9.noar 23 MB/s | 254 kB 00:00 2026-03-10T14:01:36.171 INFO:teuthology.orchestra.run.vm04.stdout:(91/119): python3-jaraco-classes-3.2.1-5.el9.no 8.5 MB/s | 18 kB 00:00 2026-03-10T14:01:36.173 INFO:teuthology.orchestra.run.vm04.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 10 MB/s | 23 kB 00:00 2026-03-10T14:01:36.174 INFO:teuthology.orchestra.run.vm04.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 9.7 MB/s | 20 kB 00:00 2026-03-10T14:01:36.175 INFO:teuthology.orchestra.run.vm04.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 9.0 MB/s | 19 kB 00:00 2026-03-10T14:01:36.176 INFO:teuthology.orchestra.run.vm04.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 12 MB/s | 26 kB 00:00 2026-03-10T14:01:36.178 INFO:teuthology.orchestra.run.vm04.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 3.3 MB/s | 9.0 kB 00:00 2026-03-10T14:01:36.179 INFO:teuthology.orchestra.run.vm04.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 14 MB/s | 41 kB 00:00 2026-03-10T14:01:36.217 INFO:teuthology.orchestra.run.vm04.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 1.2 MB/s | 46 kB 00:00 2026-03-10T14:01:36.260 INFO:teuthology.orchestra.run.vm04.stdout:(99/119): python3-more-itertools-8.12.0-2.el9.n 1.8 MB/s | 79 kB 00:00 2026-03-10T14:01:36.321 INFO:teuthology.orchestra.run.vm04.stdout:(100/119): python3-natsort-7.1.1-5.el9.noarch.r 954 kB/s | 58 kB 00:00 2026-03-10T14:01:36.347 INFO:teuthology.orchestra.run.vm04.stdout:(101/119): python3-kubernetes-26.1.0-3.el9.noar 6.1 MB/s | 1.0 MB 00:00 2026-03-10T14:01:36.359 INFO:teuthology.orchestra.run.vm04.stdout:(102/119): python3-pecan-1.4.2-3.el9.noarch.rpm 7.0 MB/s | 272 kB 00:00 2026-03-10T14:01:36.406 INFO:teuthology.orchestra.run.vm04.stdout:(103/119): python3-portend-3.1.0-2.el9.noarch.r 278 kB/s | 16 kB 00:00 2026-03-10T14:01:36.407 INFO:teuthology.orchestra.run.vm04.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 1.8 MB/s | 90 kB 00:00 2026-03-10T14:01:36.419 INFO:teuthology.orchestra.run.vm04.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 2.5 MB/s | 31 kB 00:00 2026-03-10T14:01:36.432 INFO:teuthology.orchestra.run.vm04.stdout:(106/119): python3-routes-2.5.1-5.el9.noarch.rp 7.7 MB/s | 188 kB 00:00 2026-03-10T14:01:36.439 INFO:teuthology.orchestra.run.vm04.stdout:(107/119): python3-rsa-4.9-2.el9.noarch.rpm 2.9 MB/s | 59 kB 00:00 2026-03-10T14:01:36.440 INFO:teuthology.orchestra.run.vm04.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 4.3 MB/s | 36 kB 00:00 2026-03-10T14:01:36.456 INFO:teuthology.orchestra.run.vm04.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 5.2 MB/s | 86 kB 00:00 2026-03-10T14:01:36.459 INFO:teuthology.orchestra.run.vm04.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 12 MB/s | 230 kB 00:00 2026-03-10T14:01:36.460 INFO:teuthology.orchestra.run.vm04.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 22 MB/s | 90 kB 00:00 2026-03-10T14:01:36.501 INFO:teuthology.orchestra.run.vm04.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 537 kB/s | 22 kB 00:00 2026-03-10T14:01:36.509 INFO:teuthology.orchestra.run.vm04.stdout:(113/119): python3-werkzeug-2.0.3-3.el9.1.noarc 8.3 MB/s | 427 kB 00:00 2026-03-10T14:01:36.512 INFO:teuthology.orchestra.run.vm04.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 1.9 MB/s | 20 kB 00:00 2026-03-10T14:01:36.514 INFO:teuthology.orchestra.run.vm04.stdout:(115/119): re2-20211101-20.el9.x86_64.rpm 45 MB/s | 191 kB 00:00 2026-03-10T14:01:36.538 INFO:teuthology.orchestra.run.vm04.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 61 MB/s | 1.6 MB 00:00 2026-03-10T14:01:36.561 INFO:teuthology.orchestra.run.vm03.stdout:(119/119): librados2-18.2.0-0.el9.x86_64.rpm 1.2 MB/s | 3.3 MB 00:02 2026-03-10T14:01:36.564 INFO:teuthology.orchestra.run.vm03.stdout:-------------------------------------------------------------------------------- 2026-03-10T14:01:36.564 INFO:teuthology.orchestra.run.vm03.stdout:Total 8.5 MB/s | 182 MB 00:21 2026-03-10T14:01:37.179 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:01:37.232 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:01:37.232 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:01:37.612 INFO:teuthology.orchestra.run.vm04.stdout:(117/119): librados2-18.2.0-0.el9.x86_64.rpm 3.0 MB/s | 3.3 MB 00:01 2026-03-10T14:01:37.831 INFO:teuthology.orchestra.run.vm04.stdout:(118/119): librbd1-18.2.0-0.el9.x86_64.rpm 2.3 MB/s | 3.0 MB 00:01 2026-03-10T14:01:38.026 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:01:38.027 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:01:38.902 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:01:38.913 INFO:teuthology.orchestra.run.vm03.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T14:01:38.929 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T14:01:39.131 INFO:teuthology.orchestra.run.vm03.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T14:01:39.134 INFO:teuthology.orchestra.run.vm03.stdout: Upgrading : librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T14:01:39.182 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T14:01:39.184 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T14:01:39.218 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T14:01:39.229 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rados-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T14:01:39.235 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T14:01:39.239 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T14:01:39.249 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T14:01:39.250 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T14:01:39.290 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T14:01:39.291 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T14:01:39.351 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T14:01:39.359 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T14:01:39.399 INFO:teuthology.orchestra.run.vm03.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T14:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T14:01:39.414 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T14:01:39.445 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T14:01:39.466 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T14:01:39.471 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T14:01:39.479 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T14:01:39.481 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T14:01:39.487 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T14:01:39.498 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T14:01:39.514 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cephfs-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T14:01:39.548 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T14:01:39.617 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T14:01:39.640 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T14:01:39.651 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T14:01:39.663 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T14:01:39.667 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librados-devel-2:18.2.0-0.el9.x86_64 29/121 2026-03-10T14:01:39.707 INFO:teuthology.orchestra.run.vm03.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T14:01:39.715 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T14:01:39.735 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T14:01:39.764 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T14:01:39.774 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T14:01:39.785 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T14:01:39.802 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T14:01:39.815 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T14:01:39.829 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T14:01:39.902 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T14:01:39.913 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T14:01:39.927 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T14:01:39.983 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T14:01:40.390 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T14:01:40.409 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T14:01:40.416 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T14:01:40.424 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T14:01:40.430 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T14:01:40.439 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T14:01:40.443 INFO:teuthology.orchestra.run.vm03.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T14:01:40.447 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T14:01:40.458 INFO:teuthology.orchestra.run.vm03.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T14:01:40.467 INFO:teuthology.orchestra.run.vm03.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T14:01:40.472 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T14:01:40.481 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T14:01:40.488 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T14:01:40.497 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T14:01:40.503 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T14:01:40.547 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T14:01:40.866 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T14:01:40.904 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T14:01:40.912 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T14:01:40.981 INFO:teuthology.orchestra.run.vm03.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T14:01:40.984 INFO:teuthology.orchestra.run.vm03.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T14:01:41.012 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T14:01:41.451 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T14:01:41.549 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T14:01:41.696 INFO:teuthology.orchestra.run.vm04.stdout:(119/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 3.3 MB/s | 19 MB 00:05 2026-03-10T14:01:41.700 INFO:teuthology.orchestra.run.vm04.stdout:-------------------------------------------------------------------------------- 2026-03-10T14:01:41.700 INFO:teuthology.orchestra.run.vm04.stdout:Total 6.8 MB/s | 182 MB 00:26 2026-03-10T14:01:42.253 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:01:42.299 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:01:42.300 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:01:42.438 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T14:01:42.548 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T14:01:42.570 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T14:01:42.577 INFO:teuthology.orchestra.run.vm03.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T14:01:42.756 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T14:01:42.758 INFO:teuthology.orchestra.run.vm03.stdout: Upgrading : librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T14:01:42.798 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T14:01:42.802 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rbd-2:18.2.0-0.el9.x86_64 73/121 2026-03-10T14:01:42.812 INFO:teuthology.orchestra.run.vm03.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T14:01:43.042 INFO:teuthology.orchestra.run.vm03.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T14:01:43.045 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T14:01:43.059 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:01:43.059 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:01:43.063 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T14:01:43.072 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rgw-2:18.2.0-0.el9.x86_64 77/121 2026-03-10T14:01:43.091 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T14:01:43.114 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T14:01:43.217 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T14:01:43.232 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T14:01:43.263 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T14:01:43.304 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T14:01:43.375 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T14:01:43.389 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T14:01:43.392 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T14:01:43.400 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T14:01:43.405 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T14:01:43.410 INFO:teuthology.orchestra.run.vm03.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T14:01:43.413 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T14:01:43.435 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T14:01:43.435 INFO:teuthology.orchestra.run.vm03.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T14:01:43.435 INFO:teuthology.orchestra.run.vm03.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T14:01:43.435 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:43.470 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T14:01:43.514 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T14:01:43.514 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T14:01:43.514 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:43.537 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T14:01:43.595 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T14:01:43.599 INFO:teuthology.orchestra.run.vm03.stdout: Installing : cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T14:01:43.605 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 94/121 2026-03-10T14:01:43.635 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 95/121 2026-03-10T14:01:43.639 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ceph-common-2:18.2.0-0.el9.x86_64 96/121 2026-03-10T14:01:43.885 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:01:43.894 INFO:teuthology.orchestra.run.vm04.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T14:01:43.907 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T14:01:44.075 INFO:teuthology.orchestra.run.vm04.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T14:01:44.077 INFO:teuthology.orchestra.run.vm04.stdout: Upgrading : librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T14:01:44.122 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T14:01:44.125 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T14:01:44.155 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T14:01:44.166 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rados-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T14:01:44.171 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T14:01:44.173 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T14:01:44.187 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T14:01:44.267 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T14:01:44.304 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T14:01:44.306 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T14:01:44.355 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T14:01:44.362 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T14:01:44.387 INFO:teuthology.orchestra.run.vm04.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T14:01:44.397 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T14:01:44.400 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T14:01:44.427 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T14:01:44.445 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T14:01:44.449 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T14:01:44.457 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T14:01:44.460 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T14:01:44.465 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T14:01:44.475 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T14:01:44.490 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cephfs-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T14:01:44.519 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T14:01:44.580 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T14:01:44.597 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T14:01:44.606 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T14:01:44.616 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T14:01:44.623 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librados-devel-2:18.2.0-0.el9.x86_64 29/121 2026-03-10T14:01:44.662 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T14:01:44.663 INFO:teuthology.orchestra.run.vm04.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T14:01:44.669 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T14:01:44.759 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T14:01:44.773 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T14:01:44.799 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T14:01:44.806 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T14:01:44.813 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T14:01:44.828 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T14:01:44.840 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T14:01:44.854 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T14:01:44.928 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T14:01:44.938 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T14:01:44.948 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T14:01:45.004 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T14:01:45.085 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T14:01:45.093 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T14:01:45.141 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T14:01:45.141 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T14:01:45.141 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T14:01:45.141 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:45.147 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T14:01:45.411 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T14:01:45.429 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T14:01:45.436 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T14:01:45.445 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T14:01:45.451 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T14:01:45.463 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T14:01:45.467 INFO:teuthology.orchestra.run.vm04.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T14:01:45.470 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T14:01:45.481 INFO:teuthology.orchestra.run.vm04.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T14:01:45.490 INFO:teuthology.orchestra.run.vm04.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T14:01:45.502 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T14:01:45.516 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T14:01:45.522 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T14:01:45.531 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T14:01:45.536 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T14:01:45.578 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T14:01:45.860 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T14:01:45.892 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T14:01:45.899 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T14:01:45.965 INFO:teuthology.orchestra.run.vm04.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T14:01:45.969 INFO:teuthology.orchestra.run.vm04.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T14:01:45.996 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T14:01:46.411 INFO:teuthology.orchestra.run.vm04.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T14:01:46.511 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T14:01:47.326 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T14:01:47.356 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T14:01:47.363 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T14:01:47.368 INFO:teuthology.orchestra.run.vm04.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T14:01:47.523 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T14:01:47.526 INFO:teuthology.orchestra.run.vm04.stdout: Upgrading : librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T14:01:47.561 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T14:01:47.566 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rbd-2:18.2.0-0.el9.x86_64 73/121 2026-03-10T14:01:47.575 INFO:teuthology.orchestra.run.vm04.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T14:01:47.788 INFO:teuthology.orchestra.run.vm04.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T14:01:47.791 INFO:teuthology.orchestra.run.vm04.stdout: Installing : librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T14:01:47.811 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T14:01:47.822 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-rgw-2:18.2.0-0.el9.x86_64 77/121 2026-03-10T14:01:47.846 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T14:01:47.873 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T14:01:47.973 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T14:01:47.989 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T14:01:48.021 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T14:01:48.059 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T14:01:48.122 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T14:01:48.137 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T14:01:48.141 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T14:01:48.148 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T14:01:48.154 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T14:01:48.159 INFO:teuthology.orchestra.run.vm04.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T14:01:48.162 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T14:01:48.185 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T14:01:48.185 INFO:teuthology.orchestra.run.vm04.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T14:01:48.185 INFO:teuthology.orchestra.run.vm04.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T14:01:48.185 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:48.200 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T14:01:48.234 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T14:01:48.234 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T14:01:48.234 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:48.253 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T14:01:48.313 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T14:01:48.316 INFO:teuthology.orchestra.run.vm04.stdout: Installing : cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T14:01:48.321 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 94/121 2026-03-10T14:01:48.350 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 95/121 2026-03-10T14:01:48.355 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-ceph-common-2:18.2.0-0.el9.x86_64 96/121 2026-03-10T14:01:49.371 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T14:01:49.414 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T14:01:49.745 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T14:01:49.752 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T14:01:49.798 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T14:01:49.798 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T14:01:49.798 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T14:01:49.798 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:49.803 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T14:01:52.270 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T14:01:52.271 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /sys 2026-03-10T14:01:52.271 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /proc 2026-03-10T14:01:52.271 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /mnt 2026-03-10T14:01:52.271 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /var/tmp 2026-03-10T14:01:52.271 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /home 2026-03-10T14:01:52.271 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /root 2026-03-10T14:01:52.271 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /tmp 2026-03-10T14:01:52.271 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:52.313 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T14:01:52.452 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T14:01:52.458 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T14:01:53.038 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T14:01:53.040 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T14:01:53.107 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T14:01:53.185 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 103/121 2026-03-10T14:01:53.188 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T14:01:53.210 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T14:01:53.210 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:53.210 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T14:01:53.210 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T14:01:53.210 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T14:01:53.210 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:53.223 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T14:01:53.333 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T14:01:53.336 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T14:01:53.362 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T14:01:53.362 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:53.362 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T14:01:53.362 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T14:01:53.362 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T14:01:53.362 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:53.595 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T14:01:53.620 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T14:01:53.621 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:53.621 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T14:01:53.621 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T14:01:53.621 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T14:01:53.621 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:54.491 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T14:01:54.521 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T14:01:54.521 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:54.521 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T14:01:54.521 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T14:01:54.521 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T14:01:54.521 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:54.923 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-2:18.2.0-0.el9.x86_64 109/121 2026-03-10T14:01:54.928 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T14:01:54.954 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T14:01:54.954 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:54.954 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T14:01:54.954 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T14:01:54.954 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T14:01:54.954 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:54.969 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T14:01:54.998 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T14:01:54.998 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:54.998 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T14:01:54.998 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:55.159 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T14:01:55.186 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T14:01:55.186 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:55.186 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T14:01:55.186 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T14:01:55.186 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T14:01:55.186 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /sys 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /proc 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /mnt 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /var/tmp 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /home 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /root 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /tmp 2026-03-10T14:01:56.561 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:56.593 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T14:01:56.725 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T14:01:56.731 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T14:01:57.288 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T14:01:57.312 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T14:01:57.323 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-test-2:18.2.0-0.el9.x86_64 113/121 2026-03-10T14:01:57.336 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-fuse-2:18.2.0-0.el9.x86_64 114/121 2026-03-10T14:01:57.341 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-nbd-2:18.2.0-0.el9.x86_64 115/121 2026-03-10T14:01:57.376 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T14:01:57.383 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephfs-devel-2:18.2.0-0.el9.x86_64 116/121 2026-03-10T14:01:57.390 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-fuse-2:18.2.0-0.el9.x86_64 117/121 2026-03-10T14:01:57.401 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T14:01:57.406 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T14:01:57.406 INFO:teuthology.orchestra.run.vm03.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T14:01:57.423 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T14:01:57.423 INFO:teuthology.orchestra.run.vm03.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T14:01:57.456 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 103/121 2026-03-10T14:01:57.459 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T14:01:57.484 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T14:01:57.484 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:57.484 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T14:01:57.484 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T14:01:57.484 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T14:01:57.484 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:57.497 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T14:01:57.611 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T14:01:57.615 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T14:01:57.642 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T14:01:57.642 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:57.642 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T14:01:57.642 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T14:01:57.642 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T14:01:57.642 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:57.894 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T14:01:57.925 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T14:01:57.925 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:57.925 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T14:01:57.925 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T14:01:57.925 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T14:01:57.926 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 2/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 3/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 5/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 7/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 8/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 9/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 12/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 13/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 14/121 2026-03-10T14:01:58.714 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 15/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 16/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 17/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 18/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 19/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 20/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 21/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 24/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 25/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 26/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 27/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 28/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 29/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 30/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 31/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 32/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 33/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 34/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 35/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T14:01:58.715 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T14:01:58.716 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T14:01:58.717 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 118/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T14:01:58.718 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 120/121 2026-03-10T14:01:58.828 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T14:01:58.856 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T14:01:58.856 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:58.856 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T14:01:58.857 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T14:01:58.857 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T14:01:58.857 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout:Upgraded: 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: librados2-2:18.2.0-0.el9.x86_64 librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout:Installed: 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T14:01:58.861 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T14:01:58.862 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:01:58.863 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:01:58.954 DEBUG:teuthology.parallel:result is None 2026-03-10T14:01:59.245 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-2:18.2.0-0.el9.x86_64 109/121 2026-03-10T14:01:59.249 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T14:01:59.274 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T14:01:59.274 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:59.274 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T14:01:59.274 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T14:01:59.274 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T14:01:59.274 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:59.285 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T14:01:59.308 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T14:01:59.308 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:59.308 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T14:01:59.308 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:01:59.457 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T14:01:59.482 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T14:01:59.482 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:01:59.482 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T14:01:59.482 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T14:01:59.482 INFO:teuthology.orchestra.run.vm04.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T14:01:59.482 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:02:01.595 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-test-2:18.2.0-0.el9.x86_64 113/121 2026-03-10T14:02:01.607 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-fuse-2:18.2.0-0.el9.x86_64 114/121 2026-03-10T14:02:01.612 INFO:teuthology.orchestra.run.vm04.stdout: Installing : rbd-nbd-2:18.2.0-0.el9.x86_64 115/121 2026-03-10T14:02:01.653 INFO:teuthology.orchestra.run.vm04.stdout: Installing : libcephfs-devel-2:18.2.0-0.el9.x86_64 116/121 2026-03-10T14:02:01.659 INFO:teuthology.orchestra.run.vm04.stdout: Installing : ceph-fuse-2:18.2.0-0.el9.x86_64 117/121 2026-03-10T14:02:01.668 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T14:02:01.673 INFO:teuthology.orchestra.run.vm04.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T14:02:01.673 INFO:teuthology.orchestra.run.vm04.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T14:02:01.690 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T14:02:01.690 INFO:teuthology.orchestra.run.vm04.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T14:02:02.892 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 2/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 3/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 5/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 7/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 8/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 9/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 12/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 13/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 14/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 15/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 16/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 17/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 18/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 19/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 20/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 21/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 24/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 25/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 26/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 27/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 28/121 2026-03-10T14:02:02.893 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 29/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 30/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 31/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 32/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 33/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 34/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 35/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T14:02:02.896 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T14:02:02.897 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 118/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T14:02:02.898 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 120/121 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout:Upgraded: 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: librados2-2:18.2.0-0.el9.x86_64 librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout:Installed: 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T14:02:03.002 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T14:02:03.003 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:02:03.004 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:02:03.091 DEBUG:teuthology.parallel:result is None 2026-03-10T14:02:03.091 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T14:02:03.092 INFO:teuthology.packaging:ref: None 2026-03-10T14:02:03.092 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T14:02:03.092 INFO:teuthology.packaging:branch: None 2026-03-10T14:02:03.092 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:02:03.092 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T14:02:03.680 DEBUG:teuthology.orchestra.run.vm03:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T14:02:03.702 INFO:teuthology.orchestra.run.vm03.stdout:18.2.0-0.el9 2026-03-10T14:02:03.703 INFO:teuthology.packaging:The installed version of ceph is 18.2.0-0.el9 2026-03-10T14:02:03.703 INFO:teuthology.task.install:The correct ceph version 18.2.0-0 is installed. 2026-03-10T14:02:03.704 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T14:02:03.704 INFO:teuthology.packaging:ref: None 2026-03-10T14:02:03.704 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T14:02:03.704 INFO:teuthology.packaging:branch: None 2026-03-10T14:02:03.704 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:02:03.704 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T14:02:04.292 DEBUG:teuthology.orchestra.run.vm04:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T14:02:04.313 INFO:teuthology.orchestra.run.vm04.stdout:18.2.0-0.el9 2026-03-10T14:02:04.314 INFO:teuthology.packaging:The installed version of ceph is 18.2.0-0.el9 2026-03-10T14:02:04.314 INFO:teuthology.task.install:The correct ceph version 18.2.0-0 is installed. 2026-03-10T14:02:04.315 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-10T14:02:04.315 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:02:04.315 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T14:02:04.345 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:02:04.345 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T14:02:04.380 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-10T14:02:04.381 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:02:04.381 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T14:02:04.417 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T14:02:04.480 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:02:04.480 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T14:02:04.508 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T14:02:04.573 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-10T14:02:04.573 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:02:04.573 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T14:02:04.598 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T14:02:04.663 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:02:04.664 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T14:02:04.693 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T14:02:04.762 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-10T14:02:04.762 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:02:04.762 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T14:02:04.791 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T14:02:04.858 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:02:04.858 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T14:02:04.890 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T14:02:04.960 INFO:teuthology.run_tasks:Running task print... 2026-03-10T14:02:04.964 INFO:teuthology.task.print:**** done install task... 2026-03-10T14:02:04.964 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-10T14:02:05.013 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.io/ceph/ceph:v18.2.0', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-10T14:02:05.013 INFO:tasks.cephadm:Cluster image is quay.io/ceph/ceph:v18.2.0 2026-03-10T14:02:05.013 INFO:tasks.cephadm:Cluster fsid is b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:05.013 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-10T14:02:05.013 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-10T14:02:05.013 INFO:tasks.cephadm:Monitor IPs: {'mon.vm03': '192.168.123.103', 'mon.vm04': '192.168.123.104'} 2026-03-10T14:02:05.013 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-10T14:02:05.013 DEBUG:teuthology.orchestra.run.vm03:> sudo hostname $(hostname -s) 2026-03-10T14:02:05.039 DEBUG:teuthology.orchestra.run.vm04:> sudo hostname $(hostname -s) 2026-03-10T14:02:05.069 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-10T14:02:05.069 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:02:05.675 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-10T14:02:06.375 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-10T14:02:06.376 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T14:02:06.376 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T14:02:06.376 DEBUG:teuthology.orchestra.run.vm03:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T14:02:07.672 INFO:teuthology.orchestra.run.vm03.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 14:02 /home/ubuntu/cephtest/cephadm 2026-03-10T14:02:07.673 DEBUG:teuthology.orchestra.run.vm04:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T14:02:08.932 INFO:teuthology.orchestra.run.vm04.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 14:02 /home/ubuntu/cephtest/cephadm 2026-03-10T14:02:08.932 DEBUG:teuthology.orchestra.run.vm03:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T14:02:08.949 DEBUG:teuthology.orchestra.run.vm04:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T14:02:08.971 INFO:tasks.cephadm:Pulling image quay.io/ceph/ceph:v18.2.0 on all hosts... 2026-03-10T14:02:08.971 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 pull 2026-03-10T14:02:08.991 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 pull 2026-03-10T14:02:09.129 INFO:teuthology.orchestra.run.vm03.stderr:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T14:02:09.156 INFO:teuthology.orchestra.run.vm04.stderr:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T14:02:33.670 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T14:02:33.670 INFO:teuthology.orchestra.run.vm04.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T14:02:33.670 INFO:teuthology.orchestra.run.vm04.stdout: "image_id": "dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946", 2026-03-10T14:02:33.670 INFO:teuthology.orchestra.run.vm04.stdout: "repo_digests": [ 2026-03-10T14:02:33.670 INFO:teuthology.orchestra.run.vm04.stdout: "quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007", 2026-03-10T14:02:33.670 INFO:teuthology.orchestra.run.vm04.stdout: "quay.io/ceph/ceph@sha256:d58cf65589d0abf9d5261cd46fb62be3bfb29098febc78c0fcdf116a15274d27" 2026-03-10T14:02:33.670 INFO:teuthology.orchestra.run.vm04.stdout: ] 2026-03-10T14:02:33.670 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T14:02:33.697 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:02:33.697 INFO:teuthology.orchestra.run.vm03.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T14:02:33.697 INFO:teuthology.orchestra.run.vm03.stdout: "image_id": "dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946", 2026-03-10T14:02:33.697 INFO:teuthology.orchestra.run.vm03.stdout: "repo_digests": [ 2026-03-10T14:02:33.697 INFO:teuthology.orchestra.run.vm03.stdout: "quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007", 2026-03-10T14:02:33.697 INFO:teuthology.orchestra.run.vm03.stdout: "quay.io/ceph/ceph@sha256:d58cf65589d0abf9d5261cd46fb62be3bfb29098febc78c0fcdf116a15274d27" 2026-03-10T14:02:33.697 INFO:teuthology.orchestra.run.vm03.stdout: ] 2026-03-10T14:02:33.697 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:02:33.707 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /etc/ceph 2026-03-10T14:02:33.740 DEBUG:teuthology.orchestra.run.vm04:> sudo mkdir -p /etc/ceph 2026-03-10T14:02:33.774 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 777 /etc/ceph 2026-03-10T14:02:33.810 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 777 /etc/ceph 2026-03-10T14:02:33.844 INFO:tasks.cephadm:Writing seed config... 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-10T14:02:33.845 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-10T14:02:33.846 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-10T14:02:33.846 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:02:33.846 DEBUG:teuthology.orchestra.run.vm03:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-10T14:02:33.872 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = b81bf660-1c89-11f1-b612-27d302cdb124 mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-10T14:02:33.872 DEBUG:teuthology.orchestra.run.vm03:mon.vm03> sudo journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03.service 2026-03-10T14:02:33.914 INFO:tasks.cephadm:Bootstrapping... 2026-03-10T14:02:33.914 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 -v bootstrap --fsid b81bf660-1c89-11f1-b612-27d302cdb124 --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.103 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-10T14:02:34.039 INFO:teuthology.orchestra.run.vm03.stdout:-------------------------------------------------------------------------------- 2026-03-10T14:02:34.040 INFO:teuthology.orchestra.run.vm03.stdout:cephadm ['--image', 'quay.io/ceph/ceph:v18.2.0', '-v', 'bootstrap', '--fsid', 'b81bf660-1c89-11f1-b612-27d302cdb124', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.103', '--skip-admin-label'] 2026-03-10T14:02:34.062 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-10T14:02:34.062 INFO:teuthology.orchestra.run.vm03.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-10T14:02:34.063 INFO:teuthology.orchestra.run.vm03.stdout:Verifying podman|docker is present... 2026-03-10T14:02:34.085 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-10T14:02:34.085 INFO:teuthology.orchestra.run.vm03.stdout:Verifying lvm2 is present... 2026-03-10T14:02:34.085 INFO:teuthology.orchestra.run.vm03.stdout:Verifying time synchronization is in place... 2026-03-10T14:02:34.094 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T14:02:34.094 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T14:02:34.100 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T14:02:34.100 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout inactive 2026-03-10T14:02:34.109 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout enabled 2026-03-10T14:02:34.116 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout active 2026-03-10T14:02:34.116 INFO:teuthology.orchestra.run.vm03.stdout:Unit chronyd.service is enabled and running 2026-03-10T14:02:34.116 INFO:teuthology.orchestra.run.vm03.stdout:Repeating the final host check... 2026-03-10T14:02:34.139 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-10T14:02:34.139 INFO:teuthology.orchestra.run.vm03.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-10T14:02:34.139 INFO:teuthology.orchestra.run.vm03.stdout:systemctl is present 2026-03-10T14:02:34.139 INFO:teuthology.orchestra.run.vm03.stdout:lvcreate is present 2026-03-10T14:02:34.146 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T14:02:34.146 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T14:02:34.153 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T14:02:34.153 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout inactive 2026-03-10T14:02:34.160 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout enabled 2026-03-10T14:02:34.167 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout active 2026-03-10T14:02:34.167 INFO:teuthology.orchestra.run.vm03.stdout:Unit chronyd.service is enabled and running 2026-03-10T14:02:34.167 INFO:teuthology.orchestra.run.vm03.stdout:Host looks OK 2026-03-10T14:02:34.167 INFO:teuthology.orchestra.run.vm03.stdout:Cluster fsid: b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:34.167 INFO:teuthology.orchestra.run.vm03.stdout:Acquiring lock 139904218411984 on /run/cephadm/b81bf660-1c89-11f1-b612-27d302cdb124.lock 2026-03-10T14:02:34.167 INFO:teuthology.orchestra.run.vm03.stdout:Lock 139904218411984 acquired on /run/cephadm/b81bf660-1c89-11f1-b612-27d302cdb124.lock 2026-03-10T14:02:34.167 INFO:teuthology.orchestra.run.vm03.stdout:Verifying IP 192.168.123.103 port 3300 ... 2026-03-10T14:02:34.168 INFO:teuthology.orchestra.run.vm03.stdout:Verifying IP 192.168.123.103 port 6789 ... 2026-03-10T14:02:34.168 INFO:teuthology.orchestra.run.vm03.stdout:Base mon IP(s) is [192.168.123.103:3300, 192.168.123.103:6789], mon addrv is [v2:192.168.123.103:3300,v1:192.168.123.103:6789] 2026-03-10T14:02:34.172 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.103 metric 100 2026-03-10T14:02:34.172 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.103 metric 100 2026-03-10T14:02:34.174 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-10T14:02:34.174 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-10T14:02:34.177 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-10T14:02:34.177 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-10T14:02:34.177 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T14:02:34.177 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-10T14:02:34.177 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:3/64 scope link noprefixroute 2026-03-10T14:02:34.177 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T14:02:34.177 INFO:teuthology.orchestra.run.vm03.stdout:Mon IP `192.168.123.103` is in CIDR network `192.168.123.0/24` 2026-03-10T14:02:34.177 INFO:teuthology.orchestra.run.vm03.stdout:Mon IP `192.168.123.103` is in CIDR network `192.168.123.0/24` 2026-03-10T14:02:34.178 INFO:teuthology.orchestra.run.vm03.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-10T14:02:34.178 INFO:teuthology.orchestra.run.vm03.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-10T14:02:34.179 INFO:teuthology.orchestra.run.vm03.stdout:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T14:02:35.412 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 2026-03-10T14:02:35.413 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Trying to pull quay.io/ceph/ceph:v18.2.0... 2026-03-10T14:02:35.413 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Getting image source signatures 2026-03-10T14:02:35.413 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying blob sha256:3bd20aeff60302f668275dc2005d10679ae56492967a3a5a54fd3dde85333aec 2026-03-10T14:02:35.413 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying blob sha256:46af8f5390d4e94fc57efb422ccb97bb53dfe5b948546bfc191b46557eb2dbd9 2026-03-10T14:02:35.413 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying config sha256:dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 2026-03-10T14:02:35.413 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-10T14:02:35.715 INFO:teuthology.orchestra.run.vm03.stdout:ceph: stdout ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable) 2026-03-10T14:02:35.716 INFO:teuthology.orchestra.run.vm03.stdout:Ceph version: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable) 2026-03-10T14:02:35.716 INFO:teuthology.orchestra.run.vm03.stdout:Extracting ceph user uid/gid from container image... 2026-03-10T14:02:35.918 INFO:teuthology.orchestra.run.vm03.stdout:stat: stdout 167 167 2026-03-10T14:02:35.918 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial keys... 2026-03-10T14:02:36.349 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQB8JLBpGOecDBAAklXV4EovvGXagss+IiqaVg== 2026-03-10T14:02:36.448 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQB8JLBpFy59GRAAbxUakOGorhdKmlrm7GJJ9g== 2026-03-10T14:02:36.591 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQB8JLBppjkfIBAALGfXEgRiWOlu+Y8OTd+uDQ== 2026-03-10T14:02:36.591 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial monmap... 2026-03-10T14:02:36.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout:monmaptool for vm03 [v2:192.168.123.103:3300,v1:192.168.123.103:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout:setting min_mon_release = pacific 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: set fsid to b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:02:36.706 INFO:teuthology.orchestra.run.vm03.stdout:Creating mon... 2026-03-10T14:02:36.842 INFO:teuthology.orchestra.run.vm03.stdout:create mon.vm03 on 2026-03-10T14:02:37.025 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-10T14:02:37.153 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-10T14:02:37.283 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-b81bf660-1c89-11f1-b612-27d302cdb124.target → /etc/systemd/system/ceph-b81bf660-1c89-11f1-b612-27d302cdb124.target. 2026-03-10T14:02:37.284 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-b81bf660-1c89-11f1-b612-27d302cdb124.target → /etc/systemd/system/ceph-b81bf660-1c89-11f1-b612-27d302cdb124.target. 2026-03-10T14:02:37.441 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03 2026-03-10T14:02:37.441 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to reset failed state of unit ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03.service: Unit ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03.service not loaded. 2026-03-10T14:02:37.590 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-b81bf660-1c89-11f1-b612-27d302cdb124.target.wants/ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03.service → /etc/systemd/system/ceph-b81bf660-1c89-11f1-b612-27d302cdb124@.service. 2026-03-10T14:02:37.723 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 systemd[1]: Starting Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:02:37.780 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-10T14:02:37.780 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T14:02:37.780 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mon to start... 2026-03-10T14:02:37.780 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mon... 2026-03-10T14:02:37.978 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 podman[49418]: 2026-03-10 14:02:37.724900782 +0000 UTC m=+0.020022760 container create 3083e61ddb8ae9862367d77dc04540ab46a129adbdbd83b7a07ef1efcb08a75f (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, io.buildah.version=1.29.1, CEPH_POINT_RELEASE=-18.2.0, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, RELEASE=HEAD, org.label-schema.schema-version=1.0, GIT_CLEAN=True, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-10T14:02:37.978 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 podman[49418]: 2026-03-10 14:02:37.767743325 +0000 UTC m=+0.062865313 container init 3083e61ddb8ae9862367d77dc04540ab46a129adbdbd83b7a07ef1efcb08a75f (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.build-date=20231212, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, RELEASE=HEAD, org.label-schema.schema-version=1.0, GIT_CLEAN=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.license=GPLv2, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=-18.2.0, maintainer=Guillaume Abrioux , ceph=True) 2026-03-10T14:02:37.978 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 podman[49418]: 2026-03-10 14:02:37.771819542 +0000 UTC m=+0.066941520 container start 3083e61ddb8ae9862367d77dc04540ab46a129adbdbd83b7a07ef1efcb08a75f (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, RELEASE=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=-18.2.0, maintainer=Guillaume Abrioux ) 2026-03-10T14:02:37.978 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 bash[49418]: 3083e61ddb8ae9862367d77dc04540ab46a129adbdbd83b7a07ef1efcb08a75f 2026-03-10T14:02:37.978 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 podman[49418]: 2026-03-10 14:02:37.71734333 +0000 UTC m=+0.012465308 image pull dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 quay.io/ceph/ceph:v18.2.0 2026-03-10T14:02:37.978 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 systemd[1]: Started Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:02:37.978 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 ceph-mon[49432]: mkfs b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:37.978 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:37 vm03 ceph-mon[49432]: mon.vm03 is new leader, mons vm03 in quorum (ranks 0) 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout cluster: 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout id: b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout services: 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm03 (age 0.163835s) 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout data: 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-10T14:02:38.011 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout pgs: 2026-03-10T14:02:38.012 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.929+0000 7fb09aa3f700 1 Processor -- start 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.930+0000 7fb09aa3f700 1 -- start start 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.930+0000 7fb09aa3f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094104620 0x7fb094106a40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.930+0000 7fb09aa3f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0940745b0 con 0x7fb094104620 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.930+0000 7fb093fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094104620 0x7fb094106a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.930+0000 7fb093fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094104620 0x7fb094106a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37678/0 (socket says 192.168.123.103:37678) 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.930+0000 7fb093fff700 1 -- 192.168.123.103:0/1980718099 learned_addr learned my addr 192.168.123.103:0/1980718099 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.931+0000 7fb093fff700 1 -- 192.168.123.103:0/1980718099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb0940746f0 con 0x7fb094104620 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.931+0000 7fb093fff700 1 --2- 192.168.123.103:0/1980718099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094104620 0x7fb094106a40 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fb07c009a90 tx=0x7fb07c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=fea517ee96fc0f6d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.931+0000 7fb092ffd700 1 -- 192.168.123.103:0/1980718099 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb07c004030 con 0x7fb094104620 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.931+0000 7fb092ffd700 1 -- 192.168.123.103:0/1980718099 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fb07c004190 con 0x7fb094104620 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 -- 192.168.123.103:0/1980718099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094104620 msgr2=0x7fb094106a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 --2- 192.168.123.103:0/1980718099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094104620 0x7fb094106a40 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fb07c009a90 tx=0x7fb07c009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 -- 192.168.123.103:0/1980718099 shutdown_connections 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 --2- 192.168.123.103:0/1980718099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094104620 0x7fb094106a40 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 -- 192.168.123.103:0/1980718099 >> 192.168.123.103:0/1980718099 conn(0x7fb094100270 msgr2=0x7fb0941026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 -- 192.168.123.103:0/1980718099 shutdown_connections 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 -- 192.168.123.103:0/1980718099 wait complete. 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 Processor -- start 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 -- start start 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.932+0000 7fb09aa3f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094071ca0 0x7fb0940720b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.933+0000 7fb09aa3f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb0940745b0 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.933+0000 7fb093fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094071ca0 0x7fb0940720b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.933+0000 7fb093fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094071ca0 0x7fb0940720b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37682/0 (socket says 192.168.123.103:37682) 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.933+0000 7fb093fff700 1 -- 192.168.123.103:0/3996935394 learned_addr learned my addr 192.168.123.103:0/3996935394 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.933+0000 7fb093fff700 1 -- 192.168.123.103:0/3996935394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb07c009740 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.933+0000 7fb093fff700 1 --2- 192.168.123.103:0/3996935394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094071ca0 0x7fb0940720b0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fb07c004000 tx=0x7fb07c004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.933+0000 7fb0917fa700 1 -- 192.168.123.103:0/3996935394 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb07c00be80 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.934+0000 7fb0917fa700 1 -- 192.168.123.103:0/3996935394 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fb07c0036a0 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.934+0000 7fb0917fa700 1 -- 192.168.123.103:0/3996935394 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb07c003830 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.934+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb0940725f0 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.934+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb0941ab440 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.935+0000 7fb0917fa700 1 -- 192.168.123.103:0/3996935394 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fb07c003c90 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.935+0000 7fb0917fa700 1 -- 192.168.123.103:0/3996935394 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fb07c011420 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.935+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb09404fa50 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.937+0000 7fb0917fa700 1 -- 192.168.123.103:0/3996935394 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7fb07c01b440 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.978+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7fb094062380 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.981+0000 7fb0917fa700 1 -- 192.168.123.103:0/3996935394 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+320 (secure 0 0 0) 0x7fb07c01b5e0 con 0x7fb094071ca0 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.983+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094071ca0 msgr2=0x7fb0940720b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.983+0000 7fb09aa3f700 1 --2- 192.168.123.103:0/3996935394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094071ca0 0x7fb0940720b0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fb07c004000 tx=0x7fb07c004750 comp rx=0 tx=0).stop 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.983+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 shutdown_connections 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.983+0000 7fb09aa3f700 1 --2- 192.168.123.103:0/3996935394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb094071ca0 0x7fb0940720b0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.983+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 >> 192.168.123.103:0/3996935394 conn(0x7fb094100270 msgr2=0x7fb094105a40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.983+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 shutdown_connections 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:37.983+0000 7fb09aa3f700 1 -- 192.168.123.103:0/3996935394 wait complete. 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:mon is available 2026-03-10T14:02:38.013 INFO:teuthology.orchestra.run.vm03.stdout:Assimilating anything we can from ceph.conf... 2026-03-10T14:02:38.724 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:38.724 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T14:02:38.724 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout fsid = b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:38.724 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.103:3300,v1:192.168.123.103:6789] 2026-03-10T14:02:38.724 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T14:02:38.724 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.172+0000 7f7967f9d700 1 Processor -- start 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.172+0000 7f7967f9d700 1 -- start start 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.172+0000 7f7967f9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f7960106260 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.172+0000 7f7967f9d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f79601067a0 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.173+0000 7f7965d39700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f7960106260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.174+0000 7f7965d39700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f7960106260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37690/0 (socket says 192.168.123.103:37690) 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.174+0000 7f7965d39700 1 -- 192.168.123.103:0/1063494045 learned_addr learned my addr 192.168.123.103:0/1063494045 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.174+0000 7f7965d39700 1 -- 192.168.123.103:0/1063494045 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f79601068e0 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.174+0000 7f7965d39700 1 --2- 192.168.123.103:0/1063494045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f7960106260 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f7954009a90 tx=0x7f7954009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=44862b1ef6daa0d9 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.175+0000 7f7964d37700 1 -- 192.168.123.103:0/1063494045 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7954004030 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.175+0000 7f7964d37700 1 -- 192.168.123.103:0/1063494045 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f7954004190 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.175+0000 7f7967f9d700 1 -- 192.168.123.103:0/1063494045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 msgr2=0x7f7960106260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.175+0000 7f7967f9d700 1 --2- 192.168.123.103:0/1063494045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f7960106260 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f7954009a90 tx=0x7f7954009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.176+0000 7f7967f9d700 1 -- 192.168.123.103:0/1063494045 shutdown_connections 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.176+0000 7f7967f9d700 1 --2- 192.168.123.103:0/1063494045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f7960106260 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.176+0000 7f7967f9d700 1 -- 192.168.123.103:0/1063494045 >> 192.168.123.103:0/1063494045 conn(0x7f7960101420 msgr2=0x7f7960103830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.176+0000 7f7967f9d700 1 -- 192.168.123.103:0/1063494045 shutdown_connections 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.176+0000 7f7967f9d700 1 -- 192.168.123.103:0/1063494045 wait complete. 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.177+0000 7f7967f9d700 1 Processor -- start 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.177+0000 7f7967f9d700 1 -- start start 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.177+0000 7f7967f9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f796007bc70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.177+0000 7f7965d39700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f796007bc70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.177+0000 7f7965d39700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f796007bc70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37692/0 (socket says 192.168.123.103:37692) 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.177+0000 7f7965d39700 1 -- 192.168.123.103:0/1211262476 learned_addr learned my addr 192.168.123.103:0/1211262476 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.177+0000 7f7967f9d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f79601067a0 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.178+0000 7f7965d39700 1 -- 192.168.123.103:0/1211262476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7954009740 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.178+0000 7f7965d39700 1 --2- 192.168.123.103:0/1211262476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f796007bc70 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f795400b3c0 tx=0x7f7954004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.178+0000 7f7952ffd700 1 -- 192.168.123.103:0/1211262476 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f79540036a0 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.178+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f796007a2c0 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.178+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f796007a760 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.179+0000 7f7952ffd700 1 -- 192.168.123.103:0/1211262476 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f7954003800 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.179+0000 7f7952ffd700 1 -- 192.168.123.103:0/1211262476 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7954003970 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.179+0000 7f7952ffd700 1 -- 192.168.123.103:0/1211262476 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f7954021070 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.179+0000 7f7952ffd700 1 -- 192.168.123.103:0/1211262476 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f7954024d80 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.180+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f796004f9e0 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.181+0000 7f7952ffd700 1 -- 192.168.123.103:0/1211262476 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f79601068e0 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.220+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f7960062380 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.317+0000 7f7952ffd700 1 -- 192.168.123.103:0/1211262476 <== mon.0 v2:192.168.123.103:3300/0 7 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f7954024470 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.317+0000 7f7952ffd700 1 -- 192.168.123.103:0/1211262476 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+435 (secure 0 0 0) 0x7f7954036520 con 0x7f7960105e50 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.319+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 msgr2=0x7f796007bc70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.320+0000 7f7967f9d700 1 --2- 192.168.123.103:0/1211262476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f796007bc70 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f795400b3c0 tx=0x7f7954004750 comp rx=0 tx=0).stop 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.320+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 shutdown_connections 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.320+0000 7f7967f9d700 1 --2- 192.168.123.103:0/1211262476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7960105e50 0x7f796007bc70 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.320+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 >> 192.168.123.103:0/1211262476 conn(0x7f7960101420 msgr2=0x7f796018d770 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.321+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 shutdown_connections 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:38.321+0000 7f7967f9d700 1 -- 192.168.123.103:0/1211262476 wait complete. 2026-03-10T14:02:38.725 INFO:teuthology.orchestra.run.vm03.stdout:Generating new minimal ceph.conf... 2026-03-10T14:02:38.982 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:38 vm03 ceph-mon[49432]: mon.vm03 is new leader, mons vm03 in quorum (ranks 0) 2026-03-10T14:02:38.982 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:38 vm03 ceph-mon[49432]: monmap e1: 1 mons at {vm03=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0]} removed_ranks: {} 2026-03-10T14:02:38.982 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:38 vm03 ceph-mon[49432]: fsmap 2026-03-10T14:02:38.982 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:38 vm03 ceph-mon[49432]: osdmap e1: 0 total, 0 up, 0 in 2026-03-10T14:02:38.982 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:38 vm03 ceph-mon[49432]: mgrmap e1: no daemons active 2026-03-10T14:02:38.982 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:38 vm03 ceph-mon[49432]: from='client.? 192.168.123.103:0/3996935394' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch 2026-03-10T14:02:38.982 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:38 vm03 ceph-mon[49432]: from='client.? 192.168.123.103:0/1211262476' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-10T14:02:38.982 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:38 vm03 ceph-mon[49432]: from='client.? 192.168.123.103:0/1211262476' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished 2026-03-10T14:02:39.165 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.052+0000 7fc8644a5700 1 Processor -- start 2026-03-10T14:02:39.165 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.052+0000 7fc8644a5700 1 -- start start 2026-03-10T14:02:39.165 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.053+0000 7fc8644a5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:39.165 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.053+0000 7fc8644a5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc85c108890 con 0x7fc85c107f40 2026-03-10T14:02:39.165 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.053+0000 7fc862241700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:39.165 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.053+0000 7fc862241700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37694/0 (socket says 192.168.123.103:37694) 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.053+0000 7fc862241700 1 -- 192.168.123.103:0/2209964194 learned_addr learned my addr 192.168.123.103:0/2209964194 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.054+0000 7fc862241700 1 -- 192.168.123.103:0/2209964194 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc85c1089d0 con 0x7fc85c107f40 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.054+0000 7fc862241700 1 --2- 192.168.123.103:0/2209964194 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c108350 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc84c009a90 tx=0x7fc84c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a52d09474ac6ba24 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.054+0000 7fc86123f700 1 -- 192.168.123.103:0/2209964194 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc84c004030 con 0x7fc85c107f40 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.054+0000 7fc86123f700 1 -- 192.168.123.103:0/2209964194 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fc84c00b7e0 con 0x7fc85c107f40 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.054+0000 7fc86123f700 1 -- 192.168.123.103:0/2209964194 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc84c003970 con 0x7fc85c107f40 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.055+0000 7fc8644a5700 1 -- 192.168.123.103:0/2209964194 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 msgr2=0x7fc85c108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.055+0000 7fc8644a5700 1 --2- 192.168.123.103:0/2209964194 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c108350 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc84c009a90 tx=0x7fc84c009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.055+0000 7fc8644a5700 1 -- 192.168.123.103:0/2209964194 shutdown_connections 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.055+0000 7fc8644a5700 1 --2- 192.168.123.103:0/2209964194 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c108350 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.055+0000 7fc8644a5700 1 -- 192.168.123.103:0/2209964194 >> 192.168.123.103:0/2209964194 conn(0x7fc85c07b4b0 msgr2=0x7fc85c07b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.055+0000 7fc8644a5700 1 -- 192.168.123.103:0/2209964194 shutdown_connections 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.055+0000 7fc8644a5700 1 -- 192.168.123.103:0/2209964194 wait complete. 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.056+0000 7fc8644a5700 1 Processor -- start 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.056+0000 7fc8644a5700 1 -- start start 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.056+0000 7fc8644a5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c19b950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.056+0000 7fc8644a5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc85c19be90 con 0x7fc85c107f40 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.056+0000 7fc862241700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c19b950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.057+0000 7fc862241700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c19b950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37704/0 (socket says 192.168.123.103:37704) 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.057+0000 7fc862241700 1 -- 192.168.123.103:0/720794891 learned_addr learned my addr 192.168.123.103:0/720794891 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.057+0000 7fc862241700 1 -- 192.168.123.103:0/720794891 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc84c009740 con 0x7fc85c107f40 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.057+0000 7fc862241700 1 --2- 192.168.123.103:0/720794891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c19b950 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc84c00bfb0 tx=0x7fc84c003b20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.057+0000 7fc8537fe700 1 -- 192.168.123.103:0/720794891 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc84c0040d0 con 0x7fc85c107f40 2026-03-10T14:02:39.166 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.058+0000 7fc8537fe700 1 -- 192.168.123.103:0/720794891 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fc84c004230 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.058+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc85c19c090 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.058+0000 7fc8537fe700 1 -- 192.168.123.103:0/720794891 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc84c011420 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.058+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc85c19c530 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.059+0000 7fc8537fe700 1 -- 192.168.123.103:0/720794891 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fc84c011940 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.059+0000 7fc8537fe700 1 -- 192.168.123.103:0/720794891 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc84c022400 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.059+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc85c04f9e0 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.060+0000 7fc8537fe700 1 -- 192.168.123.103:0/720794891 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7fc84c0043a0 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.106+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7fc85c062380 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.107+0000 7fc8537fe700 1 -- 192.168.123.103:0/720794891 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7fc84c011ce0 con 0x7fc85c107f40 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.109+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 msgr2=0x7fc85c19b950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.109+0000 7fc8644a5700 1 --2- 192.168.123.103:0/720794891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c19b950 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc84c00bfb0 tx=0x7fc84c003b20 comp rx=0 tx=0).stop 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.110+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 shutdown_connections 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.110+0000 7fc8644a5700 1 --2- 192.168.123.103:0/720794891 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc85c107f40 0x7fc85c19b950 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.110+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 >> 192.168.123.103:0/720794891 conn(0x7fc85c07b4b0 msgr2=0x7fc85c1054a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.110+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 shutdown_connections 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.110+0000 7fc8644a5700 1 -- 192.168.123.103:0/720794891 wait complete. 2026-03-10T14:02:39.167 INFO:teuthology.orchestra.run.vm03.stdout:Restarting the monitor... 2026-03-10T14:02:39.271 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 systemd[1]: Stopping Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:02:39.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[49428]: 2026-03-10T14:02:39.271+0000 7f9659c54700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm03 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:02:39.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[49428]: 2026-03-10T14:02:39.271+0000 7f9659c54700 -1 mon.vm03@0(leader) e1 *** Got Signal Terminated *** 2026-03-10T14:02:39.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 podman[49634]: 2026-03-10 14:02:39.477273856 +0000 UTC m=+0.220187242 container died 3083e61ddb8ae9862367d77dc04540ab46a129adbdbd83b7a07ef1efcb08a75f (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.29.1, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T14:02:39.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 podman[49634]: 2026-03-10 14:02:39.497720739 +0000 UTC m=+0.240634125 container remove 3083e61ddb8ae9862367d77dc04540ab46a129adbdbd83b7a07ef1efcb08a75f (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS) 2026-03-10T14:02:39.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 bash[49634]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03 2026-03-10T14:02:39.716 INFO:teuthology.orchestra.run.vm03.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-10T14:02:39.808 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03.service: Deactivated successfully. 2026-03-10T14:02:39.808 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 systemd[1]: Stopped Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:02:39.808 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 systemd[1]: Starting Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 podman[49704]: 2026-03-10 14:02:39.668943572 +0000 UTC m=+0.019817005 container create f59cc7d5bdfd721a1c607cf5c3921dd1e5f48688868f7fe7c9f2066755202e6e (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, ceph=True, io.buildah.version=1.29.1, RELEASE=HEAD, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 podman[49704]: 2026-03-10 14:02:39.706666784 +0000 UTC m=+0.057540217 container init f59cc7d5bdfd721a1c607cf5c3921dd1e5f48688868f7fe7c9f2066755202e6e (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.29.1, RELEASE=HEAD, org.label-schema.license=GPLv2, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_BRANCH=HEAD, CEPH_POINT_RELEASE=-18.2.0, org.label-schema.build-date=20231212, org.label-schema.schema-version=1.0, GIT_CLEAN=True) 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 podman[49704]: 2026-03-10 14:02:39.709340427 +0000 UTC m=+0.060213860 container start f59cc7d5bdfd721a1c607cf5c3921dd1e5f48688868f7fe7c9f2066755202e6e (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.29.1, CEPH_POINT_RELEASE=-18.2.0, org.label-schema.license=GPLv2, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, org.label-schema.build-date=20231212) 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 bash[49704]: f59cc7d5bdfd721a1c607cf5c3921dd1e5f48688868f7fe7c9f2066755202e6e 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 podman[49704]: 2026-03-10 14:02:39.661371201 +0000 UTC m=+0.012244645 image pull dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 quay.io/ceph/ceph:v18.2.0 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 systemd[1]: Started Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable), process ceph-mon, pid 2 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: pidfile_write: ignore empty --pid-file 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: load: jerasure load: lrc 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: RocksDB version: 7.9.2 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Git sha 0 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Compile date 2023-08-03 19:21:13 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: DB SUMMARY 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: DB Session ID: 3P9KVEQC8BWIGGF5UK9Z 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: CURRENT file: CURRENT 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm03/store.db dir, Total Num: 1, files: 000008.sst 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm03/store.db: 000009.log size: 96406 ; 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.error_if_exists: 0 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.create_if_missing: 0 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.paranoid_checks: 1 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.env: 0x56341fe8b720 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.info_log: 0x563421d9f340 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T14:02:39.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.statistics: (nil) 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.use_fsync: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_log_file_size: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.allow_fallocate: 1 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.use_direct_reads: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.db_log_dir: 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.wal_dir: 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.write_buffer_manager: 0x56342102e5a0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.unordered_write: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.row_cache: None 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.wal_filter: None 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.two_write_queues: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.wal_compression: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.atomic_flush: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T14:02:39.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.log_readahead_size: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_background_jobs: 2 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_background_compactions: -1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_subcompactions: 1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_open_files: -1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_background_flushes: -1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Compression algorithms supported: 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: kZSTD supported: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: kXpressCompression supported: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: kZlibCompression supported: 1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: kSnappyCompression supported: 1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: kLZ4Compression supported: 1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: kBZip2Compression supported: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000010 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.merge_operator: 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_filter: None 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x563421d9f440) 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks: 1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_top_level_index_and_filter: 1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_type: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_index_type: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_shortening: 1 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: checksum: 4 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: no_block_cache: 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache: 0x5634210b1350 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_name: BinnedLRUCache 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_options: 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: capacity : 536870912 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_shard_bits : 4 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: strict_capacity_limit : 0 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: high_pri_pool_ratio: 0.000 2026-03-10T14:02:39.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_compressed: (nil) 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: persistent_cache: (nil) 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size: 4096 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size_deviation: 10 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_restart_interval: 16 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_block_restart_interval: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: metadata_block_size: 4096 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: partition_filters: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: use_delta_encoding: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: filter_policy: bloomfilter 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: whole_key_filtering: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: verify_compression: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: read_amp_bytes_per_bit: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: format_version: 5 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_index_compression: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_align: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_auto_readahead_size: 262144 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: prepopulate_block_cache: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: initial_auto_readahead_size: 8192 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression: NoCompression 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.num_levels: 7 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T14:02:39.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.table_properties_collectors: 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.inplace_update_support: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.bloom_locality: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.max_successive_merges: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.ttl: 2592000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.enable_blob_files: false 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.min_blob_size: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 6793a155-1d93-4889-a6c4-fc7b890f7683 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151359742814, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151359746172, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 91910, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 301, "table_properties": {"data_size": 89956, "index_size": 241, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 13543, "raw_average_key_size": 50, "raw_value_size": 82605, "raw_average_value_size": 308, "num_data_blocks": 10, "num_entries": 268, "num_filter_entries": 268, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773151359, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "6793a155-1d93-4889-a6c4-fc7b890f7683", "db_session_id": "3P9KVEQC8BWIGGF5UK9Z", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151359746230, "job": 1, "event": "recovery_finished"} 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm03/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x56342114e000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: DB pointer 0x56342113a000 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: starting mon.vm03 rank 0 at public addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] at bind addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon_data /var/lib/ceph/mon/ceph-vm03 fsid b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???) e1 preinit fsid b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).mds e1 new map 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).mds e1 print_map 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout: e1 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout: legacy client fscid: -1 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-10T14:02:39.813 INFO:journalctl@ceph.mon.vm03.vm03.stdout: No filesystems configured 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).mgr e0 loading version 1 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).mgr e1 active server: (0) 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03@-1(???).mgr e1 mkfs or daemon transitioned to available, loading commands 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** DB Stats ** 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** Compaction Stats [default] ** 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: L0 2/0 91.60 KB 0.5 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 28.4 0.00 0.00 1 0.003 0 0 0.0 0.0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Sum 2/0 91.60 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 28.4 0.00 0.00 1 0.003 0 0 0.0 0.0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 28.4 0.00 0.00 1 0.003 0 0 0.0 0.0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** Compaction Stats [default] ** 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 28.4 0.00 0.00 1 0.003 0 0 0.0 0.0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative compaction: 0.00 GB write, 4.84 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval compaction: 0.00 GB write, 4.84 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Block cache BinnedLRUCache@0x5634210b1350#2 capacity: 512.00 MB usage: 49.41 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 9e-06 secs_since: 0 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Block cache entry stats(count,size,portion): DataBlock(9,48.08 KB,0.00917017%) FilterBlock(2,0.89 KB,0.000169873%) IndexBlock(2,0.44 KB,8.34465e-05%) Misc(1,0.00 KB,0%) 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mon.vm03 is new leader, mons vm03 in quorum (ranks 0) 2026-03-10T14:02:39.814 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: monmap e1: 1 mons at {vm03=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0]} removed_ranks: {} 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.868+0000 7f64374ca700 1 Processor -- start 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.870+0000 7f64374ca700 1 -- start start 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.870+0000 7f64374ca700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430108c10 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.870+0000 7f64374ca700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64300745b0 con 0x7f6430106830 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.870+0000 7f6435266700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430108c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.870+0000 7f6435266700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430108c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37708/0 (socket says 192.168.123.103:37708) 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.870+0000 7f6435266700 1 -- 192.168.123.103:0/514946589 learned_addr learned my addr 192.168.123.103:0/514946589 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.871+0000 7f6435266700 1 -- 192.168.123.103:0/514946589 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64300746f0 con 0x7f6430106830 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.871+0000 7f6435266700 1 --2- 192.168.123.103:0/514946589 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430108c10 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f6424009a90 tx=0x7f6424009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8da58b80a83f717b server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.871+0000 7f6423fff700 1 -- 192.168.123.103:0/514946589 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6424004030 con 0x7f6430106830 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.871+0000 7f6423fff700 1 -- 192.168.123.103:0/514946589 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f642400b7e0 con 0x7f6430106830 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.872+0000 7f6423fff700 1 -- 192.168.123.103:0/514946589 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6424003a30 con 0x7f6430106830 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.872+0000 7f64374ca700 1 -- 192.168.123.103:0/514946589 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 msgr2=0x7f6430108c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.872+0000 7f64374ca700 1 --2- 192.168.123.103:0/514946589 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430108c10 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f6424009a90 tx=0x7f6424009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.872+0000 7f64374ca700 1 -- 192.168.123.103:0/514946589 shutdown_connections 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.872+0000 7f64374ca700 1 --2- 192.168.123.103:0/514946589 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430108c10 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.872+0000 7f64374ca700 1 -- 192.168.123.103:0/514946589 >> 192.168.123.103:0/514946589 conn(0x7f6430100270 msgr2=0x7f64301026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:39.970 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.873+0000 7f64374ca700 1 -- 192.168.123.103:0/514946589 shutdown_connections 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.873+0000 7f64374ca700 1 -- 192.168.123.103:0/514946589 wait complete. 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.873+0000 7f64374ca700 1 Processor -- start 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.873+0000 7f64374ca700 1 -- start start 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f64374ca700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430197650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f64374ca700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6430197b90 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f6435266700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430197650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f6435266700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430197650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37712/0 (socket says 192.168.123.103:37712) 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f6435266700 1 -- 192.168.123.103:0/1157435777 learned_addr learned my addr 192.168.123.103:0/1157435777 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f6435266700 1 -- 192.168.123.103:0/1157435777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6424009740 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f6435266700 1 --2- 192.168.123.103:0/1157435777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430197650 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f642400bfb0 tx=0x7f6424003f70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f64227fc700 1 -- 192.168.123.103:0/1157435777 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6424004450 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.874+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6430197d90 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.875+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6430198230 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.875+0000 7f64227fc700 1 -- 192.168.123.103:0/1157435777 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f64240128d0 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.875+0000 7f64227fc700 1 -- 192.168.123.103:0/1157435777 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f64240188d0 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.875+0000 7f64227fc700 1 -- 192.168.123.103:0/1157435777 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f642402a430 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.875+0000 7f64227fc700 1 -- 192.168.123.103:0/1157435777 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f6424021840 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.877+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6414005320 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.878+0000 7f64227fc700 1 -- 192.168.123.103:0/1157435777 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f6424012a40 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.914+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7f6414005cc0 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.917+0000 7f64227fc700 1 -- 192.168.123.103:0/1157435777 <== mon.0 v2:192.168.123.103:3300/0 7 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f6424012e70 con 0x7f6430106830 2026-03-10T14:02:39.971 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.917+0000 7f64227fc700 1 -- 192.168.123.103:0/1157435777 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7f6424017480 con 0x7f6430106830 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.918+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 msgr2=0x7f6430197650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.918+0000 7f64374ca700 1 --2- 192.168.123.103:0/1157435777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430197650 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f642400bfb0 tx=0x7f6424003f70 comp rx=0 tx=0).stop 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.919+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 shutdown_connections 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.919+0000 7f64374ca700 1 --2- 192.168.123.103:0/1157435777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6430106830 0x7f6430197650 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.919+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 >> 192.168.123.103:0/1157435777 conn(0x7f6430100270 msgr2=0x7f6430100f20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.919+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 shutdown_connections 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:39.919+0000 7f64374ca700 1 -- 192.168.123.103:0/1157435777 wait complete. 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:Creating mgr... 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-10T14:02:39.972 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-10T14:02:40.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: fsmap 2026-03-10T14:02:40.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: osdmap e1: 0 total, 0 up, 0 in 2026-03-10T14:02:40.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:39 vm03 ceph-mon[49718]: mgrmap e1: no daemons active 2026-03-10T14:02:40.124 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mgr.vm03.rwbbep 2026-03-10T14:02:40.124 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to reset failed state of unit ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mgr.vm03.rwbbep.service: Unit ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mgr.vm03.rwbbep.service not loaded. 2026-03-10T14:02:40.263 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-b81bf660-1c89-11f1-b612-27d302cdb124.target.wants/ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mgr.vm03.rwbbep.service → /etc/systemd/system/ceph-b81bf660-1c89-11f1-b612-27d302cdb124@.service. 2026-03-10T14:02:40.437 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-10T14:02:40.437 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T14:02:40.437 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-10T14:02:40.437 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-10T14:02:40.437 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr to start... 2026-03-10T14:02:40.437 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr... 2026-03-10T14:02:40.682 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "b81bf660-1c89-11f1-b612-27d302cdb124", 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 0, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T14:02:40.683 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:40.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T14:02:37.820124+0000", 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.590+0000 7ff220878700 1 Processor -- start 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.591+0000 7ff220878700 1 -- start start 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.591+0000 7ff220878700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c071410 0x7ff21c071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.591+0000 7ff220878700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff21c071d60 con 0x7ff21c071410 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.592+0000 7ff21b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c071410 0x7ff21c071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.592+0000 7ff21b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c071410 0x7ff21c071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37732/0 (socket says 192.168.123.103:37732) 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.592+0000 7ff21b7fe700 1 -- 192.168.123.103:0/2887228345 learned_addr learned my addr 192.168.123.103:0/2887228345 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.592+0000 7ff21b7fe700 1 -- 192.168.123.103:0/2887228345 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff21c071ea0 con 0x7ff21c071410 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.592+0000 7ff21b7fe700 1 --2- 192.168.123.103:0/2887228345 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c071410 0x7ff21c071820 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7ff20c00ab30 tx=0x7ff20c010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=394676b31d1ce870 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff21a7fc700 1 -- 192.168.123.103:0/2887228345 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff20c010e00 con 0x7ff21c071410 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff21a7fc700 1 -- 192.168.123.103:0/2887228345 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7ff20c0044d0 con 0x7ff21c071410 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff220878700 1 -- 192.168.123.103:0/2887228345 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c071410 msgr2=0x7ff21c071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff220878700 1 --2- 192.168.123.103:0/2887228345 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c071410 0x7ff21c071820 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7ff20c00ab30 tx=0x7ff20c010730 comp rx=0 tx=0).stop 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff220878700 1 -- 192.168.123.103:0/2887228345 shutdown_connections 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff220878700 1 --2- 192.168.123.103:0/2887228345 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c071410 0x7ff21c071820 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff220878700 1 -- 192.168.123.103:0/2887228345 >> 192.168.123.103:0/2887228345 conn(0x7ff21c06c9d0 msgr2=0x7ff21c06ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff220878700 1 -- 192.168.123.103:0/2887228345 shutdown_connections 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.594+0000 7ff220878700 1 -- 192.168.123.103:0/2887228345 wait complete. 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.595+0000 7ff220878700 1 Processor -- start 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.595+0000 7ff220878700 1 -- start start 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.595+0000 7ff220878700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c1a86b0 0x7ff21c1a8ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.595+0000 7ff220878700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff20c01a410 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.596+0000 7ff21b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c1a86b0 0x7ff21c1a8ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.596+0000 7ff21b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c1a86b0 0x7ff21c1a8ac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37740/0 (socket says 192.168.123.103:37740) 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.596+0000 7ff21b7fe700 1 -- 192.168.123.103:0/683553349 learned_addr learned my addr 192.168.123.103:0/683553349 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.596+0000 7ff21b7fe700 1 -- 192.168.123.103:0/683553349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff20c00a7e0 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.596+0000 7ff21b7fe700 1 --2- 192.168.123.103:0/683553349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c1a86b0 0x7ff21c1a8ac0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7ff20c00bbd0 tx=0x7ff20c003980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.598+0000 7ff218ff9700 1 -- 192.168.123.103:0/683553349 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff20c003bd0 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.598+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff21c1a9000 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.598+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff21c1abd10 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.598+0000 7ff218ff9700 1 -- 192.168.123.103:0/683553349 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7ff20c00f070 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.598+0000 7ff218ff9700 1 -- 192.168.123.103:0/683553349 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff20c0229a0 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.599+0000 7ff218ff9700 1 -- 192.168.123.103:0/683553349 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7ff20c018070 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.599+0000 7ff218ff9700 1 -- 192.168.123.103:0/683553349 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7ff20c02c930 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.599+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff208005320 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.601+0000 7ff218ff9700 1 -- 192.168.123.103:0/683553349 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7ff20c027070 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.643+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7ff208005190 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.644+0000 7ff218ff9700 1 -- 192.168.123.103:0/683553349 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7ff20c003d30 con 0x7ff21c1a86b0 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.651+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c1a86b0 msgr2=0x7ff21c1a8ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.651+0000 7ff220878700 1 --2- 192.168.123.103:0/683553349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c1a86b0 0x7ff21c1a8ac0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7ff20c00bbd0 tx=0x7ff20c003980 comp rx=0 tx=0).stop 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.652+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 shutdown_connections 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.652+0000 7ff220878700 1 --2- 192.168.123.103:0/683553349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21c1a86b0 0x7ff21c1a8ac0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.652+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 >> 192.168.123.103:0/683553349 conn(0x7ff21c06c9d0 msgr2=0x7ff21c06d470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.652+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 shutdown_connections 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:40.653+0000 7ff220878700 1 -- 192.168.123.103:0/683553349 wait complete. 2026-03-10T14:02:40.686 INFO:teuthology.orchestra.run.vm03.stdout:mgr not available, waiting (1/15)... 2026-03-10T14:02:41.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:40 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1157435777' entity='client.admin' 2026-03-10T14:02:41.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:40 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/683553349' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T14:02:42.911 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "b81bf660-1c89-11f1-b612-27d302cdb124", 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 3, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T14:02:42.912 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T14:02:42.914 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T14:02:37.820124+0000", 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.829+0000 7f220b9ed700 1 Processor -- start 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.830+0000 7f220b9ed700 1 -- start start 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.830+0000 7f220b9ed700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2204071430 0x7f2204071840 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.830+0000 7f220b9ed700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2204071d80 con 0x7f2204071430 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.830+0000 7f2209789700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2204071430 0x7f2204071840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.830+0000 7f2209789700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2204071430 0x7f2204071840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37742/0 (socket says 192.168.123.103:37742) 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.830+0000 7f2209789700 1 -- 192.168.123.103:0/601262692 learned_addr learned my addr 192.168.123.103:0/601262692 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.830+0000 7f2209789700 1 -- 192.168.123.103:0/601262692 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2204071ec0 con 0x7f2204071430 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.831+0000 7f2209789700 1 --2- 192.168.123.103:0/601262692 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2204071430 0x7f2204071840 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f2200008b10 tx=0x7f2200008e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=9f0049661ba719c5 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.831+0000 7f21fbfff700 1 -- 192.168.123.103:0/601262692 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2200004030 con 0x7f2204071430 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.831+0000 7f21fbfff700 1 -- 192.168.123.103:0/601262692 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f2200009800 con 0x7f2204071430 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.832+0000 7f220b9ed700 1 -- 192.168.123.103:0/601262692 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2204071430 msgr2=0x7f2204071840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.832+0000 7f220b9ed700 1 --2- 192.168.123.103:0/601262692 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2204071430 0x7f2204071840 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f2200008b10 tx=0x7f2200008e20 comp rx=0 tx=0).stop 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.832+0000 7f220b9ed700 1 -- 192.168.123.103:0/601262692 shutdown_connections 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.832+0000 7f220b9ed700 1 --2- 192.168.123.103:0/601262692 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2204071430 0x7f2204071840 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.832+0000 7f220b9ed700 1 -- 192.168.123.103:0/601262692 >> 192.168.123.103:0/601262692 conn(0x7f220406c970 msgr2=0x7f220406eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.832+0000 7f220b9ed700 1 -- 192.168.123.103:0/601262692 shutdown_connections 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.832+0000 7f220b9ed700 1 -- 192.168.123.103:0/601262692 wait complete. 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.833+0000 7f220b9ed700 1 Processor -- start 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.833+0000 7f220b9ed700 1 -- start start 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.833+0000 7f220b9ed700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22041b0ef0 0x7f22041b1300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.833+0000 7f220b9ed700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22000039a0 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.833+0000 7f2209789700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22041b0ef0 0x7f22041b1300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.833+0000 7f2209789700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22041b0ef0 0x7f22041b1300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37754/0 (socket says 192.168.123.103:37754) 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.833+0000 7f2209789700 1 -- 192.168.123.103:0/103447736 learned_addr learned my addr 192.168.123.103:0/103447736 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.834+0000 7f2209789700 1 -- 192.168.123.103:0/103447736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f22000087c0 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.834+0000 7f2209789700 1 --2- 192.168.123.103:0/103447736 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22041b0ef0 0x7f22041b1300 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f2200008020 tx=0x7f2200004050 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.834+0000 7f21fa7fc700 1 -- 192.168.123.103:0/103447736 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f22000036a0 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.834+0000 7f21fa7fc700 1 -- 192.168.123.103:0/103447736 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f2200004430 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.834+0000 7f21fa7fc700 1 -- 192.168.123.103:0/103447736 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f220001b420 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.835+0000 7f220b9ed700 1 -- 192.168.123.103:0/103447736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f22041b1840 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.835+0000 7f220b9ed700 1 -- 192.168.123.103:0/103447736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f22041b44d0 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.835+0000 7f220b9ed700 1 -- 192.168.123.103:0/103447736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f220404efc0 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.835+0000 7f21fa7fc700 1 -- 192.168.123.103:0/103447736 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f220000b350 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.836+0000 7f21fa7fc700 1 -- 192.168.123.103:0/103447736 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f220001ba30 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.837+0000 7f21fa7fc700 1 -- 192.168.123.103:0/103447736 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f220001f070 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.875+0000 7f220b9ed700 1 -- 192.168.123.103:0/103447736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f22041b4780 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.875+0000 7f21fa7fc700 1 -- 192.168.123.103:0/103447736 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f220001b580 con 0x7f22041b0ef0 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.878+0000 7f21effff700 1 -- 192.168.123.103:0/103447736 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22041b0ef0 msgr2=0x7f22041b1300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.878+0000 7f21effff700 1 --2- 192.168.123.103:0/103447736 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22041b0ef0 0x7f22041b1300 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f2200008020 tx=0x7f2200004050 comp rx=0 tx=0).stop 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.878+0000 7f21effff700 1 -- 192.168.123.103:0/103447736 shutdown_connections 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.878+0000 7f21effff700 1 --2- 192.168.123.103:0/103447736 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22041b0ef0 0x7f22041b1300 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.878+0000 7f21effff700 1 -- 192.168.123.103:0/103447736 >> 192.168.123.103:0/103447736 conn(0x7f220406c970 msgr2=0x7f220406d400 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.878+0000 7f21effff700 1 -- 192.168.123.103:0/103447736 shutdown_connections 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:42.879+0000 7f21effff700 1 -- 192.168.123.103:0/103447736 wait complete. 2026-03-10T14:02:42.915 INFO:teuthology.orchestra.run.vm03.stdout:mgr not available, waiting (2/15)... 2026-03-10T14:02:43.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:42 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/103447736' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "b81bf660-1c89-11f1-b612-27d302cdb124", 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:45.144 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 5, 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T14:02:45.145 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T14:02:45.146 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T14:02:45.146 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T14:02:37.820124+0000", 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.060+0000 7f7249bd4700 1 Processor -- start 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.061+0000 7f7249bd4700 1 -- start start 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.061+0000 7f7249bd4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7244071430 0x7f7244071840 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.061+0000 7f7249bd4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7244071d80 con 0x7f7244071430 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.062+0000 7f72437fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7244071430 0x7f7244071840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.062+0000 7f72437fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7244071430 0x7f7244071840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57928/0 (socket says 192.168.123.103:57928) 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.062+0000 7f72437fe700 1 -- 192.168.123.103:0/1237089552 learned_addr learned my addr 192.168.123.103:0/1237089552 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.062+0000 7f72437fe700 1 -- 192.168.123.103:0/1237089552 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7244071ec0 con 0x7f7244071430 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.062+0000 7f72437fe700 1 --2- 192.168.123.103:0/1237089552 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7244071430 0x7f7244071840 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f723400ab30 tx=0x7f7234010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=5773a0c872b473c2 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.063+0000 7f72427fc700 1 -- 192.168.123.103:0/1237089552 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7234010e00 con 0x7f7244071430 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.063+0000 7f72427fc700 1 -- 192.168.123.103:0/1237089552 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f72340044d0 con 0x7f7244071430 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.064+0000 7f7249bd4700 1 -- 192.168.123.103:0/1237089552 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7244071430 msgr2=0x7f7244071840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.064+0000 7f7249bd4700 1 --2- 192.168.123.103:0/1237089552 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7244071430 0x7f7244071840 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f723400ab30 tx=0x7f7234010730 comp rx=0 tx=0).stop 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.064+0000 7f7249bd4700 1 -- 192.168.123.103:0/1237089552 shutdown_connections 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.064+0000 7f7249bd4700 1 --2- 192.168.123.103:0/1237089552 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7244071430 0x7f7244071840 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.064+0000 7f7249bd4700 1 -- 192.168.123.103:0/1237089552 >> 192.168.123.103:0/1237089552 conn(0x7f724406c970 msgr2=0x7f724406eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.064+0000 7f7249bd4700 1 -- 192.168.123.103:0/1237089552 shutdown_connections 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.064+0000 7f7249bd4700 1 -- 192.168.123.103:0/1237089552 wait complete. 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.065+0000 7f7249bd4700 1 Processor -- start 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.065+0000 7f7249bd4700 1 -- start start 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.065+0000 7f7249bd4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72441a8700 0x7f72441a8b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.065+0000 7f7249bd4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f723401a410 con 0x7f72441a8700 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.065+0000 7f72437fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72441a8700 0x7f72441a8b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.065+0000 7f72437fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72441a8700 0x7f72441a8b10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57938/0 (socket says 192.168.123.103:57938) 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.065+0000 7f72437fe700 1 -- 192.168.123.103:0/3747450678 learned_addr learned my addr 192.168.123.103:0/3747450678 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.066+0000 7f72437fe700 1 -- 192.168.123.103:0/3747450678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f723400a7e0 con 0x7f72441a8700 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.066+0000 7f72437fe700 1 --2- 192.168.123.103:0/3747450678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72441a8700 0x7f72441a8b10 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f723400bbd0 tx=0x7f7234003980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.066+0000 7f7240ff9700 1 -- 192.168.123.103:0/3747450678 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7234003bd0 con 0x7f72441a8700 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.067+0000 7f7249bd4700 1 -- 192.168.123.103:0/3747450678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f72441a9050 con 0x7f72441a8700 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.067+0000 7f7249bd4700 1 -- 192.168.123.103:0/3747450678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72441abd60 con 0x7f72441a8700 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.067+0000 7f7240ff9700 1 -- 192.168.123.103:0/3747450678 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f723400f070 con 0x7f72441a8700 2026-03-10T14:02:45.147 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.067+0000 7f7240ff9700 1 -- 192.168.123.103:0/3747450678 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f72340229a0 con 0x7f72441a8700 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.068+0000 7f7240ff9700 1 -- 192.168.123.103:0/3747450678 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f7234018070 con 0x7f72441a8700 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.068+0000 7f7240ff9700 1 -- 192.168.123.103:0/3747450678 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f723402c930 con 0x7f72441a8700 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.069+0000 7f7249bd4700 1 -- 192.168.123.103:0/3747450678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7230005320 con 0x7f72441a8700 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.070+0000 7f7240ff9700 1 -- 192.168.123.103:0/3747450678 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f7234027070 con 0x7f72441a8700 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.109+0000 7f7249bd4700 1 -- 192.168.123.103:0/3747450678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f7230005190 con 0x7f72441a8700 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.110+0000 7f7240ff9700 1 -- 192.168.123.103:0/3747450678 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f7234003d30 con 0x7f72441a8700 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.112+0000 7f722a7fc700 1 -- 192.168.123.103:0/3747450678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72441a8700 msgr2=0x7f72441a8b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.112+0000 7f722a7fc700 1 --2- 192.168.123.103:0/3747450678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72441a8700 0x7f72441a8b10 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f723400bbd0 tx=0x7f7234003980 comp rx=0 tx=0).stop 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.113+0000 7f722a7fc700 1 -- 192.168.123.103:0/3747450678 shutdown_connections 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.113+0000 7f722a7fc700 1 --2- 192.168.123.103:0/3747450678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f72441a8700 0x7f72441a8b10 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.113+0000 7f722a7fc700 1 -- 192.168.123.103:0/3747450678 >> 192.168.123.103:0/3747450678 conn(0x7f724406c970 msgr2=0x7f724406d440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.113+0000 7f722a7fc700 1 -- 192.168.123.103:0/3747450678 shutdown_connections 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:45.113+0000 7f722a7fc700 1 -- 192.168.123.103:0/3747450678 wait complete. 2026-03-10T14:02:45.148 INFO:teuthology.orchestra.run.vm03.stdout:mgr not available, waiting (3/15)... 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3747450678' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: Activating manager daemon vm03.rwbbep 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: mgrmap e2: vm03.rwbbep(active, starting, since 0.00409229s) 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: Manager daemon vm03.rwbbep is now available 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:02:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:45 vm03 ceph-mon[49718]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:47.515 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:47 vm03 ceph-mon[49718]: mgrmap e3: vm03.rwbbep(active, since 1.06573s) 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "b81bf660-1c89-11f1-b612-27d302cdb124", 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 7, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T14:02:47.712 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:47.714 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T14:02:37.820124+0000", 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.359+0000 7f32f670d700 1 Processor -- start 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.359+0000 7f32f670d700 1 -- start start 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.359+0000 7f32f670d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0105030 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.359+0000 7f32f670d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32f0105570 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.360+0000 7f32effff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0105030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.360+0000 7f32effff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0105030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58012/0 (socket says 192.168.123.103:58012) 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.360+0000 7f32effff700 1 -- 192.168.123.103:0/2118501121 learned_addr learned my addr 192.168.123.103:0/2118501121 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.360+0000 7f32effff700 1 -- 192.168.123.103:0/2118501121 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f32f01056b0 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.360+0000 7f32effff700 1 --2- 192.168.123.103:0/2118501121 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0105030 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f32e4009a90 tx=0x7f32e4009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=2427954534d41d54 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32eeffd700 1 -- 192.168.123.103:0/2118501121 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f32e4004030 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32eeffd700 1 -- 192.168.123.103:0/2118501121 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f32e400b7e0 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32f670d700 1 -- 192.168.123.103:0/2118501121 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 msgr2=0x7f32f0105030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32f670d700 1 --2- 192.168.123.103:0/2118501121 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0105030 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f32e4009a90 tx=0x7f32e4009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32f670d700 1 -- 192.168.123.103:0/2118501121 shutdown_connections 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32f670d700 1 --2- 192.168.123.103:0/2118501121 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0105030 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32f670d700 1 -- 192.168.123.103:0/2118501121 >> 192.168.123.103:0/2118501121 conn(0x7f32f0100270 msgr2=0x7f32f01026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32f670d700 1 -- 192.168.123.103:0/2118501121 shutdown_connections 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.361+0000 7f32f670d700 1 -- 192.168.123.103:0/2118501121 wait complete. 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.362+0000 7f32f670d700 1 Processor -- start 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.362+0000 7f32f670d700 1 -- start start 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.362+0000 7f32f670d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0199a80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.362+0000 7f32f670d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32f0105570 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.362+0000 7f32effff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0199a80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32effff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0199a80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58024/0 (socket says 192.168.123.103:58024) 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32effff700 1 -- 192.168.123.103:0/3275491445 learned_addr learned my addr 192.168.123.103:0/3275491445 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32effff700 1 -- 192.168.123.103:0/3275491445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f32e4009740 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32effff700 1 --2- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0199a80 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f32e4003e90 tx=0x7f32e4003f70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32ed7fa700 1 -- 192.168.123.103:0/3275491445 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f32e40043d0 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32ed7fa700 1 -- 192.168.123.103:0/3275491445 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f32e4004530 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f32f0199fc0 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32ed7fa700 1 -- 192.168.123.103:0/3275491445 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f32e4011670 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.363+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f32f019a3e0 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.364+0000 7f32ed7fa700 1 -- 192.168.123.103:0/3275491445 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 3) v1 ==== 44962+0+0 (secure 0 0 0) 0x7f32e40117d0 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.365+0000 7f32ed7fa700 1 --2- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f32d80382f0 0x7f32d803a7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.365+0000 7f32ed7fa700 1 -- 192.168.123.103:0/3275491445 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f32e404ce50 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.365+0000 7f32ef7fe700 1 --2- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f32d80382f0 0x7f32d803a7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.365+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f32f019a690 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.369+0000 7f32ef7fe700 1 --2- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f32d80382f0 0x7f32d803a7a0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f32e0006fd0 tx=0x7f32e0006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.369+0000 7f32ed7fa700 1 -- 192.168.123.103:0/3275491445 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f32e402b950 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.518+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f32f004f9e0 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.639+0000 7f32ed7fa700 1 -- 192.168.123.103:0/3275491445 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7f32e4018b40 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.644+0000 7f32ed7fa700 1 -- 192.168.123.103:0/3275491445 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7f32e402b430 con 0x7f32f0104c20 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.644+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f32d80382f0 msgr2=0x7f32d803a7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.644+0000 7f32f670d700 1 --2- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f32d80382f0 0x7f32d803a7a0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f32e0006fd0 tx=0x7f32e0006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.644+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 msgr2=0x7f32f0199a80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.645+0000 7f32f670d700 1 --2- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0199a80 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f32e4003e90 tx=0x7f32e4003f70 comp rx=0 tx=0).stop 2026-03-10T14:02:47.715 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.645+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 shutdown_connections 2026-03-10T14:02:47.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.645+0000 7f32f670d700 1 --2- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f32d80382f0 0x7f32d803a7a0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:47.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.645+0000 7f32f670d700 1 --2- 192.168.123.103:0/3275491445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32f0104c20 0x7f32f0199a80 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:47.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.645+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 >> 192.168.123.103:0/3275491445 conn(0x7f32f0100270 msgr2=0x7f32f00740b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:47.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.645+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 shutdown_connections 2026-03-10T14:02:47.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.645+0000 7f32f670d700 1 -- 192.168.123.103:0/3275491445 wait complete. 2026-03-10T14:02:47.716 INFO:teuthology.orchestra.run.vm03.stdout:mgr is available 2026-03-10T14:02:48.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:48.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T14:02:48.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout fsid = b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:48.040 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.855+0000 7f4ad7ddc700 1 Processor -- start 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.855+0000 7f4ad7ddc700 1 -- start start 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.856+0000 7f4ad7ddc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0106260 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.856+0000 7f4ad5b78700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0106260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.856+0000 7f4ad5b78700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0106260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58028/0 (socket says 192.168.123.103:58028) 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.856+0000 7f4ad5b78700 1 -- 192.168.123.103:0/2290699772 learned_addr learned my addr 192.168.123.103:0/2290699772 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.856+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/2290699772 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4ad01067a0 con 0x7f4ad0105e50 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.857+0000 7f4ad5b78700 1 -- 192.168.123.103:0/2290699772 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4ad01068e0 con 0x7f4ad0105e50 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.857+0000 7f4ad5b78700 1 --2- 192.168.123.103:0/2290699772 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0106260 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f4acc009a90 tx=0x7f4acc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=69fdd06b8506f68a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.857+0000 7f4ad4b76700 1 -- 192.168.123.103:0/2290699772 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4acc004030 con 0x7f4ad0105e50 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.857+0000 7f4ad4b76700 1 -- 192.168.123.103:0/2290699772 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f4acc00b7e0 con 0x7f4ad0105e50 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.857+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/2290699772 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 msgr2=0x7f4ad0106260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.857+0000 7f4ad7ddc700 1 --2- 192.168.123.103:0/2290699772 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0106260 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f4acc009a90 tx=0x7f4acc009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.858+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/2290699772 shutdown_connections 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.858+0000 7f4ad7ddc700 1 --2- 192.168.123.103:0/2290699772 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0106260 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.858+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/2290699772 >> 192.168.123.103:0/2290699772 conn(0x7f4ad0101420 msgr2=0x7f4ad0103830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.858+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/2290699772 shutdown_connections 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.858+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/2290699772 wait complete. 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.859+0000 7f4ad7ddc700 1 Processor -- start 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.859+0000 7f4ad7ddc700 1 -- start start 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.859+0000 7f4ad7ddc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0199a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.859+0000 7f4ad5b78700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0199a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.859+0000 7f4ad5b78700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0199a10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58036/0 (socket says 192.168.123.103:58036) 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.859+0000 7f4ad5b78700 1 -- 192.168.123.103:0/3641433571 learned_addr learned my addr 192.168.123.103:0/3641433571 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.860+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4ad01067a0 con 0x7f4ad0105e50 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.860+0000 7f4ad5b78700 1 -- 192.168.123.103:0/3641433571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4acc009740 con 0x7f4ad0105e50 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.860+0000 7f4ad5b78700 1 --2- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0199a10 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4acc003ee0 tx=0x7f4acc003fc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.860+0000 7f4ac6ffd700 1 -- 192.168.123.103:0/3641433571 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4acc004460 con 0x7f4ad0105e50 2026-03-10T14:02:48.041 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.860+0000 7f4ac6ffd700 1 -- 192.168.123.103:0/3641433571 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f4acc0045c0 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.860+0000 7f4ac6ffd700 1 -- 192.168.123.103:0/3641433571 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4acc011640 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.860+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4ad0199f50 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.860+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4ad019a370 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.862+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4ad019a620 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.862+0000 7f4ac6ffd700 1 -- 192.168.123.103:0/3641433571 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7f4acc028020 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.863+0000 7f4ac6ffd700 1 --2- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4abc038510 0x7f4abc03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.863+0000 7f4ac6ffd700 1 -- 192.168.123.103:0/3641433571 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f4acc04bbb0 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.865+0000 7f4ad5377700 1 --2- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4abc038510 0x7f4abc03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.865+0000 7f4ac6ffd700 1 -- 192.168.123.103:0/3641433571 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4acc019970 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.865+0000 7f4ad5377700 1 --2- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4abc038510 0x7f4abc03a9c0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f4ac0006fd0 tx=0x7f4ac0006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.967+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f4ad010a250 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.968+0000 7f4ac6ffd700 1 -- 192.168.123.103:0/3641433571 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+373 (secure 0 0 0) 0x7f4acc019740 con 0x7f4ad0105e50 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4abc038510 msgr2=0x7f4abc03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 --2- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4abc038510 0x7f4abc03a9c0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f4ac0006fd0 tx=0x7f4ac0006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 msgr2=0x7f4ad0199a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 --2- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0199a10 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4acc003ee0 tx=0x7f4acc003fc0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 shutdown_connections 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 --2- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4abc038510 0x7f4abc03a9c0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 --2- 192.168.123.103:0/3641433571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4ad0105e50 0x7f4ad0199a10 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 >> 192.168.123.103:0/3641433571 conn(0x7f4ad0101420 msgr2=0x7f4ad0102030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 shutdown_connections 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:47.971+0000 7f4ad7ddc700 1 -- 192.168.123.103:0/3641433571 wait complete. 2026-03-10T14:02:48.042 INFO:teuthology.orchestra.run.vm03.stdout:Enabling cephadm module... 2026-03-10T14:02:48.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:48 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3275491445' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T14:02:48.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:48 vm03 ceph-mon[49718]: mgrmap e4: vm03.rwbbep(active, since 2s) 2026-03-10T14:02:48.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:48 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3641433571' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-10T14:02:48.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:48 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/471715523' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-10T14:02:48.684 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.170+0000 7fe76a33a700 1 Processor -- start 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.171+0000 7fe76a33a700 1 -- start start 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.171+0000 7fe76a33a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe764108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.171+0000 7fe76a33a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe764108890 con 0x7fe764107f40 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.171+0000 7fe763fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe764108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.171+0000 7fe763fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe764108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58050/0 (socket says 192.168.123.103:58050) 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.172+0000 7fe763fff700 1 -- 192.168.123.103:0/2890536037 learned_addr learned my addr 192.168.123.103:0/2890536037 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.172+0000 7fe763fff700 1 -- 192.168.123.103:0/2890536037 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe7641089d0 con 0x7fe764107f40 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.172+0000 7fe763fff700 1 --2- 192.168.123.103:0/2890536037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe764108350 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fe74c009a90 tx=0x7fe74c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a521dade8ecbd664 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.172+0000 7fe762ffd700 1 -- 192.168.123.103:0/2890536037 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe74c004030 con 0x7fe764107f40 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.172+0000 7fe762ffd700 1 -- 192.168.123.103:0/2890536037 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe74c00b7e0 con 0x7fe764107f40 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.172+0000 7fe762ffd700 1 -- 192.168.123.103:0/2890536037 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe74c003a40 con 0x7fe764107f40 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.173+0000 7fe76a33a700 1 -- 192.168.123.103:0/2890536037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 msgr2=0x7fe764108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.173+0000 7fe76a33a700 1 --2- 192.168.123.103:0/2890536037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe764108350 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fe74c009a90 tx=0x7fe74c009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.173+0000 7fe76a33a700 1 -- 192.168.123.103:0/2890536037 shutdown_connections 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.173+0000 7fe76a33a700 1 --2- 192.168.123.103:0/2890536037 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe764108350 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.173+0000 7fe76a33a700 1 -- 192.168.123.103:0/2890536037 >> 192.168.123.103:0/2890536037 conn(0x7fe764103770 msgr2=0x7fe764105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.173+0000 7fe76a33a700 1 -- 192.168.123.103:0/2890536037 shutdown_connections 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.173+0000 7fe76a33a700 1 -- 192.168.123.103:0/2890536037 wait complete. 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.173+0000 7fe76a33a700 1 Processor -- start 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe76a33a700 1 -- start start 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe76a33a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe76419bbe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe76a33a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe76419c120 con 0x7fe764107f40 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe763fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe76419bbe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:48.685 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe763fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe76419bbe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58054/0 (socket says 192.168.123.103:58054) 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe763fff700 1 -- 192.168.123.103:0/471715523 learned_addr learned my addr 192.168.123.103:0/471715523 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe763fff700 1 -- 192.168.123.103:0/471715523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe74c009740 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe763fff700 1 --2- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe76419bbe0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fe74c003710 tx=0x7fe74c003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.174+0000 7fe7617fa700 1 -- 192.168.123.103:0/471715523 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe74c004110 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.175+0000 7fe7617fa700 1 -- 192.168.123.103:0/471715523 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe74c01a430 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.175+0000 7fe7617fa700 1 -- 192.168.123.103:0/471715523 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe74c011460 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.175+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe76419c320 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.175+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe76419c7c0 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.175+0000 7fe7617fa700 1 -- 192.168.123.103:0/471715523 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7fe74c0115c0 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.176+0000 7fe7617fa700 1 --2- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe750038420 0x7fe75003a8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.176+0000 7fe7617fa700 1 -- 192.168.123.103:0/471715523 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fe74c04d170 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.176+0000 7fe7637fe700 1 --2- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe750038420 0x7fe75003a8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.176+0000 7fe7637fe700 1 --2- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe750038420 0x7fe75003a8d0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fe754006fd0 tx=0x7fe754006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.176+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe764062380 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.179+0000 7fe7617fa700 1 -- 192.168.123.103:0/471715523 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe74c02c430 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.308+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7fe764109ef0 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.646+0000 7fe7617fa700 1 -- 192.168.123.103:0/471715523 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fe74c02c7b0 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.647+0000 7fe7617fa700 1 -- 192.168.123.103:0/471715523 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v5) v1 ==== 86+0+0 (secure 0 0 0) 0x7fe74c04f870 con 0x7fe764107f40 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.649+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe750038420 msgr2=0x7fe75003a8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.649+0000 7fe76a33a700 1 --2- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe750038420 0x7fe75003a8d0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fe754006fd0 tx=0x7fe754006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.649+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 msgr2=0x7fe76419bbe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.649+0000 7fe76a33a700 1 --2- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe76419bbe0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fe74c003710 tx=0x7fe74c003b40 comp rx=0 tx=0).stop 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.650+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 shutdown_connections 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.650+0000 7fe76a33a700 1 --2- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe750038420 0x7fe75003a8d0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.650+0000 7fe76a33a700 1 --2- 192.168.123.103:0/471715523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe764107f40 0x7fe76419bbe0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.650+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 >> 192.168.123.103:0/471715523 conn(0x7fe764103770 msgr2=0x7fe764105390 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.650+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 shutdown_connections 2026-03-10T14:02:48.686 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.650+0000 7fe76a33a700 1 -- 192.168.123.103:0/471715523 wait complete. 2026-03-10T14:02:49.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-10T14:02:49.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 5, 2026-03-10T14:02:49.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T14:02:49.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "active_name": "vm03.rwbbep", 2026-03-10T14:02:49.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T14:02:49.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-10T14:02:49.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.827+0000 7f574f59e700 1 Processor -- start 2026-03-10T14:02:49.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.827+0000 7f574f59e700 1 -- start start 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.827+0000 7f574f59e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f5750071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.828+0000 7f574f59e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5750071d60 con 0x7f5750071410 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.828+0000 7f574e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f5750071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.828+0000 7f574e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f5750071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58086/0 (socket says 192.168.123.103:58086) 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.828+0000 7f574e59c700 1 -- 192.168.123.103:0/4068335023 learned_addr learned my addr 192.168.123.103:0/4068335023 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.828+0000 7f574e59c700 1 -- 192.168.123.103:0/4068335023 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5750071ea0 con 0x7f5750071410 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.829+0000 7f574e59c700 1 --2- 192.168.123.103:0/4068335023 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f5750071820 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f574000ab30 tx=0x7f5740010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e8069d9474904a90 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.829+0000 7f574d59a700 1 -- 192.168.123.103:0/4068335023 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5740010e00 con 0x7f5750071410 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.829+0000 7f574d59a700 1 -- 192.168.123.103:0/4068335023 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f57400044d0 con 0x7f5750071410 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.829+0000 7f574d59a700 1 -- 192.168.123.103:0/4068335023 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f574001a5b0 con 0x7f5750071410 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.830+0000 7f574f59e700 1 -- 192.168.123.103:0/4068335023 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 msgr2=0x7f5750071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.830+0000 7f574f59e700 1 --2- 192.168.123.103:0/4068335023 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f5750071820 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f574000ab30 tx=0x7f5740010730 comp rx=0 tx=0).stop 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.830+0000 7f574f59e700 1 -- 192.168.123.103:0/4068335023 shutdown_connections 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.830+0000 7f574f59e700 1 --2- 192.168.123.103:0/4068335023 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f5750071820 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.830+0000 7f574f59e700 1 -- 192.168.123.103:0/4068335023 >> 192.168.123.103:0/4068335023 conn(0x7f575006c9d0 msgr2=0x7f575006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.830+0000 7f574f59e700 1 -- 192.168.123.103:0/4068335023 shutdown_connections 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.830+0000 7f574f59e700 1 -- 192.168.123.103:0/4068335023 wait complete. 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.831+0000 7f574f59e700 1 Processor -- start 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.831+0000 7f574f59e700 1 -- start start 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.831+0000 7f574f59e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f57501a86e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.831+0000 7f574f59e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57501a8c20 con 0x7f5750071410 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.831+0000 7f574e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f57501a86e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.831+0000 7f574e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f57501a86e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58094/0 (socket says 192.168.123.103:58094) 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.832+0000 7f574e59c700 1 -- 192.168.123.103:0/1779902364 learned_addr learned my addr 192.168.123.103:0/1779902364 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.832+0000 7f574e59c700 1 -- 192.168.123.103:0/1779902364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f574000a7e0 con 0x7f5750071410 2026-03-10T14:02:49.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.832+0000 7f574e59c700 1 --2- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f57501a86e0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f5740003d60 tx=0x7f57400042e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.834+0000 7f573f7fe700 1 -- 192.168.123.103:0/1779902364 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f57400036a0 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.834+0000 7f574f59e700 1 -- 192.168.123.103:0/1779902364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f57501a8e20 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.834+0000 7f574f59e700 1 -- 192.168.123.103:0/1779902364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57501a9320 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.834+0000 7f573f7fe700 1 -- 192.168.123.103:0/1779902364 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f574000f070 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.834+0000 7f573f7fe700 1 -- 192.168.123.103:0/1779902364 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5740009790 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.835+0000 7f574f59e700 1 -- 192.168.123.103:0/1779902364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f57300052f0 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.836+0000 7f573f7fe700 1 -- 192.168.123.103:0/1779902364 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f5740018070 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.836+0000 7f573f7fe700 1 --2- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57380384b0 0x7f573803a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.836+0000 7f574dd9b700 1 -- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57380384b0 msgr2=0x7f573803a960 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.836+0000 7f574dd9b700 1 --2- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57380384b0 0x7f573803a960 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.836+0000 7f573f7fe700 1 -- 192.168.123.103:0/1779902364 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f574004c980 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.839+0000 7f573f7fe700 1 -- 192.168.123.103:0/1779902364 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f574002c420 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.975+0000 7f574f59e700 1 -- 192.168.123.103:0/1779902364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f57300061d0 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.978+0000 7f573f7fe700 1 -- 192.168.123.103:0/1779902364 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v5) v1 ==== 56+0+98 (secure 0 0 0) 0x7f5740022ba0 con 0x7f5750071410 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.981+0000 7f573d7fa700 1 -- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57380384b0 msgr2=0x7f573803a960 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.982+0000 7f573d7fa700 1 --2- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57380384b0 0x7f573803a960 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.982+0000 7f573d7fa700 1 -- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 msgr2=0x7f57501a86e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.982+0000 7f573d7fa700 1 --2- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f57501a86e0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f5740003d60 tx=0x7f57400042e0 comp rx=0 tx=0).stop 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.982+0000 7f573d7fa700 1 -- 192.168.123.103:0/1779902364 shutdown_connections 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.982+0000 7f573d7fa700 1 --2- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57380384b0 0x7f573803a960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.982+0000 7f573d7fa700 1 --2- 192.168.123.103:0/1779902364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5750071410 0x7f57501a86e0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.982+0000 7f573d7fa700 1 -- 192.168.123.103:0/1779902364 >> 192.168.123.103:0/1779902364 conn(0x7f575006c9d0 msgr2=0x7f575006d6b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.982+0000 7f573d7fa700 1 -- 192.168.123.103:0/1779902364 shutdown_connections 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:48.983+0000 7f573d7fa700 1 -- 192.168.123.103:0/1779902364 wait complete. 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for the mgr to restart... 2026-03-10T14:02:49.017 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr epoch 5... 2026-03-10T14:02:49.933 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:49 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/471715523' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-10T14:02:49.934 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:49 vm03 ceph-mon[49718]: mgrmap e5: vm03.rwbbep(active, since 3s) 2026-03-10T14:02:49.934 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:49 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1779902364' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: Active manager daemon vm03.rwbbep restarted 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: Activating manager daemon vm03.rwbbep 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: osdmap e2: 0 total, 0 up, 0 in 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: mgrmap e6: vm03.rwbbep(active, starting, since 0.00449452s) 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: Manager daemon vm03.rwbbep is now available 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:02:53.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:53 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 7, 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.169+0000 7f91746cb700 1 Processor -- start 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.169+0000 7f91746cb700 1 -- start start 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.169+0000 7f91746cb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c071200 0x7f916c071610 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.169+0000 7f91746cb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f916c0728b0 con 0x7f916c071200 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.170+0000 7f9172467700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c071200 0x7f916c071610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.170+0000 7f9172467700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c071200 0x7f916c071610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58106/0 (socket says 192.168.123.103:58106) 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.170+0000 7f9172467700 1 -- 192.168.123.103:0/1271775710 learned_addr learned my addr 192.168.123.103:0/1271775710 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.171+0000 7f9172467700 1 -- 192.168.123.103:0/1271775710 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f916c0729f0 con 0x7f916c071200 2026-03-10T14:02:54.434 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.171+0000 7f9172467700 1 --2- 192.168.123.103:0/1271775710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c071200 0x7f916c071610 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f9168009a90 tx=0x7f9168009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d6105270baeb13bb server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.171+0000 7f9171465700 1 -- 192.168.123.103:0/1271775710 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9168004030 con 0x7f916c071200 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.171+0000 7f9171465700 1 -- 192.168.123.103:0/1271775710 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f916800b7e0 con 0x7f916c071200 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.172+0000 7f91746cb700 1 -- 192.168.123.103:0/1271775710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c071200 msgr2=0x7f916c071610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.172+0000 7f91746cb700 1 --2- 192.168.123.103:0/1271775710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c071200 0x7f916c071610 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f9168009a90 tx=0x7f9168009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.172+0000 7f91746cb700 1 -- 192.168.123.103:0/1271775710 shutdown_connections 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.172+0000 7f91746cb700 1 --2- 192.168.123.103:0/1271775710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c071200 0x7f916c071610 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.172+0000 7f91746cb700 1 -- 192.168.123.103:0/1271775710 >> 192.168.123.103:0/1271775710 conn(0x7f916c06cc30 msgr2=0x7f916c06f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.173+0000 7f91746cb700 1 -- 192.168.123.103:0/1271775710 shutdown_connections 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.173+0000 7f91746cb700 1 -- 192.168.123.103:0/1271775710 wait complete. 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.173+0000 7f91746cb700 1 Processor -- start 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.173+0000 7f91746cb700 1 -- start start 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.173+0000 7f91746cb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c11b460 0x7f916c11b870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.173+0000 7f91746cb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f916c0728b0 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.174+0000 7f9172467700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c11b460 0x7f916c11b870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.174+0000 7f9172467700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c11b460 0x7f916c11b870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58108/0 (socket says 192.168.123.103:58108) 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.174+0000 7f9172467700 1 -- 192.168.123.103:0/3919436591 learned_addr learned my addr 192.168.123.103:0/3919436591 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.174+0000 7f9172467700 1 -- 192.168.123.103:0/3919436591 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9168009740 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.174+0000 7f9172467700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c11b460 0x7f916c11b870 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f916800bd00 tx=0x7f916800bde0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.176+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f916801a670 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.176+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f916c11ca30 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.176+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f916c11c0c0 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.176+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f916801ac70 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.177+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f91680044e0 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.177+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f916802c430 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.178+0000 7f91637fe700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.178+0000 7f9171c66700 1 -- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 msgr2=0x7f915803a990 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.178+0000 7f9171c66700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.178+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f915803b0a0 con 0x7f91580384e0 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.178+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f916804cb40 con 0x7f916c11b460 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.378+0000 7f9171c66700 1 -- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 msgr2=0x7f915803a990 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.378+0000 7f9171c66700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T14:02:54.435 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.779+0000 7f9171c66700 1 -- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 msgr2=0x7f915803a990 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:49.779+0000 7f9171c66700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:50.580+0000 7f9171c66700 1 -- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 msgr2=0x7f915803a990 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:50.580+0000 7f9171c66700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:52.181+0000 7f9171c66700 1 -- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 msgr2=0x7f915803a990 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:52.181+0000 7f9171c66700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:53.373+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mgrmap(e 6) v1 ==== 44846+0+0 (secure 0 0 0) 0x7f916802c6e0 con 0x7f916c11b460 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:53.373+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 msgr2=0x7f915803a990 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:53.373+0000 7f91637fe700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.376+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f9168011420 con 0x7f916c11b460 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.376+0000 7f91637fe700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.376+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f915803b0a0 con 0x7f91580384e0 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.378+0000 7f9171c66700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.379+0000 7f9171c66700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f915c003a10 tx=0x7f915c0092b0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.379+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f915803b0a0 con 0x7f91580384e0 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.383+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f916c11c7e0 con 0x7f91580384e0 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.383+0000 7f91637fe700 1 -- 192.168.123.103:0/3919436591 <== mgr.14120 v2:192.168.123.103:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7f916c11c7e0 con 0x7f91580384e0 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.383+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 msgr2=0x7f915803a990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.383+0000 7f91746cb700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f915c003a10 tx=0x7f915c0092b0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.383+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c11b460 msgr2=0x7f916c11b870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.383+0000 7f91746cb700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c11b460 0x7f916c11b870 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f916800bd00 tx=0x7f916800bde0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.384+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 shutdown_connections 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.384+0000 7f91746cb700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f91580384e0 0x7f915803a990 secure :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f915c003a10 tx=0x7f915c0092b0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.384+0000 7f91746cb700 1 --2- 192.168.123.103:0/3919436591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f916c11b460 0x7f916c11b870 secure :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f916800bd00 tx=0x7f916800bde0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.384+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 >> 192.168.123.103:0/3919436591 conn(0x7f916c06cc30 msgr2=0x7f916c112570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.384+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 shutdown_connections 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.384+0000 7f91746cb700 1 -- 192.168.123.103:0/3919436591 wait complete. 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:mgr epoch 5 is available 2026-03-10T14:02:54.436 INFO:teuthology.orchestra.run.vm03.stdout:Setting orchestrator backend to cephadm... 2026-03-10T14:02:54.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:54 vm03 ceph-mon[49718]: Found migration_current of "None". Setting to last migration. 2026-03-10T14:02:54.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:54 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:02:54.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:54 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:02:54.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:54 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:54.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:54 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:54.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:54 vm03 ceph-mon[49718]: mgrmap e7: vm03.rwbbep(active, since 1.00788s) 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.575+0000 7f43cb152700 1 Processor -- start 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.575+0000 7f43cb152700 1 -- start start 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.575+0000 7f43cb152700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c4108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.575+0000 7f43cb152700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f43c4108890 con 0x7f43c4107f40 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.576+0000 7f43c8eee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c4108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.576+0000 7f43c8eee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c4108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57738/0 (socket says 192.168.123.103:57738) 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.576+0000 7f43c8eee700 1 -- 192.168.123.103:0/3334043739 learned_addr learned my addr 192.168.123.103:0/3334043739 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.576+0000 7f43c8eee700 1 -- 192.168.123.103:0/3334043739 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f43c41089d0 con 0x7f43c4107f40 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.576+0000 7f43c8eee700 1 --2- 192.168.123.103:0/3334043739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c4108350 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f43b4009a90 tx=0x7f43b4009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d25e699c9f590503 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.577+0000 7f43c37fe700 1 -- 192.168.123.103:0/3334043739 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f43b4004030 con 0x7f43c4107f40 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.577+0000 7f43c37fe700 1 -- 192.168.123.103:0/3334043739 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f43b400b7e0 con 0x7f43c4107f40 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.577+0000 7f43cb152700 1 -- 192.168.123.103:0/3334043739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 msgr2=0x7f43c4108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.577+0000 7f43cb152700 1 --2- 192.168.123.103:0/3334043739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c4108350 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f43b4009a90 tx=0x7f43b4009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.577+0000 7f43cb152700 1 -- 192.168.123.103:0/3334043739 shutdown_connections 2026-03-10T14:02:54.762 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.577+0000 7f43cb152700 1 --2- 192.168.123.103:0/3334043739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c4108350 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.577+0000 7f43cb152700 1 -- 192.168.123.103:0/3334043739 >> 192.168.123.103:0/3334043739 conn(0x7f43c4103770 msgr2=0x7f43c4105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.578+0000 7f43cb152700 1 -- 192.168.123.103:0/3334043739 shutdown_connections 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.578+0000 7f43cb152700 1 -- 192.168.123.103:0/3334043739 wait complete. 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.578+0000 7f43cb152700 1 Processor -- start 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.578+0000 7f43cb152700 1 -- start start 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.579+0000 7f43cb152700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c419bcb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.579+0000 7f43cb152700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f43c4108890 con 0x7f43c4107f40 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.579+0000 7f43c8eee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c419bcb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.579+0000 7f43c8eee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c419bcb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57748/0 (socket says 192.168.123.103:57748) 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.579+0000 7f43c8eee700 1 -- 192.168.123.103:0/612781099 learned_addr learned my addr 192.168.123.103:0/612781099 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.579+0000 7f43c8eee700 1 -- 192.168.123.103:0/612781099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f43b4009740 con 0x7f43c4107f40 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.579+0000 7f43c8eee700 1 --2- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c419bcb0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f43b4009710 tx=0x7f43b4003f40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.580+0000 7f43c1ffb700 1 -- 192.168.123.103:0/612781099 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f43b40043a0 con 0x7f43c4107f40 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.580+0000 7f43c1ffb700 1 -- 192.168.123.103:0/612781099 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f43b4004500 con 0x7f43c4107f40 2026-03-10T14:02:54.763 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.580+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f43c419c1f0 con 0x7f43c4107f40 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.580+0000 7f43c1ffb700 1 -- 192.168.123.103:0/612781099 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f43b40115e0 con 0x7f43c4107f40 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.581+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f43c419c610 con 0x7f43c4107f40 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.581+0000 7f43c1ffb700 1 -- 192.168.123.103:0/612781099 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f43b40117c0 con 0x7f43c4107f40 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.581+0000 7f43c1ffb700 1 --2- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f43ac0382d0 0x7f43ac03a780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.581+0000 7f43c1ffb700 1 -- 192.168.123.103:0/612781099 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f43b401a840 con 0x7f43c4107f40 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.581+0000 7f43c3fff700 1 --2- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f43ac0382d0 0x7f43ac03a780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.582+0000 7f43c3fff700 1 --2- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f43ac0382d0 0x7f43ac03a780 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f43b8006fd0 tx=0x7f43b8006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.582+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f43c419c8b0 con 0x7f43c4107f40 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.586+0000 7f43c1ffb700 1 -- 192.168.123.103:0/612781099 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f43b4053050 con 0x7f43c4107f40 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.701+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7f43c4105c10 con 0x7f43ac0382d0 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.712+0000 7f43c1ffb700 1 -- 192.168.123.103:0/612781099 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f43c4105c10 con 0x7f43ac0382d0 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.715+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f43ac0382d0 msgr2=0x7f43ac03a780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.715+0000 7f43cb152700 1 --2- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f43ac0382d0 0x7f43ac03a780 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f43b8006fd0 tx=0x7f43b8006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.715+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 msgr2=0x7f43c419bcb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.715+0000 7f43cb152700 1 --2- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c419bcb0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f43b4009710 tx=0x7f43b4003f40 comp rx=0 tx=0).stop 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.715+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 shutdown_connections 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.715+0000 7f43cb152700 1 --2- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f43ac0382d0 0x7f43ac03a780 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.716+0000 7f43cb152700 1 --2- 192.168.123.103:0/612781099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f43c4107f40 0x7f43c419bcb0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.716+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 >> 192.168.123.103:0/612781099 conn(0x7f43c4103770 msgr2=0x7f43c4105500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.716+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 shutdown_connections 2026-03-10T14:02:54.764 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.716+0000 7f43cb152700 1 -- 192.168.123.103:0/612781099 wait complete. 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.893+0000 7f60680a8700 1 Processor -- start 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.893+0000 7f60680a8700 1 -- start start 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.893+0000 7f60680a8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f6060104430 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.893+0000 7f60680a8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6060104970 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.893+0000 7f6065e44700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f6060104430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.893+0000 7f6065e44700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f6060104430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57758/0 (socket says 192.168.123.103:57758) 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.893+0000 7f6065e44700 1 -- 192.168.123.103:0/2061598596 learned_addr learned my addr 192.168.123.103:0/2061598596 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.894+0000 7f6065e44700 1 -- 192.168.123.103:0/2061598596 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6060104ab0 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.894+0000 7f6065e44700 1 --2- 192.168.123.103:0/2061598596 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f6060104430 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f6050009a90 tx=0x7f6050009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f7af277039d3c838 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.894+0000 7f6064e42700 1 -- 192.168.123.103:0/2061598596 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6050004030 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.894+0000 7f6064e42700 1 -- 192.168.123.103:0/2061598596 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f605000b7e0 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.895+0000 7f60680a8700 1 -- 192.168.123.103:0/2061598596 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 msgr2=0x7f6060104430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.895+0000 7f60680a8700 1 --2- 192.168.123.103:0/2061598596 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f6060104430 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f6050009a90 tx=0x7f6050009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.895+0000 7f60680a8700 1 -- 192.168.123.103:0/2061598596 shutdown_connections 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.895+0000 7f60680a8700 1 --2- 192.168.123.103:0/2061598596 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f6060104430 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.895+0000 7f60680a8700 1 -- 192.168.123.103:0/2061598596 >> 192.168.123.103:0/2061598596 conn(0x7f60600ffa60 msgr2=0x7f6060101e70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.895+0000 7f60680a8700 1 -- 192.168.123.103:0/2061598596 shutdown_connections 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.895+0000 7f60680a8700 1 -- 192.168.123.103:0/2061598596 wait complete. 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.896+0000 7f60680a8700 1 Processor -- start 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.896+0000 7f60680a8700 1 -- start start 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.896+0000 7f60680a8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f606019ffd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.896+0000 7f6065e44700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f606019ffd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.896+0000 7f6065e44700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f606019ffd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57772/0 (socket says 192.168.123.103:57772) 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.896+0000 7f6065e44700 1 -- 192.168.123.103:0/71572002 learned_addr learned my addr 192.168.123.103:0/71572002 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.896+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6060104970 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.897+0000 7f6065e44700 1 -- 192.168.123.103:0/71572002 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6050009740 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.897+0000 7f6065e44700 1 --2- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f606019ffd0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f6050003ec0 tx=0x7f6050003fa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.897+0000 7f6056ffd700 1 -- 192.168.123.103:0/71572002 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6050004400 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.897+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6060105c60 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.897+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f60601a07a0 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.898+0000 7f6056ffd700 1 -- 192.168.123.103:0/71572002 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f6050004560 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.898+0000 7f6056ffd700 1 -- 192.168.123.103:0/71572002 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6050011620 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.898+0000 7f6056ffd700 1 -- 192.168.123.103:0/71572002 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f6050011840 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.898+0000 7f6056ffd700 1 --2- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f604c0384a0 0x7f604c03a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.898+0000 7f6056ffd700 1 -- 192.168.123.103:0/71572002 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f605004cf80 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.899+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6044005320 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.900+0000 7f6065643700 1 --2- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f604c0384a0 0x7f604c03a950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.900+0000 7f6065643700 1 --2- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f604c0384a0 0x7f604c03a950 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f605c006fd0 tx=0x7f605c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:54.902+0000 7f6056ffd700 1 -- 192.168.123.103:0/71572002 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6050011af0 con 0x7f6060104020 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.009+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7f6044000bf0 con 0x7f604c0384a0 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.010+0000 7f6056ffd700 1 -- 192.168.123.103:0/71572002 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7f6044000bf0 con 0x7f604c0384a0 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.014+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f604c0384a0 msgr2=0x7f604c03a950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.014+0000 7f60680a8700 1 --2- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f604c0384a0 0x7f604c03a950 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f605c006fd0 tx=0x7f605c006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.014+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 msgr2=0x7f606019ffd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.014+0000 7f60680a8700 1 --2- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f606019ffd0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f6050003ec0 tx=0x7f6050003fa0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.015+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 shutdown_connections 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.015+0000 7f60680a8700 1 --2- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f604c0384a0 0x7f604c03a950 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.015+0000 7f60680a8700 1 --2- 192.168.123.103:0/71572002 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6060104020 0x7f606019ffd0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.015+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 >> 192.168.123.103:0/71572002 conn(0x7f60600ffa60 msgr2=0x7f606010a5c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.015+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 shutdown_connections 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.016+0000 7f60680a8700 1 -- 192.168.123.103:0/71572002 wait complete. 2026-03-10T14:02:55.064 INFO:teuthology.orchestra.run.vm03.stdout:Generating ssh key... 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.207+0000 7f528b277700 1 Processor -- start 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.207+0000 7f528b277700 1 -- start start 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.207+0000 7f528b277700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f5284106260 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.207+0000 7f528b277700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52841067a0 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.208+0000 7f5289013700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f5284106260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.208+0000 7f5289013700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f5284106260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57788/0 (socket says 192.168.123.103:57788) 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.208+0000 7f5289013700 1 -- 192.168.123.103:0/3825909024 learned_addr learned my addr 192.168.123.103:0/3825909024 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.208+0000 7f5289013700 1 -- 192.168.123.103:0/3825909024 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52841068e0 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.208+0000 7f5289013700 1 --2- 192.168.123.103:0/3825909024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f5284106260 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f5280009cf0 tx=0x7f528000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=2ae740f83d185efe server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.208+0000 7f527bfff700 1 -- 192.168.123.103:0/3825909024 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5280004030 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.209+0000 7f527bfff700 1 -- 192.168.123.103:0/3825909024 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f528000b810 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.209+0000 7f528b277700 1 -- 192.168.123.103:0/3825909024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 msgr2=0x7f5284106260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.209+0000 7f528b277700 1 --2- 192.168.123.103:0/3825909024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f5284106260 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f5280009cf0 tx=0x7f528000b0e0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.209+0000 7f528b277700 1 -- 192.168.123.103:0/3825909024 shutdown_connections 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.209+0000 7f528b277700 1 --2- 192.168.123.103:0/3825909024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f5284106260 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.209+0000 7f528b277700 1 -- 192.168.123.103:0/3825909024 >> 192.168.123.103:0/3825909024 conn(0x7f5284101420 msgr2=0x7f5284103830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.209+0000 7f528b277700 1 -- 192.168.123.103:0/3825909024 shutdown_connections 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.210+0000 7f528b277700 1 -- 192.168.123.103:0/3825909024 wait complete. 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.210+0000 7f528b277700 1 Processor -- start 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.210+0000 7f528b277700 1 -- start start 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.210+0000 7f528b277700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f528419bc50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.210+0000 7f528b277700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52841067a0 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.210+0000 7f5289013700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f528419bc50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.211+0000 7f5289013700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f528419bc50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57800/0 (socket says 192.168.123.103:57800) 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.211+0000 7f5289013700 1 -- 192.168.123.103:0/552320928 learned_addr learned my addr 192.168.123.103:0/552320928 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.211+0000 7f5289013700 1 -- 192.168.123.103:0/552320928 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5280009740 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.211+0000 7f5289013700 1 --2- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f528419bc50 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f5280003ee0 tx=0x7f5280003fc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.211+0000 7f527a7fc700 1 -- 192.168.123.103:0/552320928 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5280004320 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.211+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f528419c190 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.212+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f528419c630 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.212+0000 7f527a7fc700 1 -- 192.168.123.103:0/552320928 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f5280004480 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.212+0000 7f527a7fc700 1 -- 192.168.123.103:0/552320928 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f528001a560 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.213+0000 7f527a7fc700 1 -- 192.168.123.103:0/552320928 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f528001a7c0 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.213+0000 7f527a7fc700 1 --2- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52700383b0 0x7f527003a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.213+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f52841068e0 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.214+0000 7f5288812700 1 --2- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52700383b0 0x7f527003a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.215+0000 7f5288812700 1 --2- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52700383b0 0x7f527003a860 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f5274006fd0 tx=0x7f5274006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.215+0000 7f527a7fc700 1 -- 192.168.123.103:0/552320928 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f528001e070 con 0x7f5284105e50 2026-03-10T14:02:55.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.216+0000 7f527a7fc700 1 -- 192.168.123.103:0/552320928 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f52841068e0 con 0x7f5284105e50 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.321+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f52841068e0 con 0x7f52700383b0 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.426+0000 7f527a7fc700 1 -- 192.168.123.103:0/552320928 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f52841068e0 con 0x7f52700383b0 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52700383b0 msgr2=0x7f527003a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 --2- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52700383b0 0x7f527003a860 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f5274006fd0 tx=0x7f5274006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 msgr2=0x7f528419bc50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 --2- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f528419bc50 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f5280003ee0 tx=0x7f5280003fc0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 shutdown_connections 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 --2- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52700383b0 0x7f527003a860 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 --2- 192.168.123.103:0/552320928 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5284105e50 0x7f528419bc50 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 >> 192.168.123.103:0/552320928 conn(0x7f5284101420 msgr2=0x7f5284101fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 shutdown_connections 2026-03-10T14:02:55.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.429+0000 7f528b277700 1 -- 192.168.123.103:0/552320928 wait complete. 2026-03-10T14:02:55.598 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: [10/Mar/2026:14:02:54] ENGINE Bus STARTING 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: [10/Mar/2026:14:02:54] ENGINE Serving on http://192.168.123.103:8765 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: [10/Mar/2026:14:02:54] ENGINE Serving on https://192.168.123.103:7150 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: [10/Mar/2026:14:02:54] ENGINE Bus STARTED 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:02:55.599 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:55 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:55.776 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDJxXup9gGfn3tPtASG1Q+juHRUnudNH1ynOUCAqvqcGWEOBs3CBYAr3KDP7hI82XXy5AQFMPyrgtNX0jTpNqIpHcBEC7FlVT8vA2QItERgZ1zK3EGoE6ot8q3zVLJx0WQ4PhzS9GSg5NoKWdms/R2TiStmGf2lpiG5WLOPLikvyCdLO8JCj7n+7Hnhy/Wd+kSFcXF07O/wEu5xhu7zfVmFP0nIj0Tbdn4oFsiqrFtfTrjsKzkrxUPp49Itu06pFSxP7dqAFNZ14vXwgR8VJOejjvPWYrkZmqYFFpfrMVZREZdqp44G3AnXdrdlU5PenydGBws7AkjBrftQSwfBfq0qP7MV/35WPC9mFYenxkOPSMajMNn86HTdL6sK2yQduSXPGHVfko401HGCIITmHuZnTgbjn+idMT5lGA24KftpS6vqjA1w+U37oJepadCJTBw4jcFAJ5ZEaep72Dn+wN2waOIDX1HtNEtN+yqnEF5ASRf3st0ubGRxxZbMoYajQRE= ceph-b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.606+0000 7f77cd1e9700 1 Processor -- start 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.606+0000 7f77cd1e9700 1 -- start start 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.607+0000 7f77cd1e9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c8108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.607+0000 7f77cd1e9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77c8108890 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.607+0000 7f77c6d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c8108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.607+0000 7f77c6d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c8108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57810/0 (socket says 192.168.123.103:57810) 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.607+0000 7f77c6d9d700 1 -- 192.168.123.103:0/568643141 learned_addr learned my addr 192.168.123.103:0/568643141 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.607+0000 7f77c6d9d700 1 -- 192.168.123.103:0/568643141 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f77c81089d0 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.607+0000 7f77c6d9d700 1 --2- 192.168.123.103:0/568643141 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c8108350 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f77b0009a90 tx=0x7f77b0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7ba4b909884f2051 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77c5d9b700 1 -- 192.168.123.103:0/568643141 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f77b0004030 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77c5d9b700 1 -- 192.168.123.103:0/568643141 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f77b000b7e0 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77c5d9b700 1 -- 192.168.123.103:0/568643141 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f77b0003a40 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77cd1e9700 1 -- 192.168.123.103:0/568643141 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 msgr2=0x7f77c8108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77cd1e9700 1 --2- 192.168.123.103:0/568643141 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c8108350 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f77b0009a90 tx=0x7f77b0009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77cd1e9700 1 -- 192.168.123.103:0/568643141 shutdown_connections 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77cd1e9700 1 --2- 192.168.123.103:0/568643141 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c8108350 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77cd1e9700 1 -- 192.168.123.103:0/568643141 >> 192.168.123.103:0/568643141 conn(0x7f77c807b4b0 msgr2=0x7f77c807b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77cd1e9700 1 -- 192.168.123.103:0/568643141 shutdown_connections 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.608+0000 7f77cd1e9700 1 -- 192.168.123.103:0/568643141 wait complete. 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.609+0000 7f77cd1e9700 1 Processor -- start 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.609+0000 7f77cd1e9700 1 -- start start 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.609+0000 7f77cd1e9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c819ba70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.609+0000 7f77cd1e9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77c819bfb0 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.609+0000 7f77c6d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c819ba70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.609+0000 7f77c6d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c819ba70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57812/0 (socket says 192.168.123.103:57812) 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.609+0000 7f77c6d9d700 1 -- 192.168.123.103:0/409796153 learned_addr learned my addr 192.168.123.103:0/409796153 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.610+0000 7f77c6d9d700 1 -- 192.168.123.103:0/409796153 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f77b0009740 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.610+0000 7f77c6d9d700 1 --2- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c819ba70 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f77b000bf20 tx=0x7f77b000bf50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.610+0000 7f77bffff700 1 -- 192.168.123.103:0/409796153 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f77b00041a0 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.610+0000 7f77bffff700 1 -- 192.168.123.103:0/409796153 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f77b0004300 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.610+0000 7f77bffff700 1 -- 192.168.123.103:0/409796153 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f77b00114c0 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.610+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f77c819c1b0 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.610+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f77c819c5d0 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.611+0000 7f77bffff700 1 -- 192.168.123.103:0/409796153 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f77b0011620 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.611+0000 7f77bffff700 1 --2- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f77b4038400 0x7f77b403a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.611+0000 7f77bffff700 1 -- 192.168.123.103:0/409796153 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f77b004cf80 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.611+0000 7f77c659c700 1 --2- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f77b4038400 0x7f77b403a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.612+0000 7f77c659c700 1 --2- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f77b4038400 0x7f77b403a8b0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f77b8006fd0 tx=0x7f77b8006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.613+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f77c804f9e0 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.615+0000 7f77bffff700 1 -- 192.168.123.103:0/409796153 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f77b002b430 con 0x7f77c8107f40 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.721+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f77c8105b60 con 0x7f77b4038400 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.722+0000 7f77bffff700 1 -- 192.168.123.103:0/409796153 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+595 (secure 0 0 0) 0x7f77c8105b60 con 0x7f77b4038400 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.724+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f77b4038400 msgr2=0x7f77b403a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.724+0000 7f77cd1e9700 1 --2- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f77b4038400 0x7f77b403a8b0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f77b8006fd0 tx=0x7f77b8006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.724+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 msgr2=0x7f77c819ba70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.724+0000 7f77cd1e9700 1 --2- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c819ba70 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f77b000bf20 tx=0x7f77b000bf50 comp rx=0 tx=0).stop 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.724+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 shutdown_connections 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.724+0000 7f77cd1e9700 1 --2- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f77b4038400 0x7f77b403a8b0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.724+0000 7f77cd1e9700 1 --2- 192.168.123.103:0/409796153 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f77c8107f40 0x7f77c819ba70 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.725+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 >> 192.168.123.103:0/409796153 conn(0x7f77c807b4b0 msgr2=0x7f77c8105450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.725+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 shutdown_connections 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.725+0000 7f77cd1e9700 1 -- 192.168.123.103:0/409796153 wait complete. 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:Adding key to root@localhost authorized_keys... 2026-03-10T14:02:55.777 INFO:teuthology.orchestra.run.vm03.stdout:Adding host vm03... 2026-03-10T14:02:56.659 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:56 vm03 ceph-mon[49718]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:02:56.659 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:56 vm03 ceph-mon[49718]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:02:56.659 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:56 vm03 ceph-mon[49718]: Generating ssh key... 2026-03-10T14:02:56.659 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:56 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:56.659 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:56 vm03 ceph-mon[49718]: mgrmap e8: vm03.rwbbep(active, since 2s) 2026-03-10T14:02:57.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:57 vm03 ceph-mon[49718]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:02:57.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:57 vm03 ceph-mon[49718]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm03", "addr": "192.168.123.103", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:02:57.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:57 vm03 ceph-mon[49718]: Deploying cephadm binary to vm03 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Added host 'vm03' with addr '192.168.123.103' 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.909+0000 7f056b527700 1 Processor -- start 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.910+0000 7f056b527700 1 -- start start 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.910+0000 7f056b527700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f0564108320 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.910+0000 7f056b527700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0564108860 con 0x7f0564107ef0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.910+0000 7f05692c3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f0564108320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.910+0000 7f05692c3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f0564108320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57814/0 (socket says 192.168.123.103:57814) 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.910+0000 7f05692c3700 1 -- 192.168.123.103:0/3175155428 learned_addr learned my addr 192.168.123.103:0/3175155428 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.911+0000 7f05692c3700 1 -- 192.168.123.103:0/3175155428 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f05641089a0 con 0x7f0564107ef0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.911+0000 7f05692c3700 1 --2- 192.168.123.103:0/3175155428 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f0564108320 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f0558009a90 tx=0x7f0558009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e63a1df44a2442d2 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.911+0000 7f0557fff700 1 -- 192.168.123.103:0/3175155428 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0558004030 con 0x7f0564107ef0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.911+0000 7f0557fff700 1 -- 192.168.123.103:0/3175155428 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f055800b7e0 con 0x7f0564107ef0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.912+0000 7f0557fff700 1 -- 192.168.123.103:0/3175155428 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0558003a40 con 0x7f0564107ef0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.912+0000 7f056b527700 1 -- 192.168.123.103:0/3175155428 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 msgr2=0x7f0564108320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.912+0000 7f056b527700 1 --2- 192.168.123.103:0/3175155428 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f0564108320 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f0558009a90 tx=0x7f0558009da0 comp rx=0 tx=0).stop 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.912+0000 7f056b527700 1 -- 192.168.123.103:0/3175155428 shutdown_connections 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.912+0000 7f056b527700 1 --2- 192.168.123.103:0/3175155428 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f0564108320 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.912+0000 7f056b527700 1 -- 192.168.123.103:0/3175155428 >> 192.168.123.103:0/3175155428 conn(0x7f0564103770 msgr2=0x7f0564105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.913+0000 7f056b527700 1 -- 192.168.123.103:0/3175155428 shutdown_connections 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.913+0000 7f056b527700 1 -- 192.168.123.103:0/3175155428 wait complete. 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.913+0000 7f056b527700 1 Processor -- start 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.913+0000 7f056b527700 1 -- start start 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.914+0000 7f056b527700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f056419bab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.914+0000 7f056b527700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f056419bff0 con 0x7f0564107ef0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.914+0000 7f05692c3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f056419bab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.914+0000 7f05692c3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f056419bab0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57826/0 (socket says 192.168.123.103:57826) 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.914+0000 7f05692c3700 1 -- 192.168.123.103:0/3515850364 learned_addr learned my addr 192.168.123.103:0/3515850364 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.914+0000 7f05692c3700 1 -- 192.168.123.103:0/3515850364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0558009740 con 0x7f0564107ef0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.914+0000 7f05692c3700 1 --2- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f056419bab0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f055800bef0 tx=0x7f0558003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.914+0000 7f05567fc700 1 -- 192.168.123.103:0/3515850364 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0558004140 con 0x7f0564107ef0 2026-03-10T14:02:57.801 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.915+0000 7f05567fc700 1 -- 192.168.123.103:0/3515850364 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f05580042a0 con 0x7f0564107ef0 2026-03-10T14:02:57.802 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.915+0000 7f05567fc700 1 -- 192.168.123.103:0/3515850364 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0558011440 con 0x7f0564107ef0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.915+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f056419c250 con 0x7f0564107ef0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.915+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f056419f050 con 0x7f0564107ef0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.916+0000 7f05567fc700 1 -- 192.168.123.103:0/3515850364 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f05580115a0 con 0x7f0564107ef0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.916+0000 7f05567fc700 1 --2- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0550038450 0x7f055003a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.916+0000 7f05567fc700 1 -- 192.168.123.103:0/3515850364 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f055804d120 con 0x7f0564107ef0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.916+0000 7f0568ac2700 1 --2- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0550038450 0x7f055003a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.917+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f056404f9e0 con 0x7f0564107ef0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.917+0000 7f0568ac2700 1 --2- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0550038450 0x7f055003a900 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f0560006fd0 tx=0x7f0560006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:55.920+0000 7f05567fc700 1 -- 192.168.123.103:0/3515850364 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f055802b430 con 0x7f0564107ef0 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:56.025+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm03", "addr": "192.168.123.103", "target": ["mon-mgr", ""]}) v1 -- 0x7f0564105a60 con 0x7f0550038450 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.765+0000 7f05567fc700 1 -- 192.168.123.103:0/3515850364 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f0564105a60 con 0x7f0550038450 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.768+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0550038450 msgr2=0x7f055003a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.768+0000 7f056b527700 1 --2- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0550038450 0x7f055003a900 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f0560006fd0 tx=0x7f0560006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.769+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 msgr2=0x7f056419bab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.769+0000 7f056b527700 1 --2- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f056419bab0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f055800bef0 tx=0x7f0558003b40 comp rx=0 tx=0).stop 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.769+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 shutdown_connections 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.769+0000 7f056b527700 1 --2- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0550038450 0x7f055003a900 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.769+0000 7f056b527700 1 --2- 192.168.123.103:0/3515850364 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0564107ef0 0x7f056419bab0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.769+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 >> 192.168.123.103:0/3515850364 conn(0x7f0564103770 msgr2=0x7f0564105350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.769+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 shutdown_connections 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.769+0000 7f056b527700 1 -- 192.168.123.103:0/3515850364 wait complete. 2026-03-10T14:02:57.804 INFO:teuthology.orchestra.run.vm03.stdout:Deploying mon service with default placement... 2026-03-10T14:02:58.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.945+0000 7f1bb14c4700 1 Processor -- start 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.945+0000 7f1bb14c4700 1 -- start start 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.945+0000 7f1bb14c4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac071200 0x7f1bac071610 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.945+0000 7f1bb14c4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1bac0728b0 con 0x7f1bac071200 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.946+0000 7f1baaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac071200 0x7f1bac071610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.946+0000 7f1baaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac071200 0x7f1bac071610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57842/0 (socket says 192.168.123.103:57842) 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.946+0000 7f1baaffd700 1 -- 192.168.123.103:0/3095466135 learned_addr learned my addr 192.168.123.103:0/3095466135 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.946+0000 7f1baaffd700 1 -- 192.168.123.103:0/3095466135 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1bac0729f0 con 0x7f1bac071200 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.946+0000 7f1baaffd700 1 --2- 192.168.123.103:0/3095466135 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac071200 0x7f1bac071610 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f1ba0009cf0 tx=0x7f1ba000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=ca2f19d60bd959bd server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1ba9ffb700 1 -- 192.168.123.103:0/3095466135 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1ba0004030 con 0x7f1bac071200 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1ba9ffb700 1 -- 192.168.123.103:0/3095466135 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f1ba000b810 con 0x7f1bac071200 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1bb14c4700 1 -- 192.168.123.103:0/3095466135 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac071200 msgr2=0x7f1bac071610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1bb14c4700 1 --2- 192.168.123.103:0/3095466135 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac071200 0x7f1bac071610 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f1ba0009cf0 tx=0x7f1ba000b0e0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1bb14c4700 1 -- 192.168.123.103:0/3095466135 shutdown_connections 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1bb14c4700 1 --2- 192.168.123.103:0/3095466135 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac071200 0x7f1bac071610 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1bb14c4700 1 -- 192.168.123.103:0/3095466135 >> 192.168.123.103:0/3095466135 conn(0x7f1bac06cc30 msgr2=0x7f1bac06f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1bb14c4700 1 -- 192.168.123.103:0/3095466135 shutdown_connections 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.947+0000 7f1bb14c4700 1 -- 192.168.123.103:0/3095466135 wait complete. 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1bb14c4700 1 Processor -- start 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1bb14c4700 1 -- start start 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1bb14c4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac1a8980 0x7f1bac1a8d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1bb14c4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1bac0728b0 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1baaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac1a8980 0x7f1bac1a8d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1baaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac1a8980 0x7f1bac1a8d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57852/0 (socket says 192.168.123.103:57852) 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1baaffd700 1 -- 192.168.123.103:0/29363097 learned_addr learned my addr 192.168.123.103:0/29363097 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1baaffd700 1 -- 192.168.123.103:0/29363097 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1ba0009740 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.948+0000 7f1baaffd700 1 --2- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac1a8980 0x7f1bac1a8d90 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f1ba0009cc0 tx=0x7f1ba0003cb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.949+0000 7f1b93fff700 1 -- 192.168.123.103:0/29363097 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1ba0003ed0 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.949+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1bac1a92d0 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.949+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1bac1abf60 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.949+0000 7f1b93fff700 1 -- 192.168.123.103:0/29363097 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f1ba00044d0 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.949+0000 7f1b93fff700 1 -- 192.168.123.103:0/29363097 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1ba001ac60 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.949+0000 7f1b93fff700 1 -- 192.168.123.103:0/29363097 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f1ba0011420 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.950+0000 7f1b93fff700 1 --2- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1b940384b0 0x7f1b9403a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.950+0000 7f1baa7fc700 1 --2- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1b940384b0 0x7f1b9403a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.950+0000 7f1b93fff700 1 -- 192.168.123.103:0/29363097 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f1ba004c6c0 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.950+0000 7f1baa7fc700 1 --2- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1b940384b0 0x7f1b9403a960 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f1ba400ad30 tx=0x7f1ba40093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.951+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1b98005320 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:57.953+0000 7f1b93fff700 1 -- 192.168.123.103:0/29363097 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1ba001f020 con 0x7f1bac1a8980 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.071+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7f1b98000bf0 con 0x7f1b940384b0 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.078+0000 7f1b93fff700 1 -- 192.168.123.103:0/29363097 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f1b98000bf0 con 0x7f1b940384b0 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1b940384b0 msgr2=0x7f1b9403a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 --2- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1b940384b0 0x7f1b9403a960 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f1ba400ad30 tx=0x7f1ba40093f0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac1a8980 msgr2=0x7f1bac1a8d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 --2- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac1a8980 0x7f1bac1a8d90 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f1ba0009cc0 tx=0x7f1ba0003cb0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 shutdown_connections 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 --2- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1b940384b0 0x7f1b9403a960 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 --2- 192.168.123.103:0/29363097 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1bac1a8980 0x7f1bac1a8d90 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 >> 192.168.123.103:0/29363097 conn(0x7f1bac06cc30 msgr2=0x7f1bac112570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 shutdown_connections 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.081+0000 7f1bb14c4700 1 -- 192.168.123.103:0/29363097 wait complete. 2026-03-10T14:02:58.119 INFO:teuthology.orchestra.run.vm03.stdout:Deploying mgr service with default placement... 2026-03-10T14:02:58.472 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.281+0000 7fb671c75700 1 Processor -- start 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.281+0000 7fb671c75700 1 -- start start 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.282+0000 7fb671c75700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.282+0000 7fb671c75700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb66c071d60 con 0x7fb66c071410 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.282+0000 7fb670c73700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.282+0000 7fb670c73700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57862/0 (socket says 192.168.123.103:57862) 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.282+0000 7fb670c73700 1 -- 192.168.123.103:0/3887267279 learned_addr learned my addr 192.168.123.103:0/3887267279 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.282+0000 7fb670c73700 1 -- 192.168.123.103:0/3887267279 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb66c071ea0 con 0x7fb66c071410 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.284+0000 7fb670c73700 1 --2- 192.168.123.103:0/3887267279 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c071820 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fb65c0096e0 tx=0x7fb65c00c1f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=88b99dc2016437de server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.284+0000 7fb66b7fe700 1 -- 192.168.123.103:0/3887267279 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb65c004030 con 0x7fb66c071410 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.284+0000 7fb66b7fe700 1 -- 192.168.123.103:0/3887267279 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb65c00c920 con 0x7fb66c071410 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.284+0000 7fb66b7fe700 1 -- 192.168.123.103:0/3887267279 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb65c003bc0 con 0x7fb66c071410 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.285+0000 7fb671c75700 1 -- 192.168.123.103:0/3887267279 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 msgr2=0x7fb66c071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.285+0000 7fb671c75700 1 --2- 192.168.123.103:0/3887267279 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c071820 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fb65c0096e0 tx=0x7fb65c00c1f0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.285+0000 7fb671c75700 1 -- 192.168.123.103:0/3887267279 shutdown_connections 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.285+0000 7fb671c75700 1 --2- 192.168.123.103:0/3887267279 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c071820 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.285+0000 7fb671c75700 1 -- 192.168.123.103:0/3887267279 >> 192.168.123.103:0/3887267279 conn(0x7fb66c06c9d0 msgr2=0x7fb66c06ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.287+0000 7fb671c75700 1 -- 192.168.123.103:0/3887267279 shutdown_connections 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.287+0000 7fb671c75700 1 -- 192.168.123.103:0/3887267279 wait complete. 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.291+0000 7fb671c75700 1 Processor -- start 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.292+0000 7fb671c75700 1 -- start start 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.292+0000 7fb671c75700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c1a86e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.292+0000 7fb671c75700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb66c1a8c20 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.293+0000 7fb670c73700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c1a86e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.293+0000 7fb670c73700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c1a86e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57872/0 (socket says 192.168.123.103:57872) 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.293+0000 7fb670c73700 1 -- 192.168.123.103:0/3307531024 learned_addr learned my addr 192.168.123.103:0/3307531024 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.293+0000 7fb670c73700 1 -- 192.168.123.103:0/3307531024 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb65c009160 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.293+0000 7fb670c73700 1 --2- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c1a86e0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb65c00c670 tx=0x7fb65c003f80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.293+0000 7fb669ffb700 1 -- 192.168.123.103:0/3307531024 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb65c004360 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.293+0000 7fb671c75700 1 -- 192.168.123.103:0/3307531024 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb66c1a8e20 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.293+0000 7fb671c75700 1 -- 192.168.123.103:0/3307531024 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb66c1a92c0 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.294+0000 7fb669ffb700 1 -- 192.168.123.103:0/3307531024 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb65c0044c0 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.294+0000 7fb669ffb700 1 -- 192.168.123.103:0/3307531024 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb65c018620 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.295+0000 7fb671c75700 1 -- 192.168.123.103:0/3307531024 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb658005320 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.296+0000 7fb669ffb700 1 -- 192.168.123.103:0/3307531024 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fb65c018860 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.296+0000 7fb669ffb700 1 --2- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb6540384b0 0x7fb65403a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.296+0000 7fb669ffb700 1 -- 192.168.123.103:0/3307531024 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb65c04c3f0 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.300+0000 7fb66bfff700 1 --2- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb6540384b0 0x7fb65403a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.300+0000 7fb669ffb700 1 -- 192.168.123.103:0/3307531024 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb65c01fd60 con 0x7fb66c071410 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.302+0000 7fb66bfff700 1 --2- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb6540384b0 0x7fb65403a960 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fb660009940 tx=0x7fb660006e30 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.417+0000 7fb671c75700 1 -- 192.168.123.103:0/3307531024 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7fb658000bf0 con 0x7fb6540384b0 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.422+0000 7fb669ffb700 1 -- 192.168.123.103:0/3307531024 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fb658000bf0 con 0x7fb6540384b0 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 -- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb6540384b0 msgr2=0x7fb65403a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 --2- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb6540384b0 0x7fb65403a960 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fb660009940 tx=0x7fb660006e30 comp rx=0 tx=0).stop 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 -- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 msgr2=0x7fb66c1a86e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 --2- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c1a86e0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb65c00c670 tx=0x7fb65c003f80 comp rx=0 tx=0).stop 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 -- 192.168.123.103:0/3307531024 shutdown_connections 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 --2- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb6540384b0 0x7fb65403a960 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 --2- 192.168.123.103:0/3307531024 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb66c071410 0x7fb66c1a86e0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 -- 192.168.123.103:0/3307531024 >> 192.168.123.103:0/3307531024 conn(0x7fb66c06c9d0 msgr2=0x7fb66c06d680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.426+0000 7fb6537fe700 1 -- 192.168.123.103:0/3307531024 shutdown_connections 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.427+0000 7fb6537fe700 1 -- 192.168.123.103:0/3307531024 wait complete. 2026-03-10T14:02:58.475 INFO:teuthology.orchestra.run.vm03.stdout:Deploying crash service with default placement... 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f20b45700 1 Processor -- start 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f20b45700 1 -- start start 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f20b45700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f140a43b0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f20b45700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f140a4980 con 0x7f6f140a3fa0 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f1b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f140a43b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f1b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f140a43b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57882/0 (socket says 192.168.123.103:57882) 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f1b7fe700 1 -- 192.168.123.103:0/3808841451 learned_addr learned my addr 192.168.123.103:0/3808841451 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f1b7fe700 1 -- 192.168.123.103:0/3808841451 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f140a5130 con 0x7f6f140a3fa0 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.612+0000 7f6f1b7fe700 1 --2- 192.168.123.103:0/3808841451 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f140a43b0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f6f1001ab30 tx=0x7f6f1001ae40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7ca2c291d09ebc4b server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.613+0000 7f6f1a7fc700 1 -- 192.168.123.103:0/3808841451 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f10004030 con 0x7f6f140a3fa0 2026-03-10T14:02:58.777 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.613+0000 7f6f1a7fc700 1 -- 192.168.123.103:0/3808841451 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6f1001c8b0 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.613+0000 7f6f20b45700 1 -- 192.168.123.103:0/3808841451 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 msgr2=0x7f6f140a43b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.613+0000 7f6f20b45700 1 --2- 192.168.123.103:0/3808841451 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f140a43b0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f6f1001ab30 tx=0x7f6f1001ae40 comp rx=0 tx=0).stop 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.613+0000 7f6f20b45700 1 -- 192.168.123.103:0/3808841451 shutdown_connections 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.613+0000 7f6f20b45700 1 --2- 192.168.123.103:0/3808841451 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f140a43b0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.613+0000 7f6f20b45700 1 -- 192.168.123.103:0/3808841451 >> 192.168.123.103:0/3808841451 conn(0x7f6f1409f4b0 msgr2=0x7f6f140a1900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.614+0000 7f6f20b45700 1 -- 192.168.123.103:0/3808841451 shutdown_connections 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.614+0000 7f6f20b45700 1 -- 192.168.123.103:0/3808841451 wait complete. 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.615+0000 7f6f20b45700 1 Processor -- start 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.615+0000 7f6f20b45700 1 -- start start 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.615+0000 7f6f20b45700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f14137c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.615+0000 7f6f20b45700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f140a4980 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.616+0000 7f6f1b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f14137c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.616+0000 7f6f1b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f14137c30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57886/0 (socket says 192.168.123.103:57886) 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.616+0000 7f6f1b7fe700 1 -- 192.168.123.103:0/4277629349 learned_addr learned my addr 192.168.123.103:0/4277629349 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.616+0000 7f6f1b7fe700 1 -- 192.168.123.103:0/4277629349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f1001a7e0 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.616+0000 7f6f1b7fe700 1 --2- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f14137c30 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6f1001a7b0 tx=0x7f6f1001ced0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.616+0000 7f6f18ff9700 1 -- 192.168.123.103:0/4277629349 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f100036e0 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.616+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f14138170 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.616+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f14138670 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.617+0000 7f6f18ff9700 1 -- 192.168.123.103:0/4277629349 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6f1002c920 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.617+0000 7f6f18ff9700 1 -- 192.168.123.103:0/4277629349 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6f10022a50 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.617+0000 7f6f18ff9700 1 -- 192.168.123.103:0/4277629349 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f6f10022c70 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.617+0000 7f6f18ff9700 1 --2- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6f0c0384b0 0x7f6f0c03a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.617+0000 7f6f18ff9700 1 -- 192.168.123.103:0/4277629349 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f6f1005e9a0 con 0x7f6f140a3fa0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.617+0000 7f6f1affd700 1 --2- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6f0c0384b0 0x7f6f0c03a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:58.778 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.617+0000 7f6f1affd700 1 --2- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6f0c0384b0 0x7f6f0c03a960 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f6f08006fd0 tx=0x7f6f08006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.618+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f000052f0 con 0x7f6f140a3fa0 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.621+0000 7f6f18ff9700 1 -- 192.168.123.103:0/4277629349 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6f1002a3f0 con 0x7f6f140a3fa0 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.732+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7f6f00000bc0 con 0x7f6f0c0384b0 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.738+0000 7f6f18ff9700 1 -- 192.168.123.103:0/4277629349 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7f6f00000bc0 con 0x7f6f0c0384b0 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.740+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6f0c0384b0 msgr2=0x7f6f0c03a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.740+0000 7f6f20b45700 1 --2- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6f0c0384b0 0x7f6f0c03a960 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f6f08006fd0 tx=0x7f6f08006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.740+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 msgr2=0x7f6f14137c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.740+0000 7f6f20b45700 1 --2- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f14137c30 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6f1001a7b0 tx=0x7f6f1001ced0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.741+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 shutdown_connections 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.741+0000 7f6f20b45700 1 --2- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6f0c0384b0 0x7f6f0c03a960 secure :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f6f08006fd0 tx=0x7f6f08006e40 comp rx=0 tx=0).stop 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.741+0000 7f6f20b45700 1 --2- 192.168.123.103:0/4277629349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f140a3fa0 0x7f6f14137c30 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.741+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 >> 192.168.123.103:0/4277629349 conn(0x7f6f1409f4b0 msgr2=0x7f6f140a0140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.741+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 shutdown_connections 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.741+0000 7f6f20b45700 1 -- 192.168.123.103:0/4277629349 wait complete. 2026-03-10T14:02:58.779 INFO:teuthology.orchestra.run.vm03.stdout:Deploying ceph-exporter service with default placement... 2026-03-10T14:02:59.055 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: Added host vm03 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: Saving service mon spec with placement count:5 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: Saving service mgr spec with placement count:2 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:59.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:02:58 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:02:59.117 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.942+0000 7f1728c48700 1 Processor -- start 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.942+0000 7f1728c48700 1 -- start start 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.942+0000 7f1728c48700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724071190 0x7f17240715a0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.942+0000 7f1728c48700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1724072c90 con 0x7f1724071190 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.943+0000 7f172259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724071190 0x7f17240715a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.943+0000 7f172259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724071190 0x7f17240715a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57900/0 (socket says 192.168.123.103:57900) 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.943+0000 7f172259c700 1 -- 192.168.123.103:0/4275590499 learned_addr learned my addr 192.168.123.103:0/4275590499 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.943+0000 7f172259c700 1 -- 192.168.123.103:0/4275590499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1724072dd0 con 0x7f1724071190 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.943+0000 7f172259c700 1 --2- 192.168.123.103:0/4275590499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724071190 0x7f17240715a0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f1714009cf0 tx=0x7f171400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=971219edee82af4d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.944+0000 7f172159a700 1 -- 192.168.123.103:0/4275590499 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1714004030 con 0x7f1724071190 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.945+0000 7f172159a700 1 -- 192.168.123.103:0/4275590499 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f171400b810 con 0x7f1724071190 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.945+0000 7f1728c48700 1 -- 192.168.123.103:0/4275590499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724071190 msgr2=0x7f17240715a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.945+0000 7f1728c48700 1 --2- 192.168.123.103:0/4275590499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724071190 0x7f17240715a0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f1714009cf0 tx=0x7f171400b0e0 comp rx=0 tx=0).stop 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.945+0000 7f1728c48700 1 -- 192.168.123.103:0/4275590499 shutdown_connections 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.945+0000 7f1728c48700 1 --2- 192.168.123.103:0/4275590499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724071190 0x7f17240715a0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.945+0000 7f1728c48700 1 -- 192.168.123.103:0/4275590499 >> 192.168.123.103:0/4275590499 conn(0x7f172406cc30 msgr2=0x7f172406f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f1728c48700 1 -- 192.168.123.103:0/4275590499 shutdown_connections 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f1728c48700 1 -- 192.168.123.103:0/4275590499 wait complete. 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f1728c48700 1 Processor -- start 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f1728c48700 1 -- start start 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f1728c48700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724086720 0x7f1724086b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f1728c48700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1724072c90 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f172259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724086720 0x7f1724086b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f172259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724086720 0x7f1724086b30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57912/0 (socket says 192.168.123.103:57912) 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f172259c700 1 -- 192.168.123.103:0/1999575015 learned_addr learned my addr 192.168.123.103:0/1999575015 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f172259c700 1 -- 192.168.123.103:0/1999575015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1714009740 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.946+0000 7f172259c700 1 --2- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724086720 0x7f1724086b30 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f1714009cc0 tx=0x7f171400bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.948+0000 7f17137fe700 1 -- 192.168.123.103:0/1999575015 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1714003950 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.948+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1724089d00 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.948+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17240872c0 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.949+0000 7f17137fe700 1 -- 192.168.123.103:0/1999575015 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f17140043c0 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.949+0000 7f17137fe700 1 -- 192.168.123.103:0/1999575015 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f171401ac80 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.949+0000 7f17137fe700 1 -- 192.168.123.103:0/1999575015 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f1714011420 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.950+0000 7f17137fe700 1 --2- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f170c03a700 0x7f170c03cbb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.950+0000 7f17137fe700 1 -- 192.168.123.103:0/1999575015 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f171404c7d0 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.950+0000 7f1721d9b700 1 --2- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f170c03a700 0x7f170c03cbb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.950+0000 7f1721d9b700 1 --2- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f170c03a700 0x7f170c03cbb0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f171c00ad30 tx=0x7f171c0093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.951+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1704005320 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:58.954+0000 7f17137fe700 1 -- 192.168.123.103:0/1999575015 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f171401f020 con 0x7f1724086720 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.077+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f1704000bf0 con 0x7f170c03a700 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.084+0000 7f17137fe700 1 -- 192.168.123.103:0/1999575015 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f1704000bf0 con 0x7f170c03a700 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f170c03a700 msgr2=0x7f170c03cbb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 --2- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f170c03a700 0x7f170c03cbb0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f171c00ad30 tx=0x7f171c0093f0 comp rx=0 tx=0).stop 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724086720 msgr2=0x7f1724086b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 --2- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724086720 0x7f1724086b30 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f1714009cc0 tx=0x7f171400bfa0 comp rx=0 tx=0).stop 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 shutdown_connections 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 --2- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f170c03a700 0x7f170c03cbb0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:59.118 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 --2- 192.168.123.103:0/1999575015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1724086720 0x7f1724086b30 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:02:59.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 >> 192.168.123.103:0/1999575015 conn(0x7f172406cc30 msgr2=0x7f172406e920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:02:59.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 shutdown_connections 2026-03-10T14:02:59.119 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.088+0000 7f1728c48700 1 -- 192.168.123.103:0/1999575015 wait complete. 2026-03-10T14:02:59.119 INFO:teuthology.orchestra.run.vm03.stdout:Deploying prometheus service with default placement... 2026-03-10T14:03:00.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:00 vm03 ceph-mon[49718]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:00.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:00 vm03 ceph-mon[49718]: Saving service crash spec with placement * 2026-03-10T14:03:00.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:00 vm03 ceph-mon[49718]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:00.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:00 vm03 ceph-mon[49718]: Saving service ceph-exporter spec with placement * 2026-03-10T14:03:00.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:00 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:00.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:00 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:00.100 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:00 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:00.125 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.927+0000 7fb7f3a07700 1 Processor -- start 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.928+0000 7fb7f3a07700 1 -- start start 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.929+0000 7fb7f3a07700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec071c50 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.929+0000 7fb7f3a07700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7ec072190 con 0x7fb7ec071840 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.929+0000 7fb7f17a3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec071c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.929+0000 7fb7f17a3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec071c50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57930/0 (socket says 192.168.123.103:57930) 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.929+0000 7fb7f17a3700 1 -- 192.168.123.103:0/194993676 learned_addr learned my addr 192.168.123.103:0/194993676 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.930+0000 7fb7f17a3700 1 -- 192.168.123.103:0/194993676 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7ec0722d0 con 0x7fb7ec071840 2026-03-10T14:03:00.126 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.930+0000 7fb7f17a3700 1 --2- 192.168.123.103:0/194993676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec071c50 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb7e800d180 tx=0x7fb7e800d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=dc10a92d2366a1e0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.930+0000 7fb7e3fff700 1 -- 192.168.123.103:0/194993676 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb7e8010070 con 0x7fb7ec071840 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.930+0000 7fb7e3fff700 1 -- 192.168.123.103:0/194993676 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb7e8004510 con 0x7fb7ec071840 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.931+0000 7fb7f3a07700 1 -- 192.168.123.103:0/194993676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 msgr2=0x7fb7ec071c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.931+0000 7fb7f3a07700 1 --2- 192.168.123.103:0/194993676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec071c50 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb7e800d180 tx=0x7fb7e800d490 comp rx=0 tx=0).stop 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.931+0000 7fb7f3a07700 1 -- 192.168.123.103:0/194993676 shutdown_connections 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.931+0000 7fb7f3a07700 1 --2- 192.168.123.103:0/194993676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec071c50 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.931+0000 7fb7f3a07700 1 -- 192.168.123.103:0/194993676 >> 192.168.123.103:0/194993676 conn(0x7fb7ec06cc30 msgr2=0x7fb7ec06f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.931+0000 7fb7f3a07700 1 -- 192.168.123.103:0/194993676 shutdown_connections 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.931+0000 7fb7f3a07700 1 -- 192.168.123.103:0/194993676 wait complete. 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.933+0000 7fb7f3a07700 1 Processor -- start 2026-03-10T14:03:00.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.933+0000 7fb7f3a07700 1 -- start start 2026-03-10T14:03:00.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.933+0000 7fb7f3a07700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec1a1b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.933+0000 7fb7f3a07700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7e8003c20 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.933+0000 7fb7f17a3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec1a1b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.933+0000 7fb7f17a3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec1a1b70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57936/0 (socket says 192.168.123.103:57936) 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.933+0000 7fb7f17a3700 1 -- 192.168.123.103:0/1769356234 learned_addr learned my addr 192.168.123.103:0/1769356234 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.933+0000 7fb7f17a3700 1 -- 192.168.123.103:0/1769356234 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7e80087c0 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.934+0000 7fb7f17a3700 1 --2- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec1a1b70 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb7e8008c40 tx=0x7fb7e8008d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.934+0000 7fb7e27fc700 1 -- 192.168.123.103:0/1769356234 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb7e8010050 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.934+0000 7fb7f3a07700 1 -- 192.168.123.103:0/1769356234 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb7ec1a20b0 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.934+0000 7fb7f3a07700 1 -- 192.168.123.103:0/1769356234 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb7ec1a2550 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.934+0000 7fb7e27fc700 1 -- 192.168.123.103:0/1769356234 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb7e800b150 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.934+0000 7fb7e27fc700 1 -- 192.168.123.103:0/1769356234 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb7e80164e0 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.935+0000 7fb7f3a07700 1 -- 192.168.123.103:0/1769356234 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb7d0005320 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.938+0000 7fb7e27fc700 1 -- 192.168.123.103:0/1769356234 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fb7e80041f0 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.938+0000 7fb7e27fc700 1 --2- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb7d8038120 0x7fb7d803a5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.938+0000 7fb7f0fa2700 1 --2- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb7d8038120 0x7fb7d803a5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.939+0000 7fb7f0fa2700 1 --2- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb7d8038120 0x7fb7d803a5d0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb7e400ad80 tx=0x7fb7e40093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.939+0000 7fb7e27fc700 1 -- 192.168.123.103:0/1769356234 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb7e804c550 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:02:59.939+0000 7fb7e27fc700 1 -- 192.168.123.103:0/1769356234 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb7e804c980 con 0x7fb7ec071840 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.054+0000 7fb7f3a07700 1 -- 192.168.123.103:0/1769356234 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7fb7d0000bf0 con 0x7fb7d8038120 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.058+0000 7fb7e27fc700 1 -- 192.168.123.103:0/1769356234 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7fb7d0000bf0 con 0x7fb7d8038120 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.061+0000 7fb7d7fff700 1 -- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb7d8038120 msgr2=0x7fb7d803a5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.061+0000 7fb7d7fff700 1 --2- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb7d8038120 0x7fb7d803a5d0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb7e400ad80 tx=0x7fb7e40093f0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.061+0000 7fb7d7fff700 1 -- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 msgr2=0x7fb7ec1a1b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.061+0000 7fb7d7fff700 1 --2- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec1a1b70 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb7e8008c40 tx=0x7fb7e8008d20 comp rx=0 tx=0).stop 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.062+0000 7fb7d7fff700 1 -- 192.168.123.103:0/1769356234 shutdown_connections 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.062+0000 7fb7d7fff700 1 --2- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb7d8038120 0x7fb7d803a5d0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.062+0000 7fb7d7fff700 1 --2- 192.168.123.103:0/1769356234 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb7ec071840 0x7fb7ec1a1b70 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.062+0000 7fb7d7fff700 1 -- 192.168.123.103:0/1769356234 >> 192.168.123.103:0/1769356234 conn(0x7fb7ec06cc30 msgr2=0x7fb7ec06f330 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.062+0000 7fb7d7fff700 1 -- 192.168.123.103:0/1769356234 shutdown_connections 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.062+0000 7fb7d7fff700 1 -- 192.168.123.103:0/1769356234 wait complete. 2026-03-10T14:03:00.130 INFO:teuthology.orchestra.run.vm03.stdout:Deploying grafana service with default placement... 2026-03-10T14:03:00.565 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-10T14:03:00.565 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.386+0000 7f7fcdf86700 1 Processor -- start 2026-03-10T14:03:00.565 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.387+0000 7f7fcdf86700 1 -- start start 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.387+0000 7f7fcdf86700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc0097f60 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.387+0000 7f7fcdf86700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7fc0005660 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.388+0000 7f7fc77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc0097f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.388+0000 7f7fc77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc0097f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57950/0 (socket says 192.168.123.103:57950) 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.388+0000 7f7fc77fe700 1 -- 192.168.123.103:0/1965535742 learned_addr learned my addr 192.168.123.103:0/1965535742 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.388+0000 7f7fc77fe700 1 -- 192.168.123.103:0/1965535742 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7fc00984a0 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.388+0000 7f7fc77fe700 1 --2- 192.168.123.103:0/1965535742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc0097f60 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f7fbc009a90 tx=0x7f7fbc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a1888b5b90cacdd5 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.388+0000 7f7fc67fc700 1 -- 192.168.123.103:0/1965535742 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7fbc004030 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.388+0000 7f7fc67fc700 1 -- 192.168.123.103:0/1965535742 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7fbc00b7e0 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.388+0000 7f7fc67fc700 1 -- 192.168.123.103:0/1965535742 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7fbc003b30 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.389+0000 7f7fcdf86700 1 -- 192.168.123.103:0/1965535742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 msgr2=0x7f7fc0097f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.389+0000 7f7fcdf86700 1 --2- 192.168.123.103:0/1965535742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc0097f60 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f7fbc009a90 tx=0x7f7fbc009da0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.391+0000 7f7fcdf86700 1 -- 192.168.123.103:0/1965535742 shutdown_connections 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.391+0000 7f7fcdf86700 1 --2- 192.168.123.103:0/1965535742 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc0097f60 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.391+0000 7f7fcdf86700 1 -- 192.168.123.103:0/1965535742 >> 192.168.123.103:0/1965535742 conn(0x7f7fc008f5a0 msgr2=0x7f7fc00919f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.392+0000 7f7fcdf86700 1 -- 192.168.123.103:0/1965535742 shutdown_connections 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.392+0000 7f7fcdf86700 1 -- 192.168.123.103:0/1965535742 wait complete. 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.392+0000 7f7fcdf86700 1 Processor -- start 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.392+0000 7f7fcdf86700 1 -- start start 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.392+0000 7f7fcdf86700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc00954e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.392+0000 7f7fcdf86700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7fc0095a20 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.393+0000 7f7fc77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc00954e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.393+0000 7f7fc77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc00954e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57960/0 (socket says 192.168.123.103:57960) 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.393+0000 7f7fc77fe700 1 -- 192.168.123.103:0/503979444 learned_addr learned my addr 192.168.123.103:0/503979444 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.393+0000 7f7fc77fe700 1 -- 192.168.123.103:0/503979444 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7fbc009740 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.393+0000 7f7fc77fe700 1 --2- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc00954e0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f7fbc000c00 tx=0x7f7fbc00bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.393+0000 7f7fc4ff9700 1 -- 192.168.123.103:0/503979444 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7fbc004030 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.393+0000 7f7fc4ff9700 1 -- 192.168.123.103:0/503979444 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7fbc01a460 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.394+0000 7f7fc4ff9700 1 -- 192.168.123.103:0/503979444 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7fbc011480 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.394+0000 7f7fcdf86700 1 -- 192.168.123.103:0/503979444 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7fc0093b30 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.395+0000 7f7fcdf86700 1 -- 192.168.123.103:0/503979444 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7fc0093fd0 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.396+0000 7f7fc4ff9700 1 -- 192.168.123.103:0/503979444 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f7fbc0041e0 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.396+0000 7f7fb27fc700 1 -- 192.168.123.103:0/503979444 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7fa80052f0 con 0x7f7fc0095b80 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.396+0000 7f7fc4ff9700 1 --2- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7fb80380f0 0x7f7fb803a5a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.566 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.396+0000 7f7fc6ffd700 1 --2- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7fb80380f0 0x7f7fb803a5a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.397+0000 7f7fc6ffd700 1 --2- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7fb80380f0 0x7f7fb803a5a0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f7fb4006fd0 tx=0x7f7fb4006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.397+0000 7f7fc4ff9700 1 -- 192.168.123.103:0/503979444 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f7fbc028030 con 0x7f7fc0095b80 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.403+0000 7f7fc4ff9700 1 -- 192.168.123.103:0/503979444 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7fbc018ba0 con 0x7f7fc0095b80 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.527+0000 7f7fb27fc700 1 -- 192.168.123.103:0/503979444 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7f7fa8000bc0 con 0x7f7fb80380f0 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.532+0000 7f7fc4ff9700 1 -- 192.168.123.103:0/503979444 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7f7fa8000bc0 con 0x7f7fb80380f0 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.534+0000 7f7fb27fc700 1 -- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7fb80380f0 msgr2=0x7f7fb803a5a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.534+0000 7f7fb27fc700 1 --2- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7fb80380f0 0x7f7fb803a5a0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f7fb4006fd0 tx=0x7f7fb4006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.534+0000 7f7fb27fc700 1 -- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 msgr2=0x7f7fc00954e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.534+0000 7f7fb27fc700 1 --2- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc00954e0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f7fbc000c00 tx=0x7f7fbc00bfa0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.535+0000 7f7fb27fc700 1 -- 192.168.123.103:0/503979444 shutdown_connections 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.535+0000 7f7fb27fc700 1 --2- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7fb80380f0 0x7f7fb803a5a0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.535+0000 7f7fb27fc700 1 --2- 192.168.123.103:0/503979444 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7fc0095b80 0x7f7fc00954e0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.535+0000 7f7fb27fc700 1 -- 192.168.123.103:0/503979444 >> 192.168.123.103:0/503979444 conn(0x7f7fc008f5a0 msgr2=0x7f7fc0090110 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.535+0000 7f7fb27fc700 1 -- 192.168.123.103:0/503979444 shutdown_connections 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.535+0000 7f7fb27fc700 1 -- 192.168.123.103:0/503979444 wait complete. 2026-03-10T14:03:00.567 INFO:teuthology.orchestra.run.vm03.stdout:Deploying node-exporter service with default placement... 2026-03-10T14:03:00.880 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-10T14:03:00.880 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.706+0000 7fa461011700 1 Processor -- start 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.706+0000 7fa461011700 1 -- start start 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.706+0000 7fa461011700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c072a40 0x7fa45c071060 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.706+0000 7fa461011700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa45c0715a0 con 0x7fa45c072a40 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.707+0000 7fa45bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c072a40 0x7fa45c071060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.707+0000 7fa45bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c072a40 0x7fa45c071060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57972/0 (socket says 192.168.123.103:57972) 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.707+0000 7fa45bfff700 1 -- 192.168.123.103:0/122734362 learned_addr learned my addr 192.168.123.103:0/122734362 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.707+0000 7fa45bfff700 1 -- 192.168.123.103:0/122734362 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa45c0716e0 con 0x7fa45c072a40 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.707+0000 7fa45bfff700 1 --2- 192.168.123.103:0/122734362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c072a40 0x7fa45c071060 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fa44c009a90 tx=0x7fa44c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=45a4d4e3fa9783f1 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.712+0000 7fa45affd700 1 -- 192.168.123.103:0/122734362 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa44c004030 con 0x7fa45c072a40 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.712+0000 7fa45affd700 1 -- 192.168.123.103:0/122734362 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa44c00b7e0 con 0x7fa45c072a40 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.712+0000 7fa461011700 1 -- 192.168.123.103:0/122734362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c072a40 msgr2=0x7fa45c071060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.712+0000 7fa461011700 1 --2- 192.168.123.103:0/122734362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c072a40 0x7fa45c071060 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fa44c009a90 tx=0x7fa44c009da0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.712+0000 7fa461011700 1 -- 192.168.123.103:0/122734362 shutdown_connections 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.712+0000 7fa461011700 1 --2- 192.168.123.103:0/122734362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c072a40 0x7fa45c071060 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.712+0000 7fa461011700 1 -- 192.168.123.103:0/122734362 >> 192.168.123.103:0/122734362 conn(0x7fa45c06c9d0 msgr2=0x7fa45c06ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.713+0000 7fa461011700 1 -- 192.168.123.103:0/122734362 shutdown_connections 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.713+0000 7fa461011700 1 -- 192.168.123.103:0/122734362 wait complete. 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.714+0000 7fa461011700 1 Processor -- start 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.714+0000 7fa461011700 1 -- start start 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.714+0000 7fa461011700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c1a8830 0x7fa45c1a8c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.714+0000 7fa461011700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa45c0715a0 con 0x7fa45c1a8830 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.714+0000 7fa45bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c1a8830 0x7fa45c1a8c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.714+0000 7fa45bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c1a8830 0x7fa45c1a8c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57980/0 (socket says 192.168.123.103:57980) 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.714+0000 7fa45bfff700 1 -- 192.168.123.103:0/736509468 learned_addr learned my addr 192.168.123.103:0/736509468 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.714+0000 7fa45bfff700 1 -- 192.168.123.103:0/736509468 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa44c009740 con 0x7fa45c1a8830 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.715+0000 7fa45bfff700 1 --2- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c1a8830 0x7fa45c1a8c40 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fa44c009130 tx=0x7fa44c00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.881 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.715+0000 7fa4597fa700 1 -- 192.168.123.103:0/736509468 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa44c003f60 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.715+0000 7fa4597fa700 1 -- 192.168.123.103:0/736509468 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa44c0045a0 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.717+0000 7fa4597fa700 1 -- 192.168.123.103:0/736509468 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa44c01ad80 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.718+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa45c1a9180 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.718+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa45c1abe10 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.719+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa45c04f030 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.719+0000 7fa4597fa700 1 -- 192.168.123.103:0/736509468 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fa44c01a460 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.719+0000 7fa4597fa700 1 --2- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa444038140 0x7fa44403a5f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.720+0000 7fa4597fa700 1 -- 192.168.123.103:0/736509468 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fa44c04b9d0 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.720+0000 7fa45b7fe700 1 --2- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa444038140 0x7fa44403a5f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.720+0000 7fa45b7fe700 1 --2- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa444038140 0x7fa44403a5f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fa450009990 tx=0x7fa450006e30 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.724+0000 7fa4597fa700 1 -- 192.168.123.103:0/736509468 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa44c01a710 con 0x7fa45c1a8830 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.835+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7fa45c06dcb0 con 0x7fa444038140 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.840+0000 7fa4597fa700 1 -- 192.168.123.103:0/736509468 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7fa45c06dcb0 con 0x7fa444038140 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.846+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa444038140 msgr2=0x7fa44403a5f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.846+0000 7fa461011700 1 --2- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa444038140 0x7fa44403a5f0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fa450009990 tx=0x7fa450006e30 comp rx=0 tx=0).stop 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.846+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c1a8830 msgr2=0x7fa45c1a8c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.846+0000 7fa461011700 1 --2- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c1a8830 0x7fa45c1a8c40 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fa44c009130 tx=0x7fa44c00be30 comp rx=0 tx=0).stop 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.848+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 shutdown_connections 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.848+0000 7fa461011700 1 --2- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa444038140 0x7fa44403a5f0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.848+0000 7fa461011700 1 --2- 192.168.123.103:0/736509468 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa45c1a8830 0x7fa45c1a8c40 secure :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fa44c009130 tx=0x7fa44c00be30 comp rx=0 tx=0).stop 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.848+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 >> 192.168.123.103:0/736509468 conn(0x7fa45c06c9d0 msgr2=0x7fa45c06d5a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.848+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 shutdown_connections 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:00.848+0000 7fa461011700 1 -- 192.168.123.103:0/736509468 wait complete. 2026-03-10T14:03:00.882 INFO:teuthology.orchestra.run.vm03.stdout:Deploying alertmanager service with default placement... 2026-03-10T14:03:01.222 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.043+0000 7fa6c3b3b700 1 Processor -- start 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.044+0000 7fa6c3b3b700 1 -- start start 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.044+0000 7fa6c3b3b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc106140 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.044+0000 7fa6c3b3b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6bc106680 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.044+0000 7fa6c18d7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc106140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.045+0000 7fa6c18d7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc106140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57994/0 (socket says 192.168.123.103:57994) 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.045+0000 7fa6c18d7700 1 -- 192.168.123.103:0/1485681555 learned_addr learned my addr 192.168.123.103:0/1485681555 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.045+0000 7fa6c18d7700 1 -- 192.168.123.103:0/1485681555 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa6bc1067c0 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.045+0000 7fa6c18d7700 1 --2- 192.168.123.103:0/1485681555 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc106140 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fa6ac009cf0 tx=0x7fa6ac00f4f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b5c7184a6cfdecaf server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.045+0000 7fa6c08d5700 1 -- 192.168.123.103:0/1485681555 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa6ac00fc20 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.045+0000 7fa6c08d5700 1 -- 192.168.123.103:0/1485681555 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa6ac004510 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.045+0000 7fa6c08d5700 1 -- 192.168.123.103:0/1485681555 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa6ac017450 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.046+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1485681555 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 msgr2=0x7fa6bc106140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.046+0000 7fa6c3b3b700 1 --2- 192.168.123.103:0/1485681555 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc106140 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fa6ac009cf0 tx=0x7fa6ac00f4f0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.046+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1485681555 shutdown_connections 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.046+0000 7fa6c3b3b700 1 --2- 192.168.123.103:0/1485681555 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc106140 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.046+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1485681555 >> 192.168.123.103:0/1485681555 conn(0x7fa6bc1013a0 msgr2=0x7fa6bc1037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.047+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1485681555 shutdown_connections 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.047+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1485681555 wait complete. 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.047+0000 7fa6c3b3b700 1 Processor -- start 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.048+0000 7fa6c3b3b700 1 -- start start 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.048+0000 7fa6c3b3b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc197860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.048+0000 7fa6c3b3b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6bc197da0 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.048+0000 7fa6c18d7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc197860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.048+0000 7fa6c18d7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc197860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58010/0 (socket says 192.168.123.103:58010) 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.048+0000 7fa6c18d7700 1 -- 192.168.123.103:0/1049990145 learned_addr learned my addr 192.168.123.103:0/1049990145 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.048+0000 7fa6c18d7700 1 -- 192.168.123.103:0/1049990145 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa6ac009740 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.048+0000 7fa6c18d7700 1 --2- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc197860 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fa6ac01a040 tx=0x7fa6ac004140 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.049+0000 7fa6b2ffd700 1 -- 192.168.123.103:0/1049990145 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa6ac0043a0 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.049+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa6bc197fa0 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.049+0000 7fa6b2ffd700 1 -- 192.168.123.103:0/1049990145 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa6ac017bf0 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.049+0000 7fa6b2ffd700 1 -- 192.168.123.103:0/1049990145 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa6ac020d10 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.049+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa6bc198440 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.050+0000 7fa6b2ffd700 1 -- 192.168.123.103:0/1049990145 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fa6ac02a460 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.051+0000 7fa6b2ffd700 1 --2- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa6a8040c60 0x7fa6a8043110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.051+0000 7fa6b2ffd700 1 -- 192.168.123.103:0/1049990145 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fa6ac04cca0 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.051+0000 7fa6c10d6700 1 --2- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa6a8040c60 0x7fa6a8043110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.052+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa6bc1916c0 con 0x7fa6bc105d30 2026-03-10T14:03:01.223 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.055+0000 7fa6c10d6700 1 --2- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa6a8040c60 0x7fa6a8043110 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fa6b8006fd0 tx=0x7fa6b8006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.055+0000 7fa6b2ffd700 1 -- 192.168.123.103:0/1049990145 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa6ac04ce60 con 0x7fa6bc105d30 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.163+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7fa6bc078ff0 con 0x7fa6a8040c60 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.170+0000 7fa6b2ffd700 1 -- 192.168.123.103:0/1049990145 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7fa6bc078ff0 con 0x7fa6a8040c60 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa6a8040c60 msgr2=0x7fa6a8043110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 --2- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa6a8040c60 0x7fa6a8043110 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fa6b8006fd0 tx=0x7fa6b8006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 msgr2=0x7fa6bc197860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 --2- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc197860 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fa6ac01a040 tx=0x7fa6ac004140 comp rx=0 tx=0).stop 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 shutdown_connections 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 --2- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa6a8040c60 0x7fa6a8043110 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 --2- 192.168.123.103:0/1049990145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa6bc105d30 0x7fa6bc197860 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 >> 192.168.123.103:0/1049990145 conn(0x7fa6bc1013a0 msgr2=0x7fa6bc102cb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.173+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 shutdown_connections 2026-03-10T14:03:01.224 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.174+0000 7fa6c3b3b700 1 -- 192.168.123.103:0/1049990145 wait complete. 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.355+0000 7f18c01f9700 1 Processor -- start 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.355+0000 7f18c01f9700 1 -- start start 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.355+0000 7f18c01f9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b81047c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.355+0000 7f18c01f9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f18b8104d00 con 0x7f18b81043b0 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.355+0000 7f18bdf95700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b81047c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.355+0000 7f18bdf95700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b81047c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58026/0 (socket says 192.168.123.103:58026) 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.355+0000 7f18bdf95700 1 -- 192.168.123.103:0/2363325505 learned_addr learned my addr 192.168.123.103:0/2363325505 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.356+0000 7f18bdf95700 1 -- 192.168.123.103:0/2363325505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f18b8104e40 con 0x7f18b81043b0 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.356+0000 7f18bdf95700 1 --2- 192.168.123.103:0/2363325505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b81047c0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f18ac01ad90 tx=0x7f18ac01c3d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a783ceebdeff0b97 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.527 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.356+0000 7f18bcf93700 1 -- 192.168.123.103:0/2363325505 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f18ac01ae80 con 0x7f18b81043b0 2026-03-10T14:03:01.528 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.356+0000 7f18bcf93700 1 -- 192.168.123.103:0/2363325505 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f18ac004030 con 0x7f18b81043b0 2026-03-10T14:03:01.528 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.357+0000 7f18c01f9700 1 -- 192.168.123.103:0/2363325505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 msgr2=0x7f18b81047c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.528 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.357+0000 7f18c01f9700 1 --2- 192.168.123.103:0/2363325505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b81047c0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f18ac01ad90 tx=0x7f18ac01c3d0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.528 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.357+0000 7f18c01f9700 1 -- 192.168.123.103:0/2363325505 shutdown_connections 2026-03-10T14:03:01.528 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.357+0000 7f18c01f9700 1 --2- 192.168.123.103:0/2363325505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b81047c0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.528 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.357+0000 7f18c01f9700 1 -- 192.168.123.103:0/2363325505 >> 192.168.123.103:0/2363325505 conn(0x7f18b80ffa60 msgr2=0x7f18b8101e70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.357+0000 7f18c01f9700 1 -- 192.168.123.103:0/2363325505 shutdown_connections 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.357+0000 7f18c01f9700 1 -- 192.168.123.103:0/2363325505 wait complete. 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.358+0000 7f18c01f9700 1 Processor -- start 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.358+0000 7f18c01f9700 1 -- start start 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.358+0000 7f18c01f9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b8193260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.358+0000 7f18bdf95700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b8193260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.358+0000 7f18bdf95700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b8193260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58040/0 (socket says 192.168.123.103:58040) 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.358+0000 7f18bdf95700 1 -- 192.168.123.103:0/265499155 learned_addr learned my addr 192.168.123.103:0/265499155 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.358+0000 7f18c01f9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f18ac003b40 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.359+0000 7f18bdf95700 1 -- 192.168.123.103:0/265499155 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f18ac01a7e0 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.359+0000 7f18bdf95700 1 --2- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b8193260 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f18ac006e90 tx=0x7f18ac0286e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.360+0000 7f18aaffd700 1 -- 192.168.123.103:0/265499155 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f18ac003740 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.360+0000 7f18aaffd700 1 -- 192.168.123.103:0/265499155 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f18ac01ae80 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.360+0000 7f18aaffd700 1 -- 192.168.123.103:0/265499155 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f18ac030cb0 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.360+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f18b81937a0 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.360+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f18b8193c40 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.361+0000 7f18aaffd700 1 -- 192.168.123.103:0/265499155 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f18ac030450 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.362+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f18b8070060 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.363+0000 7f18aaffd700 1 --2- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f18a40381b0 0x7f18a403a660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.363+0000 7f18aaffd700 1 -- 192.168.123.103:0/265499155 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f18ac024070 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.365+0000 7f18bd794700 1 --2- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f18a40381b0 0x7f18a403a660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.365+0000 7f18aaffd700 1 -- 192.168.123.103:0/265499155 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f18ac0219b0 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.365+0000 7f18bd794700 1 --2- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f18a40381b0 0x7f18a403a660 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f18b4006fd0 tx=0x7f18b4006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.468+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7f18b804f9e0 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.478+0000 7f18aaffd700 1 -- 192.168.123.103:0/265499155 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7f18ac026e80 con 0x7f18b81043b0 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.481+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f18a40381b0 msgr2=0x7f18a403a660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.481+0000 7f18c01f9700 1 --2- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f18a40381b0 0x7f18a403a660 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f18b4006fd0 tx=0x7f18b4006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.481+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 msgr2=0x7f18b8193260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.481+0000 7f18c01f9700 1 --2- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b8193260 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f18ac006e90 tx=0x7f18ac0286e0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.481+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 shutdown_connections 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.481+0000 7f18c01f9700 1 --2- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f18a40381b0 0x7f18a403a660 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.482+0000 7f18c01f9700 1 --2- 192.168.123.103:0/265499155 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f18b81043b0 0x7f18b8193260 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.482+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 >> 192.168.123.103:0/265499155 conn(0x7f18b80ffa60 msgr2=0x7f18b806afe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.482+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 shutdown_connections 2026-03-10T14:03:01.529 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.482+0000 7f18c01f9700 1 -- 192.168.123.103:0/265499155 wait complete. 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: Saving service prometheus spec with placement count:1 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: Saving service grafana spec with placement count:1 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:01.786 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:01 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/265499155' entity='client.admin' 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.670+0000 7f24dcafc700 1 Processor -- start 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.671+0000 7f24dcafc700 1 -- start start 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.671+0000 7f24dcafc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d807ad30 0x7f24d8079230 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.671+0000 7f24dcafc700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f24d8079770 con 0x7f24d807ad30 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.671+0000 7f24d659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d807ad30 0x7f24d8079230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.671+0000 7f24d659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d807ad30 0x7f24d8079230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58048/0 (socket says 192.168.123.103:58048) 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.671+0000 7f24d659c700 1 -- 192.168.123.103:0/1673613172 learned_addr learned my addr 192.168.123.103:0/1673613172 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.672+0000 7f24d659c700 1 -- 192.168.123.103:0/1673613172 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24d80798b0 con 0x7f24d807ad30 2026-03-10T14:03:01.845 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.672+0000 7f24d659c700 1 --2- 192.168.123.103:0/1673613172 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d807ad30 0x7f24d8079230 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f24c0009cf0 tx=0x7f24c000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=24a2a40250f11774 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.672+0000 7f24d559a700 1 -- 192.168.123.103:0/1673613172 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f24c0004030 con 0x7f24d807ad30 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.672+0000 7f24d559a700 1 -- 192.168.123.103:0/1673613172 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f24c000b810 con 0x7f24d807ad30 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.672+0000 7f24d559a700 1 -- 192.168.123.103:0/1673613172 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f24c0003b10 con 0x7f24d807ad30 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.673+0000 7f24dcafc700 1 -- 192.168.123.103:0/1673613172 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d807ad30 msgr2=0x7f24d8079230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.673+0000 7f24dcafc700 1 --2- 192.168.123.103:0/1673613172 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d807ad30 0x7f24d8079230 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f24c0009cf0 tx=0x7f24c000b0e0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.673+0000 7f24dcafc700 1 -- 192.168.123.103:0/1673613172 shutdown_connections 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.673+0000 7f24dcafc700 1 --2- 192.168.123.103:0/1673613172 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d807ad30 0x7f24d8079230 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.673+0000 7f24dcafc700 1 -- 192.168.123.103:0/1673613172 >> 192.168.123.103:0/1673613172 conn(0x7f24d81013a0 msgr2=0x7f24d81037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.673+0000 7f24dcafc700 1 -- 192.168.123.103:0/1673613172 shutdown_connections 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.673+0000 7f24dcafc700 1 -- 192.168.123.103:0/1673613172 wait complete. 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.674+0000 7f24dcafc700 1 Processor -- start 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.674+0000 7f24dcafc700 1 -- start start 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.674+0000 7f24dcafc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d81a02d0 0x7f24d81a06e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.674+0000 7f24dcafc700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f24d81a0c20 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.674+0000 7f24d659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d81a02d0 0x7f24d81a06e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.674+0000 7f24d659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d81a02d0 0x7f24d81a06e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58056/0 (socket says 192.168.123.103:58056) 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.674+0000 7f24d659c700 1 -- 192.168.123.103:0/3317338013 learned_addr learned my addr 192.168.123.103:0/3317338013 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.675+0000 7f24d659c700 1 -- 192.168.123.103:0/3317338013 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24c0009740 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.675+0000 7f24d659c700 1 --2- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d81a02d0 0x7f24d81a06e0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f24c0009cc0 tx=0x7f24c001b800 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.675+0000 7f24cf7fe700 1 -- 192.168.123.103:0/3317338013 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f24c001ba20 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.675+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f24d81a0e20 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.675+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f24d81a3a80 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.676+0000 7f24cf7fe700 1 -- 192.168.123.103:0/3317338013 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f24c001bb80 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.676+0000 7f24cf7fe700 1 -- 192.168.123.103:0/3317338013 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f24c001c450 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.677+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f24b8005320 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.677+0000 7f24cf7fe700 1 -- 192.168.123.103:0/3317338013 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f24c0022070 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.677+0000 7f24cf7fe700 1 --2- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f24c40384c0 0x7f24c403a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.677+0000 7f24cf7fe700 1 -- 192.168.123.103:0/3317338013 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f24c004c1e0 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.677+0000 7f24d5d9b700 1 --2- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f24c40384c0 0x7f24c403a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.678+0000 7f24d5d9b700 1 --2- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f24c40384c0 0x7f24c403a970 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f24c8006fd0 tx=0x7f24c8006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.680+0000 7f24cf7fe700 1 -- 192.168.123.103:0/3317338013 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f24c001bcf0 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.786+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7f24b8005f70 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.794+0000 7f24cf7fe700 1 -- 192.168.123.103:0/3317338013 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7f24c002e3e0 con 0x7f24d81a02d0 2026-03-10T14:03:01.847 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.798+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f24c40384c0 msgr2=0x7f24c403a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.798+0000 7f24dcafc700 1 --2- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f24c40384c0 0x7f24c403a970 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f24c8006fd0 tx=0x7f24c8006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.798+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d81a02d0 msgr2=0x7f24d81a06e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.798+0000 7f24dcafc700 1 --2- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d81a02d0 0x7f24d81a06e0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f24c0009cc0 tx=0x7f24c001b800 comp rx=0 tx=0).stop 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.798+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 shutdown_connections 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.798+0000 7f24dcafc700 1 --2- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f24c40384c0 0x7f24c403a970 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.798+0000 7f24dcafc700 1 --2- 192.168.123.103:0/3317338013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f24d81a02d0 0x7f24d81a06e0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.799+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 >> 192.168.123.103:0/3317338013 conn(0x7f24d81013a0 msgr2=0x7f24d8102000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.799+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 shutdown_connections 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.799+0000 7f24dcafc700 1 -- 192.168.123.103:0/3317338013 wait complete. 2026-03-10T14:03:01.848 INFO:teuthology.orchestra.run.vm03.stdout:Enabling the dashboard module... 2026-03-10T14:03:02.792 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:02 vm03 ceph-mon[49718]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:02.792 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:02 vm03 ceph-mon[49718]: Saving service node-exporter spec with placement * 2026-03-10T14:03:02.792 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:02 vm03 ceph-mon[49718]: from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:02.792 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:02 vm03 ceph-mon[49718]: Saving service alertmanager spec with placement count:1 2026-03-10T14:03:02.792 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:02 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3317338013' entity='client.admin' 2026-03-10T14:03:02.792 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:02 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1422942041' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.979+0000 7f3f3ba22700 1 Processor -- start 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.980+0000 7f3f3ba22700 1 -- start start 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.980+0000 7f3f3ba22700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f34105030 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.980+0000 7f3f3ba22700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f34105570 con 0x7f3f34104c20 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.980+0000 7f3f397be700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f34105030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.980+0000 7f3f397be700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f34105030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58060/0 (socket says 192.168.123.103:58060) 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.980+0000 7f3f397be700 1 -- 192.168.123.103:0/71871339 learned_addr learned my addr 192.168.123.103:0/71871339 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.981+0000 7f3f397be700 1 -- 192.168.123.103:0/71871339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3f341056b0 con 0x7f3f34104c20 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.981+0000 7f3f397be700 1 --2- 192.168.123.103:0/71871339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f34105030 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f3f24009a90 tx=0x7f3f24009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=21f0bacf7f2ed498 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:02.830 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.981+0000 7f3f2bfff700 1 -- 192.168.123.103:0/71871339 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3f24004030 con 0x7f3f34104c20 2026-03-10T14:03:02.831 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.981+0000 7f3f2bfff700 1 -- 192.168.123.103:0/71871339 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3f2400b7e0 con 0x7f3f34104c20 2026-03-10T14:03:02.831 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.982+0000 7f3f3ba22700 1 -- 192.168.123.103:0/71871339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 msgr2=0x7f3f34105030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:02.831 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.982+0000 7f3f3ba22700 1 --2- 192.168.123.103:0/71871339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f34105030 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f3f24009a90 tx=0x7f3f24009da0 comp rx=0 tx=0).stop 2026-03-10T14:03:02.831 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.982+0000 7f3f3ba22700 1 -- 192.168.123.103:0/71871339 shutdown_connections 2026-03-10T14:03:02.831 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.982+0000 7f3f3ba22700 1 --2- 192.168.123.103:0/71871339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f34105030 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.982+0000 7f3f3ba22700 1 -- 192.168.123.103:0/71871339 >> 192.168.123.103:0/71871339 conn(0x7f3f34100270 msgr2=0x7f3f341026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.982+0000 7f3f3ba22700 1 -- 192.168.123.103:0/71871339 shutdown_connections 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.982+0000 7f3f3ba22700 1 -- 192.168.123.103:0/71871339 wait complete. 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.983+0000 7f3f3ba22700 1 Processor -- start 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.983+0000 7f3f3ba22700 1 -- start start 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.983+0000 7f3f3ba22700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f3419bf30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.983+0000 7f3f3ba22700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f24014070 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.983+0000 7f3f397be700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f3419bf30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.983+0000 7f3f397be700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f3419bf30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58074/0 (socket says 192.168.123.103:58074) 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.983+0000 7f3f397be700 1 -- 192.168.123.103:0/1422942041 learned_addr learned my addr 192.168.123.103:0/1422942041 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.984+0000 7f3f397be700 1 -- 192.168.123.103:0/1422942041 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3f24009740 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.984+0000 7f3f397be700 1 --2- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f3419bf30 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f3f24003f80 tx=0x7f3f24004060 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.984+0000 7f3f2a7fc700 1 -- 192.168.123.103:0/1422942041 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3f2400bec0 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.984+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3f3419c470 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.984+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3f3419c910 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.985+0000 7f3f2a7fc700 1 -- 192.168.123.103:0/1422942041 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3f24003900 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.985+0000 7f3f2a7fc700 1 -- 192.168.123.103:0/1422942041 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3f24021780 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.985+0000 7f3f2a7fc700 1 -- 192.168.123.103:0/1422942041 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f3f2402b430 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.985+0000 7f3f2a7fc700 1 --2- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3f200384b0 0x7f3f2003a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.985+0000 7f3f2a7fc700 1 -- 192.168.123.103:0/1422942041 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f3f2404ce70 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.986+0000 7f3f38fbd700 1 --2- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3f200384b0 0x7f3f2003a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.986+0000 7f3f38fbd700 1 --2- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3f200384b0 0x7f3f2003a960 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f3f30006fd0 tx=0x7f3f30006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.987+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3f18005320 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:01.989+0000 7f3f2a7fc700 1 -- 192.168.123.103:0/1422942041 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3f24026030 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.133+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7f3f18005190 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.793+0000 7f3f2a7fc700 1 -- 192.168.123.103:0/1422942041 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f3f24017dc0 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.797+0000 7f3f2a7fc700 1 -- 192.168.123.103:0/1422942041 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v9) v1 ==== 88+0+0 (secure 0 0 0) 0x7f3f24018840 con 0x7f3f34104c20 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.800+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3f200384b0 msgr2=0x7f3f2003a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.800+0000 7f3f3ba22700 1 --2- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3f200384b0 0x7f3f2003a960 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f3f30006fd0 tx=0x7f3f30006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.800+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 msgr2=0x7f3f3419bf30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.800+0000 7f3f3ba22700 1 --2- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f3419bf30 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f3f24003f80 tx=0x7f3f24004060 comp rx=0 tx=0).stop 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.800+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 shutdown_connections 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.800+0000 7f3f3ba22700 1 --2- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3f200384b0 0x7f3f2003a960 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.800+0000 7f3f3ba22700 1 --2- 192.168.123.103:0/1422942041 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f34104c20 0x7f3f3419bf30 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.800+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 >> 192.168.123.103:0/1422942041 conn(0x7f3f34100270 msgr2=0x7f3f34192aa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.801+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 shutdown_connections 2026-03-10T14:03:02.832 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:02.801+0000 7f3f3ba22700 1 -- 192.168.123.103:0/1422942041 wait complete. 2026-03-10T14:03:03.203 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 9, 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "active_name": "vm03.rwbbep", 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.004+0000 7f02658c9700 1 Processor -- start 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.004+0000 7f02658c9700 1 -- start start 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.004+0000 7f02658c9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0260071410 0x7f0260071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.004+0000 7f02658c9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0260071d60 con 0x7f0260071410 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.004+0000 7f02648c7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0260071410 0x7f0260071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.004+0000 7f02648c7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0260071410 0x7f0260071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58098/0 (socket says 192.168.123.103:58098) 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.004+0000 7f02648c7700 1 -- 192.168.123.103:0/520538895 learned_addr learned my addr 192.168.123.103:0/520538895 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.005+0000 7f02648c7700 1 -- 192.168.123.103:0/520538895 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0260071ea0 con 0x7f0260071410 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.005+0000 7f02648c7700 1 --2- 192.168.123.103:0/520538895 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0260071410 0x7f0260071820 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f025000d180 tx=0x7f025000d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=74b1efc3c9225a9a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:03.204 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.006+0000 7f025f7fe700 1 -- 192.168.123.103:0/520538895 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0250010070 con 0x7f0260071410 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.006+0000 7f025f7fe700 1 -- 192.168.123.103:0/520538895 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0250004510 con 0x7f0260071410 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.006+0000 7f02658c9700 1 -- 192.168.123.103:0/520538895 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0260071410 msgr2=0x7f0260071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.006+0000 7f02658c9700 1 --2- 192.168.123.103:0/520538895 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0260071410 0x7f0260071820 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f025000d180 tx=0x7f025000d490 comp rx=0 tx=0).stop 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.007+0000 7f02658c9700 1 -- 192.168.123.103:0/520538895 shutdown_connections 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.007+0000 7f02658c9700 1 --2- 192.168.123.103:0/520538895 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0260071410 0x7f0260071820 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.007+0000 7f02658c9700 1 -- 192.168.123.103:0/520538895 >> 192.168.123.103:0/520538895 conn(0x7f026006c9d0 msgr2=0x7f026006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.007+0000 7f02658c9700 1 -- 192.168.123.103:0/520538895 shutdown_connections 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.007+0000 7f02658c9700 1 -- 192.168.123.103:0/520538895 wait complete. 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.007+0000 7f02658c9700 1 Processor -- start 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.007+0000 7f02658c9700 1 -- start start 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.007+0000 7f02658c9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f026019ff40 0x7f02601a0350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.008+0000 7f02658c9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0250003c20 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.008+0000 7f02648c7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f026019ff40 0x7f02601a0350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.008+0000 7f02648c7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f026019ff40 0x7f02601a0350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58110/0 (socket says 192.168.123.103:58110) 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.008+0000 7f02648c7700 1 -- 192.168.123.103:0/2014102666 learned_addr learned my addr 192.168.123.103:0/2014102666 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.008+0000 7f02648c7700 1 -- 192.168.123.103:0/2014102666 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f02500087c0 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.008+0000 7f02648c7700 1 --2- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f026019ff40 0x7f02601a0350 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f0250004210 tx=0x7f02500042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.010+0000 7f025dffb700 1 -- 192.168.123.103:0/2014102666 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0250010040 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.010+0000 7f02658c9700 1 -- 192.168.123.103:0/2014102666 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f02601a0890 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.010+0000 7f02658c9700 1 -- 192.168.123.103:0/2014102666 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f02601a1510 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.011+0000 7f02658c9700 1 -- 192.168.123.103:0/2014102666 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f026004f030 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.011+0000 7f025dffb700 1 -- 192.168.123.103:0/2014102666 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f025000eec0 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.011+0000 7f025dffb700 1 -- 192.168.123.103:0/2014102666 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0250016490 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.012+0000 7f025dffb700 1 -- 192.168.123.103:0/2014102666 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f0250016610 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.012+0000 7f025dffb700 1 --2- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0248038500 0x7f024803a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.012+0000 7f025ffff700 1 -- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0248038500 msgr2=0x7f024803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.012+0000 7f025ffff700 1 --2- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0248038500 0x7f024803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.012+0000 7f025dffb700 1 -- 192.168.123.103:0/2014102666 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f025004cc00 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.017+0000 7f025dffb700 1 -- 192.168.123.103:0/2014102666 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f02500157b0 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.159+0000 7f02658c9700 1 -- 192.168.123.103:0/2014102666 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f02601a1da0 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.160+0000 7f025dffb700 1 -- 192.168.123.103:0/2014102666 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v9) v1 ==== 56+0+98 (secure 0 0 0) 0x7f0250024080 con 0x7f026019ff40 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 -- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0248038500 msgr2=0x7f024803a9b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 --2- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0248038500 0x7f024803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 -- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f026019ff40 msgr2=0x7f02601a0350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 --2- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f026019ff40 0x7f02601a0350 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f0250004210 tx=0x7f02500042f0 comp rx=0 tx=0).stop 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 -- 192.168.123.103:0/2014102666 shutdown_connections 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 --2- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0248038500 0x7f024803a9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 --2- 192.168.123.103:0/2014102666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f026019ff40 0x7f02601a0350 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 -- 192.168.123.103:0/2014102666 >> 192.168.123.103:0/2014102666 conn(0x7f026006c9d0 msgr2=0x7f026006d450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.171+0000 7f02477fe700 1 -- 192.168.123.103:0/2014102666 shutdown_connections 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.172+0000 7f02477fe700 1 -- 192.168.123.103:0/2014102666 wait complete. 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for the mgr to restart... 2026-03-10T14:03:03.205 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr epoch 9... 2026-03-10T14:03:04.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:03 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1422942041' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-10T14:03:04.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:03 vm03 ceph-mon[49718]: mgrmap e9: vm03.rwbbep(active, since 9s) 2026-03-10T14:03:04.094 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:03 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/2014102666' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: Active manager daemon vm03.rwbbep restarted 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: Activating manager daemon vm03.rwbbep 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: osdmap e3: 0 total, 0 up, 0 in 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: mgrmap e10: vm03.rwbbep(active, starting, since 0.123284s) 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: Manager daemon vm03.rwbbep is now available 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:03:07.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:07 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 11, 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.352+0000 7fd3a2d5d700 1 Processor -- start 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.352+0000 7fd3a2d5d700 1 -- start start 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.353+0000 7fd3a2d5d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c072b50 0x7fd39c071050 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.353+0000 7fd3a2d5d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd39c071590 con 0x7fd39c072b50 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.354+0000 7fd3a0af9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c072b50 0x7fd39c071050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.354+0000 7fd3a0af9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c072b50 0x7fd39c071050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58114/0 (socket says 192.168.123.103:58114) 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.354+0000 7fd3a0af9700 1 -- 192.168.123.103:0/3372197837 learned_addr learned my addr 192.168.123.103:0/3372197837 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.354+0000 7fd3a0af9700 1 -- 192.168.123.103:0/3372197837 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd39c0716d0 con 0x7fd39c072b50 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.354+0000 7fd3a0af9700 1 --2- 192.168.123.103:0/3372197837 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c072b50 0x7fd39c071050 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fd38c009a90 tx=0x7fd38c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=50dcc420d82eee15 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.355+0000 7fd39b7fe700 1 -- 192.168.123.103:0/3372197837 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd38c004030 con 0x7fd39c072b50 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.355+0000 7fd39b7fe700 1 -- 192.168.123.103:0/3372197837 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd38c00b7e0 con 0x7fd39c072b50 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.355+0000 7fd39b7fe700 1 -- 192.168.123.103:0/3372197837 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd38c003ae0 con 0x7fd39c072b50 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.355+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/3372197837 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c072b50 msgr2=0x7fd39c071050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.355+0000 7fd3a2d5d700 1 --2- 192.168.123.103:0/3372197837 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c072b50 0x7fd39c071050 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fd38c009a90 tx=0x7fd38c009da0 comp rx=0 tx=0).stop 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.355+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/3372197837 shutdown_connections 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.355+0000 7fd3a2d5d700 1 --2- 192.168.123.103:0/3372197837 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c072b50 0x7fd39c071050 secure :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fd38c009a90 tx=0x7fd38c009da0 comp rx=0 tx=0).stop 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.355+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/3372197837 >> 192.168.123.103:0/3372197837 conn(0x7fd39c06c970 msgr2=0x7fd39c06eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.356+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/3372197837 shutdown_connections 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.356+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/3372197837 wait complete. 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.356+0000 7fd3a2d5d700 1 Processor -- start 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.356+0000 7fd3a2d5d700 1 -- start start 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.356+0000 7fd3a2d5d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c1a88e0 0x7fd39c1a8cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.356+0000 7fd3a2d5d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd39c1a9230 con 0x7fd39c1a88e0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.357+0000 7fd3a0af9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c1a88e0 0x7fd39c1a8cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.357+0000 7fd3a0af9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c1a88e0 0x7fd39c1a8cf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58116/0 (socket says 192.168.123.103:58116) 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.357+0000 7fd3a0af9700 1 -- 192.168.123.103:0/1284511358 learned_addr learned my addr 192.168.123.103:0/1284511358 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.357+0000 7fd3a0af9700 1 -- 192.168.123.103:0/1284511358 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd38c009740 con 0x7fd39c1a88e0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.358+0000 7fd3a0af9700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c1a88e0 0x7fd39c1a8cf0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fd38c0038b0 tx=0x7fd38c011650 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.358+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd38c0119f0 con 0x7fd39c1a88e0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.358+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd38c011b50 con 0x7fd39c1a88e0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.358+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd38c025490 con 0x7fd39c1a88e0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.358+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd39c1a9430 con 0x7fd39c1a88e0 2026-03-10T14:03:08.739 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.358+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd39c07b250 con 0x7fd39c1a88e0 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.361+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7fd38c011cc0 con 0x7fd39c1a88e0 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.361+0000 7fd399ffb700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.361+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7fd38403b0d0 con 0x7fd384038510 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.361+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fd38c04c1f0 con 0x7fd39c1a88e0 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.361+0000 7fd39bfff700 1 -- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 msgr2=0x7fd38403a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.362+0000 7fd39bfff700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.562+0000 7fd39bfff700 1 -- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 msgr2=0x7fd38403a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.562+0000 7fd39bfff700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.962+0000 7fd39bfff700 1 -- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 msgr2=0x7fd38403a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:03.962+0000 7fd39bfff700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:04.763+0000 7fd39bfff700 1 -- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 msgr2=0x7fd38403a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:04.763+0000 7fd39bfff700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:06.365+0000 7fd39bfff700 1 -- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 msgr2=0x7fd38403a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:06.365+0000 7fd39bfff700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:07.675+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mgrmap(e 10) v1 ==== 44859+0+0 (secure 0 0 0) 0x7fd38c01b440 con 0x7fd39c1a88e0 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:07.675+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 msgr2=0x7fd38403a9c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:07.675+0000 7fd399ffb700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.678+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7fd38c01ba20 con 0x7fd39c1a88e0 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.678+0000 7fd399ffb700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.678+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7fd38403b0d0 con 0x7fd384038510 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.680+0000 7fd39bfff700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.680+0000 7fd39bfff700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fd390003a10 tx=0x7fd3900092b0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.681+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7fd38403b0d0 con 0x7fd384038510 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.684+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7fd39c1a9c60 con 0x7fd384038510 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd399ffb700 1 -- 192.168.123.103:0/1284511358 <== mgr.14164 v2:192.168.123.103:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7fd39c1a9c60 con 0x7fd384038510 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 msgr2=0x7fd38403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd3a2d5d700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fd390003a10 tx=0x7fd3900092b0 comp rx=0 tx=0).stop 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c1a88e0 msgr2=0x7fd39c1a8cf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd3a2d5d700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c1a88e0 0x7fd39c1a8cf0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fd38c0038b0 tx=0x7fd38c011650 comp rx=0 tx=0).stop 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 shutdown_connections 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd3a2d5d700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd384038510 0x7fd38403a9c0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd3a2d5d700 1 --2- 192.168.123.103:0/1284511358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd39c1a88e0 0x7fd39c1a8cf0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.685+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 >> 192.168.123.103:0/1284511358 conn(0x7fd39c06c970 msgr2=0x7fd39c06df70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.686+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 shutdown_connections 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.686+0000 7fd3a2d5d700 1 -- 192.168.123.103:0/1284511358 wait complete. 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:mgr epoch 9 is available 2026-03-10T14:03:08.740 INFO:teuthology.orchestra.run.vm03.stdout:Generating a dashboard self-signed certificate... 2026-03-10T14:03:09.006 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:08 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:03:09.006 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:08 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:03:09.006 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:08 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:09.006 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:08 vm03 ceph-mon[49718]: mgrmap e11: vm03.rwbbep(active, since 1.12682s) 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.884+0000 7f32ae296700 1 Processor -- start 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.884+0000 7f32ae296700 1 -- start start 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.884+0000 7f32ae296700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a8071840 0x7f32a8071c50 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.884+0000 7f32ae296700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f32a8072190 con 0x7f32a8071840 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.884+0000 7f32a7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a8071840 0x7f32a8071c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.885+0000 7f32a7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a8071840 0x7f32a8071c50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33336/0 (socket says 192.168.123.103:33336) 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.885+0000 7f32a7fff700 1 -- 192.168.123.103:0/3476731391 learned_addr learned my addr 192.168.123.103:0/3476731391 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.885+0000 7f32a7fff700 1 -- 192.168.123.103:0/3476731391 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f32a80722d0 con 0x7f32a8071840 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.886+0000 7f32a7fff700 1 --2- 192.168.123.103:0/3476731391 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a8071840 0x7f32a8071c50 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f329800d180 tx=0x7f329800d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8f02526686d8a35d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.886+0000 7f32a6ffd700 1 -- 192.168.123.103:0/3476731391 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3298010070 con 0x7f32a8071840 2026-03-10T14:03:09.100 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.886+0000 7f32a6ffd700 1 -- 192.168.123.103:0/3476731391 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3298004510 con 0x7f32a8071840 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.887+0000 7f32ae296700 1 -- 192.168.123.103:0/3476731391 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a8071840 msgr2=0x7f32a8071c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.887+0000 7f32ae296700 1 --2- 192.168.123.103:0/3476731391 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a8071840 0x7f32a8071c50 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f329800d180 tx=0x7f329800d490 comp rx=0 tx=0).stop 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.887+0000 7f32ae296700 1 -- 192.168.123.103:0/3476731391 shutdown_connections 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.887+0000 7f32ae296700 1 --2- 192.168.123.103:0/3476731391 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a8071840 0x7f32a8071c50 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.887+0000 7f32ae296700 1 -- 192.168.123.103:0/3476731391 >> 192.168.123.103:0/3476731391 conn(0x7f32a806cc30 msgr2=0x7f32a806f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.887+0000 7f32ae296700 1 -- 192.168.123.103:0/3476731391 shutdown_connections 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.887+0000 7f32ae296700 1 -- 192.168.123.103:0/3476731391 wait complete. 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.887+0000 7f32ae296700 1 Processor -- start 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.888+0000 7f32ae296700 1 -- start start 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.888+0000 7f32ae296700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a81a0130 0x7f32a81a0540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.888+0000 7f32ae296700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3298003c20 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.888+0000 7f32a7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a81a0130 0x7f32a81a0540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.888+0000 7f32a7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a81a0130 0x7f32a81a0540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33348/0 (socket says 192.168.123.103:33348) 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.889+0000 7f32a7fff700 1 -- 192.168.123.103:0/1176952498 learned_addr learned my addr 192.168.123.103:0/1176952498 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.889+0000 7f32a7fff700 1 -- 192.168.123.103:0/1176952498 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f32980087c0 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.889+0000 7f32a7fff700 1 --2- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a81a0130 0x7f32a81a0540 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f3298008c10 tx=0x7f3298008cf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.890+0000 7f32a57fa700 1 -- 192.168.123.103:0/1176952498 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3298010050 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.890+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f32a81a0a80 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.890+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f32a81a1700 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.891+0000 7f32a57fa700 1 -- 192.168.123.103:0/1176952498 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f329800deb0 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.891+0000 7f32a57fa700 1 -- 192.168.123.103:0/1176952498 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3298016440 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.892+0000 7f32a57fa700 1 -- 192.168.123.103:0/1176952498 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f3298016620 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.892+0000 7f32a57fa700 1 --2- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3290038420 0x7f329003a8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.893+0000 7f32a77fe700 1 --2- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3290038420 0x7f329003a8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.893+0000 7f32a77fe700 1 --2- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3290038420 0x7f329003a8d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f32a000ad80 tx=0x7f32a00093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.893+0000 7f32a57fa700 1 -- 192.168.123.103:0/1176952498 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f329804c8f0 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.893+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3294005320 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:08.897+0000 7f32a57fa700 1 -- 192.168.123.103:0/1176952498 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f329801b070 con 0x7f32a81a0130 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.007+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7f3294000bf0 con 0x7f3290038420 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.067+0000 7f32a57fa700 1 -- 192.168.123.103:0/1176952498 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f3294000bf0 con 0x7f3290038420 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.069+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3290038420 msgr2=0x7f329003a8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.069+0000 7f32ae296700 1 --2- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3290038420 0x7f329003a8d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f32a000ad80 tx=0x7f32a00093f0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.069+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a81a0130 msgr2=0x7f32a81a0540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.069+0000 7f32ae296700 1 --2- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a81a0130 0x7f32a81a0540 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f3298008c10 tx=0x7f3298008cf0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.069+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 shutdown_connections 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.069+0000 7f32ae296700 1 --2- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3290038420 0x7f329003a8d0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.069+0000 7f32ae296700 1 --2- 192.168.123.103:0/1176952498 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f32a81a0130 0x7f32a81a0540 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.069+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 >> 192.168.123.103:0/1176952498 conn(0x7f32a806cc30 msgr2=0x7f32a806e710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.070+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 shutdown_connections 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.070+0000 7f32ae296700 1 -- 192.168.123.103:0/1176952498 wait complete. 2026-03-10T14:03:09.102 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial admin user... 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$f7YeAyLmUPsbQ6wk/yOauOtSGSt66DFBUvGzL1fz6lS3qxCubaVVy", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773151389, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.261+0000 7fe3053ef700 1 Processor -- start 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.261+0000 7fe3053ef700 1 -- start start 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.261+0000 7fe3053ef700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe300071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.261+0000 7fe3053ef700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe300071d60 con 0x7fe300071410 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.262+0000 7fe2fffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe300071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.262+0000 7fe2fffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe300071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33356/0 (socket says 192.168.123.103:33356) 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.262+0000 7fe2fffff700 1 -- 192.168.123.103:0/1250830420 learned_addr learned my addr 192.168.123.103:0/1250830420 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.262+0000 7fe2fffff700 1 -- 192.168.123.103:0/1250830420 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe300071ea0 con 0x7fe300071410 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.262+0000 7fe2fffff700 1 --2- 192.168.123.103:0/1250830420 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe300071820 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fe2f0009cf0 tx=0x7fe2f000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=747c5db5cfe8f857 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.263+0000 7fe2ff7fe700 1 -- 192.168.123.103:0/1250830420 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe2f0004030 con 0x7fe300071410 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.263+0000 7fe2ff7fe700 1 -- 192.168.123.103:0/1250830420 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe2f000b810 con 0x7fe300071410 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.263+0000 7fe3053ef700 1 -- 192.168.123.103:0/1250830420 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 msgr2=0x7fe300071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.627 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.263+0000 7fe3053ef700 1 --2- 192.168.123.103:0/1250830420 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe300071820 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fe2f0009cf0 tx=0x7fe2f000b0e0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.264+0000 7fe3053ef700 1 -- 192.168.123.103:0/1250830420 shutdown_connections 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.264+0000 7fe3053ef700 1 --2- 192.168.123.103:0/1250830420 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe300071820 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.264+0000 7fe3053ef700 1 -- 192.168.123.103:0/1250830420 >> 192.168.123.103:0/1250830420 conn(0x7fe30006c9d0 msgr2=0x7fe30006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.264+0000 7fe3053ef700 1 -- 192.168.123.103:0/1250830420 shutdown_connections 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.264+0000 7fe3053ef700 1 -- 192.168.123.103:0/1250830420 wait complete. 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.264+0000 7fe3053ef700 1 Processor -- start 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.264+0000 7fe3053ef700 1 -- start start 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.265+0000 7fe3053ef700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe3001a0260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.265+0000 7fe3053ef700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe300071d60 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.265+0000 7fe2fffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe3001a0260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.265+0000 7fe2fffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe3001a0260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33370/0 (socket says 192.168.123.103:33370) 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.265+0000 7fe2fffff700 1 -- 192.168.123.103:0/736575686 learned_addr learned my addr 192.168.123.103:0/736575686 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.265+0000 7fe2fffff700 1 -- 192.168.123.103:0/736575686 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2f0009740 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.265+0000 7fe2fffff700 1 --2- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe3001a0260 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fe2f0006e90 tx=0x7fe2f0003d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.266+0000 7fe2fdffb700 1 -- 192.168.123.103:0/736575686 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe2f0003f40 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.266+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe3001a0800 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.266+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe3001a0ca0 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.269+0000 7fe2fdffb700 1 -- 192.168.123.103:0/736575686 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe2f0004580 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.269+0000 7fe2fdffb700 1 -- 192.168.123.103:0/736575686 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe2f001ae60 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.269+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe30004efc0 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.272+0000 7fe2fdffb700 1 -- 192.168.123.103:0/736575686 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7fe2f001a430 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.272+0000 7fe2fdffb700 1 --2- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe2e0038390 0x7fe2e003a840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.273+0000 7fe2fdffb700 1 -- 192.168.123.103:0/736575686 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fe2f004bec0 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.273+0000 7fe2fdffb700 1 -- 192.168.123.103:0/736575686 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe2f002c390 con 0x7fe300071410 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.275+0000 7fe2f7fff700 1 --2- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe2e0038390 0x7fe2e003a840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.275+0000 7fe2f7fff700 1 --2- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe2e0038390 0x7fe2e003a840 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fe2e8009990 tx=0x7fe2e8006e30 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.410+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7fe3000621a0 con 0x7fe2e0038390 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.580+0000 7fe2fdffb700 1 -- 192.168.123.103:0/736575686 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7fe3000621a0 con 0x7fe2e0038390 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.582+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe2e0038390 msgr2=0x7fe2e003a840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.582+0000 7fe3053ef700 1 --2- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe2e0038390 0x7fe2e003a840 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fe2e8009990 tx=0x7fe2e8006e30 comp rx=0 tx=0).stop 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.582+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 msgr2=0x7fe3001a0260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.582+0000 7fe3053ef700 1 --2- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe3001a0260 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fe2f0006e90 tx=0x7fe2f0003d30 comp rx=0 tx=0).stop 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.583+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 shutdown_connections 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.583+0000 7fe3053ef700 1 --2- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe2e0038390 0x7fe2e003a840 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.583+0000 7fe3053ef700 1 --2- 192.168.123.103:0/736575686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe300071410 0x7fe3001a0260 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.583+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 >> 192.168.123.103:0/736575686 conn(0x7fe30006c9d0 msgr2=0x7fe30010ddc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.583+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 shutdown_connections 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.583+0000 7fe3053ef700 1 -- 192.168.123.103:0/736575686 wait complete. 2026-03-10T14:03:09.628 INFO:teuthology.orchestra.run.vm03.stdout:Fetching dashboard port number... 2026-03-10T14:03:09.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: [10/Mar/2026:14:03:08] ENGINE Bus STARTING 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: [10/Mar/2026:14:03:08] ENGINE Serving on https://192.168.123.103:7150 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: [10/Mar/2026:14:03:08] ENGINE Serving on http://192.168.123.103:8765 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: [10/Mar/2026:14:03:08] ENGINE Bus STARTED 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:09 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 8443 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.765+0000 7fbbac723700 1 Processor -- start 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.765+0000 7fbbac723700 1 -- start start 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.765+0000 7fbbac723700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba4108330 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.765+0000 7fbbac723700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbba4108870 con 0x7fbba4107f20 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.766+0000 7fbbaa4bf700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba4108330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.766+0000 7fbbaa4bf700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba4108330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33386/0 (socket says 192.168.123.103:33386) 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.766+0000 7fbbaa4bf700 1 -- 192.168.123.103:0/2273433694 learned_addr learned my addr 192.168.123.103:0/2273433694 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.766+0000 7fbbaa4bf700 1 -- 192.168.123.103:0/2273433694 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbba41089b0 con 0x7fbba4107f20 2026-03-10T14:03:09.933 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.766+0000 7fbbaa4bf700 1 --2- 192.168.123.103:0/2273433694 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba4108330 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fbba0009cf0 tx=0x7fbba000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6f1d17ce019905c4 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.767+0000 7fbba94bd700 1 -- 192.168.123.103:0/2273433694 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbba0004030 con 0x7fbba4107f20 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.767+0000 7fbba94bd700 1 -- 192.168.123.103:0/2273433694 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbba000b810 con 0x7fbba4107f20 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.767+0000 7fbbac723700 1 -- 192.168.123.103:0/2273433694 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 msgr2=0x7fbba4108330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.767+0000 7fbbac723700 1 --2- 192.168.123.103:0/2273433694 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba4108330 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fbba0009cf0 tx=0x7fbba000b0e0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 -- 192.168.123.103:0/2273433694 shutdown_connections 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 --2- 192.168.123.103:0/2273433694 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba4108330 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 -- 192.168.123.103:0/2273433694 >> 192.168.123.103:0/2273433694 conn(0x7fbba407b4b0 msgr2=0x7fbba407b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 -- 192.168.123.103:0/2273433694 shutdown_connections 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 -- 192.168.123.103:0/2273433694 wait complete. 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 Processor -- start 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 -- start start 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba419c040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.768+0000 7fbbac723700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbba4108870 con 0x7fbba4107f20 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.769+0000 7fbbaa4bf700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba419c040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.769+0000 7fbbaa4bf700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba419c040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33390/0 (socket says 192.168.123.103:33390) 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.769+0000 7fbbaa4bf700 1 -- 192.168.123.103:0/570028214 learned_addr learned my addr 192.168.123.103:0/570028214 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.769+0000 7fbbaa4bf700 1 -- 192.168.123.103:0/570028214 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbba0009740 con 0x7fbba4107f20 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.769+0000 7fbbaa4bf700 1 --2- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba419c040 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fbba000bf40 tx=0x7fbba0003d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.770+0000 7fbb9b7fe700 1 -- 192.168.123.103:0/570028214 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbba0003f40 con 0x7fbba4107f20 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.770+0000 7fbb9b7fe700 1 -- 192.168.123.103:0/570028214 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbba0004580 con 0x7fbba4107f20 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.770+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbba419c580 con 0x7fbba4107f20 2026-03-10T14:03:09.934 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.771+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbba419ca20 con 0x7fbba4107f20 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.771+0000 7fbb9b7fe700 1 -- 192.168.123.103:0/570028214 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbba001ae60 con 0x7fbba4107f20 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.772+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbba4195f60 con 0x7fbba4107f20 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.775+0000 7fbb9b7fe700 1 -- 192.168.123.103:0/570028214 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7fbba0011460 con 0x7fbba4107f20 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.775+0000 7fbb9b7fe700 1 --2- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbb900380c0 0x7fbb9003a570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.775+0000 7fbb9b7fe700 1 -- 192.168.123.103:0/570028214 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fbba001a430 con 0x7fbba4107f20 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.776+0000 7fbba9cbe700 1 --2- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbb900380c0 0x7fbb9003a570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.776+0000 7fbba9cbe700 1 --2- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbb900380c0 0x7fbb9003a570 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fbb94006fd0 tx=0x7fbb94006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.776+0000 7fbb9b7fe700 1 -- 192.168.123.103:0/570028214 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbba0052050 con 0x7fbba4107f20 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.880+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7fbba404f9e0 con 0x7fbba4107f20 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.881+0000 7fbb9b7fe700 1 -- 192.168.123.103:0/570028214 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7fbba0023940 con 0x7fbba4107f20 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.883+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbb900380c0 msgr2=0x7fbb9003a570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.883+0000 7fbbac723700 1 --2- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbb900380c0 0x7fbb9003a570 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fbb94006fd0 tx=0x7fbb94006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.883+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 msgr2=0x7fbba419c040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.883+0000 7fbbac723700 1 --2- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba419c040 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fbba000bf40 tx=0x7fbba0003d30 comp rx=0 tx=0).stop 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.883+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 shutdown_connections 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.883+0000 7fbbac723700 1 --2- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbb900380c0 0x7fbb9003a570 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.883+0000 7fbbac723700 1 --2- 192.168.123.103:0/570028214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbba4107f20 0x7fbba419c040 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.883+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 >> 192.168.123.103:0/570028214 conn(0x7fbba407b4b0 msgr2=0x7fbba4106da0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.884+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 shutdown_connections 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:09.884+0000 7fbbac723700 1 -- 192.168.123.103:0/570028214 wait complete. 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-10T14:03:09.935 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-10T14:03:09.936 INFO:teuthology.orchestra.run.vm03.stdout:Ceph Dashboard is now available at: 2026-03-10T14:03:09.936 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:09.936 INFO:teuthology.orchestra.run.vm03.stdout: URL: https://vm03.local:8443/ 2026-03-10T14:03:09.936 INFO:teuthology.orchestra.run.vm03.stdout: User: admin 2026-03-10T14:03:09.936 INFO:teuthology.orchestra.run.vm03.stdout: Password: 04onx8sgum 2026-03-10T14:03:09.936 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:09.936 INFO:teuthology.orchestra.run.vm03.stdout:Saving cluster configuration to /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config directory 2026-03-10T14:03:09.936 INFO:teuthology.orchestra.run.vm03.stdout:Enabling autotune for osd_memory_target 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.065+0000 7f7dff12b700 1 Processor -- start 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.066+0000 7f7dff12b700 1 -- start start 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.066+0000 7f7dff12b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df8108330 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.066+0000 7f7dff12b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7df8108870 con 0x7f7df8107f20 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.067+0000 7f7dfcec7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df8108330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.068+0000 7f7dfcec7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df8108330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33396/0 (socket says 192.168.123.103:33396) 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.068+0000 7f7dfcec7700 1 -- 192.168.123.103:0/2830249555 learned_addr learned my addr 192.168.123.103:0/2830249555 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.071+0000 7f7dfcec7700 1 -- 192.168.123.103:0/2830249555 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7df81089b0 con 0x7f7df8107f20 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.072+0000 7f7dfcec7700 1 --2- 192.168.123.103:0/2830249555 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df8108330 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f7de8009cf0 tx=0x7f7de800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=30cb2a3e9158e6ea server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.072+0000 7f7df77fe700 1 -- 192.168.123.103:0/2830249555 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7de8004030 con 0x7f7df8107f20 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.072+0000 7f7df77fe700 1 -- 192.168.123.103:0/2830249555 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7de800b810 con 0x7f7df8107f20 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.073+0000 7f7dff12b700 1 -- 192.168.123.103:0/2830249555 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 msgr2=0x7f7df8108330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.073+0000 7f7dff12b700 1 --2- 192.168.123.103:0/2830249555 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df8108330 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f7de8009cf0 tx=0x7f7de800b0e0 comp rx=0 tx=0).stop 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.073+0000 7f7dff12b700 1 -- 192.168.123.103:0/2830249555 shutdown_connections 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.073+0000 7f7dff12b700 1 --2- 192.168.123.103:0/2830249555 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df8108330 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.073+0000 7f7dff12b700 1 -- 192.168.123.103:0/2830249555 >> 192.168.123.103:0/2830249555 conn(0x7f7df807b4b0 msgr2=0x7f7df807b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.073+0000 7f7dff12b700 1 -- 192.168.123.103:0/2830249555 shutdown_connections 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.073+0000 7f7dff12b700 1 -- 192.168.123.103:0/2830249555 wait complete. 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.074+0000 7f7dff12b700 1 Processor -- start 2026-03-10T14:03:10.248 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.074+0000 7f7dff12b700 1 -- start start 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.074+0000 7f7dff12b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df819c040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.074+0000 7f7dff12b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7df8108870 con 0x7f7df8107f20 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.074+0000 7f7dfcec7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df819c040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.074+0000 7f7dfcec7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df819c040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33406/0 (socket says 192.168.123.103:33406) 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.074+0000 7f7dfcec7700 1 -- 192.168.123.103:0/1259425877 learned_addr learned my addr 192.168.123.103:0/1259425877 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.075+0000 7f7dfcec7700 1 -- 192.168.123.103:0/1259425877 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7de8009740 con 0x7f7df8107f20 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.075+0000 7f7dfcec7700 1 --2- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df819c040 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f7de800bf40 tx=0x7f7de8003d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.075+0000 7f7df5ffb700 1 -- 192.168.123.103:0/1259425877 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7de8003f40 con 0x7f7df8107f20 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.075+0000 7f7df5ffb700 1 -- 192.168.123.103:0/1259425877 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7de8004580 con 0x7f7df8107f20 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.075+0000 7f7df5ffb700 1 -- 192.168.123.103:0/1259425877 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7de801ae60 con 0x7f7df8107f20 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.075+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7df819c580 con 0x7f7df8107f20 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.075+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7df819ca20 con 0x7f7df8107f20 2026-03-10T14:03:10.249 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.076+0000 7f7df5ffb700 1 -- 192.168.123.103:0/1259425877 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f7de801a430 con 0x7f7df8107f20 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.077+0000 7f7df5ffb700 1 --2- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7de00384e0 0x7f7de003a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.077+0000 7f7df5ffb700 1 -- 192.168.123.103:0/1259425877 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f7de804be00 con 0x7f7df8107f20 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.077+0000 7f7df7fff700 1 --2- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7de00384e0 0x7f7de003a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.077+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7df8195f60 con 0x7f7df8107f20 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.080+0000 7f7df7fff700 1 --2- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7de00384e0 0x7f7de003a990 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f7dec006fd0 tx=0x7f7dec006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.081+0000 7f7df5ffb700 1 -- 192.168.123.103:0/1259425877 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7de802b990 con 0x7f7df8107f20 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.180+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f7df819ccd0 con 0x7f7df8107f20 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.181+0000 7f7df5ffb700 1 -- 192.168.123.103:0/1259425877 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f7de802b340 con 0x7f7df8107f20 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7de00384e0 msgr2=0x7f7de003a990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 --2- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7de00384e0 0x7f7de003a990 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f7dec006fd0 tx=0x7f7dec006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 msgr2=0x7f7df819c040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 --2- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df819c040 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f7de800bf40 tx=0x7f7de8003d30 comp rx=0 tx=0).stop 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 shutdown_connections 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 --2- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7de00384e0 0x7f7de003a990 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 --2- 192.168.123.103:0/1259425877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7df8107f20 0x7f7df819c040 secure :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f7de800bf40 tx=0x7f7de8003d30 comp rx=0 tx=0).stop 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 >> 192.168.123.103:0/1259425877 conn(0x7f7df807b4b0 msgr2=0x7f7df8106da0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 shutdown_connections 2026-03-10T14:03:10.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.184+0000 7f7dff12b700 1 -- 192.168.123.103:0/1259425877 wait complete. 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.372+0000 7f57879af700 1 Processor -- start 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.373+0000 7f57879af700 1 -- start start 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.373+0000 7f57879af700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780103cb0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.373+0000 7f57879af700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5780104280 con 0x7f57801038a0 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.373+0000 7f578574b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780103cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.374+0000 7f578574b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780103cb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33420/0 (socket says 192.168.123.103:33420) 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.374+0000 7f578574b700 1 -- 192.168.123.103:0/117519670 learned_addr learned my addr 192.168.123.103:0/117519670 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.374+0000 7f578574b700 1 -- 192.168.123.103:0/117519670 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57801043c0 con 0x7f57801038a0 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.374+0000 7f578574b700 1 --2- 192.168.123.103:0/117519670 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780103cb0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f577c009cf0 tx=0x7f577c00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c7d3c20b1f8515b server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.375+0000 7f5777fff700 1 -- 192.168.123.103:0/117519670 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f577c004030 con 0x7f57801038a0 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.375+0000 7f5777fff700 1 -- 192.168.123.103:0/117519670 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f577c00b810 con 0x7f57801038a0 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.375+0000 7f5777fff700 1 -- 192.168.123.103:0/117519670 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f577c003b10 con 0x7f57801038a0 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.375+0000 7f57879af700 1 -- 192.168.123.103:0/117519670 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 msgr2=0x7f5780103cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.375+0000 7f57879af700 1 --2- 192.168.123.103:0/117519670 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780103cb0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f577c009cf0 tx=0x7f577c00b0e0 comp rx=0 tx=0).stop 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.376+0000 7f57879af700 1 -- 192.168.123.103:0/117519670 shutdown_connections 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.376+0000 7f57879af700 1 --2- 192.168.123.103:0/117519670 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780103cb0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.376+0000 7f57879af700 1 -- 192.168.123.103:0/117519670 >> 192.168.123.103:0/117519670 conn(0x7f57800ff5f0 msgr2=0x7f5780101a00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.376+0000 7f57879af700 1 -- 192.168.123.103:0/117519670 shutdown_connections 2026-03-10T14:03:10.599 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.376+0000 7f57879af700 1 -- 192.168.123.103:0/117519670 wait complete. 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.376+0000 7f57879af700 1 Processor -- start 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.377+0000 7f57879af700 1 -- start start 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.377+0000 7f57879af700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780197870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.377+0000 7f57879af700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5780197db0 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.377+0000 7f578574b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780197870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.377+0000 7f578574b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780197870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33428/0 (socket says 192.168.123.103:33428) 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.377+0000 7f578574b700 1 -- 192.168.123.103:0/3443260979 learned_addr learned my addr 192.168.123.103:0/3443260979 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.378+0000 7f578574b700 1 -- 192.168.123.103:0/3443260979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f577c009740 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.378+0000 7f578574b700 1 --2- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780197870 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f577c000c00 tx=0x7f577c011890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.378+0000 7f57767fc700 1 -- 192.168.123.103:0/3443260979 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f577c011bc0 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.378+0000 7f57767fc700 1 -- 192.168.123.103:0/3443260979 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f577c011d20 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.378+0000 7f57767fc700 1 -- 192.168.123.103:0/3443260979 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f577c01a590 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.378+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5780197fb0 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.378+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5780198450 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.380+0000 7f57767fc700 1 -- 192.168.123.103:0/3443260979 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f577c01b440 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.380+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5780191080 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.380+0000 7f57767fc700 1 --2- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f576c038500 0x7f576c03a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.380+0000 7f57767fc700 1 -- 192.168.123.103:0/3443260979 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f577c04bf20 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.380+0000 7f5784f4a700 1 --2- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f576c038500 0x7f576c03a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.383+0000 7f57767fc700 1 -- 192.168.123.103:0/3443260979 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f577c01a6f0 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.383+0000 7f5784f4a700 1 --2- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f576c038500 0x7f576c03a9b0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f5770006fd0 tx=0x7f5770006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.548+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7f5780062380 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.551+0000 7f57767fc700 1 -- 192.168.123.103:0/3443260979 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7f577c01f020 con 0x7f57801038a0 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.552+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f576c038500 msgr2=0x7f576c03a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.552+0000 7f57879af700 1 --2- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f576c038500 0x7f576c03a9b0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f5770006fd0 tx=0x7f5770006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.552+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 msgr2=0x7f5780197870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.552+0000 7f57879af700 1 --2- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780197870 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f577c000c00 tx=0x7f577c011890 comp rx=0 tx=0).stop 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.553+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 shutdown_connections 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.553+0000 7f57879af700 1 --2- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f576c038500 0x7f576c03a9b0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.553+0000 7f57879af700 1 --2- 192.168.123.103:0/3443260979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57801038a0 0x7f5780197870 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.553+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 >> 192.168.123.103:0/3443260979 conn(0x7f57800ff5f0 msgr2=0x7f578018bc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.553+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 shutdown_connections 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-10T14:03:10.553+0000 7f57879af700 1 -- 192.168.123.103:0/3443260979 wait complete. 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:Or, if you are only running a single cluster on this host: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: ceph telemetry on 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout:For more information see: 2026-03-10T14:03:10.600 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:10.601 INFO:teuthology.orchestra.run.vm03.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-10T14:03:10.601 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:10.601 INFO:teuthology.orchestra.run.vm03.stdout:Bootstrap complete. 2026-03-10T14:03:10.636 INFO:tasks.cephadm:Fetching config... 2026-03-10T14:03:10.636 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:03:10.636 DEBUG:teuthology.orchestra.run.vm03:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-10T14:03:10.655 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-10T14:03:10.655 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:03:10.655 DEBUG:teuthology.orchestra.run.vm03:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-10T14:03:10.733 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-10T14:03:10.733 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:03:10.733 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/keyring of=/dev/stdout 2026-03-10T14:03:10.801 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-10T14:03:10.801 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:03:10.801 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-10T14:03:10.863 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-10T14:03:10.863 DEBUG:teuthology.orchestra.run.vm03:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDJxXup9gGfn3tPtASG1Q+juHRUnudNH1ynOUCAqvqcGWEOBs3CBYAr3KDP7hI82XXy5AQFMPyrgtNX0jTpNqIpHcBEC7FlVT8vA2QItERgZ1zK3EGoE6ot8q3zVLJx0WQ4PhzS9GSg5NoKWdms/R2TiStmGf2lpiG5WLOPLikvyCdLO8JCj7n+7Hnhy/Wd+kSFcXF07O/wEu5xhu7zfVmFP0nIj0Tbdn4oFsiqrFtfTrjsKzkrxUPp49Itu06pFSxP7dqAFNZ14vXwgR8VJOejjvPWYrkZmqYFFpfrMVZREZdqp44G3AnXdrdlU5PenydGBws7AkjBrftQSwfBfq0qP7MV/35WPC9mFYenxkOPSMajMNn86HTdL6sK2yQduSXPGHVfko401HGCIITmHuZnTgbjn+idMT5lGA24KftpS6vqjA1w+U37oJepadCJTBw4jcFAJ5ZEaep72Dn+wN2waOIDX1HtNEtN+yqnEF5ASRf3st0ubGRxxZbMoYajQRE= ceph-b81bf660-1c89-11f1-b612-27d302cdb124' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T14:03:10.967 INFO:teuthology.orchestra.run.vm03.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDJxXup9gGfn3tPtASG1Q+juHRUnudNH1ynOUCAqvqcGWEOBs3CBYAr3KDP7hI82XXy5AQFMPyrgtNX0jTpNqIpHcBEC7FlVT8vA2QItERgZ1zK3EGoE6ot8q3zVLJx0WQ4PhzS9GSg5NoKWdms/R2TiStmGf2lpiG5WLOPLikvyCdLO8JCj7n+7Hnhy/Wd+kSFcXF07O/wEu5xhu7zfVmFP0nIj0Tbdn4oFsiqrFtfTrjsKzkrxUPp49Itu06pFSxP7dqAFNZ14vXwgR8VJOejjvPWYrkZmqYFFpfrMVZREZdqp44G3AnXdrdlU5PenydGBws7AkjBrftQSwfBfq0qP7MV/35WPC9mFYenxkOPSMajMNn86HTdL6sK2yQduSXPGHVfko401HGCIITmHuZnTgbjn+idMT5lGA24KftpS6vqjA1w+U37oJepadCJTBw4jcFAJ5ZEaep72Dn+wN2waOIDX1HtNEtN+yqnEF5ASRf3st0ubGRxxZbMoYajQRE= ceph-b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:03:10.980 DEBUG:teuthology.orchestra.run.vm04:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDJxXup9gGfn3tPtASG1Q+juHRUnudNH1ynOUCAqvqcGWEOBs3CBYAr3KDP7hI82XXy5AQFMPyrgtNX0jTpNqIpHcBEC7FlVT8vA2QItERgZ1zK3EGoE6ot8q3zVLJx0WQ4PhzS9GSg5NoKWdms/R2TiStmGf2lpiG5WLOPLikvyCdLO8JCj7n+7Hnhy/Wd+kSFcXF07O/wEu5xhu7zfVmFP0nIj0Tbdn4oFsiqrFtfTrjsKzkrxUPp49Itu06pFSxP7dqAFNZ14vXwgR8VJOejjvPWYrkZmqYFFpfrMVZREZdqp44G3AnXdrdlU5PenydGBws7AkjBrftQSwfBfq0qP7MV/35WPC9mFYenxkOPSMajMNn86HTdL6sK2yQduSXPGHVfko401HGCIITmHuZnTgbjn+idMT5lGA24KftpS6vqjA1w+U37oJepadCJTBw4jcFAJ5ZEaep72Dn+wN2waOIDX1HtNEtN+yqnEF5ASRf3st0ubGRxxZbMoYajQRE= ceph-b81bf660-1c89-11f1-b612-27d302cdb124' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T14:03:11.014 INFO:teuthology.orchestra.run.vm04.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDJxXup9gGfn3tPtASG1Q+juHRUnudNH1ynOUCAqvqcGWEOBs3CBYAr3KDP7hI82XXy5AQFMPyrgtNX0jTpNqIpHcBEC7FlVT8vA2QItERgZ1zK3EGoE6ot8q3zVLJx0WQ4PhzS9GSg5NoKWdms/R2TiStmGf2lpiG5WLOPLikvyCdLO8JCj7n+7Hnhy/Wd+kSFcXF07O/wEu5xhu7zfVmFP0nIj0Tbdn4oFsiqrFtfTrjsKzkrxUPp49Itu06pFSxP7dqAFNZ14vXwgR8VJOejjvPWYrkZmqYFFpfrMVZREZdqp44G3AnXdrdlU5PenydGBws7AkjBrftQSwfBfq0qP7MV/35WPC9mFYenxkOPSMajMNn86HTdL6sK2yQduSXPGHVfko401HGCIITmHuZnTgbjn+idMT5lGA24KftpS6vqjA1w+U37oJepadCJTBw4jcFAJ5ZEaep72Dn+wN2waOIDX1HtNEtN+yqnEF5ASRf3st0ubGRxxZbMoYajQRE= ceph-b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:03:11.025 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-10T14:03:11.049 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:10 vm03 ceph-mon[49718]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:11.049 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:10 vm03 ceph-mon[49718]: from='client.14178 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:11.049 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:10 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/570028214' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-10T14:03:11.049 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:10 vm03 ceph-mon[49718]: mgrmap e12: vm03.rwbbep(active, since 2s) 2026-03-10T14:03:11.049 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:10 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3443260979' entity='client.admin' 2026-03-10T14:03:11.175 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.465+0000 7f7671b9a700 1 -- 192.168.123.103:0/3819807332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c0fdc50 msgr2=0x7f766c0fe060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.465+0000 7f7671b9a700 1 --2- 192.168.123.103:0/3819807332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c0fdc50 0x7f766c0fe060 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f765c009b50 tx=0x7f765c009e60 comp rx=0 tx=0).stop 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.465+0000 7f7671b9a700 1 -- 192.168.123.103:0/3819807332 shutdown_connections 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.465+0000 7f7671b9a700 1 --2- 192.168.123.103:0/3819807332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c0fdc50 0x7f766c0fe060 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.465+0000 7f7671b9a700 1 -- 192.168.123.103:0/3819807332 >> 192.168.123.103:0/3819807332 conn(0x7f766c0f9800 msgr2=0x7f766c0fbc30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.466+0000 7f7671b9a700 1 -- 192.168.123.103:0/3819807332 shutdown_connections 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.466+0000 7f7671b9a700 1 -- 192.168.123.103:0/3819807332 wait complete. 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.466+0000 7f7671b9a700 1 Processor -- start 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.467+0000 7f7671b9a700 1 -- start start 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.467+0000 7f7671b9a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c197650 0x7f766c197a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:11.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.467+0000 7f7671b9a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f766c197fa0 con 0x7f766c197650 2026-03-10T14:03:11.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.467+0000 7f766b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c197650 0x7f766c197a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:11.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.467+0000 7f766b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c197650 0x7f766c197a60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33448/0 (socket says 192.168.123.103:33448) 2026-03-10T14:03:11.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.467+0000 7f766b7fe700 1 -- 192.168.123.103:0/3761417239 learned_addr learned my addr 192.168.123.103:0/3761417239 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:11.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.467+0000 7f766b7fe700 1 -- 192.168.123.103:0/3761417239 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f765c0097e0 con 0x7f766c197650 2026-03-10T14:03:11.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.468+0000 7f766b7fe700 1 --2- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c197650 0x7f766c197a60 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f765c000c00 tx=0x7f765c0050d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:11.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.468+0000 7f76697fa700 1 -- 192.168.123.103:0/3761417239 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f765c01c070 con 0x7f766c197650 2026-03-10T14:03:11.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.468+0000 7f76697fa700 1 -- 192.168.123.103:0/3761417239 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f765c0056d0 con 0x7f766c197650 2026-03-10T14:03:11.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.468+0000 7f76697fa700 1 -- 192.168.123.103:0/3761417239 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f765c021e60 con 0x7f766c197650 2026-03-10T14:03:11.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.468+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f766c1981a0 con 0x7f766c197650 2026-03-10T14:03:11.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.468+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f766c19ae00 con 0x7f766c197650 2026-03-10T14:03:11.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.469+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f766c191560 con 0x7f766c197650 2026-03-10T14:03:11.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.472+0000 7f76697fa700 1 -- 192.168.123.103:0/3761417239 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f765c021800 con 0x7f766c197650 2026-03-10T14:03:11.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.472+0000 7f76697fa700 1 --2- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f76580384c0 0x7f765803a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:11.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.472+0000 7f76697fa700 1 -- 192.168.123.103:0/3761417239 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f765c04c8f0 con 0x7f766c197650 2026-03-10T14:03:11.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.472+0000 7f76697fa700 1 -- 192.168.123.103:0/3761417239 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f765c0787c0 con 0x7f766c197650 2026-03-10T14:03:11.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.473+0000 7f7663fff700 1 --2- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f76580384c0 0x7f765803a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:11.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.473+0000 7f7663fff700 1 --2- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f76580384c0 0x7f765803a970 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f7654006fd0 tx=0x7f7654006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:11.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.595+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7f766c02ce30 con 0x7f766c197650 2026-03-10T14:03:11.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.604+0000 7f76697fa700 1 -- 192.168.123.103:0/3761417239 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7f765c026020 con 0x7f766c197650 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.609+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f76580384c0 msgr2=0x7f765803a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.609+0000 7f7671b9a700 1 --2- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f76580384c0 0x7f765803a970 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f7654006fd0 tx=0x7f7654006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.609+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c197650 msgr2=0x7f766c197a60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.609+0000 7f7671b9a700 1 --2- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c197650 0x7f766c197a60 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f765c000c00 tx=0x7f765c0050d0 comp rx=0 tx=0).stop 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.610+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 shutdown_connections 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.610+0000 7f7671b9a700 1 --2- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f76580384c0 0x7f765803a970 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.610+0000 7f7671b9a700 1 --2- 192.168.123.103:0/3761417239 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f766c197650 0x7f766c197a60 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.610+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 >> 192.168.123.103:0/3761417239 conn(0x7f766c0f9800 msgr2=0x7f766c18e6a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.610+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 shutdown_connections 2026-03-10T14:03:11.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:11.610+0000 7f7671b9a700 1 -- 192.168.123.103:0/3761417239 wait complete. 2026-03-10T14:03:11.684 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-10T14:03:11.684 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-10T14:03:11.883 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:03:12.169 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.169+0000 7f2d9177a700 1 -- 192.168.123.103:0/421708671 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 msgr2=0x7f2d8c1024f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:12.169 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.169+0000 7f2d89ffb700 1 -- 192.168.123.103:0/421708671 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2d74004030 con 0x7f2d8c1020e0 2026-03-10T14:03:12.169 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.169+0000 7f2d9177a700 1 --2- 192.168.123.103:0/421708671 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 0x7f2d8c1024f0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f2d74009b00 tx=0x7f2d74009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 -- 192.168.123.103:0/421708671 shutdown_connections 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 --2- 192.168.123.103:0/421708671 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 0x7f2d8c1024f0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 -- 192.168.123.103:0/421708671 >> 192.168.123.103:0/421708671 conn(0x7f2d8c0fd8d0 msgr2=0x7f2d8c0ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 -- 192.168.123.103:0/421708671 shutdown_connections 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 -- 192.168.123.103:0/421708671 wait complete. 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 Processor -- start 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 -- start start 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 0x7f2d8c1972c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:12.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.171+0000 7f2d9177a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d8c197800 con 0x7f2d8c1020e0 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.172+0000 7f2d8affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 0x7f2d8c1972c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.172+0000 7f2d8affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 0x7f2d8c1972c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33464/0 (socket says 192.168.123.103:33464) 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.172+0000 7f2d8affd700 1 -- 192.168.123.103:0/3580587627 learned_addr learned my addr 192.168.123.103:0/3580587627 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.172+0000 7f2d8affd700 1 -- 192.168.123.103:0/3580587627 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2d740097e0 con 0x7f2d8c1020e0 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.172+0000 7f2d8affd700 1 --2- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 0x7f2d8c1972c0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f2d7400fa00 tx=0x7f2d7400fae0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.172+0000 7f2d83fff700 1 -- 192.168.123.103:0/3580587627 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2d7401d070 con 0x7f2d8c1020e0 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.173+0000 7f2d83fff700 1 -- 192.168.123.103:0/3580587627 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2d74005dc0 con 0x7f2d8c1020e0 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.173+0000 7f2d9177a700 1 -- 192.168.123.103:0/3580587627 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2d8c197a00 con 0x7f2d8c1020e0 2026-03-10T14:03:12.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.173+0000 7f2d9177a700 1 -- 192.168.123.103:0/3580587627 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2d8c197ea0 con 0x7f2d8c1020e0 2026-03-10T14:03:12.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.174+0000 7f2d83fff700 1 -- 192.168.123.103:0/3580587627 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2d74017700 con 0x7f2d8c1020e0 2026-03-10T14:03:12.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.174+0000 7f2d83fff700 1 -- 192.168.123.103:0/3580587627 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f2d74017920 con 0x7f2d8c1020e0 2026-03-10T14:03:12.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.174+0000 7f2d83fff700 1 --2- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2d78038180 0x7f2d7803a630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:12.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.174+0000 7f2d83fff700 1 -- 192.168.123.103:0/3580587627 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f2d7404c310 con 0x7f2d8c1020e0 2026-03-10T14:03:12.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.174+0000 7f2d8a7fc700 1 --2- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2d78038180 0x7f2d7803a630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:12.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.175+0000 7f2d8a7fc700 1 --2- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2d78038180 0x7f2d7803a630 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f2d7c006fd0 tx=0x7f2d7c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:12.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.175+0000 7f2d9177a700 1 -- 192.168.123.103:0/3580587627 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2d6c005320 con 0x7f2d8c1020e0 2026-03-10T14:03:12.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.179+0000 7f2d83fff700 1 -- 192.168.123.103:0/3580587627 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2d74027030 con 0x7f2d8c1020e0 2026-03-10T14:03:12.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.285+0000 7f2d9177a700 1 -- 192.168.123.103:0/3580587627 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7f2d6c000bf0 con 0x7f2d78038180 2026-03-10T14:03:12.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.289+0000 7f2d83fff700 1 -- 192.168.123.103:0/3580587627 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f2d6c000bf0 con 0x7f2d78038180 2026-03-10T14:03:12.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.291+0000 7f2d81ffb700 1 -- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2d78038180 msgr2=0x7f2d7803a630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.291+0000 7f2d81ffb700 1 --2- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2d78038180 0x7f2d7803a630 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f2d7c006fd0 tx=0x7f2d7c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.291+0000 7f2d81ffb700 1 -- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 msgr2=0x7f2d8c1972c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.291+0000 7f2d81ffb700 1 --2- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 0x7f2d8c1972c0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f2d7400fa00 tx=0x7f2d7400fae0 comp rx=0 tx=0).stop 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.292+0000 7f2d81ffb700 1 -- 192.168.123.103:0/3580587627 shutdown_connections 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.292+0000 7f2d81ffb700 1 --2- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2d78038180 0x7f2d7803a630 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.292+0000 7f2d81ffb700 1 --2- 192.168.123.103:0/3580587627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d8c1020e0 0x7f2d8c1972c0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.292+0000 7f2d81ffb700 1 -- 192.168.123.103:0/3580587627 >> 192.168.123.103:0/3580587627 conn(0x7f2d8c0fd8d0 msgr2=0x7f2d8c0ff4d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.292+0000 7f2d81ffb700 1 -- 192.168.123.103:0/3580587627 shutdown_connections 2026-03-10T14:03:12.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:12.292+0000 7f2d81ffb700 1 -- 192.168.123.103:0/3580587627 wait complete. 2026-03-10T14:03:12.340 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm04 2026-03-10T14:03:12.340 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:03:12.340 DEBUG:teuthology.orchestra.run.vm04:> dd of=/etc/ceph/ceph.conf 2026-03-10T14:03:12.356 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:03:12.356 DEBUG:teuthology.orchestra.run.vm04:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:03:12.411 INFO:tasks.cephadm:Adding host vm04 to orchestrator... 2026-03-10T14:03:12.411 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch host add vm04 2026-03-10T14:03:12.569 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:03:12.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:12 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3761417239' entity='client.admin' 2026-03-10T14:03:12.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:12 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:12.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:12 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:12.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:12 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:12.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:12 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:03:12.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:12 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:03:12.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:12 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:03:13.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.185+0000 7fca60c05700 1 -- 192.168.123.103:0/3258467722 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 msgr2=0x7fca5c0717d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:13.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.185+0000 7fca60c05700 1 --2- 192.168.123.103:0/3258467722 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 0x7fca5c0717d0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fca4c009b50 tx=0x7fca4c009e60 comp rx=0 tx=0).stop 2026-03-10T14:03:13.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.186+0000 7fca60c05700 1 -- 192.168.123.103:0/3258467722 shutdown_connections 2026-03-10T14:03:13.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.186+0000 7fca60c05700 1 --2- 192.168.123.103:0/3258467722 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 0x7fca5c0717d0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:13.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.186+0000 7fca60c05700 1 -- 192.168.123.103:0/3258467722 >> 192.168.123.103:0/3258467722 conn(0x7fca5c06cd30 msgr2=0x7fca5c06f180 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.186+0000 7fca60c05700 1 -- 192.168.123.103:0/3258467722 shutdown_connections 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.186+0000 7fca60c05700 1 -- 192.168.123.103:0/3258467722 wait complete. 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.187+0000 7fca60c05700 1 Processor -- start 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.187+0000 7fca60c05700 1 -- start start 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.187+0000 7fca60c05700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 0x7fca5c1a3e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.187+0000 7fca60c05700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca5c1a4370 con 0x7fca5c0713c0 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.187+0000 7fca5b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 0x7fca5c1a3e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.188+0000 7fca5b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 0x7fca5c1a3e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33474/0 (socket says 192.168.123.103:33474) 2026-03-10T14:03:13.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.188+0000 7fca5b7fe700 1 -- 192.168.123.103:0/1454082647 learned_addr learned my addr 192.168.123.103:0/1454082647 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:13.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.188+0000 7fca5b7fe700 1 -- 192.168.123.103:0/1454082647 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca4c0097e0 con 0x7fca5c0713c0 2026-03-10T14:03:13.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.188+0000 7fca5b7fe700 1 --2- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 0x7fca5c1a3e30 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fca4c005f50 tx=0x7fca4c0050d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:13.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.188+0000 7fca58ff9700 1 -- 192.168.123.103:0/1454082647 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fca4c01c070 con 0x7fca5c0713c0 2026-03-10T14:03:13.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.189+0000 7fca58ff9700 1 -- 192.168.123.103:0/1454082647 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fca4c021470 con 0x7fca5c0713c0 2026-03-10T14:03:13.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.189+0000 7fca58ff9700 1 -- 192.168.123.103:0/1454082647 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fca4c00f460 con 0x7fca5c0713c0 2026-03-10T14:03:13.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.189+0000 7fca60c05700 1 -- 192.168.123.103:0/1454082647 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca5c1a4570 con 0x7fca5c0713c0 2026-03-10T14:03:13.189 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.189+0000 7fca60c05700 1 -- 192.168.123.103:0/1454082647 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca5c1a4a10 con 0x7fca5c0713c0 2026-03-10T14:03:13.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.192+0000 7fca58ff9700 1 -- 192.168.123.103:0/1454082647 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7fca4c0052a0 con 0x7fca5c0713c0 2026-03-10T14:03:13.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.192+0000 7fca58ff9700 1 --2- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca440380f0 0x7fca4403a5a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:13.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.192+0000 7fca58ff9700 1 -- 192.168.123.103:0/1454082647 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fca4c04c1d0 con 0x7fca5c0713c0 2026-03-10T14:03:13.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.193+0000 7fca5affd700 1 --2- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca440380f0 0x7fca4403a5a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:13.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.193+0000 7fca5affd700 1 --2- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca440380f0 0x7fca4403a5a0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fca5400ad30 tx=0x7fca540093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:13.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.193+0000 7fca427fc700 1 -- 192.168.123.103:0/1454082647 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca5c062380 con 0x7fca5c0713c0 2026-03-10T14:03:13.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.196+0000 7fca58ff9700 1 -- 192.168.123.103:0/1454082647 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fca4c026030 con 0x7fca5c0713c0 2026-03-10T14:03:13.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.312+0000 7fca427fc700 1 -- 192.168.123.103:0/1454082647 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm04", "target": ["mon-mgr", ""]}) v1 -- 0x7fca5c062030 con 0x7fca440380f0 2026-03-10T14:03:13.752 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:13.751+0000 7fca58ff9700 1 -- 192.168.123.103:0/1454082647 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fca4c02ace0 con 0x7fca5c0713c0 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T14:03:14.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:13 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:03:14.755 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:03:14.755 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: from='client.14190 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm04", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:14.755 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:03:14.755 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-10T14:03:14.755 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: mgrmap e13: vm03.rwbbep(active, since 6s) 2026-03-10T14:03:15.012 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: Deploying cephadm binary to vm04 2026-03-10T14:03:15.013 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:15.013 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:15.013 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:15.013 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:15.013 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:03:15.013 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T14:03:15.013 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:14 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:03:15.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.107+0000 7fca58ff9700 1 -- 192.168.123.103:0/1454082647 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7fca5c062030 con 0x7fca440380f0 2026-03-10T14:03:15.107 INFO:teuthology.orchestra.run.vm03.stdout:Added host 'vm04' with addr '192.168.123.104' 2026-03-10T14:03:15.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.109+0000 7fca427fc700 1 -- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca440380f0 msgr2=0x7fca4403a5a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:15.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.109+0000 7fca427fc700 1 --2- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca440380f0 0x7fca4403a5a0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fca5400ad30 tx=0x7fca540093f0 comp rx=0 tx=0).stop 2026-03-10T14:03:15.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.109+0000 7fca427fc700 1 -- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 msgr2=0x7fca5c1a3e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:15.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.109+0000 7fca427fc700 1 --2- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 0x7fca5c1a3e30 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fca4c005f50 tx=0x7fca4c0050d0 comp rx=0 tx=0).stop 2026-03-10T14:03:15.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.109+0000 7fca427fc700 1 -- 192.168.123.103:0/1454082647 shutdown_connections 2026-03-10T14:03:15.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.109+0000 7fca427fc700 1 --2- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca440380f0 0x7fca4403a5a0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:15.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.109+0000 7fca427fc700 1 --2- 192.168.123.103:0/1454082647 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca5c0713c0 0x7fca5c1a3e30 secure :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fca4c005f50 tx=0x7fca4c0050d0 comp rx=0 tx=0).stop 2026-03-10T14:03:15.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.109+0000 7fca427fc700 1 -- 192.168.123.103:0/1454082647 >> 192.168.123.103:0/1454082647 conn(0x7fca5c06cd30 msgr2=0x7fca5c06fa10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:15.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.110+0000 7fca427fc700 1 -- 192.168.123.103:0/1454082647 shutdown_connections 2026-03-10T14:03:15.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.113+0000 7fca427fc700 1 -- 192.168.123.103:0/1454082647 wait complete. 2026-03-10T14:03:15.299 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch host ls --format=json 2026-03-10T14:03:15.621 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:03:15.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.938+0000 7f3ebaa22700 1 -- 192.168.123.103:0/305765659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 msgr2=0x7f3eb410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.938+0000 7f3ebaa22700 1 --2- 192.168.123.103:0/305765659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 0x7f3eb410edb0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f3eb0009b00 tx=0x7f3eb0009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.938+0000 7f3ebaa22700 1 -- 192.168.123.103:0/305765659 shutdown_connections 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.938+0000 7f3ebaa22700 1 --2- 192.168.123.103:0/305765659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 0x7f3eb410edb0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.938+0000 7f3ebaa22700 1 -- 192.168.123.103:0/305765659 >> 192.168.123.103:0/305765659 conn(0x7f3eb406c410 msgr2=0x7f3eb406c810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.938+0000 7f3ebaa22700 1 -- 192.168.123.103:0/305765659 shutdown_connections 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.939+0000 7f3ebaa22700 1 -- 192.168.123.103:0/305765659 wait complete. 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.939+0000 7f3ebaa22700 1 Processor -- start 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.939+0000 7f3ebaa22700 1 -- start start 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.939+0000 7f3ebaa22700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 0x7f3eb4114bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.939+0000 7f3ebaa22700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3eb4115110 con 0x7f3eb4072730 2026-03-10T14:03:15.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.940+0000 7f3eb9a20700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 0x7f3eb4114bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:15.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.940+0000 7f3eb9a20700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 0x7f3eb4114bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:35260/0 (socket says 192.168.123.103:35260) 2026-03-10T14:03:15.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.940+0000 7f3eb9a20700 1 -- 192.168.123.103:0/903231222 learned_addr learned my addr 192.168.123.103:0/903231222 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:15.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.940+0000 7f3eb9a20700 1 -- 192.168.123.103:0/903231222 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3eb00097e0 con 0x7f3eb4072730 2026-03-10T14:03:15.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.940+0000 7f3eb9a20700 1 --2- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 0x7f3eb4114bd0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f3eb0005950 tx=0x7f3eb0004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:15.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.943+0000 7f3eaaffd700 1 -- 192.168.123.103:0/903231222 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3eb001c070 con 0x7f3eb4072730 2026-03-10T14:03:15.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.943+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3eb4115310 con 0x7f3eb4072730 2026-03-10T14:03:15.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.943+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3eb4115730 con 0x7f3eb4072730 2026-03-10T14:03:15.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.943+0000 7f3eaaffd700 1 -- 192.168.123.103:0/903231222 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3eb00056f0 con 0x7f3eb4072730 2026-03-10T14:03:15.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.943+0000 7f3eaaffd700 1 -- 192.168.123.103:0/903231222 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3eb000f460 con 0x7f3eb4072730 2026-03-10T14:03:15.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.944+0000 7f3eaaffd700 1 -- 192.168.123.103:0/903231222 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f3eb000f5e0 con 0x7f3eb4072730 2026-03-10T14:03:15.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.944+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3eb410d520 con 0x7f3eb4072730 2026-03-10T14:03:15.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.945+0000 7f3eaaffd700 1 --2- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3ea00385a0 0x7f3ea003aa50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:15.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.945+0000 7f3eb921f700 1 --2- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3ea00385a0 0x7f3ea003aa50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:15.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.945+0000 7f3eb921f700 1 --2- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3ea00385a0 0x7f3ea003aa50 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f3ea4006fd0 tx=0x7f3ea4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:15.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.945+0000 7f3eaaffd700 1 -- 192.168.123.103:0/903231222 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f3eb004d380 con 0x7f3eb4072730 2026-03-10T14:03:15.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:15.948+0000 7f3eaaffd700 1 -- 192.168.123.103:0/903231222 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3eb0029950 con 0x7f3eb4072730 2026-03-10T14:03:16.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.075+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f3eb4062380 con 0x7f3ea00385a0 2026-03-10T14:03:16.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.077+0000 7f3eaaffd700 1 -- 192.168.123.103:0/903231222 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7f3eb4062380 con 0x7f3ea00385a0 2026-03-10T14:03:16.076 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:03:16.076 INFO:teuthology.orchestra.run.vm03.stdout:[{"addr": "192.168.123.103", "hostname": "vm03", "labels": [], "status": ""}, {"addr": "192.168.123.104", "hostname": "vm04", "labels": [], "status": ""}] 2026-03-10T14:03:16.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.080+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3ea00385a0 msgr2=0x7f3ea003aa50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:16.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.080+0000 7f3ebaa22700 1 --2- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3ea00385a0 0x7f3ea003aa50 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f3ea4006fd0 tx=0x7f3ea4006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:16.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.081+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 msgr2=0x7f3eb4114bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:16.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.081+0000 7f3ebaa22700 1 --2- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 0x7f3eb4114bd0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f3eb0005950 tx=0x7f3eb0004dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:16.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.083+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 shutdown_connections 2026-03-10T14:03:16.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.083+0000 7f3ebaa22700 1 --2- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3ea00385a0 0x7f3ea003aa50 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:16.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.083+0000 7f3ebaa22700 1 --2- 192.168.123.103:0/903231222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eb4072730 0x7f3eb4114bd0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:16.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.083+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 >> 192.168.123.103:0/903231222 conn(0x7f3eb406c410 msgr2=0x7f3eb410e6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:16.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.083+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 shutdown_connections 2026-03-10T14:03:16.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.085+0000 7f3ebaa22700 1 -- 192.168.123.103:0/903231222 wait complete. 2026-03-10T14:03:16.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:15 vm03 ceph-mon[49718]: Deploying daemon crash.vm03 on vm03 2026-03-10T14:03:16.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:15 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:16.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:15 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:16.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:15 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:16.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:15 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:16.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:15 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:16.157 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-10T14:03:16.157 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd crush tunables default 2026-03-10T14:03:16.449 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:03:16.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.735+0000 7f6027a46700 1 -- 192.168.123.103:0/3562196500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 msgr2=0x7f60200fe710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:16.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.735+0000 7f6027a46700 1 --2- 192.168.123.103:0/3562196500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 0x7f60200fe710 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f6014009b00 tx=0x7f6014009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:16.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.736+0000 7f6027a46700 1 -- 192.168.123.103:0/3562196500 shutdown_connections 2026-03-10T14:03:16.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.736+0000 7f6027a46700 1 --2- 192.168.123.103:0/3562196500 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 0x7f60200fe710 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:16.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.736+0000 7f6027a46700 1 -- 192.168.123.103:0/3562196500 >> 192.168.123.103:0/3562196500 conn(0x7f60200f9d90 msgr2=0x7f60200fc1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:16.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.737+0000 7f6027a46700 1 -- 192.168.123.103:0/3562196500 shutdown_connections 2026-03-10T14:03:16.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.737+0000 7f6027a46700 1 -- 192.168.123.103:0/3562196500 wait complete. 2026-03-10T14:03:16.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.737+0000 7f6027a46700 1 Processor -- start 2026-03-10T14:03:16.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f6027a46700 1 -- start start 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f6027a46700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 0x7f60201a1980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f6027a46700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60201a1ec0 con 0x7f60200fe340 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f60257e2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 0x7f60201a1980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f60257e2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 0x7f60201a1980 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:35274/0 (socket says 192.168.123.103:35274) 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f60257e2700 1 -- 192.168.123.103:0/3898289331 learned_addr learned my addr 192.168.123.103:0/3898289331 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f60257e2700 1 -- 192.168.123.103:0/3898289331 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f60140097e0 con 0x7f60200fe340 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f60257e2700 1 --2- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 0x7f60201a1980 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f6014006010 tx=0x7f6014004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.738+0000 7f60127fc700 1 -- 192.168.123.103:0/3898289331 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f601401c070 con 0x7f60200fe340 2026-03-10T14:03:16.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.739+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f60201a20c0 con 0x7f60200fe340 2026-03-10T14:03:16.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.739+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f602019b970 con 0x7f60200fe340 2026-03-10T14:03:16.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.739+0000 7f60127fc700 1 -- 192.168.123.103:0/3898289331 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6014021470 con 0x7f60200fe340 2026-03-10T14:03:16.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.739+0000 7f60127fc700 1 -- 192.168.123.103:0/3898289331 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f601400f460 con 0x7f60200fe340 2026-03-10T14:03:16.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.739+0000 7f60127fc700 1 -- 192.168.123.103:0/3898289331 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f601400f5c0 con 0x7f60200fe340 2026-03-10T14:03:16.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.739+0000 7f60127fc700 1 --2- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f600c038510 0x7f600c03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:16.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.739+0000 7f60127fc700 1 -- 192.168.123.103:0/3898289331 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f601404d3d0 con 0x7f60200fe340 2026-03-10T14:03:16.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.740+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6004005320 con 0x7f60200fe340 2026-03-10T14:03:16.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.743+0000 7f6024fe1700 1 --2- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f600c038510 0x7f600c03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:16.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.743+0000 7f6024fe1700 1 --2- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f600c038510 0x7f600c03a9c0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f601c006fd0 tx=0x7f601c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:16.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.743+0000 7f60127fc700 1 -- 192.168.123.103:0/3898289331 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6014026070 con 0x7f60200fe340 2026-03-10T14:03:16.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:16.859+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7f60040059f0 con 0x7f60200fe340 2026-03-10T14:03:17.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:16 vm03 ceph-mon[49718]: Added host vm04 2026-03-10T14:03:17.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:16 vm03 ceph-mon[49718]: Deploying daemon node-exporter.vm03 on vm03 2026-03-10T14:03:17.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.859+0000 7f60127fc700 1 -- 192.168.123.103:0/3898289331 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7f6014029720 con 0x7f60200fe340 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.861+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f600c038510 msgr2=0x7f600c03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.861+0000 7f6027a46700 1 --2- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f600c038510 0x7f600c03a9c0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f601c006fd0 tx=0x7f601c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.861+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 msgr2=0x7f60201a1980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.861+0000 7f6027a46700 1 --2- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 0x7f60201a1980 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f6014006010 tx=0x7f6014004dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.861+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 shutdown_connections 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.861+0000 7f6027a46700 1 --2- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f600c038510 0x7f600c03a9c0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.861+0000 7f6027a46700 1 --2- 192.168.123.103:0/3898289331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60200fe340 0x7f60201a1980 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.861+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 >> 192.168.123.103:0/3898289331 conn(0x7f60200f9d90 msgr2=0x7f60200fba70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:17.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.862+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 shutdown_connections 2026-03-10T14:03:17.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:03:17.862+0000 7f6027a46700 1 -- 192.168.123.103:0/3898289331 wait complete. 2026-03-10T14:03:17.861 INFO:teuthology.orchestra.run.vm03.stderr:adjusted tunables profile to default 2026-03-10T14:03:17.916 INFO:tasks.cephadm:Adding mon.vm03 on vm03 2026-03-10T14:03:17.916 INFO:tasks.cephadm:Adding mon.vm04 on vm04 2026-03-10T14:03:17.916 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch apply mon '2;vm03:192.168.123.103=vm03;vm04:192.168.123.104=vm04' 2026-03-10T14:03:18.069 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:18.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:17 vm03 ceph-mon[49718]: from='client.14193 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T14:03:18.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:17 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3898289331' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-10T14:03:18.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:17 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:18.110 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:18 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3898289331' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-10T14:03:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:18 vm03 ceph-mon[49718]: osdmap e4: 0 total, 0 up, 0 in 2026-03-10T14:03:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:18 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:18 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:18 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:18 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:19.218 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.219+0000 7f8a3aa91700 1 -- 192.168.123.104:0/703210015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 msgr2=0x7f8a340fed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:19.218 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.219+0000 7f8a3aa91700 1 --2- 192.168.123.104:0/703210015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 0x7f8a340fed20 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f8a24009b00 tx=0x7f8a24009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:19.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.220+0000 7f8a3aa91700 1 -- 192.168.123.104:0/703210015 shutdown_connections 2026-03-10T14:03:19.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.220+0000 7f8a3aa91700 1 --2- 192.168.123.104:0/703210015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 0x7f8a340fed20 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:19.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.220+0000 7f8a3aa91700 1 -- 192.168.123.104:0/703210015 >> 192.168.123.104:0/703210015 conn(0x7f8a340fa4a0 msgr2=0x7f8a340fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:19.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.220+0000 7f8a3aa91700 1 -- 192.168.123.104:0/703210015 shutdown_connections 2026-03-10T14:03:19.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.221+0000 7f8a3aa91700 1 -- 192.168.123.104:0/703210015 wait complete. 2026-03-10T14:03:19.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.221+0000 7f8a3aa91700 1 Processor -- start 2026-03-10T14:03:19.219 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.221+0000 7f8a3aa91700 1 -- start start 2026-03-10T14:03:19.220 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.221+0000 7f8a3aa91700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 0x7f8a34197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:19.220 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.221+0000 7f8a3aa91700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a341978c0 con 0x7f8a340fe910 2026-03-10T14:03:19.220 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.222+0000 7f8a3882d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 0x7f8a34197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:19.220 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.222+0000 7f8a3882d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 0x7f8a34197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:47442/0 (socket says 192.168.123.104:47442) 2026-03-10T14:03:19.220 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.222+0000 7f8a3882d700 1 -- 192.168.123.104:0/1751308682 learned_addr learned my addr 192.168.123.104:0/1751308682 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:19.220 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.222+0000 7f8a3882d700 1 -- 192.168.123.104:0/1751308682 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8a240097e0 con 0x7f8a340fe910 2026-03-10T14:03:19.220 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.222+0000 7f8a3882d700 1 --2- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 0x7f8a34197380 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f8a24004f40 tx=0x7f8a24005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:19.220 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.222+0000 7f8a31ffb700 1 -- 192.168.123.104:0/1751308682 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a2401c070 con 0x7f8a340fe910 2026-03-10T14:03:19.222 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.222+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8a34197ac0 con 0x7f8a340fe910 2026-03-10T14:03:19.222 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.222+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8a34197f60 con 0x7f8a340fe910 2026-03-10T14:03:19.222 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.223+0000 7f8a31ffb700 1 -- 192.168.123.104:0/1751308682 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8a240053b0 con 0x7f8a340fe910 2026-03-10T14:03:19.222 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.223+0000 7f8a31ffb700 1 -- 192.168.123.104:0/1751308682 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a2400f460 con 0x7f8a340fe910 2026-03-10T14:03:19.222 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.223+0000 7f8a31ffb700 1 -- 192.168.123.104:0/1751308682 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f8a2400f6c0 con 0x7f8a340fe910 2026-03-10T14:03:19.222 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.223+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8a20005320 con 0x7f8a340fe910 2026-03-10T14:03:19.222 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.223+0000 7f8a31ffb700 1 --2- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8a1c040d20 0x7f8a1c0431d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:19.222 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.223+0000 7f8a31ffb700 1 -- 192.168.123.104:0/1751308682 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8a2404d530 con 0x7f8a340fe910 2026-03-10T14:03:19.223 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.224+0000 7f8a33fff700 1 --2- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8a1c040d20 0x7f8a1c0431d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:19.223 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.225+0000 7f8a33fff700 1 --2- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8a1c040d20 0x7f8a1c0431d0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f8a28006fd0 tx=0x7f8a28006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:19.225 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.227+0000 7f8a31ffb700 1 -- 192.168.123.104:0/1751308682 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8a24029bb0 con 0x7f8a340fe910 2026-03-10T14:03:19.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.336+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm03:192.168.123.103=vm03;vm04:192.168.123.104=vm04", "target": ["mon-mgr", ""]}) v1 -- 0x7f8a20000c90 con 0x7f8a1c040d20 2026-03-10T14:03:19.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.342+0000 7f8a31ffb700 1 -- 192.168.123.104:0/1751308682 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f8a20000c90 con 0x7f8a1c040d20 2026-03-10T14:03:19.340 INFO:teuthology.orchestra.run.vm04.stdout:Scheduled mon update... 2026-03-10T14:03:19.342 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.344+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8a1c040d20 msgr2=0x7f8a1c0431d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.344+0000 7f8a3aa91700 1 --2- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8a1c040d20 0x7f8a1c0431d0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f8a28006fd0 tx=0x7f8a28006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.344+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 msgr2=0x7f8a34197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.344+0000 7f8a3aa91700 1 --2- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 0x7f8a34197380 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f8a24004f40 tx=0x7f8a24005e70 comp rx=0 tx=0).stop 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.345+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 shutdown_connections 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.345+0000 7f8a3aa91700 1 --2- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8a1c040d20 0x7f8a1c0431d0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.345+0000 7f8a3aa91700 1 --2- 192.168.123.104:0/1751308682 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8a340fe910 0x7f8a34197380 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.345+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 >> 192.168.123.104:0/1751308682 conn(0x7f8a340fa4a0 msgr2=0x7f8a340fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.345+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 shutdown_connections 2026-03-10T14:03:19.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.345+0000 7f8a3aa91700 1 -- 192.168.123.104:0/1751308682 wait complete. 2026-03-10T14:03:19.406 DEBUG:teuthology.orchestra.run.vm04:mon.vm04> sudo journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm04.service 2026-03-10T14:03:19.407 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:19.407 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:19.583 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:19.619 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:19.884 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.885+0000 7fdc99d6e700 1 -- 192.168.123.104:0/4241711693 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 msgr2=0x7fdc941025e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:19.884 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.885+0000 7fdc99d6e700 1 --2- 192.168.123.104:0/4241711693 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 0x7fdc941025e0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fdc7c009b00 tx=0x7fdc7c009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:19.884 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 -- 192.168.123.104:0/4241711693 shutdown_connections 2026-03-10T14:03:19.884 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 --2- 192.168.123.104:0/4241711693 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 0x7fdc941025e0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:19.884 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 -- 192.168.123.104:0/4241711693 >> 192.168.123.104:0/4241711693 conn(0x7fdc940fd760 msgr2=0x7fdc940ffbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:19.884 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 -- 192.168.123.104:0/4241711693 shutdown_connections 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 -- 192.168.123.104:0/4241711693 wait complete. 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 Processor -- start 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 -- start start 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 0x7fdc94192f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.886+0000 7fdc99d6e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc94193490 con 0x7fdc941021d0 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.887+0000 7fdc937fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 0x7fdc94192f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.887+0000 7fdc937fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 0x7fdc94192f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:47460/0 (socket says 192.168.123.104:47460) 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.887+0000 7fdc937fe700 1 -- 192.168.123.104:0/865140190 learned_addr learned my addr 192.168.123.104:0/865140190 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.887+0000 7fdc937fe700 1 -- 192.168.123.104:0/865140190 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdc7c0097e0 con 0x7fdc941021d0 2026-03-10T14:03:19.885 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.887+0000 7fdc937fe700 1 --2- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 0x7fdc94192f50 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fdc7c004750 tx=0x7fdc7c005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:19.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.887+0000 7fdc90ff9700 1 -- 192.168.123.104:0/865140190 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdc7c01c070 con 0x7fdc941021d0 2026-03-10T14:03:19.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.888+0000 7fdc99d6e700 1 -- 192.168.123.104:0/865140190 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdc94193690 con 0x7fdc941021d0 2026-03-10T14:03:19.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.888+0000 7fdc99d6e700 1 -- 192.168.123.104:0/865140190 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdc94193b30 con 0x7fdc941021d0 2026-03-10T14:03:19.886 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.888+0000 7fdc90ff9700 1 -- 192.168.123.104:0/865140190 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdc7c021470 con 0x7fdc941021d0 2026-03-10T14:03:19.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.888+0000 7fdc90ff9700 1 -- 192.168.123.104:0/865140190 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdc7c00f460 con 0x7fdc941021d0 2026-03-10T14:03:19.887 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.889+0000 7fdc8a7fc700 1 -- 192.168.123.104:0/865140190 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdc74005320 con 0x7fdc941021d0 2026-03-10T14:03:19.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.889+0000 7fdc90ff9700 1 -- 192.168.123.104:0/865140190 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fdc7c021ac0 con 0x7fdc941021d0 2026-03-10T14:03:19.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.889+0000 7fdc90ff9700 1 --2- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdc800384c0 0x7fdc8003a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:19.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.889+0000 7fdc90ff9700 1 -- 192.168.123.104:0/865140190 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fdc7c04c350 con 0x7fdc941021d0 2026-03-10T14:03:19.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.889+0000 7fdc92ffd700 1 --2- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdc800384c0 0x7fdc8003a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:19.888 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.890+0000 7fdc92ffd700 1 --2- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdc800384c0 0x7fdc8003a970 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fdc84006fd0 tx=0x7fdc84006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:19.891 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:19.892+0000 7fdc90ff9700 1 -- 192.168.123.104:0/865140190 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdc7c029930 con 0x7fdc941021d0 2026-03-10T14:03:20.035 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.036+0000 7fdc8a7fc700 1 -- 192.168.123.104:0/865140190 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fdc74005190 con 0x7fdc941021d0 2026-03-10T14:03:20.037 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:20.037 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:20.037 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.037+0000 7fdc90ff9700 1 -- 192.168.123.104:0/865140190 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fdc7c026030 con 0x7fdc941021d0 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 -- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdc800384c0 msgr2=0x7fdc8003a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 --2- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdc800384c0 0x7fdc8003a970 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fdc84006fd0 tx=0x7fdc84006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 -- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 msgr2=0x7fdc94192f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 --2- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 0x7fdc94192f50 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fdc7c004750 tx=0x7fdc7c005dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 -- 192.168.123.104:0/865140190 shutdown_connections 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 --2- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdc800384c0 0x7fdc8003a970 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 --2- 192.168.123.104:0/865140190 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdc941021d0 0x7fdc94192f50 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 -- 192.168.123.104:0/865140190 >> 192.168.123.104:0/865140190 conn(0x7fdc940fd760 msgr2=0x7fdc940fe430 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 -- 192.168.123.104:0/865140190 shutdown_connections 2026-03-10T14:03:20.038 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:20.040+0000 7fdc8a7fc700 1 -- 192.168.123.104:0/865140190 wait complete. 2026-03-10T14:03:20.039 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:19 vm03 ceph-mon[49718]: Deploying daemon alertmanager.vm03 on vm03 2026-03-10T14:03:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:19 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:21.110 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:21.110 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:21.257 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:21.286 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:20 vm03 ceph-mon[49718]: from='client.14197 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm03:192.168.123.103=vm03;vm04:192.168.123.104=vm04", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:03:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:20 vm03 ceph-mon[49718]: Saving service mon spec with placement vm03:192.168.123.103=vm03;vm04:192.168.123.104=vm04;count:2 2026-03-10T14:03:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:20 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/865140190' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.540+0000 7f93b7cc2700 1 -- 192.168.123.104:0/851694445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 msgr2=0x7f93b0102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.540+0000 7f93b7cc2700 1 --2- 192.168.123.104:0/851694445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 0x7f93b0102e80 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f93a0009b00 tx=0x7f93a0009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.540+0000 7f93b7cc2700 1 -- 192.168.123.104:0/851694445 shutdown_connections 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.540+0000 7f93b7cc2700 1 --2- 192.168.123.104:0/851694445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 0x7f93b0102e80 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.540+0000 7f93b7cc2700 1 -- 192.168.123.104:0/851694445 >> 192.168.123.104:0/851694445 conn(0x7f93b00fa4a0 msgr2=0x7f93b00fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.541+0000 7f93b7cc2700 1 -- 192.168.123.104:0/851694445 shutdown_connections 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.541+0000 7f93b7cc2700 1 -- 192.168.123.104:0/851694445 wait complete. 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.541+0000 7f93b7cc2700 1 Processor -- start 2026-03-10T14:03:21.539 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.541+0000 7f93b7cc2700 1 -- start start 2026-03-10T14:03:21.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.541+0000 7f93b7cc2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 0x7f93b018f020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:21.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.541+0000 7f93b7cc2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93b018f560 con 0x7f93b0100aa0 2026-03-10T14:03:21.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.542+0000 7f93b5a5e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 0x7f93b018f020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:21.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.542+0000 7f93b5a5e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 0x7f93b018f020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:47482/0 (socket says 192.168.123.104:47482) 2026-03-10T14:03:21.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.542+0000 7f93b5a5e700 1 -- 192.168.123.104:0/3801775537 learned_addr learned my addr 192.168.123.104:0/3801775537 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:21.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.542+0000 7f93b5a5e700 1 -- 192.168.123.104:0/3801775537 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f93a00097e0 con 0x7f93b0100aa0 2026-03-10T14:03:21.540 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.542+0000 7f93b5a5e700 1 --2- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 0x7f93b018f020 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f93a0004f40 tx=0x7f93a0005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:21.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.543+0000 7f93a6ffd700 1 -- 192.168.123.104:0/3801775537 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93a001c070 con 0x7f93b0100aa0 2026-03-10T14:03:21.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.543+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f93b018f760 con 0x7f93b0100aa0 2026-03-10T14:03:21.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.543+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f93b018bcd0 con 0x7f93b0100aa0 2026-03-10T14:03:21.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.543+0000 7f93a6ffd700 1 -- 192.168.123.104:0/3801775537 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f93a00053b0 con 0x7f93b0100aa0 2026-03-10T14:03:21.541 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.543+0000 7f93a6ffd700 1 -- 192.168.123.104:0/3801775537 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93a000f460 con 0x7f93b0100aa0 2026-03-10T14:03:21.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.544+0000 7f93a6ffd700 1 -- 192.168.123.104:0/3801775537 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f93a000f6d0 con 0x7f93b0100aa0 2026-03-10T14:03:21.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.544+0000 7f93a6ffd700 1 --2- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f939c038510 0x7f939c03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:21.542 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.544+0000 7f93a6ffd700 1 -- 192.168.123.104:0/3801775537 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f93a004d680 con 0x7f93b0100aa0 2026-03-10T14:03:21.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.544+0000 7f93b525d700 1 --2- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f939c038510 0x7f939c03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:21.543 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.545+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9394005320 con 0x7f93b0100aa0 2026-03-10T14:03:21.544 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.546+0000 7f93b525d700 1 --2- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f939c038510 0x7f939c03a9c0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f93ac006fd0 tx=0x7f93ac006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:21.547 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.549+0000 7f93a6ffd700 1 -- 192.168.123.104:0/3801775537 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f93a0026030 con 0x7f93b0100aa0 2026-03-10T14:03:21.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.700+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9394005190 con 0x7f93b0100aa0 2026-03-10T14:03:21.700 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.701+0000 7f93a6ffd700 1 -- 192.168.123.104:0/3801775537 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f93a0017440 con 0x7f93b0100aa0 2026-03-10T14:03:21.700 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:21.700 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:21.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.704+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f939c038510 msgr2=0x7f939c03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:21.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.704+0000 7f93b7cc2700 1 --2- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f939c038510 0x7f939c03a9c0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f93ac006fd0 tx=0x7f93ac006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:21.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.704+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 msgr2=0x7f93b018f020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:21.702 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.704+0000 7f93b7cc2700 1 --2- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 0x7f93b018f020 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f93a0004f40 tx=0x7f93a0005e70 comp rx=0 tx=0).stop 2026-03-10T14:03:21.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.704+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 shutdown_connections 2026-03-10T14:03:21.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.704+0000 7f93b7cc2700 1 --2- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f939c038510 0x7f939c03a9c0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:21.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.704+0000 7f93b7cc2700 1 --2- 192.168.123.104:0/3801775537 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f93b0100aa0 0x7f93b018f020 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:21.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.705+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 >> 192.168.123.104:0/3801775537 conn(0x7f93b00fa4a0 msgr2=0x7f93b00fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:21.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.705+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 shutdown_connections 2026-03-10T14:03:21.703 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:21.705+0000 7f93b7cc2700 1 -- 192.168.123.104:0/3801775537 wait complete. 2026-03-10T14:03:21.704 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:22.084 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:21 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3801775537' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:22.771 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:22.771 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:22.919 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:22.952 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:23.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.199+0000 7f6a7a298700 1 -- 192.168.123.104:0/463713823 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 msgr2=0x7f6a74102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:23.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.199+0000 7f6a7a298700 1 --2- 192.168.123.104:0/463713823 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 0x7f6a74102640 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f6a64009b00 tx=0x7f6a64009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:23.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.200+0000 7f6a7a298700 1 -- 192.168.123.104:0/463713823 shutdown_connections 2026-03-10T14:03:23.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.200+0000 7f6a7a298700 1 --2- 192.168.123.104:0/463713823 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 0x7f6a74102640 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:23.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.200+0000 7f6a7a298700 1 -- 192.168.123.104:0/463713823 >> 192.168.123.104:0/463713823 conn(0x7f6a740fd8d0 msgr2=0x7f6a740ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:23.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.200+0000 7f6a7a298700 1 -- 192.168.123.104:0/463713823 shutdown_connections 2026-03-10T14:03:23.198 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.200+0000 7f6a7a298700 1 -- 192.168.123.104:0/463713823 wait complete. 2026-03-10T14:03:23.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.200+0000 7f6a7a298700 1 Processor -- start 2026-03-10T14:03:23.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.201+0000 7f6a7a298700 1 -- start start 2026-03-10T14:03:23.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.201+0000 7f6a7a298700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 0x7f6a74197340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:23.199 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.201+0000 7f6a7a298700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a64012070 con 0x7f6a74102230 2026-03-10T14:03:23.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.201+0000 7f6a73fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 0x7f6a74197340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:23.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.201+0000 7f6a73fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 0x7f6a74197340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:50566/0 (socket says 192.168.123.104:50566) 2026-03-10T14:03:23.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.201+0000 7f6a73fff700 1 -- 192.168.123.104:0/2267165273 learned_addr learned my addr 192.168.123.104:0/2267165273 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:23.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.202+0000 7f6a73fff700 1 -- 192.168.123.104:0/2267165273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a640097e0 con 0x7f6a74102230 2026-03-10T14:03:23.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.202+0000 7f6a73fff700 1 --2- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 0x7f6a74197340 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6a64000c00 tx=0x7f6a64005650 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:23.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.202+0000 7f6a71ffb700 1 -- 192.168.123.104:0/2267165273 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6a6401d070 con 0x7f6a74102230 2026-03-10T14:03:23.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.202+0000 7f6a71ffb700 1 -- 192.168.123.104:0/2267165273 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6a64005dc0 con 0x7f6a74102230 2026-03-10T14:03:23.200 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.202+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6a74197880 con 0x7f6a74102230 2026-03-10T14:03:23.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.202+0000 7f6a71ffb700 1 -- 192.168.123.104:0/2267165273 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6a6400f600 con 0x7f6a74102230 2026-03-10T14:03:23.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.202+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6a74197ca0 con 0x7f6a74102230 2026-03-10T14:03:23.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.203+0000 7f6a71ffb700 1 -- 192.168.123.104:0/2267165273 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f6a64003680 con 0x7f6a74102230 2026-03-10T14:03:23.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.204+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6a7404fa50 con 0x7f6a74102230 2026-03-10T14:03:23.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.204+0000 7f6a71ffb700 1 --2- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6a60040d60 0x7f6a60043210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:23.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.204+0000 7f6a71ffb700 1 -- 192.168.123.104:0/2267165273 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f6a6404c3a0 con 0x7f6a74102230 2026-03-10T14:03:23.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.204+0000 7f6a6b5ff700 1 --2- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6a60040d60 0x7f6a60043210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:23.203 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.205+0000 7f6a6b5ff700 1 --2- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6a60040d60 0x7f6a60043210 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f6a5c006fd0 tx=0x7f6a5c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:23.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.208+0000 7f6a71ffb700 1 -- 192.168.123.104:0/2267165273 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6a640045e0 con 0x7f6a74102230 2026-03-10T14:03:23.357 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.358+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6a7402cea0 con 0x7f6a74102230 2026-03-10T14:03:23.357 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.359+0000 7f6a71ffb700 1 -- 192.168.123.104:0/2267165273 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f6a64026070 con 0x7f6a74102230 2026-03-10T14:03:23.358 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:23.358 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:23.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6a60040d60 msgr2=0x7f6a60043210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:23.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 --2- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6a60040d60 0x7f6a60043210 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f6a5c006fd0 tx=0x7f6a5c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:23.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 msgr2=0x7f6a74197340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:23.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 --2- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 0x7f6a74197340 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6a64000c00 tx=0x7f6a64005650 comp rx=0 tx=0).stop 2026-03-10T14:03:23.360 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 shutdown_connections 2026-03-10T14:03:23.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 --2- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6a60040d60 0x7f6a60043210 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:23.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 --2- 192.168.123.104:0/2267165273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a74102230 0x7f6a74197340 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:23.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 >> 192.168.123.104:0/2267165273 conn(0x7f6a740fd8d0 msgr2=0x7f6a740fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:23.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 shutdown_connections 2026-03-10T14:03:23.361 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:23.362+0000 7f6a7a298700 1 -- 192.168.123.104:0/2267165273 wait complete. 2026-03-10T14:03:23.361 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: Deploying daemon grafana.vm03 on vm03 2026-03-10T14:03:23.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:23 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:24.431 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:24.431 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:24.577 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:24.595 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:24 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/2267165273' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:24.622 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.906+0000 7f0397e42700 1 -- 192.168.123.104:0/3990410116 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 msgr2=0x7f03900731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.906+0000 7f0397e42700 1 --2- 192.168.123.104:0/3990410116 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 0x7f03900731e0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f038c009b00 tx=0x7f038c009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.906+0000 7f0397e42700 1 -- 192.168.123.104:0/3990410116 shutdown_connections 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.906+0000 7f0397e42700 1 --2- 192.168.123.104:0/3990410116 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 0x7f03900731e0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.906+0000 7f0397e42700 1 -- 192.168.123.104:0/3990410116 >> 192.168.123.104:0/3990410116 conn(0x7f03900fb7e0 msgr2=0x7f03900fdc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.906+0000 7f0397e42700 1 -- 192.168.123.104:0/3990410116 shutdown_connections 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.907+0000 7f0397e42700 1 -- 192.168.123.104:0/3990410116 wait complete. 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.907+0000 7f0397e42700 1 Processor -- start 2026-03-10T14:03:24.905 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.907+0000 7f0397e42700 1 -- start start 2026-03-10T14:03:24.906 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.907+0000 7f0397e42700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 0x7f0390197550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:24.906 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.907+0000 7f0397e42700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f038c012070 con 0x7f0390074d80 2026-03-10T14:03:24.906 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0395bde700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 0x7f0390197550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:24.906 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0395bde700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 0x7f0390197550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:50578/0 (socket says 192.168.123.104:50578) 2026-03-10T14:03:24.906 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0395bde700 1 -- 192.168.123.104:0/169310567 learned_addr learned my addr 192.168.123.104:0/169310567 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:24.906 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0395bde700 1 -- 192.168.123.104:0/169310567 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f038c0097e0 con 0x7f0390074d80 2026-03-10T14:03:24.906 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0395bde700 1 --2- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 0x7f0390197550 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f038c0055d0 tx=0x7f038c0056b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:24.907 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0386ffd700 1 -- 192.168.123.104:0/169310567 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f038c01d070 con 0x7f0390074d80 2026-03-10T14:03:24.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0390197a90 con 0x7f0390074d80 2026-03-10T14:03:24.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0386ffd700 1 -- 192.168.123.104:0/169310567 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f038c005dc0 con 0x7f0390074d80 2026-03-10T14:03:24.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.908+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0390197eb0 con 0x7f0390074d80 2026-03-10T14:03:24.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.909+0000 7f0386ffd700 1 -- 192.168.123.104:0/169310567 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f038c00f460 con 0x7f0390074d80 2026-03-10T14:03:24.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.910+0000 7f0386ffd700 1 -- 192.168.123.104:0/169310567 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f038c0038a0 con 0x7f0390074d80 2026-03-10T14:03:24.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.910+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f039004fa50 con 0x7f0390074d80 2026-03-10T14:03:24.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.910+0000 7f0386ffd700 1 --2- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f037c038550 0x7f037c03aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:24.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.910+0000 7f0386ffd700 1 -- 192.168.123.104:0/169310567 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f038c04c5d0 con 0x7f0390074d80 2026-03-10T14:03:24.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.910+0000 7f03953dd700 1 --2- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f037c038550 0x7f037c03aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:24.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.911+0000 7f03953dd700 1 --2- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f037c038550 0x7f037c03aa00 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f0380006fd0 tx=0x7f0380006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:24.913 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:24.914+0000 7f0386ffd700 1 -- 192.168.123.104:0/169310567 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f038c0297d0 con 0x7f0390074d80 2026-03-10T14:03:25.059 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.060+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f039002cc30 con 0x7f0390074d80 2026-03-10T14:03:25.060 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.062+0000 7f0386ffd700 1 -- 192.168.123.104:0/169310567 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f038c026070 con 0x7f0390074d80 2026-03-10T14:03:25.060 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:25.060 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:25.062 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.064+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f037c038550 msgr2=0x7f037c03aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:25.062 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.064+0000 7f0397e42700 1 --2- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f037c038550 0x7f037c03aa00 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f0380006fd0 tx=0x7f0380006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.064+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 msgr2=0x7f0390197550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.064+0000 7f0397e42700 1 --2- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 0x7f0390197550 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f038c0055d0 tx=0x7f038c0056b0 comp rx=0 tx=0).stop 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.064+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 shutdown_connections 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.064+0000 7f0397e42700 1 --2- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f037c038550 0x7f037c03aa00 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.064+0000 7f0397e42700 1 --2- 192.168.123.104:0/169310567 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0390074d80 0x7f0390197550 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.064+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 >> 192.168.123.104:0/169310567 conn(0x7f03900fb7e0 msgr2=0x7f03900fc490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.065+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 shutdown_connections 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:25.065+0000 7f0397e42700 1 -- 192.168.123.104:0/169310567 wait complete. 2026-03-10T14:03:25.063 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:26.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:25 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/169310567' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:26.120 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:26.120 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:26.266 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:26.311 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.592+0000 7f843f16e700 1 -- 192.168.123.104:0/1581059177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 msgr2=0x7f84380feca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.592+0000 7f843f16e700 1 --2- 192.168.123.104:0/1581059177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 0x7f84380feca0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f8428009b00 tx=0x7f8428009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.593+0000 7f843f16e700 1 -- 192.168.123.104:0/1581059177 shutdown_connections 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.593+0000 7f843f16e700 1 --2- 192.168.123.104:0/1581059177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 0x7f84380feca0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.593+0000 7f843f16e700 1 -- 192.168.123.104:0/1581059177 >> 192.168.123.104:0/1581059177 conn(0x7f84380fa4a0 msgr2=0x7f84380fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.594+0000 7f843f16e700 1 -- 192.168.123.104:0/1581059177 shutdown_connections 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.594+0000 7f843f16e700 1 -- 192.168.123.104:0/1581059177 wait complete. 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.594+0000 7f843f16e700 1 Processor -- start 2026-03-10T14:03:26.592 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.594+0000 7f843f16e700 1 -- start start 2026-03-10T14:03:26.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.594+0000 7f843f16e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 0x7f8438197310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:26.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.594+0000 7f843f16e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8428012070 con 0x7f84380fe890 2026-03-10T14:03:26.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.595+0000 7f843cf0a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 0x7f8438197310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:26.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.595+0000 7f843cf0a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 0x7f8438197310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:50586/0 (socket says 192.168.123.104:50586) 2026-03-10T14:03:26.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.595+0000 7f843cf0a700 1 -- 192.168.123.104:0/575578355 learned_addr learned my addr 192.168.123.104:0/575578355 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:26.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.595+0000 7f843cf0a700 1 -- 192.168.123.104:0/575578355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84280097e0 con 0x7f84380fe890 2026-03-10T14:03:26.593 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.595+0000 7f843cf0a700 1 --2- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 0x7f8438197310 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f84280055d0 tx=0x7f84280056b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:26.594 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.595+0000 7f8435ffb700 1 -- 192.168.123.104:0/575578355 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f842801d070 con 0x7f84380fe890 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.595+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8438197850 con 0x7f84380fe890 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.595+0000 7f8435ffb700 1 -- 192.168.123.104:0/575578355 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8428005dc0 con 0x7f84380fe890 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.596+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8438197c70 con 0x7f84380fe890 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.597+0000 7f8435ffb700 1 -- 192.168.123.104:0/575578355 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f842800f600 con 0x7f84380fe890 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.597+0000 7f8435ffb700 1 -- 192.168.123.104:0/575578355 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f842800f870 con 0x7f84380fe890 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.597+0000 7f8435ffb700 1 --2- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8420038550 0x7f842003aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.597+0000 7f8435ffb700 1 -- 192.168.123.104:0/575578355 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f842804c6c0 con 0x7f84380fe890 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.597+0000 7f8437fff700 1 --2- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8420038550 0x7f842003aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:26.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.597+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8438191050 con 0x7f84380fe890 2026-03-10T14:03:26.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.598+0000 7f8437fff700 1 --2- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8420038550 0x7f842003aa00 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f842c006fd0 tx=0x7f842c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:26.599 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.600+0000 7f8435ffb700 1 -- 192.168.123.104:0/575578355 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8428026070 con 0x7f84380fe890 2026-03-10T14:03:26.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.753+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8438197f10 con 0x7f84380fe890 2026-03-10T14:03:26.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.755+0000 7f8435ffb700 1 -- 192.168.123.104:0/575578355 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8428015350 con 0x7f84380fe890 2026-03-10T14:03:26.754 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:26.754 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:26.756 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.758+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8420038550 msgr2=0x7f842003aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:26.756 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.758+0000 7f843f16e700 1 --2- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8420038550 0x7f842003aa00 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f842c006fd0 tx=0x7f842c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:26.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.758+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 msgr2=0x7f8438197310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:26.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.758+0000 7f843f16e700 1 --2- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 0x7f8438197310 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f84280055d0 tx=0x7f84280056b0 comp rx=0 tx=0).stop 2026-03-10T14:03:26.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.758+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 shutdown_connections 2026-03-10T14:03:26.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.758+0000 7f843f16e700 1 --2- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8420038550 0x7f842003aa00 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:26.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.758+0000 7f843f16e700 1 --2- 192.168.123.104:0/575578355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84380fe890 0x7f8438197310 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:26.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.758+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 >> 192.168.123.104:0/575578355 conn(0x7f84380fa4a0 msgr2=0x7f84380fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:26.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.759+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 shutdown_connections 2026-03-10T14:03:26.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:26.759+0000 7f843f16e700 1 -- 192.168.123.104:0/575578355 wait complete. 2026-03-10T14:03:26.758 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:27.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:26 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/575578355' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:27.819 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:27.819 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:27.966 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:28.008 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:28.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.281+0000 7f33ae422700 1 -- 192.168.123.104:0/3004429114 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 msgr2=0x7f33a8073160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:28.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.281+0000 7f33ae422700 1 --2- 192.168.123.104:0/3004429114 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 0x7f33a8073160 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f3390009b00 tx=0x7f3390009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:28.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.281+0000 7f33ae422700 1 -- 192.168.123.104:0/3004429114 shutdown_connections 2026-03-10T14:03:28.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.281+0000 7f33ae422700 1 --2- 192.168.123.104:0/3004429114 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 0x7f33a8073160 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:28.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.281+0000 7f33ae422700 1 -- 192.168.123.104:0/3004429114 >> 192.168.123.104:0/3004429114 conn(0x7f33a80fb5a0 msgr2=0x7f33a80fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:28.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.282+0000 7f33ae422700 1 -- 192.168.123.104:0/3004429114 shutdown_connections 2026-03-10T14:03:28.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.282+0000 7f33ae422700 1 -- 192.168.123.104:0/3004429114 wait complete. 2026-03-10T14:03:28.280 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.282+0000 7f33ae422700 1 Processor -- start 2026-03-10T14:03:28.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.282+0000 7f33ae422700 1 -- start start 2026-03-10T14:03:28.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.283+0000 7f33ae422700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 0x7f33a81973f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:28.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.283+0000 7f33ae422700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33a8197930 con 0x7f33a8074d00 2026-03-10T14:03:28.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.283+0000 7f33a7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 0x7f33a81973f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:28.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.283+0000 7f33a7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 0x7f33a81973f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:50596/0 (socket says 192.168.123.104:50596) 2026-03-10T14:03:28.281 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.283+0000 7f33a7fff700 1 -- 192.168.123.104:0/3449767257 learned_addr learned my addr 192.168.123.104:0/3449767257 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:28.282 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.283+0000 7f33a7fff700 1 -- 192.168.123.104:0/3449767257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33900097e0 con 0x7f33a8074d00 2026-03-10T14:03:28.282 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.284+0000 7f33a7fff700 1 --2- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 0x7f33a81973f0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f3390004f40 tx=0x7f3390005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:28.282 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.284+0000 7f33a57fa700 1 -- 192.168.123.104:0/3449767257 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f339001d070 con 0x7f33a8074d00 2026-03-10T14:03:28.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.284+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f33a8197b30 con 0x7f33a8074d00 2026-03-10T14:03:28.283 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.284+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33a8197fd0 con 0x7f33a8074d00 2026-03-10T14:03:28.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.285+0000 7f33a57fa700 1 -- 192.168.123.104:0/3449767257 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3390022470 con 0x7f33a8074d00 2026-03-10T14:03:28.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.285+0000 7f33a57fa700 1 -- 192.168.123.104:0/3449767257 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f339000f460 con 0x7f33a8074d00 2026-03-10T14:03:28.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.286+0000 7f33a57fa700 1 -- 192.168.123.104:0/3449767257 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f339000f650 con 0x7f33a8074d00 2026-03-10T14:03:28.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.286+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f33a8191060 con 0x7f33a8074d00 2026-03-10T14:03:28.284 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.286+0000 7f33a57fa700 1 --2- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3394038550 0x7f339403aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:28.285 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.286+0000 7f33a57fa700 1 -- 192.168.123.104:0/3449767257 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f339004d470 con 0x7f33a8074d00 2026-03-10T14:03:28.285 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.286+0000 7f33a77fe700 1 --2- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3394038550 0x7f339403aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:28.285 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.287+0000 7f33a77fe700 1 --2- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3394038550 0x7f339403aa00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3398006fd0 tx=0x7f3398006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:28.288 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.289+0000 7f33a57fa700 1 -- 192.168.123.104:0/3449767257 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f339002a950 con 0x7f33a8074d00 2026-03-10T14:03:28.438 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.438+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f33a802cc30 con 0x7f33a8074d00 2026-03-10T14:03:28.438 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.439+0000 7f33a57fa700 1 -- 192.168.123.104:0/3449767257 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f3390027030 con 0x7f33a8074d00 2026-03-10T14:03:28.439 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:28.439 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:28.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3394038550 msgr2=0x7f339403aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:28.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 --2- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3394038550 0x7f339403aa00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3398006fd0 tx=0x7f3398006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:28.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 msgr2=0x7f33a81973f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:28.440 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 --2- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 0x7f33a81973f0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f3390004f40 tx=0x7f3390005e70 comp rx=0 tx=0).stop 2026-03-10T14:03:28.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 shutdown_connections 2026-03-10T14:03:28.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 --2- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3394038550 0x7f339403aa00 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:28.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 --2- 192.168.123.104:0/3449767257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33a8074d00 0x7f33a81973f0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:28.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 >> 192.168.123.104:0/3449767257 conn(0x7f33a80fb5a0 msgr2=0x7f33a80fc270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:28.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 shutdown_connections 2026-03-10T14:03:28.441 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:28.442+0000 7f33ae422700 1 -- 192.168.123.104:0/3449767257 wait complete. 2026-03-10T14:03:28.441 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:29.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:28 vm03 ceph-mon[49718]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:03:29.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:28 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3449767257' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:29.485 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:29.486 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:29.625 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:29.660 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.087+0000 7f259c343700 1 -- 192.168.123.104:0/1690815089 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25940fe890 msgr2=0x7f25940feca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.087+0000 7f259c343700 1 --2- 192.168.123.104:0/1690815089 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25940fe890 0x7f25940feca0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f2584009b00 tx=0x7f2584009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.087+0000 7f259c343700 1 -- 192.168.123.104:0/1690815089 shutdown_connections 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.087+0000 7f259c343700 1 --2- 192.168.123.104:0/1690815089 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25940fe890 0x7f25940feca0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.087+0000 7f259c343700 1 -- 192.168.123.104:0/1690815089 >> 192.168.123.104:0/1690815089 conn(0x7f25940fa4a0 msgr2=0x7f25940fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.087+0000 7f259c343700 1 -- 192.168.123.104:0/1690815089 shutdown_connections 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.087+0000 7f259c343700 1 -- 192.168.123.104:0/1690815089 wait complete. 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.088+0000 7f259c343700 1 Processor -- start 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.088+0000 7f259c343700 1 -- start start 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.088+0000 7f259c343700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2594197590 0x7f25941979a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:30.086 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.088+0000 7f259c343700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2594197ee0 con 0x7f2594197590 2026-03-10T14:03:30.087 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.088+0000 7f259a0df700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2594197590 0x7f25941979a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:30.087 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.088+0000 7f259a0df700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2594197590 0x7f25941979a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:50618/0 (socket says 192.168.123.104:50618) 2026-03-10T14:03:30.087 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.088+0000 7f259a0df700 1 -- 192.168.123.104:0/3928290510 learned_addr learned my addr 192.168.123.104:0/3928290510 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:30.087 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.089+0000 7f259a0df700 1 -- 192.168.123.104:0/3928290510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f25840097e0 con 0x7f2594197590 2026-03-10T14:03:30.087 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.089+0000 7f259a0df700 1 --2- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2594197590 0x7f25941979a0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f2584009fd0 tx=0x7f2584005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:30.087 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.089+0000 7f258b7fe700 1 -- 192.168.123.104:0/3928290510 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f258401d070 con 0x7f2594197590 2026-03-10T14:03:30.088 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.089+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f25941980e0 con 0x7f2594197590 2026-03-10T14:03:30.088 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.089+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f259419ad40 con 0x7f2594197590 2026-03-10T14:03:30.088 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.090+0000 7f258b7fe700 1 -- 192.168.123.104:0/3928290510 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f258400b810 con 0x7f2594197590 2026-03-10T14:03:30.088 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.090+0000 7f258b7fe700 1 -- 192.168.123.104:0/3928290510 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2584022dc0 con 0x7f2594197590 2026-03-10T14:03:30.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.090+0000 7f258b7fe700 1 -- 192.168.123.104:0/3928290510 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f258400f460 con 0x7f2594197590 2026-03-10T14:03:30.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.090+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2594191050 con 0x7f2594197590 2026-03-10T14:03:30.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.091+0000 7f258b7fe700 1 --2- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2580038550 0x7f258003aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:30.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.091+0000 7f258b7fe700 1 -- 192.168.123.104:0/3928290510 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f258404d3d0 con 0x7f2594197590 2026-03-10T14:03:30.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.091+0000 7f25998de700 1 --2- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2580038550 0x7f258003aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:30.089 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.091+0000 7f25998de700 1 --2- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2580038550 0x7f258003aa00 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f2590006fd0 tx=0x7f2590006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:30.092 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.094+0000 7f258b7fe700 1 -- 192.168.123.104:0/3928290510 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2584020960 con 0x7f2594197590 2026-03-10T14:03:30.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.235+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2594062380 con 0x7f2594197590 2026-03-10T14:03:30.234 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.236+0000 7f258b7fe700 1 -- 192.168.123.104:0/3928290510 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2584027020 con 0x7f2594197590 2026-03-10T14:03:30.234 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:30.234 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.238+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2580038550 msgr2=0x7f258003aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.238+0000 7f259c343700 1 --2- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2580038550 0x7f258003aa00 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f2590006fd0 tx=0x7f2590006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.238+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2594197590 msgr2=0x7f25941979a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.238+0000 7f259c343700 1 --2- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2594197590 0x7f25941979a0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f2584009fd0 tx=0x7f2584005e70 comp rx=0 tx=0).stop 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.239+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 shutdown_connections 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.239+0000 7f259c343700 1 --2- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2580038550 0x7f258003aa00 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.239+0000 7f259c343700 1 --2- 192.168.123.104:0/3928290510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2594197590 0x7f25941979a0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.239+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 >> 192.168.123.104:0/3928290510 conn(0x7f25940fa4a0 msgr2=0x7f25940fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.239+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 shutdown_connections 2026-03-10T14:03:30.237 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:30.239+0000 7f259c343700 1 -- 192.168.123.104:0/3928290510 wait complete. 2026-03-10T14:03:30.238 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:31.298 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:31.299 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:30 vm03 ceph-mon[49718]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:03:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:30 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3928290510' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:31.438 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:31.477 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:31.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.737+0000 7f2fb27f5700 1 -- 192.168.123.104:0/2141750930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1021d0 msgr2=0x7f2fac1025e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:31.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.737+0000 7f2fb27f5700 1 --2- 192.168.123.104:0/2141750930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1021d0 0x7f2fac1025e0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f2f94009b00 tx=0x7f2f94009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:31.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.738+0000 7f2fb27f5700 1 -- 192.168.123.104:0/2141750930 shutdown_connections 2026-03-10T14:03:31.736 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.738+0000 7f2fb27f5700 1 --2- 192.168.123.104:0/2141750930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1021d0 0x7f2fac1025e0 secure :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f2f94009b00 tx=0x7f2f94009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:31.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.738+0000 7f2fb27f5700 1 -- 192.168.123.104:0/2141750930 >> 192.168.123.104:0/2141750930 conn(0x7f2fac0fd760 msgr2=0x7f2fac0ffbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:31.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.738+0000 7f2fb27f5700 1 -- 192.168.123.104:0/2141750930 shutdown_connections 2026-03-10T14:03:31.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.738+0000 7f2fb27f5700 1 -- 192.168.123.104:0/2141750930 wait complete. 2026-03-10T14:03:31.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.739+0000 7f2fb27f5700 1 Processor -- start 2026-03-10T14:03:31.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.739+0000 7f2fb27f5700 1 -- start start 2026-03-10T14:03:31.737 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.739+0000 7f2fb27f5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1975d0 0x7f2fac1979e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:31.738 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.739+0000 7f2fb27f5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2fac197f20 con 0x7f2fac1975d0 2026-03-10T14:03:31.738 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.740+0000 7f2fabfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1975d0 0x7f2fac1979e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:31.738 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.740+0000 7f2fabfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1975d0 0x7f2fac1979e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:50644/0 (socket says 192.168.123.104:50644) 2026-03-10T14:03:31.738 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.740+0000 7f2fabfff700 1 -- 192.168.123.104:0/408198466 learned_addr learned my addr 192.168.123.104:0/408198466 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:31.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.740+0000 7f2fabfff700 1 -- 192.168.123.104:0/408198466 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2f940097e0 con 0x7f2fac1975d0 2026-03-10T14:03:31.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.741+0000 7f2fabfff700 1 --2- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1975d0 0x7f2fac1979e0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f2f94009fd0 tx=0x7f2f94004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:31.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.741+0000 7f2fa97fa700 1 -- 192.168.123.104:0/408198466 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2f9401d070 con 0x7f2fac1975d0 2026-03-10T14:03:31.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.741+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2fac198120 con 0x7f2fac1975d0 2026-03-10T14:03:31.739 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.741+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2fac19ad80 con 0x7f2fac1975d0 2026-03-10T14:03:31.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.742+0000 7f2fa97fa700 1 -- 192.168.123.104:0/408198466 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2f940056f0 con 0x7f2fac1975d0 2026-03-10T14:03:31.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.742+0000 7f2fa97fa700 1 -- 192.168.123.104:0/408198466 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2f94022ca0 con 0x7f2fac1975d0 2026-03-10T14:03:31.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.742+0000 7f2fa97fa700 1 -- 192.168.123.104:0/408198466 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f2f9400f480 con 0x7f2fac1975d0 2026-03-10T14:03:31.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.742+0000 7f2fa97fa700 1 --2- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f98038550 0x7f2f9803aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:31.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.742+0000 7f2fa97fa700 1 -- 192.168.123.104:0/408198466 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f2f9404d380 con 0x7f2fac1975d0 2026-03-10T14:03:31.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.742+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2f8c005320 con 0x7f2fac1975d0 2026-03-10T14:03:31.744 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.745+0000 7f2fa97fa700 1 -- 192.168.123.104:0/408198466 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2f94027030 con 0x7f2fac1975d0 2026-03-10T14:03:31.746 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.748+0000 7f2fab7fe700 1 --2- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f98038550 0x7f2f9803aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:31.746 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.748+0000 7f2fab7fe700 1 --2- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f98038550 0x7f2f9803aa00 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2f9c006fd0 tx=0x7f2f9c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:31.907 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:31.907 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:31.907 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.907+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2f8c005190 con 0x7f2fac1975d0 2026-03-10T14:03:31.908 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.908+0000 7f2fa97fa700 1 -- 192.168.123.104:0/408198466 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2f9404a020 con 0x7f2fac1975d0 2026-03-10T14:03:31.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.911+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f98038550 msgr2=0x7f2f9803aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:31.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.911+0000 7f2fb27f5700 1 --2- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f98038550 0x7f2f9803aa00 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2f9c006fd0 tx=0x7f2f9c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:31.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.911+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1975d0 msgr2=0x7f2fac1979e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:31.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.911+0000 7f2fb27f5700 1 --2- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1975d0 0x7f2fac1979e0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f2f94009fd0 tx=0x7f2f94004dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:31.909 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.911+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 shutdown_connections 2026-03-10T14:03:31.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.911+0000 7f2fb27f5700 1 --2- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f98038550 0x7f2f9803aa00 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:31.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.911+0000 7f2fb27f5700 1 --2- 192.168.123.104:0/408198466 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fac1975d0 0x7f2fac1979e0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:31.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.911+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 >> 192.168.123.104:0/408198466 conn(0x7f2fac0fd760 msgr2=0x7f2fac0fe430 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:31.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.912+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 shutdown_connections 2026-03-10T14:03:31.910 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:31.912+0000 7f2fb27f5700 1 -- 192.168.123.104:0/408198466 wait complete. 2026-03-10T14:03:31.911 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:32.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:31 vm03 ceph-mon[49718]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:03:32.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:31 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/408198466' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:32.958 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:32.958 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:33.117 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:33.161 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:33.421 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.422+0000 7efe76c57700 1 -- 192.168.123.104:0/15115316 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 msgr2=0x7efe70102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.422+0000 7efe76c57700 1 --2- 192.168.123.104:0/15115316 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 0x7efe70102650 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7efe60009b00 tx=0x7efe60009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.423+0000 7efe76c57700 1 -- 192.168.123.104:0/15115316 shutdown_connections 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.423+0000 7efe76c57700 1 --2- 192.168.123.104:0/15115316 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 0x7efe70102650 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.423+0000 7efe76c57700 1 -- 192.168.123.104:0/15115316 >> 192.168.123.104:0/15115316 conn(0x7efe700fd8d0 msgr2=0x7efe700ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.423+0000 7efe76c57700 1 -- 192.168.123.104:0/15115316 shutdown_connections 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.423+0000 7efe76c57700 1 -- 192.168.123.104:0/15115316 wait complete. 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.423+0000 7efe76c57700 1 Processor -- start 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.424+0000 7efe76c57700 1 -- start start 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.424+0000 7efe76c57700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 0x7efe70197420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.424+0000 7efe76c57700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efe70197960 con 0x7efe70102240 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.424+0000 7efe749f3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 0x7efe70197420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.424+0000 7efe749f3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 0x7efe70197420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:36126/0 (socket says 192.168.123.104:36126) 2026-03-10T14:03:33.422 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.424+0000 7efe749f3700 1 -- 192.168.123.104:0/3613415709 learned_addr learned my addr 192.168.123.104:0/3613415709 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:33.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.424+0000 7efe749f3700 1 -- 192.168.123.104:0/3613415709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efe600097e0 con 0x7efe70102240 2026-03-10T14:03:33.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.424+0000 7efe749f3700 1 --2- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 0x7efe70197420 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7efe60004d10 tx=0x7efe60004df0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:33.423 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.425+0000 7efe6dffb700 1 -- 192.168.123.104:0/3613415709 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe6001d070 con 0x7efe70102240 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.425+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efe70197b60 con 0x7efe70102240 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.425+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efe70198000 con 0x7efe70102240 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.426+0000 7efe6dffb700 1 -- 192.168.123.104:0/3613415709 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efe600056f0 con 0x7efe70102240 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.426+0000 7efe6dffb700 1 -- 192.168.123.104:0/3613415709 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe6000f460 con 0x7efe70102240 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.426+0000 7efe6dffb700 1 -- 192.168.123.104:0/3613415709 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7efe6000f620 con 0x7efe70102240 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.426+0000 7efe6dffb700 1 --2- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efe58038550 0x7efe5803aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.426+0000 7efe6dffb700 1 -- 192.168.123.104:0/3613415709 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7efe6004d440 con 0x7efe70102240 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.427+0000 7efe6ffff700 1 --2- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efe58038550 0x7efe5803aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:33.425 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.427+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efe5c005320 con 0x7efe70102240 2026-03-10T14:03:33.428 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.430+0000 7efe6ffff700 1 --2- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efe58038550 0x7efe5803aa00 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7efe64006fd0 tx=0x7efe64006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:33.428 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.430+0000 7efe6dffb700 1 -- 192.168.123.104:0/3613415709 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7efe60027030 con 0x7efe70102240 2026-03-10T14:03:33.574 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.575+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7efe5c005190 con 0x7efe70102240 2026-03-10T14:03:33.575 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.576+0000 7efe6dffb700 1 -- 192.168.123.104:0/3613415709 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7efe60017440 con 0x7efe70102240 2026-03-10T14:03:33.575 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:33.575 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:33.577 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.579+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efe58038550 msgr2=0x7efe5803aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:33.577 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.579+0000 7efe76c57700 1 --2- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efe58038550 0x7efe5803aa00 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7efe64006fd0 tx=0x7efe64006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:33.577 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.579+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 msgr2=0x7efe70197420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:33.578 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.579+0000 7efe76c57700 1 --2- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 0x7efe70197420 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7efe60004d10 tx=0x7efe60004df0 comp rx=0 tx=0).stop 2026-03-10T14:03:33.578 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.579+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 shutdown_connections 2026-03-10T14:03:33.578 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.579+0000 7efe76c57700 1 --2- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efe58038550 0x7efe5803aa00 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:33.578 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.579+0000 7efe76c57700 1 --2- 192.168.123.104:0/3613415709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe70102240 0x7efe70197420 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:33.578 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.580+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 >> 192.168.123.104:0/3613415709 conn(0x7efe700fd8d0 msgr2=0x7efe700fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:33.578 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.580+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 shutdown_connections 2026-03-10T14:03:33.578 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:33.580+0000 7efe76c57700 1 -- 192.168.123.104:0/3613415709 wait complete. 2026-03-10T14:03:33.579 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:33.784 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:33 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:33.784 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:33 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:33.784 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:33 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:33.784 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:33 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:33.784 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:33 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:33.784 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:33 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:33.784 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:33 vm03 ceph-mon[49718]: Deploying daemon prometheus.vm03 on vm03 2026-03-10T14:03:33.784 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:33 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:34.648 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:34.648 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:34.798 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:34.836 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:34.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:34 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3613415709' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:34.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:34 vm03 ceph-mon[49718]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:03:35.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.095+0000 7f7764b76700 1 -- 192.168.123.104:0/906080445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 msgr2=0x7f7760102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:35.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.095+0000 7f7764b76700 1 --2- 192.168.123.104:0/906080445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 0x7f7760102650 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f7750009b00 tx=0x7f7750009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:35.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.096+0000 7f7764b76700 1 -- 192.168.123.104:0/906080445 shutdown_connections 2026-03-10T14:03:35.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.096+0000 7f7764b76700 1 --2- 192.168.123.104:0/906080445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 0x7f7760102650 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:35.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.096+0000 7f7764b76700 1 -- 192.168.123.104:0/906080445 >> 192.168.123.104:0/906080445 conn(0x7f77600fd8d0 msgr2=0x7f77600ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.096+0000 7f7764b76700 1 -- 192.168.123.104:0/906080445 shutdown_connections 2026-03-10T14:03:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.096+0000 7f7764b76700 1 -- 192.168.123.104:0/906080445 wait complete. 2026-03-10T14:03:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.096+0000 7f7764b76700 1 Processor -- start 2026-03-10T14:03:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.097+0000 7f7764b76700 1 -- start start 2026-03-10T14:03:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.097+0000 7f7764b76700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 0x7f7760197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.097+0000 7f7764b76700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7760197990 con 0x7f7760102240 2026-03-10T14:03:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.097+0000 7f775e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 0x7f7760197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:35.095 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.097+0000 7f775e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 0x7f7760197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:36138/0 (socket says 192.168.123.104:36138) 2026-03-10T14:03:35.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.097+0000 7f775e59c700 1 -- 192.168.123.104:0/97092359 learned_addr learned my addr 192.168.123.104:0/97092359 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:35.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.097+0000 7f775e59c700 1 -- 192.168.123.104:0/97092359 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f77500097e0 con 0x7f7760102240 2026-03-10T14:03:35.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.098+0000 7f775e59c700 1 --2- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 0x7f7760197450 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f7750004d40 tx=0x7f7750004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:35.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.098+0000 7f77577fe700 1 -- 192.168.123.104:0/97092359 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f775001c070 con 0x7f7760102240 2026-03-10T14:03:35.096 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.098+0000 7f77577fe700 1 -- 192.168.123.104:0/97092359 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f77500056f0 con 0x7f7760102240 2026-03-10T14:03:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.098+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7760197b90 con 0x7f7760102240 2026-03-10T14:03:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.098+0000 7f77577fe700 1 -- 192.168.123.104:0/97092359 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7750017440 con 0x7f7760102240 2026-03-10T14:03:35.097 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.098+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7760198030 con 0x7f7760102240 2026-03-10T14:03:35.098 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.099+0000 7f77577fe700 1 -- 192.168.123.104:0/97092359 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f77500175a0 con 0x7f7760102240 2026-03-10T14:03:35.098 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.099+0000 7f77577fe700 1 --2- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7740038510 0x7f774003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:35.098 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.100+0000 7f77577fe700 1 -- 192.168.123.104:0/97092359 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f775004bff0 con 0x7f7760102240 2026-03-10T14:03:35.099 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.100+0000 7f7757fff700 1 --2- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7740038510 0x7f774003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:35.099 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.100+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7760191090 con 0x7f7760102240 2026-03-10T14:03:35.099 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.100+0000 7f7757fff700 1 --2- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7740038510 0x7f774003a9c0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f7748006fd0 tx=0x7f7748006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:35.102 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.103+0000 7f77577fe700 1 -- 192.168.123.104:0/97092359 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7750025070 con 0x7f7760102240 2026-03-10T14:03:35.253 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.253+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f7760062380 con 0x7f7760102240 2026-03-10T14:03:35.254 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.256+0000 7f77577fe700 1 -- 192.168.123.104:0/97092359 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f77500289b0 con 0x7f7760102240 2026-03-10T14:03:35.254 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:35.254 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:35.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.258+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7740038510 msgr2=0x7f774003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:35.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.258+0000 7f7764b76700 1 --2- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7740038510 0x7f774003a9c0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f7748006fd0 tx=0x7f7748006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:35.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.258+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 msgr2=0x7f7760197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:35.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.258+0000 7f7764b76700 1 --2- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 0x7f7760197450 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f7750004d40 tx=0x7f7750004e20 comp rx=0 tx=0).stop 2026-03-10T14:03:35.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.259+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 shutdown_connections 2026-03-10T14:03:35.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.259+0000 7f7764b76700 1 --2- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7740038510 0x7f774003a9c0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:35.257 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.259+0000 7f7764b76700 1 --2- 192.168.123.104:0/97092359 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7760102240 0x7f7760197450 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:35.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.259+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 >> 192.168.123.104:0/97092359 conn(0x7f77600fd8d0 msgr2=0x7f77600fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:35.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.259+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 shutdown_connections 2026-03-10T14:03:35.258 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:35.259+0000 7f7764b76700 1 -- 192.168.123.104:0/97092359 wait complete. 2026-03-10T14:03:35.258 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:36.326 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:36.326 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:36.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:35 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/97092359' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:36.467 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:36.507 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.777+0000 7f2474e83700 1 -- 192.168.123.104:0/232000877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 msgr2=0x7f2470102620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.777+0000 7f2474e83700 1 --2- 192.168.123.104:0/232000877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 0x7f2470102620 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f2458009b00 tx=0x7f2458009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.778+0000 7f2474e83700 1 -- 192.168.123.104:0/232000877 shutdown_connections 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.778+0000 7f2474e83700 1 --2- 192.168.123.104:0/232000877 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 0x7f2470102620 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.778+0000 7f2474e83700 1 -- 192.168.123.104:0/232000877 >> 192.168.123.104:0/232000877 conn(0x7f24700fd8d0 msgr2=0x7f24700ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.778+0000 7f2474e83700 1 -- 192.168.123.104:0/232000877 shutdown_connections 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.778+0000 7f2474e83700 1 -- 192.168.123.104:0/232000877 wait complete. 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f2474e83700 1 Processor -- start 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f2474e83700 1 -- start start 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f2474e83700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 0x7f24701973d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f2474e83700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2470197910 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f246e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 0x7f24701973d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f246e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 0x7f24701973d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:36144/0 (socket says 192.168.123.104:36144) 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f246e59c700 1 -- 192.168.123.104:0/4088604064 learned_addr learned my addr 192.168.123.104:0/4088604064 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f246e59c700 1 -- 192.168.123.104:0/4088604064 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24580097e0 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.779+0000 7f246e59c700 1 --2- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 0x7f24701973d0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f2458004750 tx=0x7f2458005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.780+0000 7f24677fe700 1 -- 192.168.123.104:0/4088604064 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f245801c070 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.780+0000 7f24677fe700 1 -- 192.168.123.104:0/4088604064 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2458021470 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.780+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2470197b10 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.780+0000 7f24677fe700 1 -- 192.168.123.104:0/4088604064 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f245800f460 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.780+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2470197fb0 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.781+0000 7f24677fe700 1 -- 192.168.123.104:0/4088604064 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f2458021ac0 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.781+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2470191060 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.781+0000 7f24677fe700 1 --2- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f245c0384c0 0x7f245c03a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.781+0000 7f24677fe700 1 -- 192.168.123.104:0/4088604064 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f245804c3e0 con 0x7f2470102210 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.781+0000 7f246dd9b700 1 --2- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f245c0384c0 0x7f245c03a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.781+0000 7f246dd9b700 1 --2- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f245c0384c0 0x7f245c03a970 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f2460006fd0 tx=0x7f2460006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:36.827 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.783+0000 7f24677fe700 1 -- 192.168.123.104:0/4088604064 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f245800f5c0 con 0x7f2470102210 2026-03-10T14:03:36.922 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.922+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2470062380 con 0x7f2470102210 2026-03-10T14:03:36.923 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.924+0000 7f24677fe700 1 -- 192.168.123.104:0/4088604064 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2458026030 con 0x7f2470102210 2026-03-10T14:03:36.923 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:36.923 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:36.925 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f245c0384c0 msgr2=0x7f245c03a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:36.925 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 --2- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f245c0384c0 0x7f245c03a970 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f2460006fd0 tx=0x7f2460006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:36.925 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 msgr2=0x7f24701973d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:36.925 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 --2- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 0x7f24701973d0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f2458004750 tx=0x7f2458005dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:36.925 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 shutdown_connections 2026-03-10T14:03:36.925 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 --2- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f245c0384c0 0x7f245c03a970 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:36.925 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 --2- 192.168.123.104:0/4088604064 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2470102210 0x7f24701973d0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:36.926 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 >> 192.168.123.104:0/4088604064 conn(0x7f24700fd8d0 msgr2=0x7f24700ff390 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:36.926 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.927+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 shutdown_connections 2026-03-10T14:03:36.926 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:36.928+0000 7f2474e83700 1 -- 192.168.123.104:0/4088604064 wait complete. 2026-03-10T14:03:36.927 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:37.129 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:36 vm03 ceph-mon[49718]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:03:37.991 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:37.991 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:38.129 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:38.166 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:38.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:37 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/4088604064' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:38.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:37 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:38.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:37 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:38.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:37 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:38.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:37 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-10T14:03:38.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:37 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.414+0000 7f836af5c700 1 -- 192.168.123.104:0/1016855896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 msgr2=0x7f8364102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.414+0000 7f836af5c700 1 --2- 192.168.123.104:0/1016855896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 0x7f8364102640 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f8354009b00 tx=0x7f8354009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.415+0000 7f836af5c700 1 -- 192.168.123.104:0/1016855896 shutdown_connections 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.415+0000 7f836af5c700 1 --2- 192.168.123.104:0/1016855896 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 0x7f8364102640 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.415+0000 7f836af5c700 1 -- 192.168.123.104:0/1016855896 >> 192.168.123.104:0/1016855896 conn(0x7f83640fd8d0 msgr2=0x7f83640ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.415+0000 7f836af5c700 1 -- 192.168.123.104:0/1016855896 shutdown_connections 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.415+0000 7f836af5c700 1 -- 192.168.123.104:0/1016855896 wait complete. 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.416+0000 7f836af5c700 1 Processor -- start 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.416+0000 7f836af5c700 1 -- start start 2026-03-10T14:03:38.414 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.416+0000 7f836af5c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 0x7f8364197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:38.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.416+0000 7f836af5c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83641978c0 con 0x7f8364102230 2026-03-10T14:03:38.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.416+0000 7f8368cf8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 0x7f8364197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:38.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f8368cf8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 0x7f8364197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:36150/0 (socket says 192.168.123.104:36150) 2026-03-10T14:03:38.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f8368cf8700 1 -- 192.168.123.104:0/3031792841 learned_addr learned my addr 192.168.123.104:0/3031792841 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:38.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f8368cf8700 1 -- 192.168.123.104:0/3031792841 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f83540097e0 con 0x7f8364102230 2026-03-10T14:03:38.415 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f8368cf8700 1 --2- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 0x7f8364197380 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f8354004750 tx=0x7f8354005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:38.416 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f8361ffb700 1 -- 192.168.123.104:0/3031792841 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f835401c070 con 0x7f8364102230 2026-03-10T14:03:38.416 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f8361ffb700 1 -- 192.168.123.104:0/3031792841 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8354021470 con 0x7f8364102230 2026-03-10T14:03:38.416 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f8361ffb700 1 -- 192.168.123.104:0/3031792841 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f835400f460 con 0x7f8364102230 2026-03-10T14:03:38.416 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8364197ac0 con 0x7f8364102230 2026-03-10T14:03:38.416 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.417+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8364197f60 con 0x7f8364102230 2026-03-10T14:03:38.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.418+0000 7f8361ffb700 1 -- 192.168.123.104:0/3031792841 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f8354021ac0 con 0x7f8364102230 2026-03-10T14:03:38.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.418+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8364191080 con 0x7f8364102230 2026-03-10T14:03:38.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.419+0000 7f8361ffb700 1 --2- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f834c0384c0 0x7f834c03a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:38.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.419+0000 7f8361ffb700 1 -- 192.168.123.104:0/3031792841 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f835404c3a0 con 0x7f8364102230 2026-03-10T14:03:38.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.419+0000 7f8363fff700 1 --2- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f834c0384c0 0x7f834c03a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:38.417 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.419+0000 7f8363fff700 1 --2- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f834c0384c0 0x7f834c03a970 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f8358006fd0 tx=0x7f8358006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:38.420 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.422+0000 7f8361ffb700 1 -- 192.168.123.104:0/3031792841 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f835400f5c0 con 0x7f8364102230 2026-03-10T14:03:38.561 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.562+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8364062380 con 0x7f8364102230 2026-03-10T14:03:38.562 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.563+0000 7f8361ffb700 1 -- 192.168.123.104:0/3031792841 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8354026030 con 0x7f8364102230 2026-03-10T14:03:38.562 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:38.562 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:38.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.565+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f834c0384c0 msgr2=0x7f834c03a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:38.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.565+0000 7f836af5c700 1 --2- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f834c0384c0 0x7f834c03a970 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f8358006fd0 tx=0x7f8358006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:38.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.567+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 msgr2=0x7f8364197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:38.565 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.567+0000 7f836af5c700 1 --2- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 0x7f8364197380 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f8354004750 tx=0x7f8354005dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:38.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.567+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 shutdown_connections 2026-03-10T14:03:38.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.567+0000 7f836af5c700 1 --2- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f834c0384c0 0x7f834c03a970 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:38.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.567+0000 7f836af5c700 1 --2- 192.168.123.104:0/3031792841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8364102230 0x7f8364197380 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:38.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.567+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 >> 192.168.123.104:0/3031792841 conn(0x7f83640fd8d0 msgr2=0x7f83640fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:38.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.568+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 shutdown_connections 2026-03-10T14:03:38.566 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:38.568+0000 7f836af5c700 1 -- 192.168.123.104:0/3031792841 wait complete. 2026-03-10T14:03:38.567 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:38 vm03 ceph-mon[49718]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:03:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:38 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3031792841' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:38 vm03 ceph-mon[49718]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-10T14:03:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:38 vm03 ceph-mon[49718]: mgrmap e14: vm03.rwbbep(active, since 31s) 2026-03-10T14:03:39.629 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:39.629 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:39.770 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:39.809 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.073+0000 7f4e775ad700 1 -- 192.168.123.104:0/3750525677 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 msgr2=0x7f4e70100420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.073+0000 7f4e775ad700 1 --2- 192.168.123.104:0/3750525677 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 0x7f4e70100420 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f4e60009b00 tx=0x7f4e60009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.074+0000 7f4e775ad700 1 -- 192.168.123.104:0/3750525677 shutdown_connections 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.074+0000 7f4e775ad700 1 --2- 192.168.123.104:0/3750525677 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 0x7f4e70100420 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.074+0000 7f4e775ad700 1 -- 192.168.123.104:0/3750525677 >> 192.168.123.104:0/3750525677 conn(0x7f4e700fb5a0 msgr2=0x7f4e700fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.074+0000 7f4e775ad700 1 -- 192.168.123.104:0/3750525677 shutdown_connections 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.075+0000 7f4e775ad700 1 -- 192.168.123.104:0/3750525677 wait complete. 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.075+0000 7f4e775ad700 1 Processor -- start 2026-03-10T14:03:40.073 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.075+0000 7f4e775ad700 1 -- start start 2026-03-10T14:03:40.074 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.075+0000 7f4e775ad700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 0x7f4e701973a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:40.074 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.075+0000 7f4e775ad700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e701978e0 con 0x7f4e70100010 2026-03-10T14:03:40.074 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.076+0000 7f4e75349700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 0x7f4e701973a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:40.074 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.076+0000 7f4e75349700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 0x7f4e701973a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:36174/0 (socket says 192.168.123.104:36174) 2026-03-10T14:03:40.074 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.076+0000 7f4e75349700 1 -- 192.168.123.104:0/78808770 learned_addr learned my addr 192.168.123.104:0/78808770 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:40.074 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.076+0000 7f4e75349700 1 -- 192.168.123.104:0/78808770 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4e600097e0 con 0x7f4e70100010 2026-03-10T14:03:40.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.076+0000 7f4e75349700 1 --2- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 0x7f4e701973a0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f4e60004750 tx=0x7f4e60005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:40.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.077+0000 7f4e667fc700 1 -- 192.168.123.104:0/78808770 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4e6001c070 con 0x7f4e70100010 2026-03-10T14:03:40.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.077+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4e70197ae0 con 0x7f4e70100010 2026-03-10T14:03:40.075 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.077+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4e70197f80 con 0x7f4e70100010 2026-03-10T14:03:40.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.077+0000 7f4e667fc700 1 -- 192.168.123.104:0/78808770 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4e60021470 con 0x7f4e70100010 2026-03-10T14:03:40.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.077+0000 7f4e667fc700 1 -- 192.168.123.104:0/78808770 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4e6000f460 con 0x7f4e70100010 2026-03-10T14:03:40.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.078+0000 7f4e667fc700 1 -- 192.168.123.104:0/78808770 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f4e600215e0 con 0x7f4e70100010 2026-03-10T14:03:40.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.078+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4e54005320 con 0x7f4e70100010 2026-03-10T14:03:40.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.078+0000 7f4e667fc700 1 --2- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e5c038560 0x7f4e5c03aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:40.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.078+0000 7f4e667fc700 1 -- 192.168.123.104:0/78808770 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f4e6004c510 con 0x7f4e70100010 2026-03-10T14:03:40.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.078+0000 7f4e74b48700 1 -- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e5c038560 msgr2=0x7f4e5c03aa10 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:03:40.076 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.078+0000 7f4e74b48700 1 --2- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e5c038560 0x7f4e5c03aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T14:03:40.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.083+0000 7f4e667fc700 1 -- 192.168.123.104:0/78808770 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4e6002aba0 con 0x7f4e70100010 2026-03-10T14:03:40.228 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.228+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f4e54005190 con 0x7f4e70100010 2026-03-10T14:03:40.229 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.230+0000 7f4e667fc700 1 -- 192.168.123.104:0/78808770 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f4e60026020 con 0x7f4e70100010 2026-03-10T14:03:40.229 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:40.229 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:40.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e5c038560 msgr2=0x7f4e5c03aa10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:03:40.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 --2- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e5c038560 0x7f4e5c03aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:40.231 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 msgr2=0x7f4e701973a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:40.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 --2- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 0x7f4e701973a0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f4e60004750 tx=0x7f4e60005dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:40.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 shutdown_connections 2026-03-10T14:03:40.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 --2- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e5c038560 0x7f4e5c03aa10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:40.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 --2- 192.168.123.104:0/78808770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e70100010 0x7f4e701973a0 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:40.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 >> 192.168.123.104:0/78808770 conn(0x7f4e700fb5a0 msgr2=0x7f4e700fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:40.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.233+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 shutdown_connections 2026-03-10T14:03:40.232 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:40.234+0000 7f4e775ad700 1 -- 192.168.123.104:0/78808770 wait complete. 2026-03-10T14:03:40.233 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:41.064 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:40 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/78808770' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:41.274 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:41.274 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:41.419 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:41.456 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:41.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.705+0000 7f3d919a3700 1 -- 192.168.123.104:0/422140378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 msgr2=0x7f3d8c102620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:41.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.705+0000 7f3d919a3700 1 --2- 192.168.123.104:0/422140378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 0x7f3d8c102620 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f3d74009b00 tx=0x7f3d74009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:41.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.705+0000 7f3d919a3700 1 -- 192.168.123.104:0/422140378 shutdown_connections 2026-03-10T14:03:41.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.705+0000 7f3d919a3700 1 --2- 192.168.123.104:0/422140378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 0x7f3d8c102620 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:41.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.705+0000 7f3d919a3700 1 -- 192.168.123.104:0/422140378 >> 192.168.123.104:0/422140378 conn(0x7f3d8c0fd8d0 msgr2=0x7f3d8c0ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:41.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.706+0000 7f3d919a3700 1 -- 192.168.123.104:0/422140378 shutdown_connections 2026-03-10T14:03:41.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.706+0000 7f3d919a3700 1 -- 192.168.123.104:0/422140378 wait complete. 2026-03-10T14:03:41.704 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.706+0000 7f3d919a3700 1 Processor -- start 2026-03-10T14:03:41.705 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.706+0000 7f3d919a3700 1 -- start start 2026-03-10T14:03:41.705 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.706+0000 7f3d919a3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 0x7f3d8c1973d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:41.705 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.706+0000 7f3d919a3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d8c197910 con 0x7f3d8c102210 2026-03-10T14:03:41.705 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.707+0000 7f3d8affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 0x7f3d8c1973d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:41.705 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.707+0000 7f3d8affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 0x7f3d8c1973d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:36194/0 (socket says 192.168.123.104:36194) 2026-03-10T14:03:41.705 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.707+0000 7f3d8affd700 1 -- 192.168.123.104:0/1134832445 learned_addr learned my addr 192.168.123.104:0/1134832445 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:41.706 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.707+0000 7f3d8affd700 1 -- 192.168.123.104:0/1134832445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d740097e0 con 0x7f3d8c102210 2026-03-10T14:03:41.706 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.707+0000 7f3d8affd700 1 --2- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 0x7f3d8c1973d0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f3d74004750 tx=0x7f3d74005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:41.706 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.708+0000 7f3d909a1700 1 -- 192.168.123.104:0/1134832445 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3d7401c070 con 0x7f3d8c102210 2026-03-10T14:03:41.706 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.708+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d8c197b10 con 0x7f3d8c102210 2026-03-10T14:03:41.706 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.708+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d8c197fb0 con 0x7f3d8c102210 2026-03-10T14:03:41.707 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.709+0000 7f3d909a1700 1 -- 192.168.123.104:0/1134832445 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3d74021470 con 0x7f3d8c102210 2026-03-10T14:03:41.707 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.709+0000 7f3d909a1700 1 -- 192.168.123.104:0/1134832445 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3d7400f460 con 0x7f3d8c102210 2026-03-10T14:03:41.707 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.709+0000 7f3d909a1700 1 -- 192.168.123.104:0/1134832445 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f3d7400f6d0 con 0x7f3d8c102210 2026-03-10T14:03:41.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.709+0000 7f3d909a1700 1 --2- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3d78038510 0x7f3d7803a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:41.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.709+0000 7f3d8a7fc700 1 -- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3d78038510 msgr2=0x7f3d7803a9c0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-10T14:03:41.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.709+0000 7f3d8a7fc700 1 --2- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3d78038510 0x7f3d7803a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T14:03:41.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.709+0000 7f3d909a1700 1 -- 192.168.123.104:0/1134832445 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f3d74029720 con 0x7f3d8c102210 2026-03-10T14:03:41.708 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.710+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d8c191060 con 0x7f3d8c102210 2026-03-10T14:03:41.712 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.714+0000 7f3d909a1700 1 -- 192.168.123.104:0/1134832445 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3d74017630 con 0x7f3d8c102210 2026-03-10T14:03:41.853 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.854+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f3d8c062380 con 0x7f3d8c102210 2026-03-10T14:03:41.854 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.855+0000 7f3d909a1700 1 -- 192.168.123.104:0/1134832445 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f3d74017630 con 0x7f3d8c102210 2026-03-10T14:03:41.854 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:41.854 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3d78038510 msgr2=0x7f3d7803a9c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 --2- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3d78038510 0x7f3d7803a9c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 msgr2=0x7f3d8c1973d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 --2- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 0x7f3d8c1973d0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f3d74004750 tx=0x7f3d74005dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 shutdown_connections 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 --2- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3d78038510 0x7f3d7803a9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 --2- 192.168.123.104:0/1134832445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d8c102210 0x7f3d8c1973d0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 >> 192.168.123.104:0/1134832445 conn(0x7f3d8c0fd8d0 msgr2=0x7f3d8c0ff390 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 shutdown_connections 2026-03-10T14:03:41.856 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:41.858+0000 7f3d919a3700 1 -- 192.168.123.104:0/1134832445 wait complete. 2026-03-10T14:03:41.857 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:42.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:41 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/1134832445' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:42.896 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:42.897 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:43.031 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:43.066 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.337+0000 7fd115ae6700 1 -- 192.168.123.104:0/3152894527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 msgr2=0x7fd1100719e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.337+0000 7fd115ae6700 1 --2- 192.168.123.104:0/3152894527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 0x7fd1100719e0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fd100009b00 tx=0x7fd100009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 -- 192.168.123.104:0/3152894527 shutdown_connections 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 --2- 192.168.123.104:0/3152894527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 0x7fd1100719e0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 -- 192.168.123.104:0/3152894527 >> 192.168.123.104:0/3152894527 conn(0x7fd11006cf00 msgr2=0x7fd11006f350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 -- 192.168.123.104:0/3152894527 shutdown_connections 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 -- 192.168.123.104:0/3152894527 wait complete. 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 Processor -- start 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 -- start start 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 0x7fd1101acab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:43.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.338+0000 7fd115ae6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd100012070 con 0x7fd1100715f0 2026-03-10T14:03:43.338 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.339+0000 7fd10f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 0x7fd1101acab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:43.338 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.339+0000 7fd10f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 0x7fd1101acab0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:38966/0 (socket says 192.168.123.104:38966) 2026-03-10T14:03:43.338 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.339+0000 7fd10f7fe700 1 -- 192.168.123.104:0/1259954560 learned_addr learned my addr 192.168.123.104:0/1259954560 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:43.338 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.339+0000 7fd10f7fe700 1 -- 192.168.123.104:0/1259954560 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd1000097e0 con 0x7fd1100715f0 2026-03-10T14:03:43.338 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.340+0000 7fd10f7fe700 1 --2- 192.168.123.104:0/1259954560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 0x7fd1101acab0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7fd1000055a0 tx=0x7fd100005680 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:43.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.340+0000 7fd10cff9700 1 -- 192.168.123.104:0/1259954560 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd10001d070 con 0x7fd1100715f0 2026-03-10T14:03:43.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.340+0000 7fd10cff9700 1 -- 192.168.123.104:0/1259954560 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd10000b810 con 0x7fd1100715f0 2026-03-10T14:03:43.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.340+0000 7fd10cff9700 1 -- 192.168.123.104:0/1259954560 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd10000f460 con 0x7fd1100715f0 2026-03-10T14:03:43.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.341+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd1101acff0 con 0x7fd1100715f0 2026-03-10T14:03:43.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.341+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd1101ad410 con 0x7fd1100715f0 2026-03-10T14:03:43.340 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.342+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd110110890 con 0x7fd1100715f0 2026-03-10T14:03:43.341 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.342+0000 7fd10cff9700 1 -- 192.168.123.104:0/1259954560 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 15) v1 ==== 44873+0+0 (secure 0 0 0) 0x7fd100003810 con 0x7fd1100715f0 2026-03-10T14:03:43.341 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.342+0000 7fd10cff9700 1 -- 192.168.123.104:0/1259954560 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fd10004c540 con 0x7fd1100715f0 2026-03-10T14:03:43.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.345+0000 7fd10cff9700 1 -- 192.168.123.104:0/1259954560 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd10000edb0 con 0x7fd1100715f0 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: Active manager daemon vm03.rwbbep restarted 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: Activating manager daemon vm03.rwbbep 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: mgrmap e15: vm03.rwbbep(active, starting, since 0.00567924s) 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: Manager daemon vm03.rwbbep is now available 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:03:43.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:03:43.488 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.488+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fd110062380 con 0x7fd1100715f0 2026-03-10T14:03:43.489 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.489+0000 7fd10cff9700 1 -- 192.168.123.104:0/1259954560 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fd100026030 con 0x7fd1100715f0 2026-03-10T14:03:43.489 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:43.489 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:43.495 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.495+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 msgr2=0x7fd1101acab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:43.495 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.495+0000 7fd115ae6700 1 --2- 192.168.123.104:0/1259954560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 0x7fd1101acab0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7fd1000055a0 tx=0x7fd100005680 comp rx=0 tx=0).stop 2026-03-10T14:03:43.495 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.496+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 shutdown_connections 2026-03-10T14:03:43.495 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.496+0000 7fd115ae6700 1 --2- 192.168.123.104:0/1259954560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1100715f0 0x7fd1101acab0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:43.495 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.496+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 >> 192.168.123.104:0/1259954560 conn(0x7fd11006cf00 msgr2=0x7fd11006fc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:43.495 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.496+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 shutdown_connections 2026-03-10T14:03:43.495 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:43.496+0000 7fd115ae6700 1 -- 192.168.123.104:0/1259954560 wait complete. 2026-03-10T14:03:43.496 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: [10/Mar/2026:14:03:43] ENGINE Bus STARTING 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/1259954560' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: [10/Mar/2026:14:03:43] ENGINE Serving on http://192.168.123.103:8765 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: [10/Mar/2026:14:03:43] ENGINE Serving on https://192.168.123.103:7150 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: [10/Mar/2026:14:03:43] ENGINE Bus STARTED 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:44.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:44 vm03 ceph-mon[49718]: mgrmap e16: vm03.rwbbep(active, since 1.00925s) 2026-03-10T14:03:44.556 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:44.556 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:44.723 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:44.771 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T14:03:45.078 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.079+0000 7f97f7f3d700 1 -- 192.168.123.104:0/3746330841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f00713e0 msgr2=0x7f97f00717f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f4cd7700 1 -- 192.168.123.104:0/3746330841 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f97ec00e070 con 0x7f97f00713e0 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 --2- 192.168.123.104:0/3746330841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f00713e0 0x7f97f00717f0 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f97ec00b3a0 tx=0x7f97ec00b6b0 comp rx=0 tx=0).stop 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 -- 192.168.123.104:0/3746330841 shutdown_connections 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 --2- 192.168.123.104:0/3746330841 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f00713e0 0x7f97f00717f0 secure :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f97ec00b3a0 tx=0x7f97ec00b6b0 comp rx=0 tx=0).stop 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 -- 192.168.123.104:0/3746330841 >> 192.168.123.104:0/3746330841 conn(0x7f97f006cd50 msgr2=0x7f97f006f1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 -- 192.168.123.104:0/3746330841 shutdown_connections 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 -- 192.168.123.104:0/3746330841 wait complete. 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 Processor -- start 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 -- start start 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f01a4050 0x7f97f01a4460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.080+0000 7f97f7f3d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97f01a49a0 con 0x7f97f01a4050 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.081+0000 7f97f5cd9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f01a4050 0x7f97f01a4460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.081+0000 7f97f5cd9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f01a4050 0x7f97f01a4460 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:38988/0 (socket says 192.168.123.104:38988) 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.081+0000 7f97f5cd9700 1 -- 192.168.123.104:0/205580616 learned_addr learned my addr 192.168.123.104:0/205580616 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.081+0000 7f97f5cd9700 1 -- 192.168.123.104:0/205580616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f97ec00b050 con 0x7f97f01a4050 2026-03-10T14:03:45.079 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.081+0000 7f97f5cd9700 1 --2- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f01a4050 0x7f97f01a4460 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f97ec00bd20 tx=0x7f97ec0095c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:45.080 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.081+0000 7f97e6ffd700 1 -- 192.168.123.104:0/205580616 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f97ec00e050 con 0x7f97f01a4050 2026-03-10T14:03:45.080 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.081+0000 7f97e6ffd700 1 -- 192.168.123.104:0/205580616 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f97ec012c90 con 0x7f97f01a4050 2026-03-10T14:03:45.080 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.082+0000 7f97e6ffd700 1 -- 192.168.123.104:0/205580616 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f97ec01de40 con 0x7f97f01a4050 2026-03-10T14:03:45.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.082+0000 7f97f7f3d700 1 -- 192.168.123.104:0/205580616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f97f01a4ba0 con 0x7f97f01a4050 2026-03-10T14:03:45.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.082+0000 7f97f7f3d700 1 -- 192.168.123.104:0/205580616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f97f01a7800 con 0x7f97f01a4050 2026-03-10T14:03:45.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.083+0000 7f97f7f3d700 1 -- 192.168.123.104:0/205580616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f97f019e040 con 0x7f97f01a4050 2026-03-10T14:03:45.081 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.083+0000 7f97e6ffd700 1 -- 192.168.123.104:0/205580616 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 16) v1 ==== 45000+0+0 (secure 0 0 0) 0x7f97ec019040 con 0x7f97f01a4050 2026-03-10T14:03:45.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.083+0000 7f97e6ffd700 1 --2- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97dc037fd0 0x7f97dc03a480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:45.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.083+0000 7f97e6ffd700 1 -- 192.168.123.104:0/205580616 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f97ec04c4b0 con 0x7f97f01a4050 2026-03-10T14:03:45.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.084+0000 7f97f54d8700 1 --2- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97dc037fd0 0x7f97dc03a480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:45.082 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.084+0000 7f97f54d8700 1 --2- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97dc037fd0 0x7f97dc03a480 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f97e0006fd0 tx=0x7f97e0006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:45.084 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.086+0000 7f97e6ffd700 1 -- 192.168.123.104:0/205580616 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f97ec012e00 con 0x7f97f01a4050 2026-03-10T14:03:45.243 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.244+0000 7f97f7f3d700 1 -- 192.168.123.104:0/205580616 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f97f004f020 con 0x7f97f01a4050 2026-03-10T14:03:45.244 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.246+0000 7f97e6ffd700 1 -- 192.168.123.104:0/205580616 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f97f004f020 con 0x7f97f01a4050 2026-03-10T14:03:45.244 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:45.244 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.249+0000 7f97e4ff9700 1 -- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97dc037fd0 msgr2=0x7f97dc03a480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.249+0000 7f97e4ff9700 1 --2- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97dc037fd0 0x7f97dc03a480 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f97e0006fd0 tx=0x7f97e0006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.249+0000 7f97e4ff9700 1 -- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f01a4050 msgr2=0x7f97f01a4460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.249+0000 7f97e4ff9700 1 --2- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f01a4050 0x7f97f01a4460 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f97ec00bd20 tx=0x7f97ec0095c0 comp rx=0 tx=0).stop 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.249+0000 7f97e4ff9700 1 -- 192.168.123.104:0/205580616 shutdown_connections 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.249+0000 7f97e4ff9700 1 --2- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97dc037fd0 0x7f97dc03a480 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.249+0000 7f97e4ff9700 1 --2- 192.168.123.104:0/205580616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97f01a4050 0x7f97f01a4460 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.249+0000 7f97e4ff9700 1 -- 192.168.123.104:0/205580616 >> 192.168.123.104:0/205580616 conn(0x7f97f006cd50 msgr2=0x7f97f006fa30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.250+0000 7f97e4ff9700 1 -- 192.168.123.104:0/205580616 shutdown_connections 2026-03-10T14:03:45.248 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:45.250+0000 7f97e4ff9700 1 -- 192.168.123.104:0/205580616 wait complete. 2026-03-10T14:03:45.250 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:03:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:45 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/205580616' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:46.315 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:46.315 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:46.485 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:03:46.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.753+0000 7f8b76840700 1 -- 192.168.123.104:0/3978543889 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 msgr2=0x7f8b70068590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:46.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.753+0000 7f8b76840700 1 --2- 192.168.123.104:0/3978543889 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 0x7f8b70068590 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f8b58009b00 tx=0x7f8b58009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:46.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.754+0000 7f8b76840700 1 -- 192.168.123.104:0/3978543889 shutdown_connections 2026-03-10T14:03:46.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.754+0000 7f8b76840700 1 --2- 192.168.123.104:0/3978543889 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 0x7f8b70068590 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:46.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.754+0000 7f8b76840700 1 -- 192.168.123.104:0/3978543889 >> 192.168.123.104:0/3978543889 conn(0x7f8b70074e70 msgr2=0x7f8b70075270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:46.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.754+0000 7f8b76840700 1 -- 192.168.123.104:0/3978543889 shutdown_connections 2026-03-10T14:03:46.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.754+0000 7f8b76840700 1 -- 192.168.123.104:0/3978543889 wait complete. 2026-03-10T14:03:46.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.754+0000 7f8b76840700 1 Processor -- start 2026-03-10T14:03:46.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.754+0000 7f8b76840700 1 -- start start 2026-03-10T14:03:46.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.755+0000 7f8b76840700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 0x7f8b701a18f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:46.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.755+0000 7f8b76840700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b701a1e30 con 0x7f8b700681c0 2026-03-10T14:03:46.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.755+0000 7f8b6ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 0x7f8b701a18f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:46.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.755+0000 7f8b6ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 0x7f8b701a18f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:39002/0 (socket says 192.168.123.104:39002) 2026-03-10T14:03:46.753 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.755+0000 7f8b6ffff700 1 -- 192.168.123.104:0/1248318465 learned_addr learned my addr 192.168.123.104:0/1248318465 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:46.754 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.755+0000 7f8b6ffff700 1 -- 192.168.123.104:0/1248318465 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8b580097e0 con 0x7f8b700681c0 2026-03-10T14:03:46.754 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.755+0000 7f8b6ffff700 1 --2- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 0x7f8b701a18f0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f8b58006010 tx=0x7f8b58004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:46.754 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.756+0000 7f8b6d7fa700 1 -- 192.168.123.104:0/1248318465 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8b5801c070 con 0x7f8b700681c0 2026-03-10T14:03:46.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.756+0000 7f8b6d7fa700 1 -- 192.168.123.104:0/1248318465 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8b58021470 con 0x7f8b700681c0 2026-03-10T14:03:46.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.756+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8b701a2030 con 0x7f8b700681c0 2026-03-10T14:03:46.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.756+0000 7f8b6d7fa700 1 -- 192.168.123.104:0/1248318465 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8b5800f460 con 0x7f8b700681c0 2026-03-10T14:03:46.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.756+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8b701a2450 con 0x7f8b700681c0 2026-03-10T14:03:46.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.757+0000 7f8b6d7fa700 1 -- 192.168.123.104:0/1248318465 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f8b5800f5c0 con 0x7f8b700681c0 2026-03-10T14:03:46.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.757+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8b7004f9e0 con 0x7f8b700681c0 2026-03-10T14:03:46.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.757+0000 7f8b6d7fa700 1 --2- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8b5c0385a0 0x7f8b5c03aa50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:46.755 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.757+0000 7f8b6d7fa700 1 -- 192.168.123.104:0/1248318465 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f8b5804c490 con 0x7f8b700681c0 2026-03-10T14:03:46.756 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.757+0000 7f8b6f7fe700 1 --2- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8b5c0385a0 0x7f8b5c03aa50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:46.756 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.758+0000 7f8b6f7fe700 1 --2- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8b5c0385a0 0x7f8b5c03aa50 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f8b60006fd0 tx=0x7f8b60006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:46.759 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.761+0000 7f8b6d7fa700 1 -- 192.168.123.104:0/1248318465 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8b58029950 con 0x7f8b700681c0 2026-03-10T14:03:46.912 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.912+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8b70062380 con 0x7f8b700681c0 2026-03-10T14:03:46.912 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.913+0000 7f8b6d7fa700 1 -- 192.168.123.104:0/1248318465 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8b58026070 con 0x7f8b700681c0 2026-03-10T14:03:46.912 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:46.912 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:46.914 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.916+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8b5c0385a0 msgr2=0x7f8b5c03aa50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:46.914 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.916+0000 7f8b76840700 1 --2- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8b5c0385a0 0x7f8b5c03aa50 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f8b60006fd0 tx=0x7f8b60006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:46.914 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.916+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 msgr2=0x7f8b701a18f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:46.914 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.916+0000 7f8b76840700 1 --2- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 0x7f8b701a18f0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f8b58006010 tx=0x7f8b58004dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:46.914 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.916+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 shutdown_connections 2026-03-10T14:03:46.914 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.916+0000 7f8b76840700 1 --2- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8b5c0385a0 0x7f8b5c03aa50 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:46.914 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.916+0000 7f8b76840700 1 --2- 192.168.123.104:0/1248318465 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b700681c0 0x7f8b701a18f0 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:46.914 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.916+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 >> 192.168.123.104:0/1248318465 conn(0x7f8b70074e70 msgr2=0x7f8b70103700 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:46.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.917+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 shutdown_connections 2026-03-10T14:03:46.915 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:46.917+0000 7f8b76840700 1 -- 192.168.123.104:0/1248318465 wait complete. 2026-03-10T14:03:46.916 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:03:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:46 vm03 ceph-mon[49718]: mgrmap e17: vm03.rwbbep(active, since 2s) 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:03:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:47 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/1248318465' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:47.962 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:47.962 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:48.111 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:03:48.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.390+0000 7f583e2b7700 1 -- 192.168.123.104:0/3174460546 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 msgr2=0x7f58381015e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:48.390 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.390+0000 7f583e2b7700 1 --2- 192.168.123.104:0/3174460546 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 0x7f58381015e0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f5828009b00 tx=0x7f5828009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:48.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.392+0000 7f583e2b7700 1 -- 192.168.123.104:0/3174460546 shutdown_connections 2026-03-10T14:03:48.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.392+0000 7f583e2b7700 1 --2- 192.168.123.104:0/3174460546 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 0x7f58381015e0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:48.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.392+0000 7f583e2b7700 1 -- 192.168.123.104:0/3174460546 >> 192.168.123.104:0/3174460546 conn(0x7f58380faa70 msgr2=0x7f58380fce80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:48.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.393+0000 7f583e2b7700 1 -- 192.168.123.104:0/3174460546 shutdown_connections 2026-03-10T14:03:48.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.393+0000 7f583e2b7700 1 -- 192.168.123.104:0/3174460546 wait complete. 2026-03-10T14:03:48.391 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.393+0000 7f583e2b7700 1 Processor -- start 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.393+0000 7f583e2b7700 1 -- start start 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.393+0000 7f583e2b7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 0x7f583810e4a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.393+0000 7f583e2b7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f583810e9e0 con 0x7f5838101210 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f5837fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 0x7f583810e4a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f5837fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 0x7f583810e4a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:39040/0 (socket says 192.168.123.104:39040) 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f5837fff700 1 -- 192.168.123.104:0/3363101007 learned_addr learned my addr 192.168.123.104:0/3363101007 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f5837fff700 1 -- 192.168.123.104:0/3363101007 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58280097e0 con 0x7f5838101210 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f5837fff700 1 --2- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 0x7f583810e4a0 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f5828006010 tx=0x7f5828004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:48.392 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f58357fa700 1 -- 192.168.123.104:0/3363101007 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f582801c070 con 0x7f5838101210 2026-03-10T14:03:48.393 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f583810ebe0 con 0x7f5838101210 2026-03-10T14:03:48.393 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f58357fa700 1 -- 192.168.123.104:0/3363101007 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5828021470 con 0x7f5838101210 2026-03-10T14:03:48.393 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f58357fa700 1 -- 192.168.123.104:0/3363101007 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f582800f460 con 0x7f5838101210 2026-03-10T14:03:48.393 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.394+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f583810f000 con 0x7f5838101210 2026-03-10T14:03:48.395 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.397+0000 7f58357fa700 1 -- 192.168.123.104:0/3363101007 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f58280215e0 con 0x7f5838101210 2026-03-10T14:03:48.395 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.397+0000 7f58357fa700 1 --2- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5820038600 0x7f582003aab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:48.396 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.397+0000 7f58357fa700 1 -- 192.168.123.104:0/3363101007 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f582804c620 con 0x7f5838101210 2026-03-10T14:03:48.396 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.397+0000 7f58377fe700 1 --2- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5820038600 0x7f582003aab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:48.396 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.398+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f583810f2b0 con 0x7f5838101210 2026-03-10T14:03:48.396 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.398+0000 7f58377fe700 1 --2- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5820038600 0x7f582003aab0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f582c006fd0 tx=0x7f582c006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:48.399 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.401+0000 7f58357fa700 1 -- 192.168.123.104:0/3363101007 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5828026030 con 0x7f5838101210 2026-03-10T14:03:48.591 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.592+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f5838062380 con 0x7f5838101210 2026-03-10T14:03:48.591 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.593+0000 7f58357fa700 1 -- 192.168.123.104:0/3363101007 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f582802aa30 con 0x7f5838101210 2026-03-10T14:03:48.593 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:48.593 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:48.594 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.596+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5820038600 msgr2=0x7f582003aab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:48.594 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.596+0000 7f583e2b7700 1 --2- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5820038600 0x7f582003aab0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f582c006fd0 tx=0x7f582c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:48.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.596+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 msgr2=0x7f583810e4a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:48.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.597+0000 7f583e2b7700 1 --2- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 0x7f583810e4a0 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f5828006010 tx=0x7f5828004dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:48.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.597+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 shutdown_connections 2026-03-10T14:03:48.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.597+0000 7f583e2b7700 1 --2- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5820038600 0x7f582003aab0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:48.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.597+0000 7f583e2b7700 1 --2- 192.168.123.104:0/3363101007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5838101210 0x7f583810e4a0 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:48.595 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.597+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 >> 192.168.123.104:0/3363101007 conn(0x7f58380faa70 msgr2=0x7f5838104860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:48.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.597+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 shutdown_connections 2026-03-10T14:03:48.596 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:48.597+0000 7f583e2b7700 1 -- 192.168.123.104:0/3363101007 wait complete. 2026-03-10T14:03:48.597 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: Deploying daemon ceph-exporter.vm04 on vm04 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:48 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3363101007' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:49.685 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:49.685 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:49.813 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.044+0000 7fd87a697700 1 -- 192.168.123.104:0/1841034825 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 msgr2=0x7fd874102870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.044+0000 7fd87a697700 1 --2- 192.168.123.104:0/1841034825 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 0x7fd874102870 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7fd864009b00 tx=0x7fd864009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.045+0000 7fd87a697700 1 -- 192.168.123.104:0/1841034825 shutdown_connections 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.045+0000 7fd87a697700 1 --2- 192.168.123.104:0/1841034825 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 0x7fd874102870 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.045+0000 7fd87a697700 1 -- 192.168.123.104:0/1841034825 >> 192.168.123.104:0/1841034825 conn(0x7fd8740fdd80 msgr2=0x7fd874100190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.045+0000 7fd87a697700 1 -- 192.168.123.104:0/1841034825 shutdown_connections 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.045+0000 7fd87a697700 1 -- 192.168.123.104:0/1841034825 wait complete. 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.046+0000 7fd87a697700 1 Processor -- start 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.046+0000 7fd87a697700 1 -- start start 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.046+0000 7fd87a697700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 0x7fd87419d4f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:50.044 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.046+0000 7fd87a697700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd87419da30 con 0x7fd8741024a0 2026-03-10T14:03:50.045 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.046+0000 7fd873fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 0x7fd87419d4f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:50.045 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.047+0000 7fd873fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 0x7fd87419d4f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:39066/0 (socket says 192.168.123.104:39066) 2026-03-10T14:03:50.045 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.047+0000 7fd873fff700 1 -- 192.168.123.104:0/3971954934 learned_addr learned my addr 192.168.123.104:0/3971954934 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:50.045 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.047+0000 7fd873fff700 1 -- 192.168.123.104:0/3971954934 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8640097e0 con 0x7fd8741024a0 2026-03-10T14:03:50.045 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.047+0000 7fd873fff700 1 --2- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 0x7fd87419d4f0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fd864005b40 tx=0x7fd864004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:50.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.047+0000 7fd871ffb700 1 -- 192.168.123.104:0/3971954934 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd86401c070 con 0x7fd8741024a0 2026-03-10T14:03:50.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.048+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd87419dc30 con 0x7fd8741024a0 2026-03-10T14:03:50.046 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.048+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd87419e050 con 0x7fd8741024a0 2026-03-10T14:03:50.047 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.048+0000 7fd871ffb700 1 -- 192.168.123.104:0/3971954934 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd8640056f0 con 0x7fd8741024a0 2026-03-10T14:03:50.047 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.048+0000 7fd871ffb700 1 -- 192.168.123.104:0/3971954934 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd864021e80 con 0x7fd8741024a0 2026-03-10T14:03:50.047 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.049+0000 7fd871ffb700 1 -- 192.168.123.104:0/3971954934 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7fd86400f510 con 0x7fd8741024a0 2026-03-10T14:03:50.048 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.049+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd87404f9e0 con 0x7fd8741024a0 2026-03-10T14:03:50.048 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.049+0000 7fd871ffb700 1 --2- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd8540385b0 0x7fd85403aa60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:50.048 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.049+0000 7fd871ffb700 1 -- 192.168.123.104:0/3971954934 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fd86404c400 con 0x7fd8741024a0 2026-03-10T14:03:50.048 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.049+0000 7fd86bfff700 1 --2- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd8540385b0 0x7fd85403aa60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:50.048 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.050+0000 7fd86bfff700 1 --2- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd8540385b0 0x7fd85403aa60 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fd85c006fd0 tx=0x7fd85c006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:50.050 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.052+0000 7fd871ffb700 1 -- 192.168.123.104:0/3971954934 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd864029950 con 0x7fd8741024a0 2026-03-10T14:03:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:49 vm03 ceph-mon[49718]: Deploying daemon crash.vm04 on vm04 2026-03-10T14:03:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:49 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:49 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:49 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:49 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:50.192 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:50.192 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:50.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.191+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fd87410a100 con 0x7fd8741024a0 2026-03-10T14:03:50.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.192+0000 7fd871ffb700 1 -- 192.168.123.104:0/3971954934 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fd864026070 con 0x7fd8741024a0 2026-03-10T14:03:50.192 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.194+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd8540385b0 msgr2=0x7fd85403aa60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.194+0000 7fd87a697700 1 --2- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd8540385b0 0x7fd85403aa60 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fd85c006fd0 tx=0x7fd85c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.194+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 msgr2=0x7fd87419d4f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.194+0000 7fd87a697700 1 --2- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 0x7fd87419d4f0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fd864005b40 tx=0x7fd864004dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.194+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 shutdown_connections 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.194+0000 7fd87a697700 1 --2- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd8540385b0 0x7fd85403aa60 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.194+0000 7fd87a697700 1 --2- 192.168.123.104:0/3971954934 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8741024a0 0x7fd87419d4f0 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.194+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 >> 192.168.123.104:0/3971954934 conn(0x7fd8740fdd80 msgr2=0x7fd874106680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.195+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 shutdown_connections 2026-03-10T14:03:50.193 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:50.195+0000 7fd87a697700 1 -- 192.168.123.104:0/3971954934 wait complete. 2026-03-10T14:03:50.194 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:51.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:50 vm03 ceph-mon[49718]: Deploying daemon node-exporter.vm04 on vm04 2026-03-10T14:03:51.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:50 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3971954934' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:51.231 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:51.231 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:51.390 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:03:51.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.655+0000 7f5aba25c700 1 -- 192.168.123.104:0/3495932215 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 msgr2=0x7f5ab410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:51.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.655+0000 7f5aba25c700 1 --2- 192.168.123.104:0/3495932215 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 0x7f5ab410edb0 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f5aa4009b00 tx=0x7f5aa4009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:51.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.656+0000 7f5aba25c700 1 -- 192.168.123.104:0/3495932215 shutdown_connections 2026-03-10T14:03:51.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.656+0000 7f5aba25c700 1 --2- 192.168.123.104:0/3495932215 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 0x7f5ab410edb0 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:51.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.656+0000 7f5aba25c700 1 -- 192.168.123.104:0/3495932215 >> 192.168.123.104:0/3495932215 conn(0x7f5ab406c5e0 msgr2=0x7f5ab406c9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:51.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.656+0000 7f5aba25c700 1 -- 192.168.123.104:0/3495932215 shutdown_connections 2026-03-10T14:03:51.655 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.657+0000 7f5aba25c700 1 -- 192.168.123.104:0/3495932215 wait complete. 2026-03-10T14:03:51.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.657+0000 7f5aba25c700 1 Processor -- start 2026-03-10T14:03:51.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.657+0000 7f5aba25c700 1 -- start start 2026-03-10T14:03:51.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.658+0000 7f5aba25c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 0x7f5ab4114f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:51.656 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.658+0000 7f5aba25c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5aa4012070 con 0x7f5ab406d9f0 2026-03-10T14:03:51.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.658+0000 7f5ab37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 0x7f5ab4114f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:51.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.658+0000 7f5ab37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 0x7f5ab4114f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:39090/0 (socket says 192.168.123.104:39090) 2026-03-10T14:03:51.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.658+0000 7f5ab37fe700 1 -- 192.168.123.104:0/3049974839 learned_addr learned my addr 192.168.123.104:0/3049974839 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:51.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.658+0000 7f5ab37fe700 1 -- 192.168.123.104:0/3049974839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5aa40097e0 con 0x7f5ab406d9f0 2026-03-10T14:03:51.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.658+0000 7f5ab37fe700 1 --2- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 0x7f5ab4114f40 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f5aa4005e00 tx=0x7f5aa400bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:51.657 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.659+0000 7f5ab17fa700 1 -- 192.168.123.104:0/3049974839 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5aa401d070 con 0x7f5ab406d9f0 2026-03-10T14:03:51.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.659+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ab4115480 con 0x7f5ab406d9f0 2026-03-10T14:03:51.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.659+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ab4115920 con 0x7f5ab406d9f0 2026-03-10T14:03:51.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.659+0000 7f5ab17fa700 1 -- 192.168.123.104:0/3049974839 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5aa4003e80 con 0x7f5ab406d9f0 2026-03-10T14:03:51.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.659+0000 7f5ab17fa700 1 -- 192.168.123.104:0/3049974839 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5aa4017440 con 0x7f5ab406d9f0 2026-03-10T14:03:51.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.660+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5a94005320 con 0x7f5ab406d9f0 2026-03-10T14:03:51.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.660+0000 7f5ab17fa700 1 -- 192.168.123.104:0/3049974839 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f5aa40039a0 con 0x7f5ab406d9f0 2026-03-10T14:03:51.658 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.660+0000 7f5ab17fa700 1 --2- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5aa00385f0 0x7f5aa003aaa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:51.659 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.660+0000 7f5ab17fa700 1 -- 192.168.123.104:0/3049974839 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f5aa404c110 con 0x7f5ab406d9f0 2026-03-10T14:03:51.659 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.660+0000 7f5aabfff700 1 --2- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5aa00385f0 0x7f5aa003aaa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:51.659 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.661+0000 7f5aabfff700 1 --2- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5aa00385f0 0x7f5aa003aaa0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f5a9c006fd0 tx=0x7f5a9c006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:51.661 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.663+0000 7f5ab17fa700 1 -- 192.168.123.104:0/3049974839 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5aa4015350 con 0x7f5ab406d9f0 2026-03-10T14:03:51.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.810+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f5a94005190 con 0x7f5ab406d9f0 2026-03-10T14:03:51.813 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.812+0000 7f5ab17fa700 1 -- 192.168.123.104:0/3049974839 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f5aa4026090 con 0x7f5ab406d9f0 2026-03-10T14:03:51.813 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:51.813 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:51.824 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5aa00385f0 msgr2=0x7f5aa003aaa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 --2- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5aa00385f0 0x7f5aa003aaa0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f5a9c006fd0 tx=0x7f5a9c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 msgr2=0x7f5ab4114f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 --2- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 0x7f5ab4114f40 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f5aa4005e00 tx=0x7f5aa400bba0 comp rx=0 tx=0).stop 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 shutdown_connections 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 --2- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5aa00385f0 0x7f5aa003aaa0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 --2- 192.168.123.104:0/3049974839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ab406d9f0 0x7f5ab4114f40 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 >> 192.168.123.104:0/3049974839 conn(0x7f5ab406c5e0 msgr2=0x7f5ab410e6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 shutdown_connections 2026-03-10T14:03:51.825 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:51.824+0000 7f5aba25c700 1 -- 192.168.123.104:0/3049974839 wait complete. 2026-03-10T14:03:51.829 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:52.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:51 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3049974839' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:52.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:51 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:52.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:51 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:52.878 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:52.879 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:53.096 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: Deploying daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:03:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:52 vm03 ceph-mon[49718]: Deploying daemon mon.vm04 on vm04 2026-03-10T14:03:53.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.464+0000 7f4e43273700 1 -- 192.168.123.104:0/276313048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 msgr2=0x7f4e3c10edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:53.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.464+0000 7f4e43273700 1 --2- 192.168.123.104:0/276313048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 0x7f4e3c10edb0 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f4e38009b00 tx=0x7f4e38009e10 comp rx=0 tx=0).stop 2026-03-10T14:03:53.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.465+0000 7f4e43273700 1 -- 192.168.123.104:0/276313048 shutdown_connections 2026-03-10T14:03:53.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.465+0000 7f4e43273700 1 --2- 192.168.123.104:0/276313048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 0x7f4e3c10edb0 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:53.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.465+0000 7f4e43273700 1 -- 192.168.123.104:0/276313048 >> 192.168.123.104:0/276313048 conn(0x7f4e3c06c410 msgr2=0x7f4e3c06c810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:53.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.465+0000 7f4e43273700 1 -- 192.168.123.104:0/276313048 shutdown_connections 2026-03-10T14:03:53.464 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.466+0000 7f4e43273700 1 -- 192.168.123.104:0/276313048 wait complete. 2026-03-10T14:03:53.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.466+0000 7f4e43273700 1 Processor -- start 2026-03-10T14:03:53.466 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.468+0000 7f4e43273700 1 -- start start 2026-03-10T14:03:53.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.468+0000 7f4e43273700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 0x7f4e3c114f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:53.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.468+0000 7f4e43273700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e3c117b50 con 0x7f4e3c072730 2026-03-10T14:03:53.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.468+0000 7f4e42271700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 0x7f4e3c114f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:53.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.468+0000 7f4e42271700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 0x7f4e3c114f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:56926/0 (socket says 192.168.123.104:56926) 2026-03-10T14:03:53.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.468+0000 7f4e42271700 1 -- 192.168.123.104:0/2830216005 learned_addr learned my addr 192.168.123.104:0/2830216005 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:53.467 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.469+0000 7f4e42271700 1 -- 192.168.123.104:0/2830216005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4e380097e0 con 0x7f4e3c072730 2026-03-10T14:03:53.468 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.469+0000 7f4e42271700 1 --2- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 0x7f4e3c114f10 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f4e38006010 tx=0x7f4e38004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:53.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.469+0000 7f4e337fe700 1 -- 192.168.123.104:0/2830216005 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4e3801c070 con 0x7f4e3c072730 2026-03-10T14:03:53.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.469+0000 7f4e337fe700 1 -- 192.168.123.104:0/2830216005 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4e38021470 con 0x7f4e3c072730 2026-03-10T14:03:53.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.469+0000 7f4e337fe700 1 -- 192.168.123.104:0/2830216005 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4e3800f460 con 0x7f4e3c072730 2026-03-10T14:03:53.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.470+0000 7f4e43273700 1 -- 192.168.123.104:0/2830216005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4e3c1154b0 con 0x7f4e3c072730 2026-03-10T14:03:53.469 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.470+0000 7f4e43273700 1 -- 192.168.123.104:0/2830216005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4e3c115930 con 0x7f4e3c072730 2026-03-10T14:03:53.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.471+0000 7f4e337fe700 1 -- 192.168.123.104:0/2830216005 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f4e380215e0 con 0x7f4e3c072730 2026-03-10T14:03:53.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.471+0000 7f4e337fe700 1 --2- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e28038280 0x7f4e2803a730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:53.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.471+0000 7f4e337fe700 1 -- 192.168.123.104:0/2830216005 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f4e3804c700 con 0x7f4e3c072730 2026-03-10T14:03:53.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.471+0000 7f4e41a70700 1 --2- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e28038280 0x7f4e2803a730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:53.470 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.472+0000 7f4e317fa700 1 -- 192.168.123.104:0/2830216005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4e240052f0 con 0x7f4e3c072730 2026-03-10T14:03:53.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.474+0000 7f4e337fe700 1 -- 192.168.123.104:0/2830216005 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4e38026030 con 0x7f4e3c072730 2026-03-10T14:03:53.477 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.475+0000 7f4e41a70700 1 --2- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e28038280 0x7f4e2803a730 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f4e2c006fd0 tx=0x7f4e2c006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:53.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.742+0000 7f4e317fa700 1 -- 192.168.123.104:0/2830216005 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f4e240059c0 con 0x7f4e3c072730 2026-03-10T14:03:53.741 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.742+0000 7f4e337fe700 1 -- 192.168.123.104:0/2830216005 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f4e38005320 con 0x7f4e3c072730 2026-03-10T14:03:53.741 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:53.741 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":1,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:02:36.685064Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T14:03:53.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.753+0000 7f4e317fa700 1 -- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e28038280 msgr2=0x7f4e2803a730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:53.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.753+0000 7f4e317fa700 1 --2- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e28038280 0x7f4e2803a730 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f4e2c006fd0 tx=0x7f4e2c006e40 comp rx=0 tx=0).stop 2026-03-10T14:03:53.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.753+0000 7f4e317fa700 1 -- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 msgr2=0x7f4e3c114f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:53.752 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.753+0000 7f4e317fa700 1 --2- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 0x7f4e3c114f10 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f4e38006010 tx=0x7f4e38004dc0 comp rx=0 tx=0).stop 2026-03-10T14:03:53.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.757+0000 7f4e317fa700 1 -- 192.168.123.104:0/2830216005 shutdown_connections 2026-03-10T14:03:53.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.757+0000 7f4e317fa700 1 --2- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4e28038280 0x7f4e2803a730 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:53.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.757+0000 7f4e317fa700 1 --2- 192.168.123.104:0/2830216005 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4e3c072730 0x7f4e3c114f10 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:53.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.757+0000 7f4e317fa700 1 -- 192.168.123.104:0/2830216005 >> 192.168.123.104:0/2830216005 conn(0x7f4e3c06c410 msgr2=0x7f4e3c10cea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:53.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.757+0000 7f4e317fa700 1 -- 192.168.123.104:0/2830216005 shutdown_connections 2026-03-10T14:03:53.757 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:53.757+0000 7f4e317fa700 1 -- 192.168.123.104:0/2830216005 wait complete. 2026-03-10T14:03:53.757 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 1 2026-03-10T14:03:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:54 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:54 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/2830216005' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:03:54.963 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T14:03:54.963 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mon dump -f json 2026-03-10T14:03:55.114 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm04/config 2026-03-10T14:03:59.560 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:03:59 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: mon.vm03 calling monitor election 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: mon.vm04 calling monitor election 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.? 192.168.123.104:0/1290177472' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/crt"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: mon.vm03 is new leader, mons vm03,vm04 in quorum (ranks 0,1) 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: monmap e2: 2 mons at {vm03=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0],vm04=[v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0]} removed_ranks: {} 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: fsmap 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: mgrmap e17: vm03.rwbbep(active, since 16s) 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: Standby manager daemon vm04.ywwcto started 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.? 192.168.123.104:0/1290177472' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: overall HEALTH_OK 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.? 192.168.123.104:0/1290177472' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/key"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.? 192.168.123.104:0/1290177472' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:03:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:03:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:03:59.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.797+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3647314504 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b900035c0 msgr2=0x7f0b90005a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:59.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.797+0000 7f0bacbd9700 1 --2- 192.168.123.104:0/3647314504 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b900035c0 0x7f0b90005a40 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f0ba806cef0 tx=0x7f0b9800fe60 comp rx=0 tx=0).stop 2026-03-10T14:03:59.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3647314504 shutdown_connections 2026-03-10T14:03:59.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 --2- 192.168.123.104:0/3647314504 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b900035c0 0x7f0b90005a40 secure :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f0ba806cef0 tx=0x7f0b9800fe60 comp rx=0 tx=0).stop 2026-03-10T14:03:59.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3647314504 >> 192.168.123.104:0/3647314504 conn(0x7f0ba806b290 msgr2=0x7f0ba806b690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:59.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3647314504 shutdown_connections 2026-03-10T14:03:59.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3647314504 wait complete. 2026-03-10T14:03:59.797 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 Processor -- start 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 -- start start 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 0x7f0ba81b3320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ba81ad9b0 0x7f0ba81ade20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ba81ae360 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.798+0000 7f0bacbd9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ba81ae4a0 con 0x7f0ba81b2f50 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ba81ad9b0 0x7f0ba81ade20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ba81ad9b0 0x7f0ba81ade20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:56994/0 (socket says 192.168.123.104:56994) 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba6ffd700 1 -- 192.168.123.104:0/3596437771 learned_addr learned my addr 192.168.123.104:0/3596437771 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba77fe700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 0x7f0ba81b3320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba77fe700 1 -- 192.168.123.104:0/3596437771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 msgr2=0x7f0ba81b3320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 13 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba77fe700 1 -- 192.168.123.104:0/3596437771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 msgr2=0x7f0ba81b3320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba77fe700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 0x7f0ba81b3320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba77fe700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 0x7f0ba81b3320 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba6ffd700 1 -- 192.168.123.104:0/3596437771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 msgr2=0x7f0ba81b3320 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba6ffd700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 0x7f0ba81b3320 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba6ffd700 1 -- 192.168.123.104:0/3596437771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b9800f8b0 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba6ffd700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ba81ad9b0 0x7f0ba81ade20 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f0ba000bf40 tx=0x7f0ba000bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:59.798 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0ba4ff9700 1 -- 192.168.123.104:0/3596437771 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ba000ca30 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.799 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ba81ae720 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.799 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.799+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ba81b7d20 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.799 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.800+0000 7f0ba4ff9700 1 -- 192.168.123.104:0/3596437771 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0ba00092e0 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.799 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.800+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ba8110c20 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.799 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.800+0000 7f0ba4ff9700 1 -- 192.168.123.104:0/3596437771 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ba0007600 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.799 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.801+0000 7f0ba4ff9700 1 -- 192.168.123.104:0/3596437771 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f0ba0007760 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.800 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.802+0000 7f0ba4ff9700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9406c820 0x7f0b9406ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:03:59.800 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.802+0000 7f0ba4ff9700 1 -- 192.168.123.104:0/3596437771 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f0ba008aad0 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.802 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.804+0000 7f0ba77fe700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9406c820 0x7f0b9406ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:03:59.802 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.804+0000 7f0ba4ff9700 1 -- 192.168.123.104:0/3596437771 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0ba00563a0 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.804 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.806+0000 7f0ba77fe700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9406c820 0x7f0b9406ecd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f0b98005b40 tx=0x7f0b98005a90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:03:59.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.968+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f0ba804efc0 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.967 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.968+0000 7f0ba4ff9700 1 -- 192.168.123.104:0/3596437771 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1032 (secure 0 0 0) 0x7f0ba00599c0 con 0x7f0ba81ad9b0 2026-03-10T14:03:59.967 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:03:59.967 INFO:teuthology.orchestra.run.vm04.stdout:{"epoch":2,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","modified":"2026-03-10T14:03:54.187855Z","created":"2026-03-10T14:02:36.685064Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm04","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:3300","nonce":0},{"type":"v1","addr":"192.168.123.104:6789","nonce":0}]},"addr":"192.168.123.104:6789/0","public_addr":"192.168.123.104:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.971+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9406c820 msgr2=0x7f0b9406ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.971+0000 7f0bacbd9700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9406c820 0x7f0b9406ecd0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f0b98005b40 tx=0x7f0b98005a90 comp rx=0 tx=0).stop 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.972+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ba81ad9b0 msgr2=0x7f0ba81ade20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.972+0000 7f0bacbd9700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ba81ad9b0 0x7f0ba81ade20 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f0ba000bf40 tx=0x7f0ba000bf70 comp rx=0 tx=0).stop 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.972+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 shutdown_connections 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.972+0000 7f0bacbd9700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9406c820 0x7f0b9406ecd0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.972+0000 7f0bacbd9700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ba81b2f50 0x7f0ba81b3320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.972+0000 7f0bacbd9700 1 --2- 192.168.123.104:0/3596437771 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ba81ad9b0 0x7f0ba81ade20 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:03:59.970 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.972+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 >> 192.168.123.104:0/3596437771 conn(0x7f0ba806b290 msgr2=0x7f0ba806fa20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:03:59.971 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.972+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 shutdown_connections 2026-03-10T14:03:59.971 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:03:59.973+0000 7f0bacbd9700 1 -- 192.168.123.104:0/3596437771 wait complete. 2026-03-10T14:03:59.972 INFO:teuthology.orchestra.run.vm04.stderr:dumped monmap epoch 2 2026-03-10T14:04:00.031 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-10T14:04:00.031 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph config generate-minimal-conf 2026-03-10T14:04:00.208 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:04:00.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:00 vm04 ceph-mon[55966]: mgrmap e18: vm03.rwbbep(active, since 16s), standbys: vm04.ywwcto 2026-03-10T14:04:00.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:00 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm04.ywwcto", "id": "vm04.ywwcto"}]: dispatch 2026-03-10T14:04:00.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:00 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:00.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:00 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:00.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:00 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:00.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:00 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:00.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:00 vm04 ceph-mon[55966]: from='client.? 192.168.123.104:0/3596437771' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:04:00.278 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:00 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:04:00.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:00 vm03 ceph-mon[49718]: mgrmap e18: vm03.rwbbep(active, since 16s), standbys: vm04.ywwcto 2026-03-10T14:04:00.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:00 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm04.ywwcto", "id": "vm04.ywwcto"}]: dispatch 2026-03-10T14:04:00.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:00 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:00.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:00 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:00.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:00 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:00.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:00 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:00.319 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:00 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3596437771' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T14:04:00.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:00 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:04:00.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.492+0000 7f9182795700 1 -- 192.168.123.103:0/2527065161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 msgr2=0x7f917c10edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:00.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.492+0000 7f9182795700 1 --2- 192.168.123.103:0/2527065161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 0x7f917c10edb0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f9170009b00 tx=0x7f9170009e10 comp rx=0 tx=0).stop 2026-03-10T14:04:00.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.493+0000 7f9182795700 1 -- 192.168.123.103:0/2527065161 shutdown_connections 2026-03-10T14:04:00.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.493+0000 7f9182795700 1 --2- 192.168.123.103:0/2527065161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 0x7f917c10edb0 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:00.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.493+0000 7f9182795700 1 -- 192.168.123.103:0/2527065161 >> 192.168.123.103:0/2527065161 conn(0x7f917c06c410 msgr2=0x7f917c06c810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:00.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.493+0000 7f9182795700 1 -- 192.168.123.103:0/2527065161 shutdown_connections 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.493+0000 7f9182795700 1 -- 192.168.123.103:0/2527065161 wait complete. 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9182795700 1 Processor -- start 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9182795700 1 -- start start 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9182795700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 0x7f917c115310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9182795700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f917c115850 0x7f917c1b2890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9182795700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f917c115e10 con 0x7f917c072730 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9182795700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f917c1b2dd0 con 0x7f917c115850 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9181793700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 0x7f917c115310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9181793700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 0x7f917c115310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:50386/0 (socket says 192.168.123.103:50386) 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9181793700 1 -- 192.168.123.103:0/4023503833 learned_addr learned my addr 192.168.123.103:0/4023503833 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9180f92700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f917c115850 0x7f917c1b2890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9181793700 1 -- 192.168.123.103:0/4023503833 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f917c115850 msgr2=0x7f917c1b2890 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9181793700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f917c115850 0x7f917c1b2890 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:00.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.494+0000 7f9181793700 1 -- 192.168.123.103:0/4023503833 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91700097e0 con 0x7f917c072730 2026-03-10T14:04:00.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.495+0000 7f9181793700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 0x7f917c115310 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f917000b5c0 tx=0x7f917000bef0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:00.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.495+0000 7f916e7fc700 1 -- 192.168.123.103:0/4023503833 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f917001d070 con 0x7f917c072730 2026-03-10T14:04:00.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.495+0000 7f916e7fc700 1 -- 192.168.123.103:0/4023503833 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9170005430 con 0x7f917c072730 2026-03-10T14:04:00.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.495+0000 7f916e7fc700 1 -- 192.168.123.103:0/4023503833 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f917000f650 con 0x7f917c072730 2026-03-10T14:04:00.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.495+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f917c1b3050 con 0x7f917c072730 2026-03-10T14:04:00.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.495+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f917c1b3540 con 0x7f917c072730 2026-03-10T14:04:00.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.495+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f917c115120 con 0x7f917c072730 2026-03-10T14:04:00.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.499+0000 7f916e7fc700 1 -- 192.168.123.103:0/4023503833 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f91700055a0 con 0x7f917c072730 2026-03-10T14:04:00.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.499+0000 7f916e7fc700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f916806c760 0x7f916806ec10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:00.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.499+0000 7f916e7fc700 1 -- 192.168.123.103:0/4023503833 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f917008c8e0 con 0x7f917c072730 2026-03-10T14:04:00.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.499+0000 7f916e7fc700 1 -- 192.168.123.103:0/4023503833 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f91700b87c0 con 0x7f917c072730 2026-03-10T14:04:00.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.500+0000 7f9180f92700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f916806c760 0x7f916806ec10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:00.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.500+0000 7f9180f92700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f916806c760 0x7f916806ec10 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f9178009850 tx=0x7f9178008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:00.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.618+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f917c04efc0 con 0x7f917c072730 2026-03-10T14:04:00.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.618+0000 7f916e7fc700 1 -- 192.168.123.103:0/4023503833 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7f9170027070 con 0x7f917c072730 2026-03-10T14:04:00.617 INFO:teuthology.orchestra.run.vm03.stdout:# minimal ceph.conf for b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:04:00.617 INFO:teuthology.orchestra.run.vm03.stdout:[global] 2026-03-10T14:04:00.617 INFO:teuthology.orchestra.run.vm03.stdout: fsid = b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:04:00.617 INFO:teuthology.orchestra.run.vm03.stdout: mon_host = [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] 2026-03-10T14:04:00.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.620+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f916806c760 msgr2=0x7f916806ec10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.620+0000 7f9182795700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f916806c760 0x7f916806ec10 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f9178009850 tx=0x7f9178008040 comp rx=0 tx=0).stop 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.620+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 msgr2=0x7f917c115310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.620+0000 7f9182795700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 0x7f917c115310 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f917000b5c0 tx=0x7f917000bef0 comp rx=0 tx=0).stop 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.621+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 shutdown_connections 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.621+0000 7f9182795700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f917c072730 0x7f917c115310 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.621+0000 7f9182795700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f916806c760 0x7f916806ec10 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.621+0000 7f9182795700 1 --2- 192.168.123.103:0/4023503833 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f917c115850 0x7f917c1b2890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.621+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 >> 192.168.123.103:0/4023503833 conn(0x7f917c06c410 msgr2=0x7f917c10df20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.621+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 shutdown_connections 2026-03-10T14:04:00.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:00.621+0000 7f9182795700 1 -- 192.168.123.103:0/4023503833 wait complete. 2026-03-10T14:04:00.682 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-10T14:04:00.682 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:04:00.682 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T14:04:00.709 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:04:00.709 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:04:00.773 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:04:00.773 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T14:04:00.801 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:04:00.801 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:04:00.866 INFO:tasks.cephadm:Deploying OSDs... 2026-03-10T14:04:00.866 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:04:00.866 DEBUG:teuthology.orchestra.run.vm03:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T14:04:00.883 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T14:04:00.883 DEBUG:teuthology.orchestra.run.vm03:> ls /dev/[sv]d? 2026-03-10T14:04:00.952 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vda 2026-03-10T14:04:00.952 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdb 2026-03-10T14:04:00.952 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdc 2026-03-10T14:04:00.952 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdd 2026-03-10T14:04:00.952 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vde 2026-03-10T14:04:00.952 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T14:04:00.952 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T14:04:00.952 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdb 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdb 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 254 Links: 1 Device type: fc,10 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-10 14:03:12.023630731 +0000 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-10 14:03:11.891630548 +0000 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-10 14:03:11.891630548 +0000 2026-03-10T14:04:01.032 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-10 13:58:12.271000000 +0000 2026-03-10T14:04:01.033 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T14:04:01.102 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-10T14:04:01.102 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-10T14:04:01.103 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000100439 s, 5.1 MB/s 2026-03-10T14:04:01.104 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T14:04:01.168 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdc 2026-03-10T14:04:01.226 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdc 2026-03-10T14:04:01.227 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T14:04:01.227 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-10T14:04:01.227 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T14:04:01.227 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T14:04:01.227 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-10 14:03:12.085630817 +0000 2026-03-10T14:04:01.227 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-10 14:03:11.891630548 +0000 2026-03-10T14:04:01.227 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-10 14:03:11.891630548 +0000 2026-03-10T14:04:01.227 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-10 13:58:12.275000000 +0000 2026-03-10T14:04:01.227 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T14:04:01.302 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-10T14:04:01.302 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-10T14:04:01.302 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000116398 s, 4.4 MB/s 2026-03-10T14:04:01.303 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T14:04:01.383 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdd 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdd 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-10 14:03:12.156630916 +0000 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-10 14:03:11.903630565 +0000 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-10 14:03:11.903630565 +0000 2026-03-10T14:04:01.444 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-10 13:58:12.280000000 +0000 2026-03-10T14:04:01.445 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/4023503833' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: Reconfiguring mon.vm03 (unknown last config time)... 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.rwbbep", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:04:01.514 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:01.516 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-10T14:04:01.516 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-10T14:04:01.516 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 9.8133e-05 s, 5.2 MB/s 2026-03-10T14:04:01.517 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T14:04:01.575 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vde 2026-03-10T14:04:01.643 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vde 2026-03-10T14:04:01.643 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T14:04:01.643 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-10T14:04:01.643 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T14:04:01.643 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T14:04:01.643 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-10 14:03:12.217631001 +0000 2026-03-10T14:04:01.643 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-10 14:03:11.902630563 +0000 2026-03-10T14:04:01.643 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-10 14:03:11.902630563 +0000 2026-03-10T14:04:01.644 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-10 13:58:12.285000000 +0000 2026-03-10T14:04:01.644 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T14:04:01.735 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-10T14:04:01.735 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-10T14:04:01.735 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000112651 s, 4.5 MB/s 2026-03-10T14:04:01.736 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T14:04:01.758 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:04:01.758 DEBUG:teuthology.orchestra.run.vm04:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T14:04:01.772 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T14:04:01.772 DEBUG:teuthology.orchestra.run.vm04:> ls /dev/[sv]d? 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/4023503833' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: Reconfiguring mon.vm03 (unknown last config time)... 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.rwbbep", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:04:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:04:01.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:01.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:01.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:04:01.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:01.827 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vda 2026-03-10T14:04:01.827 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vdb 2026-03-10T14:04:01.827 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vdc 2026-03-10T14:04:01.827 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vdd 2026-03-10T14:04:01.827 INFO:teuthology.orchestra.run.vm04.stdout:/dev/vde 2026-03-10T14:04:01.827 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T14:04:01.827 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T14:04:01.827 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vdb 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vdb 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 223 Links: 1 Device type: fc,10 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-10 14:03:45.306609024 +0000 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 14:03:45.211608909 +0000 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 14:03:45.211608909 +0000 2026-03-10T14:04:01.884 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-10 13:57:41.306000000 +0000 2026-03-10T14:04:01.884 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T14:04:01.946 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-10T14:04:01.946 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-10T14:04:01.946 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000179727 s, 2.8 MB/s 2026-03-10T14:04:01.947 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T14:04:02.003 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vdc 2026-03-10T14:04:02.059 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vdc 2026-03-10T14:04:02.059 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T14:04:02.059 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-10T14:04:02.059 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T14:04:02.059 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T14:04:02.059 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-10 14:03:45.377609109 +0000 2026-03-10T14:04:02.060 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 14:03:45.221608921 +0000 2026-03-10T14:04:02.060 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 14:03:45.221608921 +0000 2026-03-10T14:04:02.060 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-10 13:57:41.321000000 +0000 2026-03-10T14:04:02.060 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T14:04:02.123 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-10T14:04:02.123 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-10T14:04:02.123 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000161232 s, 3.2 MB/s 2026-03-10T14:04:02.124 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T14:04:02.183 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vdd 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vdd 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-10 14:03:45.450609197 +0000 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 14:03:45.218608917 +0000 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 14:03:45.218608917 +0000 2026-03-10T14:04:02.243 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-10 13:57:41.332000000 +0000 2026-03-10T14:04:02.243 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T14:04:02.308 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-10T14:04:02.308 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-10T14:04:02.308 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000171351 s, 3.0 MB/s 2026-03-10T14:04:02.309 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T14:04:02.365 DEBUG:teuthology.orchestra.run.vm04:> stat /dev/vde 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout: File: /dev/vde 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout:Access: 2026-03-10 14:03:45.511609271 +0000 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 14:03:45.222608922 +0000 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 14:03:45.222608922 +0000 2026-03-10T14:04:02.421 INFO:teuthology.orchestra.run.vm04.stdout: Birth: 2026-03-10 13:57:41.372000000 +0000 2026-03-10T14:04:02.421 DEBUG:teuthology.orchestra.run.vm04:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T14:04:02.482 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records in 2026-03-10T14:04:02.483 INFO:teuthology.orchestra.run.vm04.stderr:1+0 records out 2026-03-10T14:04:02.483 INFO:teuthology.orchestra.run.vm04.stderr:512 bytes copied, 0.000179496 s, 2.9 MB/s 2026-03-10T14:04:02.483 DEBUG:teuthology.orchestra.run.vm04:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T14:04:02.542 INFO:tasks.cephadm:Deploying osd.0 on vm03 with /dev/vde... 2026-03-10T14:04:02.542 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- lvm zap /dev/vde 2026-03-10T14:04:02.923 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:04:03.017 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: Reconfiguring daemon mon.vm03 on vm03 2026-03-10T14:04:03.017 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: Reconfiguring mgr.vm03.rwbbep (unknown last config time)... 2026-03-10T14:04:03.017 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: Reconfiguring daemon mgr.vm03.rwbbep on vm03 2026-03-10T14:04:03.017 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-10T14:04:03.017 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-10T14:04:03.017 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.017 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.018 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:04:03.018 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:03.018 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.018 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: Reconfiguring daemon mon.vm03 on vm03 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: Reconfiguring mgr.vm03.rwbbep (unknown last config time)... 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: Reconfiguring daemon mgr.vm03.rwbbep on vm03 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:04:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:03.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.750 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:04:03.761 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch daemon add osd vm03:/dev/vde 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: Reconfiguring daemon crash.vm03 on vm03 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: Reconfiguring alertmanager.vm03 (dependencies changed)... 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: Reconfiguring daemon alertmanager.vm03 on vm03 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: Reconfiguring grafana.vm03 (dependencies changed)... 2026-03-10T14:04:03.945 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:03 vm03 ceph-mon[49718]: Reconfiguring daemon grafana.vm03 on vm03 2026-03-10T14:04:03.971 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: Reconfiguring daemon crash.vm03 on vm03 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: Reconfiguring alertmanager.vm03 (dependencies changed)... 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: Reconfiguring daemon alertmanager.vm03 on vm03 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: Reconfiguring grafana.vm03 (dependencies changed)... 2026-03-10T14:04:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:03 vm04 ceph-mon[55966]: Reconfiguring daemon grafana.vm03 on vm03 2026-03-10T14:04:04.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.275+0000 7fddbc68b700 1 -- 192.168.123.103:0/4286377637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb40ff450 msgr2=0x7fddb40ff860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:04.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.275+0000 7fddbc68b700 1 --2- 192.168.123.103:0/4286377637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb40ff450 0x7fddb40ff860 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7fddb0009b00 tx=0x7fddb0009e10 comp rx=0 tx=0).stop 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.277+0000 7fddbc68b700 1 -- 192.168.123.103:0/4286377637 shutdown_connections 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.277+0000 7fddbc68b700 1 --2- 192.168.123.103:0/4286377637 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fddb4100650 0x7fddb4100aa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.277+0000 7fddbc68b700 1 --2- 192.168.123.103:0/4286377637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb40ff450 0x7fddb40ff860 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.277+0000 7fddbc68b700 1 -- 192.168.123.103:0/4286377637 >> 192.168.123.103:0/4286377637 conn(0x7fddb40fa9e0 msgr2=0x7fddb40fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.277+0000 7fddbc68b700 1 -- 192.168.123.103:0/4286377637 shutdown_connections 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.277+0000 7fddbc68b700 1 -- 192.168.123.103:0/4286377637 wait complete. 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.277+0000 7fddbc68b700 1 Processor -- start 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.277+0000 7fddbc68b700 1 -- start start 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.278+0000 7fddbc68b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fddb40ff450 0x7fddb4106660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.278+0000 7fddbc68b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb4100650 0x7fddb4106ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.278+0000 7fddbc68b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fddb41071c0 con 0x7fddb4100650 2026-03-10T14:04:04.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.278+0000 7fddbc68b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fddb4107300 con 0x7fddb40ff450 2026-03-10T14:04:04.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.279+0000 7fddb9c26700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb4100650 0x7fddb4106ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:04.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.279+0000 7fddb9c26700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb4100650 0x7fddb4106ba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36128/0 (socket says 192.168.123.103:36128) 2026-03-10T14:04:04.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.279+0000 7fddb9c26700 1 -- 192.168.123.103:0/2931927270 learned_addr learned my addr 192.168.123.103:0/2931927270 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:04:04.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.279+0000 7fddba427700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fddb40ff450 0x7fddb4106660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:04.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.279+0000 7fddb9c26700 1 -- 192.168.123.103:0/2931927270 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fddb40ff450 msgr2=0x7fddb4106660 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:04.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.279+0000 7fddb9c26700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fddb40ff450 0x7fddb4106660 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:04.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.279+0000 7fddb9c26700 1 -- 192.168.123.103:0/2931927270 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fddb00097e0 con 0x7fddb4100650 2026-03-10T14:04:04.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.279+0000 7fddb9c26700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb4100650 0x7fddb4106ba0 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7fdda400d8d0 tx=0x7fdda400dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:04.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.280+0000 7fddab7fe700 1 -- 192.168.123.103:0/2931927270 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdda4009880 con 0x7fddb4100650 2026-03-10T14:04:04.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.280+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fddb410bd60 con 0x7fddb4100650 2026-03-10T14:04:04.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.280+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fddb410c2b0 con 0x7fddb4100650 2026-03-10T14:04:04.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.280+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fddb4071700 con 0x7fddb4100650 2026-03-10T14:04:04.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.280+0000 7fddab7fe700 1 -- 192.168.123.103:0/2931927270 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdda400b6e0 con 0x7fddb4100650 2026-03-10T14:04:04.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.280+0000 7fddab7fe700 1 -- 192.168.123.103:0/2931927270 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdda4009880 con 0x7fddb4100650 2026-03-10T14:04:04.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.282+0000 7fddab7fe700 1 -- 192.168.123.103:0/2931927270 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7fdda4010880 con 0x7fddb4100650 2026-03-10T14:04:04.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.282+0000 7fddab7fe700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdda006c650 0x7fdda006eb00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:04.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.282+0000 7fddab7fe700 1 -- 192.168.123.103:0/2931927270 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fdda40599f0 con 0x7fddb4100650 2026-03-10T14:04:04.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.282+0000 7fddba427700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdda006c650 0x7fdda006eb00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:04.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.284+0000 7fddba427700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdda006c650 0x7fdda006eb00 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fddb00052d0 tx=0x7fddb002c040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:04.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.284+0000 7fddab7fe700 1 -- 192.168.123.103:0/2931927270 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdda40564c0 con 0x7fddb4100650 2026-03-10T14:04:04.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:04.408+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7fddb4061190 con 0x7fdda006c650 2026-03-10T14:04:05.443 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:05 vm03 ceph-mon[49718]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:05.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:05.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:05.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:05.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:05.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:05.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:05 vm03 ceph-mon[49718]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-10T14:04:05.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:05 vm03 ceph-mon[49718]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-10T14:04:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:05 vm04 ceph-mon[55966]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:05 vm04 ceph-mon[55966]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-10T14:04:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:05 vm04 ceph-mon[55966]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-10T14:04:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:06 vm04 ceph-mon[55966]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:06 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/1511957598' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "32845076-88bb-4ae3-87fc-7369eed22d26"}]: dispatch 2026-03-10T14:04:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:06 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/1511957598' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "32845076-88bb-4ae3-87fc-7369eed22d26"}]': finished 2026-03-10T14:04:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:06 vm04 ceph-mon[55966]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T14:04:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:06 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:06.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:06 vm03 ceph-mon[49718]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:06 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1511957598' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "32845076-88bb-4ae3-87fc-7369eed22d26"}]: dispatch 2026-03-10T14:04:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:06 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1511957598' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "32845076-88bb-4ae3-87fc-7369eed22d26"}]': finished 2026-03-10T14:04:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:06 vm03 ceph-mon[49718]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T14:04:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:06 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:07.477 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:07 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/2571937434' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:07.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:07 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/2571937434' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:08.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:08 vm03 ceph-mon[49718]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:09.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:08 vm04 ceph-mon[55966]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T14:04:10.747 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:10 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:10.808 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:10.808 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:10.808 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:10.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:04:10.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:10.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:10.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:10.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:04:10.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:10.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T14:04:10.809 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:10 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: Reconfiguring ceph-exporter.vm04 (monmap changed)... 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: Reconfiguring daemon ceph-exporter.vm04 on vm04 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: Reconfiguring crash.vm04 (monmap changed)... 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: Reconfiguring daemon crash.vm04 on vm04 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: Deploying daemon osd.0 on vm03 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:04:11.670 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:11 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: Reconfiguring ceph-exporter.vm04 (monmap changed)... 2026-03-10T14:04:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: Reconfiguring daemon ceph-exporter.vm04 on vm04 2026-03-10T14:04:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: Reconfiguring crash.vm04 (monmap changed)... 2026-03-10T14:04:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: Reconfiguring daemon crash.vm04 on vm04 2026-03-10T14:04:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: Deploying daemon osd.0 on vm03 2026-03-10T14:04:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:11.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:04:11.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:04:11.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:11.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:11.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:11.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:04:11.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:04:11.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:11 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: Reconfiguring mgr.vm04.ywwcto (monmap changed)... 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: Reconfiguring daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: Reconfiguring mon.vm04 (monmap changed)... 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: Reconfiguring daemon mon.vm04 on vm04 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:12 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: Reconfiguring mgr.vm04.ywwcto (monmap changed)... 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: Reconfiguring daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: Reconfiguring mon.vm04 (monmap changed)... 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: Reconfiguring daemon mon.vm04 on vm04 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:12.997 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:12 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:14.130 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 0 on host 'vm03' 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.129+0000 7fddab7fe700 1 -- 192.168.123.103:0/2931927270 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fddb4061190 con 0x7fdda006c650 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdda006c650 msgr2=0x7fdda006eb00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdda006c650 0x7fdda006eb00 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fddb00052d0 tx=0x7fddb002c040 comp rx=0 tx=0).stop 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb4100650 msgr2=0x7fddb4106ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb4100650 0x7fddb4106ba0 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7fdda400d8d0 tx=0x7fdda400dbe0 comp rx=0 tx=0).stop 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 shutdown_connections 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdda006c650 0x7fdda006eb00 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fddb40ff450 0x7fddb4106660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 --2- 192.168.123.103:0/2931927270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fddb4100650 0x7fddb4106ba0 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 >> 192.168.123.103:0/2931927270 conn(0x7fddb40fa9e0 msgr2=0x7fddb406e840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 shutdown_connections 2026-03-10T14:04:14.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:14.132+0000 7fddbc68b700 1 -- 192.168.123.103:0/2931927270 wait complete. 2026-03-10T14:04:14.153 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:13 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:14.153 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:13 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:14.153 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:13 vm03 ceph-mon[49718]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:14.153 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:13 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:14.153 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:13 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:04:14.153 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:13 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:14.177 DEBUG:teuthology.orchestra.run.vm03:osd.0> sudo journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.0.service 2026-03-10T14:04:14.178 INFO:tasks.cephadm:Deploying osd.1 on vm03 with /dev/vdd... 2026-03-10T14:04:14.178 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- lvm zap /dev/vdd 2026-03-10T14:04:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:13 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:13 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:13 vm04 ceph-mon[55966]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:13 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:13 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:04:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:13 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:14.441 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:04:15.059 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:04:15.072 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch daemon add osd vm03:/dev/vdd 2026-03-10T14:04:15.298 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:15 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:15 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:15.322 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:04:15 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[67663]: 2026-03-10T14:04:15.181+0000 7f29a6a37640 -1 osd.0 0 log_to_monitors true 2026-03-10T14:04:15.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.583+0000 7f82b513e700 1 -- 192.168.123.103:0/2586895860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0072360 msgr2=0x7f82b00770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:15.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.583+0000 7f82b513e700 1 --2- 192.168.123.103:0/2586895860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0072360 0x7f82b00770e0 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f82a800b3a0 tx=0x7f82a800b6b0 comp rx=0 tx=0).stop 2026-03-10T14:04:15.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 -- 192.168.123.103:0/2586895860 shutdown_connections 2026-03-10T14:04:15.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 --2- 192.168.123.103:0/2586895860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0072360 0x7f82b00770e0 unknown :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:15.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 --2- 192.168.123.103:0/2586895860 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82b0071980 0x7f82b0071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:15.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 -- 192.168.123.103:0/2586895860 >> 192.168.123.103:0/2586895860 conn(0x7f82b006d1a0 msgr2=0x7f82b006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:15.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 -- 192.168.123.103:0/2586895860 shutdown_connections 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 -- 192.168.123.103:0/2586895860 wait complete. 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 Processor -- start 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 -- start start 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82b0071980 0x7f82b0082620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0082b60 0x7f82b0082fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82b01b2a90 con 0x7f82b0082b60 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.585+0000 7f82b513e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82b01b2bd0 con 0x7f82b0071980 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.586+0000 7f82ae59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0082b60 0x7f82b0082fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:15.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.586+0000 7f82ae59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0082b60 0x7f82b0082fd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44466/0 (socket says 192.168.123.103:44466) 2026-03-10T14:04:15.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.586+0000 7f82ae59c700 1 -- 192.168.123.103:0/958245294 learned_addr learned my addr 192.168.123.103:0/958245294 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:04:15.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.586+0000 7f82aed9d700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82b0071980 0x7f82b0082620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:15.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.586+0000 7f82ae59c700 1 -- 192.168.123.103:0/958245294 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82b0071980 msgr2=0x7f82b0082620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:15.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.586+0000 7f82ae59c700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82b0071980 0x7f82b0082620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:15.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.586+0000 7f82ae59c700 1 -- 192.168.123.103:0/958245294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f82a800b050 con 0x7f82b0082b60 2026-03-10T14:04:15.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.586+0000 7f82ae59c700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0082b60 0x7f82b0082fd0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f82a8007b90 tx=0x7f82a800bce0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:15.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.587+0000 7f8297fff700 1 -- 192.168.123.103:0/958245294 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82a8028070 con 0x7f82b0082b60 2026-03-10T14:04:15.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.588+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f82b01b2dd0 con 0x7f82b0082b60 2026-03-10T14:04:15.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.588+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82b01b32c0 con 0x7f82b0082b60 2026-03-10T14:04:15.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.589+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f82b007c950 con 0x7f82b0082b60 2026-03-10T14:04:15.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.592+0000 7f8297fff700 1 -- 192.168.123.103:0/958245294 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f82a8024d30 con 0x7f82b0082b60 2026-03-10T14:04:15.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.592+0000 7f8297fff700 1 -- 192.168.123.103:0/958245294 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82a800e050 con 0x7f82b0082b60 2026-03-10T14:04:15.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.592+0000 7f8297fff700 1 -- 192.168.123.103:0/958245294 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f82a8019070 con 0x7f82b0082b60 2026-03-10T14:04:15.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.593+0000 7f8297fff700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f829806c7e0 0x7f829806ec90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:15.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.593+0000 7f8297fff700 1 -- 192.168.123.103:0/958245294 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(6..6 src has 1..6) v4 ==== 1313+0+0 (secure 0 0 0) 0x7f82a8094950 con 0x7f82b0082b60 2026-03-10T14:04:15.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.593+0000 7f82aed9d700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f829806c7e0 0x7f829806ec90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:15.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.594+0000 7f82aed9d700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f829806c7e0 0x7f829806ec90 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f82b0083d50 tx=0x7f82a0008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:15.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.594+0000 7f8297fff700 1 -- 192.168.123.103:0/958245294 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f82a80601a0 con 0x7f82b0082b60 2026-03-10T14:04:15.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:15.727+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f82b0061190 con 0x7f829806c7e0 2026-03-10T14:04:16.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:16 vm03 ceph-mon[49718]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:16.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:16 vm03 ceph-mon[49718]: from='osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T14:04:16.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:16 vm03 ceph-mon[49718]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:16.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:16 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:16.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:16 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:16.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:16 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:16.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:16 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:16.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:16 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:16 vm04 ceph-mon[55966]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:16 vm04 ceph-mon[55966]: from='osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T14:04:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:16 vm04 ceph-mon[55966]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:16 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:16 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:16 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:16 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:16 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: osdmap e7: 1 total, 0 up, 1 in 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3148272554' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0af1d772-f786-4e93-8006-866ee43f51d1"}]: dispatch 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3148272554' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0af1d772-f786-4e93-8006-866ee43f51d1"}]': finished 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: osdmap e8: 2 total, 0 up, 2 in 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:17.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:17 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:17.056 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:04:16 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[67663]: 2026-03-10T14:04:16.749+0000 7f299b8ad700 -1 osd.0 0 waiting for initial osdmap 2026-03-10T14:04:17.056 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:04:16 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[67663]: 2026-03-10T14:04:16.756+0000 7f29976a2700 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: osdmap e7: 1 total, 0 up, 1 in 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3148272554' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0af1d772-f786-4e93-8006-866ee43f51d1"}]: dispatch 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3148272554' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0af1d772-f786-4e93-8006-866ee43f51d1"}]': finished 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: osdmap e8: 2 total, 0 up, 2 in 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:17 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:18 vm04 ceph-mon[55966]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:18 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/1630955741' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:18 vm04 ceph-mon[55966]: osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] boot 2026-03-10T14:04:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:18 vm04 ceph-mon[55966]: osdmap e9: 2 total, 1 up, 2 in 2026-03-10T14:04:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:18 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:18 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:18.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:18 vm03 ceph-mon[49718]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T14:04:18.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:18 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1630955741' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:18.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:18 vm03 ceph-mon[49718]: osd.0 [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] boot 2026-03-10T14:04:18.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:18 vm03 ceph-mon[49718]: osdmap e9: 2 total, 1 up, 2 in 2026-03-10T14:04:18.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:18 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:04:18.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:18 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:19.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:19 vm04 ceph-mon[55966]: purged_snaps scrub starts 2026-03-10T14:04:19.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:19 vm04 ceph-mon[55966]: purged_snaps scrub ok 2026-03-10T14:04:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:19 vm03 ceph-mon[49718]: purged_snaps scrub starts 2026-03-10T14:04:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:19 vm03 ceph-mon[49718]: purged_snaps scrub ok 2026-03-10T14:04:20.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:20 vm03 ceph-mon[49718]: pgmap v15: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:20.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:20 vm03 ceph-mon[49718]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T14:04:20.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:20 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:20.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:20 vm04 ceph-mon[55966]: pgmap v15: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:20.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:20 vm04 ceph-mon[55966]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T14:04:20.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:20 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: Detected new or changed devices on vm03 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:22 vm03 ceph-mon[49718]: Deploying daemon osd.1 on vm03 2026-03-10T14:04:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: Detected new or changed devices on vm03 2026-03-10T14:04:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T14:04:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:22 vm04 ceph-mon[55966]: Deploying daemon osd.1 on vm03 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 1 on host 'vm03' 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.271+0000 7f8297fff700 1 -- 192.168.123.103:0/958245294 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f82b0061190 con 0x7f829806c7e0 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.273+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f829806c7e0 msgr2=0x7f829806ec90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.273+0000 7f82b513e700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f829806c7e0 0x7f829806ec90 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f82b0083d50 tx=0x7f82a0008040 comp rx=0 tx=0).stop 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.273+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0082b60 msgr2=0x7f82b0082fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.273+0000 7f82b513e700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0082b60 0x7f82b0082fd0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f82a8007b90 tx=0x7f82a800bce0 comp rx=0 tx=0).stop 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.274+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 shutdown_connections 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.274+0000 7f82b513e700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f829806c7e0 0x7f829806ec90 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.274+0000 7f82b513e700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f82b0071980 0x7f82b0082620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.274+0000 7f82b513e700 1 --2- 192.168.123.103:0/958245294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0082b60 0x7f82b0082fd0 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.274+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 >> 192.168.123.103:0/958245294 conn(0x7f82b006d1a0 msgr2=0x7f82b00764e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.274+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 shutdown_connections 2026-03-10T14:04:24.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:24.274+0000 7f82b513e700 1 -- 192.168.123.103:0/958245294 wait complete. 2026-03-10T14:04:24.352 DEBUG:teuthology.orchestra.run.vm03:osd.1> sudo journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.1.service 2026-03-10T14:04:24.356 INFO:tasks.cephadm:Deploying osd.2 on vm03 with /dev/vdc... 2026-03-10T14:04:24.356 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- lvm zap /dev/vdc 2026-03-10T14:04:24.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:24 vm03 ceph-mon[49718]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:24.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:24.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:24.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:24.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:24.432 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:24 vm04 ceph-mon[55966]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:24.617 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:04:25.298 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:04:25.315 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch daemon add osd vm03:/dev/vdc 2026-03-10T14:04:25.562 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:04:25.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:25.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:25.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:25.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:25.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:25.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:25.596 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:04:25 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[73415]: 2026-03-10T14:04:25.370+0000 7ff01afc3640 -1 osd.1 0 log_to_monitors true 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.887+0000 7f04a0536700 1 -- 192.168.123.103:0/2390519820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498071a90 msgr2=0x7f0498071ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.887+0000 7f04a0536700 1 --2- 192.168.123.103:0/2390519820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498071a90 0x7f0498071ea0 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f0494009b00 tx=0x7f0494009e10 comp rx=0 tx=0).stop 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.888+0000 7f04a0536700 1 -- 192.168.123.103:0/2390519820 shutdown_connections 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.888+0000 7f04a0536700 1 --2- 192.168.123.103:0/2390519820 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0498072470 0x7f049810beb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.888+0000 7f04a0536700 1 --2- 192.168.123.103:0/2390519820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498071a90 0x7f0498071ea0 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.888+0000 7f04a0536700 1 -- 192.168.123.103:0/2390519820 >> 192.168.123.103:0/2390519820 conn(0x7f049806d1a0 msgr2=0x7f049806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.888+0000 7f04a0536700 1 -- 192.168.123.103:0/2390519820 shutdown_connections 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.888+0000 7f04a0536700 1 -- 192.168.123.103:0/2390519820 wait complete. 2026-03-10T14:04:25.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.889+0000 7f04a0536700 1 Processor -- start 2026-03-10T14:04:25.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.889+0000 7f04a0536700 1 -- start start 2026-03-10T14:04:25.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.889+0000 7f04a0536700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0498072470 0x7f0498116a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:25.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.889+0000 7f04a0536700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498116f80 0x7f04981b2760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:25.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.889+0000 7f04a0536700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f04981174b0 con 0x7f0498116f80 2026-03-10T14:04:25.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.889+0000 7f04a0536700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0498117620 con 0x7f0498072470 2026-03-10T14:04:25.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.889+0000 7f049dad1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498116f80 0x7f04981b2760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:25.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.890+0000 7f049dad1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498116f80 0x7f04981b2760 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:49370/0 (socket says 192.168.123.103:49370) 2026-03-10T14:04:25.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.890+0000 7f049dad1700 1 -- 192.168.123.103:0/1631808957 learned_addr learned my addr 192.168.123.103:0/1631808957 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:04:25.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.890+0000 7f049dad1700 1 -- 192.168.123.103:0/1631808957 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0498072470 msgr2=0x7f0498116a40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:04:25.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.890+0000 7f049dad1700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0498072470 0x7f0498116a40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:25.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.890+0000 7f049dad1700 1 -- 192.168.123.103:0/1631808957 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f04940097e0 con 0x7f0498116f80 2026-03-10T14:04:25.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.891+0000 7f049dad1700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498116f80 0x7f04981b2760 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7f049000c370 tx=0x7f049000c730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:25.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.893+0000 7f048f7fe700 1 -- 192.168.123.103:0/1631808957 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f049000e050 con 0x7f0498116f80 2026-03-10T14:04:25.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.893+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f04981b2ca0 con 0x7f0498116f80 2026-03-10T14:04:25.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.893+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f04981b31f0 con 0x7f0498116f80 2026-03-10T14:04:25.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.894+0000 7f048f7fe700 1 -- 192.168.123.103:0/1631808957 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f049000f040 con 0x7f0498116f80 2026-03-10T14:04:25.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.894+0000 7f048f7fe700 1 -- 192.168.123.103:0/1631808957 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0490013610 con 0x7f0498116f80 2026-03-10T14:04:25.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.894+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f049804ea50 con 0x7f0498116f80 2026-03-10T14:04:25.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.896+0000 7f048f7fe700 1 -- 192.168.123.103:0/1631808957 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f0490013770 con 0x7f0498116f80 2026-03-10T14:04:25.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.896+0000 7f048f7fe700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f048406c720 0x7f048406ebd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:25.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.896+0000 7f048f7fe700 1 -- 192.168.123.103:0/1631808957 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(10..10 src has 1..10) v4 ==== 1915+0+0 (secure 0 0 0) 0x7f049008b8e0 con 0x7f0498116f80 2026-03-10T14:04:25.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.899+0000 7f049e2d2700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f048406c720 0x7f048406ebd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:25.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.899+0000 7f049e2d2700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f048406c720 0x7f048406ebd0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f04940097b0 tx=0x7f0494009700 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:25.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:25.905+0000 7f048f7fe700 1 -- 192.168.123.103:0/1631808957 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f049005a970 con 0x7f0498116f80 2026-03-10T14:04:26.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:26.027+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f049810b6e0 con 0x7f048406c720 2026-03-10T14:04:26.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:26 vm04 ceph-mon[55966]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:26 vm04 ceph-mon[55966]: from='osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:26 vm03 ceph-mon[49718]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:26 vm03 ceph-mon[49718]: from='osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:26.564 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:04:27 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[73415]: 2026-03-10T14:04:27.131+0000 7ff00fe39700 -1 osd.1 0 waiting for initial osdmap 2026-03-10T14:04:27.359 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:04:27 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[73415]: 2026-03-10T14:04:27.140+0000 7ff00c42f700 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='client.14296 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3893386812' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "03fb18a9-019c-440f-9b09-702297165b29"}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3893386812' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "03fb18a9-019c-440f-9b09-702297165b29"}]': finished 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: osdmap e12: 3 total, 1 up, 3 in 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:27.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:27.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='client.14296 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:27.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3893386812' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "03fb18a9-019c-440f-9b09-702297165b29"}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3893386812' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "03fb18a9-019c-440f-9b09-702297165b29"}]': finished 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: osdmap e12: 3 total, 1 up, 3 in 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:27.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: purged_snaps scrub starts 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: purged_snaps scrub ok 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/889295974' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] boot 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: osdmap e13: 3 total, 2 up, 3 in 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:28.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: purged_snaps scrub starts 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: purged_snaps scrub ok 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/889295974' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: osd.1 [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] boot 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: osdmap e13: 3 total, 2 up, 3 in 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:30.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:30 vm04 ceph-mon[55966]: pgmap v24: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:30.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:30 vm04 ceph-mon[55966]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T14:04:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:30 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:30.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:30 vm03 ceph-mon[49718]: pgmap v24: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:30.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:30 vm03 ceph-mon[49718]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T14:04:30.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:30 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:32.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:32 vm03 ceph-mon[49718]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:32.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:32 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T14:04:32.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:32 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:32.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:32 vm03 ceph-mon[49718]: Deploying daemon osd.2 on vm03 2026-03-10T14:04:32.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:32 vm04 ceph-mon[55966]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:32.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:32 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T14:04:32.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:32 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:32.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:32 vm04 ceph-mon[55966]: Deploying daemon osd.2 on vm03 2026-03-10T14:04:34.217 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:34 vm03 ceph-mon[49718]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:34.217 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:34 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:34.217 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:34 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:34.217 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:34 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:34 vm04 ceph-mon[55966]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:34 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:34 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:34 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:34.675 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 2 on host 'vm03' 2026-03-10T14:04:34.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.676+0000 7f048f7fe700 1 -- 192.168.123.103:0/1631808957 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f049810b6e0 con 0x7f048406c720 2026-03-10T14:04:34.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f048406c720 msgr2=0x7f048406ebd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:34.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f048406c720 0x7f048406ebd0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f04940097b0 tx=0x7f0494009700 comp rx=0 tx=0).stop 2026-03-10T14:04:34.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498116f80 msgr2=0x7f04981b2760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:34.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498116f80 0x7f04981b2760 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7f049000c370 tx=0x7f049000c730 comp rx=0 tx=0).stop 2026-03-10T14:04:34.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 shutdown_connections 2026-03-10T14:04:34.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f048406c720 0x7f048406ebd0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:34.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0498072470 0x7f0498116a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:34.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 --2- 192.168.123.103:0/1631808957 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0498116f80 0x7f04981b2760 secure :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7f049000c370 tx=0x7f049000c730 comp rx=0 tx=0).stop 2026-03-10T14:04:34.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.678+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 >> 192.168.123.103:0/1631808957 conn(0x7f049806d1a0 msgr2=0x7f04980705a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:34.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.679+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 shutdown_connections 2026-03-10T14:04:34.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:04:34.679+0000 7f04a0536700 1 -- 192.168.123.103:0/1631808957 wait complete. 2026-03-10T14:04:34.792 DEBUG:teuthology.orchestra.run.vm03:osd.2> sudo journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.2.service 2026-03-10T14:04:34.796 INFO:tasks.cephadm:Deploying osd.3 on vm04 with /dev/vde... 2026-03-10T14:04:34.796 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- lvm zap /dev/vde 2026-03-10T14:04:34.973 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm04/config 2026-03-10T14:04:35.359 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:04:35 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[79637]: 2026-03-10T14:04:35.100+0000 7fcd93f04640 -1 osd.2 0 log_to_monitors true 2026-03-10T14:04:35.503 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:04:35.516 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch daemon add osd vm04:/dev/vde 2026-03-10T14:04:35.675 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm04/config 2026-03-10T14:04:35.705 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:35 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:35.705 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:35 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:35.705 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:35 vm04 ceph-mon[55966]: from='osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T14:04:35.705 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:35 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:35.705 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:35 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:35.777 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:35 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:35.777 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:35 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:35.777 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:35 vm03 ceph-mon[49718]: from='osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T14:04:35.777 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:35 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:35.777 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:35 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:35.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.957+0000 7f294f0c0700 1 -- 192.168.123.104:0/409811807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 msgr2=0x7f2948105d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:35.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.957+0000 7f294f0c0700 1 --2- 192.168.123.104:0/409811807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 0x7f2948105d90 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f293c009b00 tx=0x7f293c009e10 comp rx=0 tx=0).stop 2026-03-10T14:04:35.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.958+0000 7f294f0c0700 1 -- 192.168.123.104:0/409811807 shutdown_connections 2026-03-10T14:04:35.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.958+0000 7f294f0c0700 1 --2- 192.168.123.104:0/409811807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 0x7f2948105d90 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:35.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.958+0000 7f294f0c0700 1 --2- 192.168.123.104:0/409811807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2948101090 0x7f2948103470 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:35.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.958+0000 7f294f0c0700 1 -- 192.168.123.104:0/409811807 >> 192.168.123.104:0/409811807 conn(0x7f29480faa70 msgr2=0x7f29480fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:35.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.959+0000 7f294f0c0700 1 -- 192.168.123.104:0/409811807 shutdown_connections 2026-03-10T14:04:35.957 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.959+0000 7f294f0c0700 1 -- 192.168.123.104:0/409811807 wait complete. 2026-03-10T14:04:35.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.959+0000 7f294f0c0700 1 Processor -- start 2026-03-10T14:04:35.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.960+0000 7f294f0c0700 1 -- start start 2026-03-10T14:04:35.958 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.960+0000 7f294f0c0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2948101090 0x7f2948071d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.960+0000 7f294f0c0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 0x7f2948072290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.960+0000 7f294f0c0700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29480728b0 con 0x7f2948101090 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.960+0000 7f294f0c0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29481a38c0 con 0x7f29481039b0 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.960+0000 7f294ce5c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2948101090 0x7f2948071d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.960+0000 7f294ce5c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2948101090 0x7f2948071d50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:43368/0 (socket says 192.168.123.104:43368) 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.960+0000 7f294ce5c700 1 -- 192.168.123.104:0/4153087008 learned_addr learned my addr 192.168.123.104:0/4153087008 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.961+0000 7f294ce5c700 1 -- 192.168.123.104:0/4153087008 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 msgr2=0x7f2948072290 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.961+0000 7f2947fff700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 0x7f2948072290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.961+0000 7f294ce5c700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 0x7f2948072290 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:35.959 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.961+0000 7f294ce5c700 1 -- 192.168.123.104:0/4153087008 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f293c0097e0 con 0x7f2948101090 2026-03-10T14:04:35.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.961+0000 7f2947fff700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 0x7f2948072290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:04:35.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.961+0000 7f294ce5c700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2948101090 0x7f2948071d50 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7f293800b700 tx=0x7f293800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:35.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.962+0000 7f2945ffb700 1 -- 192.168.123.104:0/4153087008 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2938010820 con 0x7f2948101090 2026-03-10T14:04:35.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.962+0000 7f2945ffb700 1 -- 192.168.123.104:0/4153087008 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2938010e60 con 0x7f2948101090 2026-03-10T14:04:35.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.962+0000 7f2945ffb700 1 -- 192.168.123.104:0/4153087008 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2938017570 con 0x7f2948101090 2026-03-10T14:04:35.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.962+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f29481a3ba0 con 0x7f2948101090 2026-03-10T14:04:35.960 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.962+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f29481a40f0 con 0x7f2948101090 2026-03-10T14:04:35.961 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.963+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f294806bf10 con 0x7f2948101090 2026-03-10T14:04:35.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.966+0000 7f2945ffb700 1 -- 192.168.123.104:0/4153087008 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f293800f3c0 con 0x7f2948101090 2026-03-10T14:04:35.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.966+0000 7f2945ffb700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f293006c600 0x7f293006eab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:35.964 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.966+0000 7f2945ffb700 1 -- 192.168.123.104:0/4153087008 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(15..15 src has 1..15) v4 ==== 2368+0+0 (secure 0 0 0) 0x7f293808adc0 con 0x7f2948101090 2026-03-10T14:04:35.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.966+0000 7f2947fff700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f293006c600 0x7f293006eab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:35.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.967+0000 7f2947fff700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f293006c600 0x7f293006eab0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f293c009fd0 tx=0x7f293c005fd0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:35.965 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:35.967+0000 7f2945ffb700 1 -- 192.168.123.104:0/4153087008 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2938056120 con 0x7f2948101090 2026-03-10T14:04:36.093 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:36.094+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f2948061190 con 0x7f293006c600 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: osdmap e15: 3 total, 2 up, 3 in 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:36.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:36 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: osdmap e15: 3 total, 2 up, 3 in 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:36.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:36 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:37.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:04:36 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[79637]: 2026-03-10T14:04:36.946+0000 7fcd88d7a700 -1 osd.2 0 waiting for initial osdmap 2026-03-10T14:04:37.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:04:36 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[79637]: 2026-03-10T14:04:36.958+0000 7fcd85370700 -1 osd.2 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='client.14312 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: Detected new or changed devices on vm03 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: osdmap e16: 3 total, 2 up, 3 in 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/1694497151' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb"}]: dispatch 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb"}]: dispatch 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] boot 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb"}]': finished 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: osdmap e17: 4 total, 3 up, 4 in 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:37.737 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:37.738 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T14:04:37.738 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:37 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/4018696472' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='client.14312 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: Detected new or changed devices on vm03 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: osdmap e16: 3 total, 2 up, 3 in 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='client.? 192.168.123.104:0/1694497151' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb"}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb"}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: osd.2 [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] boot 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb"}]': finished 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: osdmap e17: 4 total, 3 up, 4 in 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T14:04:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:37 vm04 ceph-mon[55966]: from='client.? 192.168.123.104:0/4018696472' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:39.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: purged_snaps scrub starts 2026-03-10T14:04:39.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: purged_snaps scrub ok 2026-03-10T14:04:39.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: pgmap v32: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:39.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T14:04:39.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T14:04:39.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:39.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T14:04:39.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:39.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:39.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:39.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:39.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:38 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: purged_snaps scrub starts 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: purged_snaps scrub ok 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: pgmap v32: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:38 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:40.311 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:40 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T14:04:40.311 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:40 vm03 ceph-mon[49718]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T14:04:40.311 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:40 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:40.311 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:40 vm03 ceph-mon[49718]: pgmap v35: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:40 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T14:04:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:40 vm04 ceph-mon[55966]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T14:04:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:40 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:40 vm04 ceph-mon[55966]: pgmap v35: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:40.607 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83431]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vdd 2026-03-10T14:04:40.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83431]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T14:04:40.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83431]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T14:04:40.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83431]: pam_unix(sudo:session): session closed for user root 2026-03-10T14:04:40.608 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83428]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vde 2026-03-10T14:04:40.608 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83428]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T14:04:40.608 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83428]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T14:04:40.608 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83428]: pam_unix(sudo:session): session closed for user root 2026-03-10T14:04:41.079 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83434]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vdc 2026-03-10T14:04:41.079 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83434]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T14:04:41.079 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83434]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T14:04:41.079 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83434]: pam_unix(sudo:session): session closed for user root 2026-03-10T14:04:41.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83437]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vda 2026-03-10T14:04:41.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83437]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T14:04:41.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83437]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T14:04:41.079 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:40 vm03 sudo[83437]: pam_unix(sudo:session): session closed for user root 2026-03-10T14:04:41.080 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 sudo[61315]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vda 2026-03-10T14:04:41.080 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 sudo[61315]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T14:04:41.080 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 sudo[61315]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T14:04:41.080 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 sudo[61315]: pam_unix(sudo:session): session closed for user root 2026-03-10T14:04:41.081 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 ceph-mon[55966]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T14:04:41.081 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 ceph-mon[55966]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T14:04:41.081 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:04:41.081 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:04:41.081 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:04:41.081 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:41 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:04:41.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:41 vm03 ceph-mon[49718]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T14:04:41.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:41 vm03 ceph-mon[49718]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T14:04:41.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:41 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:04:41.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:41 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:04:41.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:41 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:04:41.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:41 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:04:42.089 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:42 vm04 ceph-mon[55966]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T14:04:42.089 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:42 vm04 ceph-mon[55966]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T14:04:42.089 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:42 vm04 ceph-mon[55966]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T14:04:42.089 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:42 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:42.089 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:42 vm04 ceph-mon[55966]: pgmap v37: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:42.089 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:42 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T14:04:42.089 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:42 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:42.089 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:42 vm04 ceph-mon[55966]: Deploying daemon osd.3 on vm04 2026-03-10T14:04:42.094 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:42.094+0000 7f2945ffb700 1 -- 192.168.123.104:0/4153087008 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f29380564f0 con 0x7f2948101090 2026-03-10T14:04:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:42 vm03 ceph-mon[49718]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T14:04:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:42 vm03 ceph-mon[49718]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T14:04:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:42 vm03 ceph-mon[49718]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T14:04:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:42 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:42 vm03 ceph-mon[49718]: pgmap v37: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:42 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T14:04:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:42 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:42 vm03 ceph-mon[49718]: Deploying daemon osd.3 on vm04 2026-03-10T14:04:43.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:43 vm04 ceph-mon[55966]: mgrmap e19: vm03.rwbbep(active, since 59s), standbys: vm04.ywwcto 2026-03-10T14:04:43.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:43 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:43.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:43 vm03 ceph-mon[49718]: mgrmap e19: vm03.rwbbep(active, since 59s), standbys: vm04.ywwcto 2026-03-10T14:04:43.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:43 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:44.582 INFO:teuthology.orchestra.run.vm04.stdout:Created osd(s) 3 on host 'vm04' 2026-03-10T14:04:44.582 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.583+0000 7f2945ffb700 1 -- 192.168.123.104:0/4153087008 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f2948061190 con 0x7f293006c600 2026-03-10T14:04:44.585 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f293006c600 msgr2=0x7f293006eab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:44.585 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f293006c600 0x7f293006eab0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f293c009fd0 tx=0x7f293c005fd0 comp rx=0 tx=0).stop 2026-03-10T14:04:44.585 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2948101090 msgr2=0x7f2948071d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:44.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2948101090 0x7f2948071d50 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7f293800b700 tx=0x7f293800bac0 comp rx=0 tx=0).stop 2026-03-10T14:04:44.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 shutdown_connections 2026-03-10T14:04:44.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2948101090 0x7f2948071d50 unknown :-1 s=CLOSED pgs=188 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:44.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f293006c600 0x7f293006eab0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:44.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 --2- 192.168.123.104:0/4153087008 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29481039b0 0x7f2948072290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:44.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 >> 192.168.123.104:0/4153087008 conn(0x7f29480faa70 msgr2=0x7f29480fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:44.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 shutdown_connections 2026-03-10T14:04:44.586 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:44.586+0000 7f294f0c0700 1 -- 192.168.123.104:0/4153087008 wait complete. 2026-03-10T14:04:44.723 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:44 vm04 ceph-mon[55966]: pgmap v38: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:44.723 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:44 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:04:44.723 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:44 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:44.723 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:44 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:44.753 DEBUG:teuthology.orchestra.run.vm04:osd.3> sudo journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.3.service 2026-03-10T14:04:44.754 INFO:tasks.cephadm:Deploying osd.4 on vm04 with /dev/vdd... 2026-03-10T14:04:44.754 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- lvm zap /dev/vdd 2026-03-10T14:04:44.761 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:44 vm03 ceph-mon[49718]: pgmap v38: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:44.761 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:44 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:04:44.761 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:44 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:44.761 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:44 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:44.976 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm04/config 2026-03-10T14:04:45.610 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:45 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:45.611 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:45 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:45.611 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:45 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:45.611 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:45 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:45.677 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:04:45.690 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch daemon add osd vm04:/dev/vdd 2026-03-10T14:04:45.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:45 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:45.858 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm04/config 2026-03-10T14:04:45.865 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:04:45 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[61840]: 2026-03-10T14:04:45.728+0000 7f26bc1ff640 -1 osd.3 0 log_to_monitors true 2026-03-10T14:04:46.119 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.119+0000 7f2f5284f700 1 -- 192.168.123.104:0/203543290 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c1038d0 msgr2=0x7f2f4c105cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:46.119 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.119+0000 7f2f5284f700 1 --2- 192.168.123.104:0/203543290 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c1038d0 0x7f2f4c105cb0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f2f3c009b00 tx=0x7f2f3c009e10 comp rx=0 tx=0).stop 2026-03-10T14:04:46.122 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.124+0000 7f2f5284f700 1 -- 192.168.123.104:0/203543290 shutdown_connections 2026-03-10T14:04:46.122 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.124+0000 7f2f5284f700 1 --2- 192.168.123.104:0/203543290 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c1038d0 0x7f2f4c105cb0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:46.122 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.124+0000 7f2f5284f700 1 --2- 192.168.123.104:0/203543290 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f4c100fb0 0x7f2f4c103390 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:46.122 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.124+0000 7f2f5284f700 1 -- 192.168.123.104:0/203543290 >> 192.168.123.104:0/203543290 conn(0x7f2f4c0fa990 msgr2=0x7f2f4c0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:46.123 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.125+0000 7f2f5284f700 1 -- 192.168.123.104:0/203543290 shutdown_connections 2026-03-10T14:04:46.124 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f5284f700 1 -- 192.168.123.104:0/203543290 wait complete. 2026-03-10T14:04:46.124 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f5284f700 1 Processor -- start 2026-03-10T14:04:46.124 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f5284f700 1 -- start start 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f5284f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c100fb0 0x7f2f4c071ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f5284f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f4c1038d0 0x7f2f4c072220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f5284f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f4c072840 con 0x7f2f4c1038d0 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f5284f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f4c072980 con 0x7f2f4c100fb0 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f4b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f4c1038d0 0x7f2f4c072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f4b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f4c1038d0 0x7f2f4c072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:48672/0 (socket says 192.168.123.104:48672) 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.126+0000 7f2f4b7fe700 1 -- 192.168.123.104:0/1476298703 learned_addr learned my addr 192.168.123.104:0/1476298703 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.127+0000 7f2f4bfff700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c100fb0 0x7f2f4c071ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.127+0000 7f2f4b7fe700 1 -- 192.168.123.104:0/1476298703 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c100fb0 msgr2=0x7f2f4c071ce0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.127+0000 7f2f4b7fe700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c100fb0 0x7f2f4c071ce0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.127+0000 7f2f4b7fe700 1 -- 192.168.123.104:0/1476298703 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2f3c0097e0 con 0x7f2f4c1038d0 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.127+0000 7f2f4bfff700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c100fb0 0x7f2f4c071ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:04:46.125 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.127+0000 7f2f4b7fe700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f4c1038d0 0x7f2f4c072220 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f2f3c00bb70 tx=0x7f2f3c00bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:46.127 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.129+0000 7f2f497fa700 1 -- 192.168.123.104:0/1476298703 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f3c01d070 con 0x7f2f4c1038d0 2026-03-10T14:04:46.127 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.129+0000 7f2f497fa700 1 -- 192.168.123.104:0/1476298703 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2f3c004cf0 con 0x7f2f4c1038d0 2026-03-10T14:04:46.127 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.129+0000 7f2f497fa700 1 -- 192.168.123.104:0/1476298703 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f3c00f670 con 0x7f2f4c1038d0 2026-03-10T14:04:46.128 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.130+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2f4c1a39b0 con 0x7f2f4c1038d0 2026-03-10T14:04:46.128 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.130+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2f4c0ff220 con 0x7f2f4c1038d0 2026-03-10T14:04:46.129 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.130+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2f4c06bf10 con 0x7f2f4c1038d0 2026-03-10T14:04:46.129 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.131+0000 7f2f497fa700 1 -- 192.168.123.104:0/1476298703 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f2f3c004e60 con 0x7f2f4c1038d0 2026-03-10T14:04:46.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.131+0000 7f2f497fa700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f3806c4c0 0x7f2f3806e970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:46.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.131+0000 7f2f497fa700 1 -- 192.168.123.104:0/1476298703 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(20..20 src has 1..20) v4 ==== 3165+0+0 (secure 0 0 0) 0x7f2f3c08c4d0 con 0x7f2f4c1038d0 2026-03-10T14:04:46.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.131+0000 7f2f4bfff700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f3806c4c0 0x7f2f3806e970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:46.130 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.132+0000 7f2f4bfff700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f3806c4c0 0x7f2f3806e970 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f2f34009fd0 tx=0x7f2f34009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:46.132 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.134+0000 7f2f497fa700 1 -- 192.168.123.104:0/1476298703 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2f3c05afa0 con 0x7f2f4c1038d0 2026-03-10T14:04:46.259 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:46.261+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f2f4c061190 con 0x7f2f3806c4c0 2026-03-10T14:04:46.374 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: pgmap v39: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:46.374 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='osd.3 [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:46.375 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:46 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: pgmap v39: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='osd.3 [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:46 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:47.387 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:04:47 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[61840]: 2026-03-10T14:04:47.104+0000 7f26b2878700 -1 osd.3 0 waiting for initial osdmap 2026-03-10T14:04:47.387 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:04:47 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[61840]: 2026-03-10T14:04:47.112+0000 7f26ab667700 -1 osd.3 22 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:04:47.387 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='client.14328 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:47.387 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: Detected new or changed devices on vm04 2026-03-10T14:04:47.387 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T14:04:47.387 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T14:04:47.387 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:47.387 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='osd.3 [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:04:47.387 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='client.? 192.168.123.104:0/2656133975' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9d0ae930-1cc2-4815-bd24-e9129038b319"}]: dispatch 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9d0ae930-1cc2-4815-bd24-e9129038b319"}]: dispatch 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9d0ae930-1cc2-4815-bd24-e9129038b319"}]': finished 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: osdmap e22: 5 total, 3 up, 5 in 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:47.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:47 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='client.14328 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: Detected new or changed devices on vm04 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='osd.3 [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/2656133975' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9d0ae930-1cc2-4815-bd24-e9129038b319"}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "9d0ae930-1cc2-4815-bd24-e9129038b319"}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "9d0ae930-1cc2-4815-bd24-e9129038b319"}]': finished 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: osdmap e22: 5 total, 3 up, 5 in 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:47 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:48.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:48 vm04 ceph-mon[55966]: pgmap v41: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:48.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:48 vm04 ceph-mon[55966]: from='client.? 192.168.123.104:0/557399317' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:48.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:48 vm04 ceph-mon[55966]: osd.3 [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] boot 2026-03-10T14:04:48.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:48 vm04 ceph-mon[55966]: osdmap e23: 5 total, 4 up, 5 in 2026-03-10T14:04:48.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:48 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:48.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:48 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:48.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:48 vm03 ceph-mon[49718]: pgmap v41: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-10T14:04:48.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:48 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/557399317' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:48.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:48 vm03 ceph-mon[49718]: osd.3 [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] boot 2026-03-10T14:04:48.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:48 vm03 ceph-mon[49718]: osdmap e23: 5 total, 4 up, 5 in 2026-03-10T14:04:48.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:04:48.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:48 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:49.785 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:49 vm03 ceph-mon[49718]: purged_snaps scrub starts 2026-03-10T14:04:49.785 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:49 vm03 ceph-mon[49718]: purged_snaps scrub ok 2026-03-10T14:04:49.785 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:49 vm03 ceph-mon[49718]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T14:04:49.785 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:49 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:49.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:49 vm04 ceph-mon[55966]: purged_snaps scrub starts 2026-03-10T14:04:49.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:49 vm04 ceph-mon[55966]: purged_snaps scrub ok 2026-03-10T14:04:49.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:49 vm04 ceph-mon[55966]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T14:04:49.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:49 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:50.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:50 vm04 ceph-mon[55966]: pgmap v44: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T14:04:50.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:50 vm04 ceph-mon[55966]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T14:04:50.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:50 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:50.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:50 vm03 ceph-mon[49718]: pgmap v44: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T14:04:50.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:50 vm03 ceph-mon[49718]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T14:04:50.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:50 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:52.593 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:52 vm04 ceph-mon[55966]: pgmap v47: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T14:04:52.593 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:52 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T14:04:52.593 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:52 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:52.593 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:52 vm04 ceph-mon[55966]: Deploying daemon osd.4 on vm04 2026-03-10T14:04:52.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:52 vm03 ceph-mon[49718]: pgmap v47: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T14:04:52.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T14:04:52.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:52 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:52.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:52 vm03 ceph-mon[49718]: Deploying daemon osd.4 on vm04 2026-03-10T14:04:54.090 INFO:teuthology.orchestra.run.vm04.stdout:Created osd(s) 4 on host 'vm04' 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.088+0000 7f2f497fa700 1 -- 192.168.123.104:0/1476298703 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f2f4c061190 con 0x7f2f3806c4c0 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.091+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f3806c4c0 msgr2=0x7f2f3806e970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.091+0000 7f2f5284f700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f3806c4c0 0x7f2f3806e970 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f2f34009fd0 tx=0x7f2f34009450 comp rx=0 tx=0).stop 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.091+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f4c1038d0 msgr2=0x7f2f4c072220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.091+0000 7f2f5284f700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f4c1038d0 0x7f2f4c072220 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f2f3c00bb70 tx=0x7f2f3c00bba0 comp rx=0 tx=0).stop 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.092+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 shutdown_connections 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.092+0000 7f2f5284f700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f3806c4c0 0x7f2f3806e970 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.092+0000 7f2f5284f700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f4c100fb0 0x7f2f4c071ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.092+0000 7f2f5284f700 1 --2- 192.168.123.104:0/1476298703 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f4c1038d0 0x7f2f4c072220 unknown :-1 s=CLOSED pgs=192 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.092+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 >> 192.168.123.104:0/1476298703 conn(0x7f2f4c0fa990 msgr2=0x7f2f4c0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.092+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 shutdown_connections 2026-03-10T14:04:54.091 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:54.092+0000 7f2f5284f700 1 -- 192.168.123.104:0/1476298703 wait complete. 2026-03-10T14:04:54.140 DEBUG:teuthology.orchestra.run.vm04:osd.4> sudo journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.4.service 2026-03-10T14:04:54.143 INFO:tasks.cephadm:Deploying osd.5 on vm04 with /dev/vdc... 2026-03-10T14:04:54.143 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- lvm zap /dev/vdc 2026-03-10T14:04:54.353 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm04/config 2026-03-10T14:04:54.384 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:54 vm04 ceph-mon[55966]: pgmap v48: 1 pgs: 1 remapped+peering; 449 KiB data, 106 MiB used, 80 GiB / 80 GiB avail 2026-03-10T14:04:54.384 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:54 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.384 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:54 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:54.384 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:54 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.384 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:54 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.384 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:54 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.384 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:54 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.384 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:54 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:54 vm03 ceph-mon[49718]: pgmap v48: 1 pgs: 1 remapped+peering; 449 KiB data, 106 MiB used, 80 GiB / 80 GiB avail 2026-03-10T14:04:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:54 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:54 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:54 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:54 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:54 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:54 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:54 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:54.877 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:04:54.890 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph orch daemon add osd vm04:/dev/vdc 2026-03-10T14:04:55.066 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm04/config 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.333+0000 7f69e128f700 1 -- 192.168.123.104:0/2341311151 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc103710 msgr2=0x7f69dc105af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.333+0000 7f69e128f700 1 --2- 192.168.123.104:0/2341311151 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc103710 0x7f69dc105af0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f69cc009b00 tx=0x7f69cc009e10 comp rx=0 tx=0).stop 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.333+0000 7f69e128f700 1 -- 192.168.123.104:0/2341311151 shutdown_connections 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.333+0000 7f69e128f700 1 --2- 192.168.123.104:0/2341311151 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc103710 0x7f69dc105af0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.333+0000 7f69e128f700 1 --2- 192.168.123.104:0/2341311151 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f69dc100df0 0x7f69dc1031d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.333+0000 7f69e128f700 1 -- 192.168.123.104:0/2341311151 >> 192.168.123.104:0/2341311151 conn(0x7f69dc0fa7f0 msgr2=0x7f69dc0fcc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.334+0000 7f69e128f700 1 -- 192.168.123.104:0/2341311151 shutdown_connections 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.334+0000 7f69e128f700 1 -- 192.168.123.104:0/2341311151 wait complete. 2026-03-10T14:04:55.332 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.334+0000 7f69e128f700 1 Processor -- start 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.334+0000 7f69e128f700 1 -- start start 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69e128f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc100df0 0x7f69dc193990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69e128f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f69dc103710 0x7f69dc193ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69e128f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69dc1944f0 con 0x7f69dc103710 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69e128f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69dc194630 con 0x7f69dc100df0 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69da7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f69dc103710 0x7f69dc193ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69da7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f69dc103710 0x7f69dc193ed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.104:38902/0 (socket says 192.168.123.104:38902) 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69da7fc700 1 -- 192.168.123.104:0/116234433 learned_addr learned my addr 192.168.123.104:0/116234433 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:04:55.333 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69da7fc700 1 -- 192.168.123.104:0/116234433 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc100df0 msgr2=0x7f69dc193990 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:04:55.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69daffd700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc100df0 0x7f69dc193990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:55.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69da7fc700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc100df0 0x7f69dc193990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:04:55.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69da7fc700 1 -- 192.168.123.104:0/116234433 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f69cc0097e0 con 0x7f69dc103710 2026-03-10T14:04:55.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.335+0000 7f69daffd700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc100df0 0x7f69dc193990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:04:55.334 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.336+0000 7f69da7fc700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f69dc103710 0x7f69dc193ed0 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f69cc005230 tx=0x7f69cc0056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:55.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.336+0000 7f69d3fff700 1 -- 192.168.123.104:0/116234433 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f69cc01d070 con 0x7f69dc103710 2026-03-10T14:04:55.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.336+0000 7f69d3fff700 1 -- 192.168.123.104:0/116234433 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f69cc00bc50 con 0x7f69dc103710 2026-03-10T14:04:55.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.336+0000 7f69d3fff700 1 -- 192.168.123.104:0/116234433 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f69cc00f800 con 0x7f69dc103710 2026-03-10T14:04:55.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.336+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f69dc199080 con 0x7f69dc103710 2026-03-10T14:04:55.335 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.336+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f69dc199570 con 0x7f69dc103710 2026-03-10T14:04:55.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.338+0000 7f69d3fff700 1 -- 192.168.123.104:0/116234433 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f69cc022ae0 con 0x7f69dc103710 2026-03-10T14:04:55.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.338+0000 7f69d3fff700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f69c806c7c0 0x7f69c806ec70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:04:55.336 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.338+0000 7f69d3fff700 1 -- 192.168.123.104:0/116234433 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(25..25 src has 1..25) v4 ==== 3697+0+0 (secure 0 0 0) 0x7f69cc08cbd0 con 0x7f69dc103710 2026-03-10T14:04:55.337 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.339+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f69dc18db00 con 0x7f69dc103710 2026-03-10T14:04:55.339 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.340+0000 7f69daffd700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f69c806c7c0 0x7f69c806ec70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:04:55.343 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.341+0000 7f69daffd700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f69c806c7c0 0x7f69c806ec70 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f69c4005fd0 tx=0x7f69c4005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:04:55.345 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.346+0000 7f69d3fff700 1 -- 192.168.123.104:0/116234433 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f69cc05b480 con 0x7f69dc103710 2026-03-10T14:04:55.475 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:04:55.477+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f69dc061190 con 0x7f69c806c7c0 2026-03-10T14:04:55.789 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:04:55 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[67504]: 2026-03-10T14:04:55.536+0000 7f16f2ab5640 -1 osd.4 0 log_to_monitors true 2026-03-10T14:04:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 64 KiB/s, 0 objects/s recovering 2026-03-10T14:04:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='client.14340 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T14:04:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: Detected new or changed devices on vm04 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:56.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:55 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:56.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 64 KiB/s, 0 objects/s recovering 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='client.14340 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm04:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: Detected new or changed devices on vm04 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:04:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:55 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:04:56.676 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:04:56 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[67504]: 2026-03-10T14:04:56.352+0000 7f16e912e700 -1 osd.4 0 waiting for initial osdmap 2026-03-10T14:04:56.676 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:04:56 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[67504]: 2026-03-10T14:04:56.360+0000 7f16e3f21700 -1 osd.4 27 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996]' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: osdmap e26: 5 total, 4 up, 5 in 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='client.? 192.168.123.104:0/4226027350' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "5c4fe084-183b-43ce-9ebe-daeadbb4f59a"}]: dispatch 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "5c4fe084-183b-43ce-9ebe-daeadbb4f59a"}]: dispatch 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996]' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "5c4fe084-183b-43ce-9ebe-daeadbb4f59a"}]': finished 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: osdmap e27: 6 total, 4 up, 6 in 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:56 vm04 ceph-mon[55966]: from='client.? 192.168.123.104:0/3120767338' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:57.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996]' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T14:04:57.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: osdmap e26: 5 total, 4 up, 5 in 2026-03-10T14:04:57.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/4226027350' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "5c4fe084-183b-43ce-9ebe-daeadbb4f59a"}]: dispatch 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "5c4fe084-183b-43ce-9ebe-daeadbb4f59a"}]: dispatch 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996]' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "5c4fe084-183b-43ce-9ebe-daeadbb4f59a"}]': finished 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: osdmap e27: 6 total, 4 up, 6 in 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:56 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/3120767338' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T14:04:58.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:58 vm03 ceph-mon[49718]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 64 KiB/s, 0 objects/s recovering 2026-03-10T14:04:58.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:58 vm03 ceph-mon[49718]: osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] boot 2026-03-10T14:04:58.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:58 vm03 ceph-mon[49718]: osdmap e28: 6 total, 5 up, 6 in 2026-03-10T14:04:58.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:58 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:58.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:58 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:04:58.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:58 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:04:58.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:58 vm04 ceph-mon[55966]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 64 KiB/s, 0 objects/s recovering 2026-03-10T14:04:58.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:58 vm04 ceph-mon[55966]: osd.4 [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] boot 2026-03-10T14:04:58.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:58 vm04 ceph-mon[55966]: osdmap e28: 6 total, 5 up, 6 in 2026-03-10T14:04:58.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:58 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:04:58.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:58 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:04:58.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:58 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:04:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:59 vm03 ceph-mon[49718]: purged_snaps scrub starts 2026-03-10T14:04:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:59 vm03 ceph-mon[49718]: purged_snaps scrub ok 2026-03-10T14:04:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:59 vm03 ceph-mon[49718]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T14:04:59.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:04:59 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:04:59.776 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:59 vm04 ceph-mon[55966]: purged_snaps scrub starts 2026-03-10T14:04:59.776 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:59 vm04 ceph-mon[55966]: purged_snaps scrub ok 2026-03-10T14:04:59.776 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:59 vm04 ceph-mon[55966]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T14:04:59.777 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:04:59 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:00.469 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:00 vm04 ceph-mon[55966]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T14:05:00.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:00 vm03 ceph-mon[49718]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T14:05:01.401 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T14:05:01.402 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:01 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:01.402 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:01 vm04 ceph-mon[55966]: Deploying daemon osd.5 on vm04 2026-03-10T14:05:01.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T14:05:01.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:01 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:01.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:01 vm03 ceph-mon[49718]: Deploying daemon osd.5 on vm04 2026-03-10T14:05:02.619 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:02 vm04 ceph-mon[55966]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T14:05:02.619 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:02.619 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:02.619 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:02 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:02.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:02 vm03 ceph-mon[49718]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T14:05:02.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:02.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:02.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:02 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:03.205 INFO:teuthology.orchestra.run.vm04.stdout:Created osd(s) 5 on host 'vm04' 2026-03-10T14:05:03.205 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.206+0000 7f69d3fff700 1 -- 192.168.123.104:0/116234433 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f69dc061190 con 0x7f69c806c7c0 2026-03-10T14:05:03.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f69c806c7c0 msgr2=0x7f69c806ec70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:03.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f69c806c7c0 0x7f69c806ec70 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f69c4005fd0 tx=0x7f69c4005dc0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f69dc103710 msgr2=0x7f69dc193ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:03.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f69dc103710 0x7f69dc193ed0 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f69cc005230 tx=0x7f69cc0056c0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 shutdown_connections 2026-03-10T14:05:03.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f69c806c7c0 0x7f69c806ec70 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.207 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f69dc100df0 0x7f69dc193990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 --2- 192.168.123.104:0/116234433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f69dc103710 0x7f69dc193ed0 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 >> 192.168.123.104:0/116234433 conn(0x7f69dc0fa7f0 msgr2=0x7f69dc0fcc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:03.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 shutdown_connections 2026-03-10T14:05:03.208 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:03.208+0000 7f69e128f700 1 -- 192.168.123.104:0/116234433 wait complete. 2026-03-10T14:05:03.265 DEBUG:teuthology.orchestra.run.vm04:osd.5> sudo journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.5.service 2026-03-10T14:05:03.268 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-10T14:05:03.268 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd stat -f json 2026-03-10T14:05:03.441 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:03.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.713+0000 7f48b03c8700 1 -- 192.168.123.103:0/3144668684 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8103960 msgr2=0x7f48a8103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:03.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.713+0000 7f48b03c8700 1 --2- 192.168.123.103:0/3144668684 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8103960 0x7f48a8103db0 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7f48a4009b00 tx=0x7f48a4009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:03.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.714+0000 7f48b03c8700 1 -- 192.168.123.103:0/3144668684 shutdown_connections 2026-03-10T14:05:03.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.714+0000 7f48b03c8700 1 --2- 192.168.123.103:0/3144668684 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8103960 0x7f48a8103db0 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.714+0000 7f48b03c8700 1 --2- 192.168.123.103:0/3144668684 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a8102760 0x7f48a8102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.714+0000 7f48b03c8700 1 -- 192.168.123.103:0/3144668684 >> 192.168.123.103:0/3144668684 conn(0x7f48a80fdcf0 msgr2=0x7f48a8100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:03.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.715+0000 7f48b03c8700 1 -- 192.168.123.103:0/3144668684 shutdown_connections 2026-03-10T14:05:03.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.715+0000 7f48b03c8700 1 -- 192.168.123.103:0/3144668684 wait complete. 2026-03-10T14:05:03.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.715+0000 7f48b03c8700 1 Processor -- start 2026-03-10T14:05:03.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.715+0000 7f48b03c8700 1 -- start start 2026-03-10T14:05:03.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48b03c8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8102760 0x7f48a81980e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:03.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48b03c8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a8103960 0x7f48a8198620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:03.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48b03c8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48a8198c40 con 0x7f48a8102760 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48b03c8700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48a8198d80 con 0x7f48a8103960 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48ad963700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a8103960 0x7f48a8198620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48ad963700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a8103960 0x7f48a8198620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:33742/0 (socket says 192.168.123.103:33742) 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48ad963700 1 -- 192.168.123.103:0/3991726688 learned_addr learned my addr 192.168.123.103:0/3991726688 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48ad963700 1 -- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8102760 msgr2=0x7f48a81980e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.716+0000 7f48ae164700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8102760 0x7f48a81980e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f48ad963700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8102760 0x7f48a81980e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f48ad963700 1 -- 192.168.123.103:0/3991726688 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48a40097e0 con 0x7f48a8103960 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f48ae164700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8102760 0x7f48a81980e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f48ad963700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a8103960 0x7f48a8198620 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f48a40052d0 tx=0x7f48a4004a80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f489f7fe700 1 -- 192.168.123.103:0/3991726688 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f48a401d070 con 0x7f48a8103960 2026-03-10T14:05:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f489f7fe700 1 -- 192.168.123.103:0/3991726688 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f48a4004500 con 0x7f48a8103960 2026-03-10T14:05:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f489f7fe700 1 -- 192.168.123.103:0/3991726688 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f48a4022470 con 0x7f48a8103960 2026-03-10T14:05:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f48a819d7d0 con 0x7f48a8103960 2026-03-10T14:05:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.717+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f48a819dc60 con 0x7f48a8103960 2026-03-10T14:05:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.718+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f48a8066e40 con 0x7f48a8103960 2026-03-10T14:05:03.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.722+0000 7f489f7fe700 1 -- 192.168.123.103:0/3991726688 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f48a400bc50 con 0x7f48a8103960 2026-03-10T14:05:03.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.722+0000 7f489f7fe700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f489406c750 0x7f489406ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:03.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.722+0000 7f489f7fe700 1 -- 192.168.123.103:0/3991726688 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(29..29 src has 1..29) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f48a408d7e0 con 0x7f48a8103960 2026-03-10T14:05:03.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.723+0000 7f48ae164700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f489406c750 0x7f489406ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.723+0000 7f48ae164700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f489406c750 0x7f489406ec00 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f4898006fd0 tx=0x7f4898008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.723+0000 7f489f7fe700 1 -- 192.168.123.103:0/3991726688 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f48a405bee0 con 0x7f48a8103960 2026-03-10T14:05:03.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.825+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f48a819e030 con 0x7f48a8103960 2026-03-10T14:05:03.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.826+0000 7f489f7fe700 1 -- 192.168.123.103:0/3991726688 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v29) v1 ==== 74+0+130 (secure 0 0 0) 0x7f48a405ba70 con 0x7f48a8103960 2026-03-10T14:05:03.825 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:03.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.829+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f489406c750 msgr2=0x7f489406ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:03.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.829+0000 7f48b03c8700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f489406c750 0x7f489406ec00 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f4898006fd0 tx=0x7f4898008040 comp rx=0 tx=0).stop 2026-03-10T14:05:03.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.829+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a8103960 msgr2=0x7f48a8198620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:03.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.829+0000 7f48b03c8700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a8103960 0x7f48a8198620 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f48a40052d0 tx=0x7f48a4004a80 comp rx=0 tx=0).stop 2026-03-10T14:05:03.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.829+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 shutdown_connections 2026-03-10T14:05:03.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.829+0000 7f48b03c8700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a8102760 0x7f48a81980e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.830+0000 7f48b03c8700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f489406c750 0x7f489406ec00 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.830+0000 7f48b03c8700 1 --2- 192.168.123.103:0/3991726688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a8103960 0x7f48a8198620 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:03.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.830+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 >> 192.168.123.103:0/3991726688 conn(0x7f48a80fdcf0 msgr2=0x7f48a8106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:03.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.830+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 shutdown_connections 2026-03-10T14:05:03.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:03.830+0000 7f48b03c8700 1 -- 192.168.123.103:0/3991726688 wait complete. 2026-03-10T14:05:03.889 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":29,"num_osds":6,"num_up_osds":5,"osd_up_since":1773151497,"num_in_osds":6,"osd_in_since":1773151496,"num_remapped_pgs":0} 2026-03-10T14:05:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:04 vm04 ceph-mon[55966]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T14:05:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:04 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:04 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:04 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:04 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:04 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3991726688' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T14:05:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:04 vm04 ceph-mon[55966]: from='osd.5 [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T14:05:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:04 vm04 ceph-mon[55966]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T14:05:04.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:05:04 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[73087]: 2026-03-10T14:05:04.100+0000 7f4272166640 -1 osd.5 0 log_to_monitors true 2026-03-10T14:05:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:04 vm03 ceph-mon[49718]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T14:05:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:04 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:04 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:04 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:04 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:04 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3991726688' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T14:05:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:04 vm03 ceph-mon[49718]: from='osd.5 [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T14:05:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:04 vm03 ceph-mon[49718]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T14:05:04.889 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd stat -f json 2026-03-10T14:05:05.034 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:05.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.270+0000 7fbf767e8700 1 -- 192.168.123.103:0/3905696633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf700691a0 msgr2=0x7fbf70105520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:05.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.270+0000 7fbf767e8700 1 --2- 192.168.123.103:0/3905696633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf700691a0 0x7fbf70105520 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7fbf58009b00 tx=0x7fbf58009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:05.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.271+0000 7fbf767e8700 1 -- 192.168.123.103:0/3905696633 shutdown_connections 2026-03-10T14:05:05.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.271+0000 7fbf767e8700 1 --2- 192.168.123.103:0/3905696633 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf70105a60 0x7fbf70107e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:05.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.271+0000 7fbf767e8700 1 --2- 192.168.123.103:0/3905696633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf700691a0 0x7fbf70105520 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:05.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.271+0000 7fbf767e8700 1 -- 192.168.123.103:0/3905696633 >> 192.168.123.103:0/3905696633 conn(0x7fbf700faa70 msgr2=0x7fbf700fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:05.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.272+0000 7fbf767e8700 1 -- 192.168.123.103:0/3905696633 shutdown_connections 2026-03-10T14:05:05.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.272+0000 7fbf767e8700 1 -- 192.168.123.103:0/3905696633 wait complete. 2026-03-10T14:05:05.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.272+0000 7fbf767e8700 1 Processor -- start 2026-03-10T14:05:05.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.272+0000 7fbf767e8700 1 -- start start 2026-03-10T14:05:05.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.272+0000 7fbf767e8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf700691a0 0x7fbf70197fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:05.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf767e8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf70105a60 0x7fbf701984f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:05.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf767e8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf70198b10 con 0x7fbf70105a60 2026-03-10T14:05:05.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf767e8700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf70198c50 con 0x7fbf700691a0 2026-03-10T14:05:05.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf6ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf700691a0 0x7fbf70197fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:05.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf6ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf700691a0 0x7fbf70197fb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:33756/0 (socket says 192.168.123.103:33756) 2026-03-10T14:05:05.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf6ffff700 1 -- 192.168.123.103:0/1015719721 learned_addr learned my addr 192.168.123.103:0/1015719721 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:05.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf6ffff700 1 -- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf70105a60 msgr2=0x7fbf701984f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:05.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf6f7fe700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf70105a60 0x7fbf701984f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:05.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf6ffff700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf70105a60 0x7fbf701984f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:05.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.273+0000 7fbf6ffff700 1 -- 192.168.123.103:0/1015719721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf580097e0 con 0x7fbf700691a0 2026-03-10T14:05:05.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.274+0000 7fbf6ffff700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf700691a0 0x7fbf70197fb0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fbf58009fd0 tx=0x7fbf58004ab0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:05.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.274+0000 7fbf6f7fe700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf70105a60 0x7fbf701984f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:05.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.274+0000 7fbf6d7fa700 1 -- 192.168.123.103:0/1015719721 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf5801d070 con 0x7fbf700691a0 2026-03-10T14:05:05.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.275+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf7019d6a0 con 0x7fbf700691a0 2026-03-10T14:05:05.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.275+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf7019db90 con 0x7fbf700691a0 2026-03-10T14:05:05.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.274+0000 7fbf6d7fa700 1 -- 192.168.123.103:0/1015719721 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbf5800bc50 con 0x7fbf700691a0 2026-03-10T14:05:05.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.276+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf701921a0 con 0x7fbf700691a0 2026-03-10T14:05:05.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.276+0000 7fbf6d7fa700 1 -- 192.168.123.103:0/1015719721 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf5800f810 con 0x7fbf700691a0 2026-03-10T14:05:05.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.277+0000 7fbf6d7fa700 1 -- 192.168.123.103:0/1015719721 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fbf5800fa30 con 0x7fbf700691a0 2026-03-10T14:05:05.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.277+0000 7fbf6d7fa700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf5c06c820 0x7fbf5c06ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:05.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.277+0000 7fbf6f7fe700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf5c06c820 0x7fbf5c06ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:05.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.278+0000 7fbf6f7fe700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf5c06c820 0x7fbf5c06ecd0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fbf60005fd0 tx=0x7fbf60005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:05.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.278+0000 7fbf6d7fa700 1 -- 192.168.123.103:0/1015719721 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(30..30 src has 1..30) v4 ==== 4150+0+0 (secure 0 0 0) 0x7fbf5808dc20 con 0x7fbf700691a0 2026-03-10T14:05:05.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.279+0000 7fbf6d7fa700 1 -- 192.168.123.103:0/1015719721 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbf5805c3c0 con 0x7fbf700691a0 2026-03-10T14:05:05.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.391+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fbf70066e40 con 0x7fbf700691a0 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: Detected new or changed devices on vm04 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='osd.5 [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:05.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:05 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:05.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.393+0000 7fbf6d7fa700 1 -- 192.168.123.103:0/1015719721 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v31) v1 ==== 74+0+130 (secure 0 0 0) 0x7fbf580270a0 con 0x7fbf700691a0 2026-03-10T14:05:05.392 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:05.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf5c06c820 msgr2=0x7fbf5c06ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:05.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf5c06c820 0x7fbf5c06ecd0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fbf60005fd0 tx=0x7fbf60005dc0 comp rx=0 tx=0).stop 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf700691a0 msgr2=0x7fbf70197fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf700691a0 0x7fbf70197fb0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fbf58009fd0 tx=0x7fbf58004ab0 comp rx=0 tx=0).stop 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 shutdown_connections 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf5c06c820 0x7fbf5c06ecd0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbf700691a0 0x7fbf70197fb0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 --2- 192.168.123.103:0/1015719721 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf70105a60 0x7fbf701984f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 >> 192.168.123.103:0/1015719721 conn(0x7fbf700faa70 msgr2=0x7fbf700fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.396+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 shutdown_connections 2026-03-10T14:05:05.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:05.397+0000 7fbf767e8700 1 -- 192.168.123.103:0/1015719721 wait complete. 2026-03-10T14:05:05.460 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":31,"num_osds":6,"num_up_osds":5,"osd_up_since":1773151497,"num_in_osds":6,"osd_in_since":1773151496,"num_remapped_pgs":0} 2026-03-10T14:05:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: Detected new or changed devices on vm04 2026-03-10T14:05:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:05:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='osd.5 [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:05 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:05.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:05:05 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[73087]: 2026-03-10T14:05:05.385+0000 7f4266fdc700 -1 osd.5 0 waiting for initial osdmap 2026-03-10T14:05:05.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:05:05 vm04 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[73087]: 2026-03-10T14:05:05.392+0000 7f42635d2700 -1 osd.5 31 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:05:06.461 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd stat -f json 2026-03-10T14:05:06.601 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:06.646 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:06 vm03 ceph-mon[49718]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T14:05:06.646 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:06 vm03 ceph-mon[49718]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T14:05:06.646 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:06 vm03 ceph-mon[49718]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T14:05:06.646 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:06 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:06.646 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:06 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:06.646 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:06 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1015719721' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T14:05:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:06 vm04 ceph-mon[55966]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T14:05:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:06 vm04 ceph-mon[55966]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]': finished 2026-03-10T14:05:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:06 vm04 ceph-mon[55966]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T14:05:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:06 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:06 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:06 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/1015719721' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T14:05:06.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.857+0000 7fc0c7495700 1 -- 192.168.123.103:0/3207615800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 msgr2=0x7fc0c0102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:06.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.857+0000 7fc0c7495700 1 --2- 192.168.123.103:0/3207615800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 0x7fc0c0102b70 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7fc0b4009b00 tx=0x7fc0b4009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:06.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.858+0000 7fc0c7495700 1 -- 192.168.123.103:0/3207615800 shutdown_connections 2026-03-10T14:05:06.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.858+0000 7fc0c7495700 1 --2- 192.168.123.103:0/3207615800 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0c0103960 0x7fc0c0103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:06.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.858+0000 7fc0c7495700 1 --2- 192.168.123.103:0/3207615800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 0x7fc0c0102b70 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:06.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.858+0000 7fc0c7495700 1 -- 192.168.123.103:0/3207615800 >> 192.168.123.103:0/3207615800 conn(0x7fc0c00fdcf0 msgr2=0x7fc0c0100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:06.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.858+0000 7fc0c7495700 1 -- 192.168.123.103:0/3207615800 shutdown_connections 2026-03-10T14:05:06.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.858+0000 7fc0c7495700 1 -- 192.168.123.103:0/3207615800 wait complete. 2026-03-10T14:05:06.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c7495700 1 Processor -- start 2026-03-10T14:05:06.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c7495700 1 -- start start 2026-03-10T14:05:06.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c7495700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 0x7fc0c0078b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:06.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c7495700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0c0103960 0x7fc0c0079040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:06.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c7495700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0c0075560 con 0x7fc0c0102760 2026-03-10T14:05:06.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c7495700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0c00756d0 con 0x7fc0c0103960 2026-03-10T14:05:06.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c5231700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 0x7fc0c0078b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:06.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c5231700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 0x7fc0c0078b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41818/0 (socket says 192.168.123.103:41818) 2026-03-10T14:05:06.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c5231700 1 -- 192.168.123.103:0/1509250290 learned_addr learned my addr 192.168.123.103:0/1509250290 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:06.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.859+0000 7fc0c5231700 1 -- 192.168.123.103:0/1509250290 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0c0103960 msgr2=0x7fc0c0079040 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:06.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0c4a30700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0c0103960 0x7fc0c0079040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:06.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0c5231700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0c0103960 0x7fc0c0079040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:06.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0c5231700 1 -- 192.168.123.103:0/1509250290 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc0b40097e0 con 0x7fc0c0102760 2026-03-10T14:05:06.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0c5231700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 0x7fc0c0078b00 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7fc0b4009fd0 tx=0x7fc0b4004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:06.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0b27fc700 1 -- 192.168.123.103:0/1509250290 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc0b401d070 con 0x7fc0c0102760 2026-03-10T14:05:06.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0c0075950 con 0x7fc0c0102760 2026-03-10T14:05:06.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0c0075e40 con 0x7fc0c0102760 2026-03-10T14:05:06.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0b27fc700 1 -- 192.168.123.103:0/1509250290 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc0b400bc50 con 0x7fc0c0102760 2026-03-10T14:05:06.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.860+0000 7fc0b27fc700 1 -- 192.168.123.103:0/1509250290 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc0b400f870 con 0x7fc0c0102760 2026-03-10T14:05:06.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.861+0000 7fc0b27fc700 1 -- 192.168.123.103:0/1509250290 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc0b400f9d0 con 0x7fc0c0102760 2026-03-10T14:05:06.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.862+0000 7fc0b27fc700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0ac06c750 0x7fc0ac06ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:06.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.862+0000 7fc0b27fc700 1 -- 192.168.123.103:0/1509250290 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(32..32 src has 1..32) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc0b408dcf0 con 0x7fc0c0102760 2026-03-10T14:05:06.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.862+0000 7fc0c4a30700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0ac06c750 0x7fc0ac06ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:06.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.863+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc0a4005320 con 0x7fc0c0102760 2026-03-10T14:05:06.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.863+0000 7fc0c4a30700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0ac06c750 0x7fc0ac06ec00 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fc0bc00ba60 tx=0x7fc0bc005d50 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:06.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.866+0000 7fc0b27fc700 1 -- 192.168.123.103:0/1509250290 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc0b405c2f0 con 0x7fc0c0102760 2026-03-10T14:05:06.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.968+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fc0a4005190 con 0x7fc0c0102760 2026-03-10T14:05:06.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.970+0000 7fc0b27fc700 1 -- 192.168.123.103:0/1509250290 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v32) v1 ==== 74+0+130 (secure 0 0 0) 0x7fc0b4027030 con 0x7fc0c0102760 2026-03-10T14:05:06.969 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:06.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0ac06c750 msgr2=0x7fc0ac06ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:06.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0ac06c750 0x7fc0ac06ec00 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fc0bc00ba60 tx=0x7fc0bc005d50 comp rx=0 tx=0).stop 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 msgr2=0x7fc0c0078b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 0x7fc0c0078b00 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7fc0b4009fd0 tx=0x7fc0b4004ab0 comp rx=0 tx=0).stop 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 shutdown_connections 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0c0102760 0x7fc0c0078b00 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0ac06c750 0x7fc0ac06ec00 secure :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fc0bc00ba60 tx=0x7fc0bc005d50 comp rx=0 tx=0).stop 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 --2- 192.168.123.103:0/1509250290 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0c0103960 0x7fc0c0079040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 >> 192.168.123.103:0/1509250290 conn(0x7fc0c00fdcf0 msgr2=0x7fc0c0106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 shutdown_connections 2026-03-10T14:05:06.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:06.973+0000 7fc0c7495700 1 -- 192.168.123.103:0/1509250290 wait complete. 2026-03-10T14:05:07.013 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":32,"num_osds":6,"num_up_osds":6,"osd_up_since":1773151506,"num_in_osds":6,"osd_in_since":1773151496,"num_remapped_pgs":0} 2026-03-10T14:05:07.013 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd dump --format=json 2026-03-10T14:05:07.152 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.401+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3901796009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 msgr2=0x7fa3d4102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.401+0000 7fa3d8d92700 1 --2- 192.168.123.103:0/3901796009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 0x7fa3d4102bf0 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7fa3c4009b50 tx=0x7fa3c4009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.401+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3901796009 shutdown_connections 2026-03-10T14:05:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.401+0000 7fa3d8d92700 1 --2- 192.168.123.103:0/3901796009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 0x7fa3d4102bf0 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.401+0000 7fa3d8d92700 1 --2- 192.168.123.103:0/3901796009 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3d4108780 0x7fa3d4108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.401+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3901796009 >> 192.168.123.103:0/3901796009 conn(0x7fa3d40fe280 msgr2=0x7fa3d4100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.402+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3901796009 shutdown_connections 2026-03-10T14:05:07.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.402+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3901796009 wait complete. 2026-03-10T14:05:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.402+0000 7fa3d8d92700 1 Processor -- start 2026-03-10T14:05:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.402+0000 7fa3d8d92700 1 -- start start 2026-03-10T14:05:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.402+0000 7fa3d8d92700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 0x7fa3d4198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.402+0000 7fa3d8d92700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3d4108780 0x7fa3d41988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:07.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.402+0000 7fa3d8d92700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3d4198fb0 con 0x7fa3d4102780 2026-03-10T14:05:07.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.402+0000 7fa3d8d92700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3d419ccf0 con 0x7fa3d4108780 2026-03-10T14:05:07.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.403+0000 7fa3d259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 0x7fa3d4198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:07.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.403+0000 7fa3d259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 0x7fa3d4198390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41832/0 (socket says 192.168.123.103:41832) 2026-03-10T14:05:07.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.403+0000 7fa3d259c700 1 -- 192.168.123.103:0/3584684489 learned_addr learned my addr 192.168.123.103:0/3584684489 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:07.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.403+0000 7fa3d259c700 1 -- 192.168.123.103:0/3584684489 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3d4108780 msgr2=0x7fa3d41988d0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:07.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.403+0000 7fa3d259c700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3d4108780 0x7fa3d41988d0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.403+0000 7fa3d259c700 1 -- 192.168.123.103:0/3584684489 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa3c40097e0 con 0x7fa3d4102780 2026-03-10T14:05:07.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.403+0000 7fa3d259c700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 0x7fa3d4198390 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7fa3bc009fd0 tx=0x7fa3bc00eea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:07.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.403+0000 7fa3cb7fe700 1 -- 192.168.123.103:0/3584684489 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3bc00cca0 con 0x7fa3d4102780 2026-03-10T14:05:07.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.404+0000 7fa3cb7fe700 1 -- 192.168.123.103:0/3584684489 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa3bc004500 con 0x7fa3d4102780 2026-03-10T14:05:07.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.404+0000 7fa3cb7fe700 1 -- 192.168.123.103:0/3584684489 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3bc010470 con 0x7fa3d4102780 2026-03-10T14:05:07.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.404+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa3d419cf70 con 0x7fa3d4102780 2026-03-10T14:05:07.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.404+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa3d419d4c0 con 0x7fa3d4102780 2026-03-10T14:05:07.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.405+0000 7fa3cb7fe700 1 -- 192.168.123.103:0/3584684489 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa3bc010630 con 0x7fa3d4102780 2026-03-10T14:05:07.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.406+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa3d404ea50 con 0x7fa3d4102780 2026-03-10T14:05:07.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.406+0000 7fa3cb7fe700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3c0074fb0 0x7fa3c0077460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:07.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.406+0000 7fa3cb7fe700 1 -- 192.168.123.103:0/3584684489 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa3bc014070 con 0x7fa3d4102780 2026-03-10T14:05:07.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.407+0000 7fa3d1d9b700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3c0074fb0 0x7fa3c0077460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:07.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.407+0000 7fa3d1d9b700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3c0074fb0 0x7fa3c0077460 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fa3c4005e50 tx=0x7fa3c4005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:07.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.409+0000 7fa3cb7fe700 1 -- 192.168.123.103:0/3584684489 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa3bc057610 con 0x7fa3d4102780 2026-03-10T14:05:07.479 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:07 vm03 ceph-mon[49718]: purged_snaps scrub starts 2026-03-10T14:05:07.479 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:07 vm03 ceph-mon[49718]: purged_snaps scrub ok 2026-03-10T14:05:07.479 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:07 vm03 ceph-mon[49718]: osd.5 [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] boot 2026-03-10T14:05:07.479 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:07 vm03 ceph-mon[49718]: osdmap e32: 6 total, 6 up, 6 in 2026-03-10T14:05:07.479 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:07 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:07.479 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:07 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1509250290' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T14:05:07.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.516+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7fa3d419d7d0 con 0x7fa3d4102780 2026-03-10T14:05:07.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.517+0000 7fa3cb7fe700 1 -- 192.168.123.103:0/3584684489 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11272 (secure 0 0 0) 0x7fa3bc05ac30 con 0x7fa3d4102780 2026-03-10T14:05:07.516 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:07.516 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":33,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","created":"2026-03-10T14:02:37.819782+0000","modified":"2026-03-10T14:05:07.388081+0000","last_up_change":"2026-03-10T14:05:06.380447+0000","last_in_change":"2026-03-10T14:04:56.344082+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T14:04:37.304178+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"32845076-88bb-4ae3-87fc-7369eed22d26","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6803","nonce":121236340}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6805","nonce":121236340}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6809","nonce":121236340}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6807","nonce":121236340}]},"public_addr":"192.168.123.103:6803/121236340","cluster_addr":"192.168.123.103:6805/121236340","heartbeat_back_addr":"192.168.123.103:6809/121236340","heartbeat_front_addr":"192.168.123.103:6807/121236340","state":["exists","up"]},{"osd":1,"uuid":"0af1d772-f786-4e93-8006-866ee43f51d1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6811","nonce":3922533653}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6813","nonce":3922533653}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6817","nonce":3922533653}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6815","nonce":3922533653}]},"public_addr":"192.168.123.103:6811/3922533653","cluster_addr":"192.168.123.103:6813/3922533653","heartbeat_back_addr":"192.168.123.103:6817/3922533653","heartbeat_front_addr":"192.168.123.103:6815/3922533653","state":["exists","up"]},{"osd":2,"uuid":"03fb18a9-019c-440f-9b09-702297165b29","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6819","nonce":1891998236}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6821","nonce":1891998236}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6825","nonce":1891998236}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6823","nonce":1891998236}]},"public_addr":"192.168.123.103:6819/1891998236","cluster_addr":"192.168.123.103:6821/1891998236","heartbeat_back_addr":"192.168.123.103:6825/1891998236","heartbeat_front_addr":"192.168.123.103:6823/1891998236","state":["exists","up"]},{"osd":3,"uuid":"9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6801","nonce":3770653735}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6803","nonce":3770653735}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6807","nonce":3770653735}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6805","nonce":3770653735}]},"public_addr":"192.168.123.104:6801/3770653735","cluster_addr":"192.168.123.104:6803/3770653735","heartbeat_back_addr":"192.168.123.104:6807/3770653735","heartbeat_front_addr":"192.168.123.104:6805/3770653735","state":["exists","up"]},{"osd":4,"uuid":"9d0ae930-1cc2-4815-bd24-e9129038b319","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6809","nonce":1824767996}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6811","nonce":1824767996}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6815","nonce":1824767996}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6813","nonce":1824767996}]},"public_addr":"192.168.123.104:6809/1824767996","cluster_addr":"192.168.123.104:6811/1824767996","heartbeat_back_addr":"192.168.123.104:6815/1824767996","heartbeat_front_addr":"192.168.123.104:6813/1824767996","state":["exists","up"]},{"osd":5,"uuid":"5c4fe084-183b-43ce-9ebe-daeadbb4f59a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6817","nonce":3736804423}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6819","nonce":3736804423}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6823","nonce":3736804423}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6821","nonce":3736804423}]},"public_addr":"192.168.123.104:6817/3736804423","cluster_addr":"192.168.123.104:6819/3736804423","heartbeat_back_addr":"192.168.123.104:6823/3736804423","heartbeat_front_addr":"192.168.123.104:6821/3736804423","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:16.138307+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:26.330857+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:36.153297+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:46.727100+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:56.580723+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:05:05.150006+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:0/330658303":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/1628969362":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/1542797438":"2026-03-11T14:03:07.554401+0000","192.168.123.103:0/621911225":"2026-03-11T14:03:07.554401+0000","192.168.123.103:6801/2":"2026-03-11T14:02:53.370766+0000","192.168.123.103:6800/2":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/1489630015":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/4124605323":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/1771049444":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/3848481828":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/3383031364":"2026-03-11T14:03:07.554401+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T14:05:07.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3c0074fb0 msgr2=0x7fa3c0077460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3c0074fb0 0x7fa3c0077460 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fa3c4005e50 tx=0x7fa3c4005dc0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 msgr2=0x7fa3d4198390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 0x7fa3d4198390 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7fa3bc009fd0 tx=0x7fa3bc00eea0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 shutdown_connections 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3d4102780 0x7fa3d4198390 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3c0074fb0 0x7fa3c0077460 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 --2- 192.168.123.103:0/3584684489 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3d4108780 0x7fa3d41988d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 >> 192.168.123.103:0/3584684489 conn(0x7fa3d40fe280 msgr2=0x7fa3d40ffbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 shutdown_connections 2026-03-10T14:05:07.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.520+0000 7fa3d8d92700 1 -- 192.168.123.103:0/3584684489 wait complete. 2026-03-10T14:05:07.582 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-10T14:04:37.304178+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '20', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-10T14:05:07.582 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd pool get .mgr pg_num 2026-03-10T14:05:07.733 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:07.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:07 vm04 ceph-mon[55966]: purged_snaps scrub starts 2026-03-10T14:05:07.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:07 vm04 ceph-mon[55966]: purged_snaps scrub ok 2026-03-10T14:05:07.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:07 vm04 ceph-mon[55966]: osd.5 [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] boot 2026-03-10T14:05:07.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:07 vm04 ceph-mon[55966]: osdmap e32: 6 total, 6 up, 6 in 2026-03-10T14:05:07.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:07 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:05:07.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:07 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/1509250290' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T14:05:07.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.969+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/3305784234 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 msgr2=0x7fe7b4101770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:07.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.969+0000 7fe7bc9c2700 1 --2- 192.168.123.103:0/3305784234 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 0x7fe7b4101770 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7fe7ac009b00 tx=0x7fe7ac009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:07.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.969+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/3305784234 shutdown_connections 2026-03-10T14:05:07.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.969+0000 7fe7bc9c2700 1 --2- 192.168.123.103:0/3305784234 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7b4068490 0x7fe7b4068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.969+0000 7fe7bc9c2700 1 --2- 192.168.123.103:0/3305784234 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 0x7fe7b4101770 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.969+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/3305784234 >> 192.168.123.103:0/3305784234 conn(0x7fe7b40754a0 msgr2=0x7fe7b40758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:07.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.970+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/3305784234 shutdown_connections 2026-03-10T14:05:07.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.970+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/3305784234 wait complete. 2026-03-10T14:05:07.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.970+0000 7fe7bc9c2700 1 Processor -- start 2026-03-10T14:05:07.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.970+0000 7fe7bc9c2700 1 -- start start 2026-03-10T14:05:07.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.970+0000 7fe7bc9c2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7b4068490 0x7fe7b4198330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:07.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7bc9c2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 0x7fe7b4198870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:07.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7bc9c2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7b4198ec0 con 0x7fe7b41013a0 2026-03-10T14:05:07.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7bc9c2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7b4199000 con 0x7fe7b4068490 2026-03-10T14:05:07.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7b9f5d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 0x7fe7b4198870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:07.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7b9f5d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 0x7fe7b4198870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41840/0 (socket says 192.168.123.103:41840) 2026-03-10T14:05:07.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7b9f5d700 1 -- 192.168.123.103:0/4234870288 learned_addr learned my addr 192.168.123.103:0/4234870288 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:07.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7b9f5d700 1 -- 192.168.123.103:0/4234870288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7b4068490 msgr2=0x7fe7b4198330 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:07.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7b9f5d700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7b4068490 0x7fe7b4198330 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:07.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7b9f5d700 1 -- 192.168.123.103:0/4234870288 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe7ac0097e0 con 0x7fe7b41013a0 2026-03-10T14:05:07.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7b9f5d700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 0x7fe7b4198870 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7fe7a800cc60 tx=0x7fe7a80074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:07.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7a77fe700 1 -- 192.168.123.103:0/4234870288 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe7a8007af0 con 0x7fe7b41013a0 2026-03-10T14:05:07.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7a77fe700 1 -- 192.168.123.103:0/4234870288 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe7a8007c50 con 0x7fe7b41013a0 2026-03-10T14:05:07.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7a77fe700 1 -- 192.168.123.103:0/4234870288 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe7a80187b0 con 0x7fe7b41013a0 2026-03-10T14:05:07.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe7b419ce50 con 0x7fe7b41013a0 2026-03-10T14:05:07.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.971+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe7b419d3a0 con 0x7fe7b41013a0 2026-03-10T14:05:07.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.972+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe7b404ea50 con 0x7fe7b41013a0 2026-03-10T14:05:07.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.976+0000 7fe7a77fe700 1 -- 192.168.123.103:0/4234870288 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe7a801f030 con 0x7fe7b41013a0 2026-03-10T14:05:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.976+0000 7fe7a77fe700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe7a006c750 0x7fe7a006ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.976+0000 7fe7a77fe700 1 -- 192.168.123.103:0/4234870288 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fe7a808b830 con 0x7fe7b41013a0 2026-03-10T14:05:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.976+0000 7fe7ba75e700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe7a006c750 0x7fe7a006ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.976+0000 7fe7a77fe700 1 -- 192.168.123.103:0/4234870288 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe7a80b77c0 con 0x7fe7b41013a0 2026-03-10T14:05:07.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:07.977+0000 7fe7ba75e700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe7a006c750 0x7fe7a006ec00 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fe7ac005f50 tx=0x7fe7ac005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:08.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.073+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7fe7b419cfe0 con 0x7fe7b41013a0 2026-03-10T14:05:08.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.074+0000 7fe7a77fe700 1 -- 192.168.123.103:0/4234870288 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v33) v1 ==== 93+0+10 (secure 0 0 0) 0x7fe7b419cfe0 con 0x7fe7b41013a0 2026-03-10T14:05:08.074 INFO:teuthology.orchestra.run.vm03.stdout:pg_num: 1 2026-03-10T14:05:08.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.077+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe7a006c750 msgr2=0x7fe7a006ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:08.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.077+0000 7fe7bc9c2700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe7a006c750 0x7fe7a006ec00 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fe7ac005f50 tx=0x7fe7ac005dc0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.077+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 msgr2=0x7fe7b4198870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:08.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.077+0000 7fe7bc9c2700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 0x7fe7b4198870 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7fe7a800cc60 tx=0x7fe7a80074a0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.078+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 shutdown_connections 2026-03-10T14:05:08.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.078+0000 7fe7bc9c2700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe7a006c750 0x7fe7a006ec00 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.078+0000 7fe7bc9c2700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7b4068490 0x7fe7b4198330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.078+0000 7fe7bc9c2700 1 --2- 192.168.123.103:0/4234870288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7b41013a0 0x7fe7b4198870 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.078+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 >> 192.168.123.103:0/4234870288 conn(0x7fe7b40754a0 msgr2=0x7fe7b40fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:08.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.078+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 shutdown_connections 2026-03-10T14:05:08.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.078+0000 7fe7bc9c2700 1 -- 192.168.123.103:0/4234870288 wait complete. 2026-03-10T14:05:08.138 INFO:tasks.cephadm:Setting up client nodes... 2026-03-10T14:05:08.138 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T14:05:08.278 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:08.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.532+0000 7f2497a21700 1 -- 192.168.123.103:0/1801812103 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490102760 msgr2=0x7f2490102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:08.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.532+0000 7f2497a21700 1 --2- 192.168.123.103:0/1801812103 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490102760 0x7f2490102b70 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f2484009b50 tx=0x7f2484009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:08.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.532+0000 7f2497a21700 1 -- 192.168.123.103:0/1801812103 shutdown_connections 2026-03-10T14:05:08.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.532+0000 7f2497a21700 1 --2- 192.168.123.103:0/1801812103 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2490103960 0x7f2490103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.532+0000 7f2497a21700 1 --2- 192.168.123.103:0/1801812103 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490102760 0x7f2490102b70 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.532+0000 7f2497a21700 1 -- 192.168.123.103:0/1801812103 >> 192.168.123.103:0/1801812103 conn(0x7f24900fdcf0 msgr2=0x7f2490100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:08.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.533+0000 7f2497a21700 1 -- 192.168.123.103:0/1801812103 shutdown_connections 2026-03-10T14:05:08.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.533+0000 7f2497a21700 1 -- 192.168.123.103:0/1801812103 wait complete. 2026-03-10T14:05:08.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.533+0000 7f2497a21700 1 Processor -- start 2026-03-10T14:05:08.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2497a21700 1 -- start start 2026-03-10T14:05:08.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2497a21700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2490103960 0x7f2490198340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:08.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2497a21700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490198880 0x7f249019d8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:08.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2497a21700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2490198d80 con 0x7f2490198880 2026-03-10T14:05:08.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2497a21700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2490198ef0 con 0x7f2490103960 2026-03-10T14:05:08.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2494fbc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490198880 0x7f249019d8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:08.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2494fbc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490198880 0x7f249019d8f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41846/0 (socket says 192.168.123.103:41846) 2026-03-10T14:05:08.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2494fbc700 1 -- 192.168.123.103:0/93954319 learned_addr learned my addr 192.168.123.103:0/93954319 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:08.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.534+0000 7f2494fbc700 1 -- 192.168.123.103:0/93954319 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2490103960 msgr2=0x7f2490198340 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:08.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f24957bd700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2490103960 0x7f2490198340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:08.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f2494fbc700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2490103960 0x7f2490198340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f2494fbc700 1 -- 192.168.123.103:0/93954319 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24840097e0 con 0x7f2490198880 2026-03-10T14:05:08.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f2494fbc700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490198880 0x7f249019d8f0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7f248c00eb10 tx=0x7f248c00eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:08.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f24827fc700 1 -- 192.168.123.103:0/93954319 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f248c00cca0 con 0x7f2490198880 2026-03-10T14:05:08.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f24827fc700 1 -- 192.168.123.103:0/93954319 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f248c00ce00 con 0x7f2490198880 2026-03-10T14:05:08.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f24827fc700 1 -- 192.168.123.103:0/93954319 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f248c018910 con 0x7f2490198880 2026-03-10T14:05:08.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f249019de90 con 0x7f2490198880 2026-03-10T14:05:08.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.535+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f249019e3b0 con 0x7f2490198880 2026-03-10T14:05:08.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.537+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2490066e40 con 0x7f2490198880 2026-03-10T14:05:08.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.537+0000 7f24827fc700 1 -- 192.168.123.103:0/93954319 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f248c018a70 con 0x7f2490198880 2026-03-10T14:05:08.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.538+0000 7f24827fc700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f247c06c750 0x7f247c06ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:08.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.538+0000 7f24827fc700 1 -- 192.168.123.103:0/93954319 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f248c014070 con 0x7f2490198880 2026-03-10T14:05:08.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.538+0000 7f24957bd700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f247c06c750 0x7f247c06ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:08.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.538+0000 7f24957bd700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f247c06c750 0x7f247c06ec00 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f24840053b0 tx=0x7f2484005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:08.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.540+0000 7f24827fc700 1 -- 192.168.123.103:0/93954319 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f248c05ab70 con 0x7f2490198880 2026-03-10T14:05:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:08 vm03 ceph-mon[49718]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:08 vm03 ceph-mon[49718]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T14:05:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:08 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3584684489' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T14:05:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:08 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/4234870288' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T14:05:08.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.692+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f249019e700 con 0x7f2490198880 2026-03-10T14:05:08.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.699+0000 7f24827fc700 1 -- 192.168.123.103:0/93954319 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7f248c05a700 con 0x7f2490198880 2026-03-10T14:05:08.698 INFO:teuthology.orchestra.run.vm03.stdout:[client.0] 2026-03-10T14:05:08.698 INFO:teuthology.orchestra.run.vm03.stdout: key = AQAUJbBplQpeKRAAadC2BFnosjXKi9aRRalu5A== 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f247c06c750 msgr2=0x7f247c06ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f247c06c750 0x7f247c06ec00 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f24840053b0 tx=0x7f2484005fb0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490198880 msgr2=0x7f249019d8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490198880 0x7f249019d8f0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7f248c00eb10 tx=0x7f248c00eed0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 shutdown_connections 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f247c06c750 0x7f247c06ec00 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2490103960 0x7f2490198340 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 --2- 192.168.123.103:0/93954319 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2490198880 0x7f249019d8f0 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:08.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.701+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 >> 192.168.123.103:0/93954319 conn(0x7f24900fdcf0 msgr2=0x7f2490106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:08.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.702+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 shutdown_connections 2026-03-10T14:05:08.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:08.702+0000 7f2497a21700 1 -- 192.168.123.103:0/93954319 wait complete. 2026-03-10T14:05:08.743 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:05:08.744 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-10T14:05:08.744 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-10T14:05:08.777 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T14:05:08.801 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:08 vm04 ceph-mon[55966]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:08.801 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:08 vm04 ceph-mon[55966]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T14:05:08.801 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:08 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3584684489' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T14:05:08.801 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:08 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/4234870288' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T14:05:08.922 INFO:teuthology.orchestra.run.vm04.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm04/config 2026-03-10T14:05:09.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.166+0000 7f880ca44700 1 -- 192.168.123.104:0/3661112748 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808101710 msgr2=0x7f8808101b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:09.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.166+0000 7f880ca44700 1 --2- 192.168.123.104:0/3661112748 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808101710 0x7f8808101b60 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f87f8009b00 tx=0x7f87f8009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:09.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.167+0000 7f880ca44700 1 -- 192.168.123.104:0/3661112748 shutdown_connections 2026-03-10T14:05:09.165 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.167+0000 7f880ca44700 1 --2- 192.168.123.104:0/3661112748 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808101710 0x7f8808101b60 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.167+0000 7f880ca44700 1 --2- 192.168.123.104:0/3661112748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8808100510 0x7f8808100920 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.167+0000 7f880ca44700 1 -- 192.168.123.104:0/3661112748 >> 192.168.123.104:0/3661112748 conn(0x7f88080fba80 msgr2=0x7f88080fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:09.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.167+0000 7f880ca44700 1 -- 192.168.123.104:0/3661112748 shutdown_connections 2026-03-10T14:05:09.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880ca44700 1 -- 192.168.123.104:0/3661112748 wait complete. 2026-03-10T14:05:09.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880ca44700 1 Processor -- start 2026-03-10T14:05:09.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880ca44700 1 -- start start 2026-03-10T14:05:09.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880ca44700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808100510 0x7f8808074af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:09.166 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880ca44700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8808101710 0x7f8808073140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:09.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880ca44700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8808199a20 con 0x7f8808101710 2026-03-10T14:05:09.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880ca44700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8808199b60 con 0x7f8808100510 2026-03-10T14:05:09.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808100510 0x7f8808074af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:09.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808100510 0x7f8808074af0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.104:51414/0 (socket says 192.168.123.104:51414) 2026-03-10T14:05:09.167 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880659c700 1 -- 192.168.123.104:0/548996123 learned_addr learned my addr 192.168.123.104:0/548996123 (peer_addr_for_me v2:192.168.123.104:0/0) 2026-03-10T14:05:09.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f8805d9b700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8808101710 0x7f8808073140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:09.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880659c700 1 -- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8808101710 msgr2=0x7f8808073140 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:09.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880659c700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8808101710 0x7f8808073140 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.168+0000 7f880659c700 1 -- 192.168.123.104:0/548996123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f87f80097e0 con 0x7f8808100510 2026-03-10T14:05:09.169 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.169+0000 7f880659c700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808100510 0x7f8808074af0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f87f000da40 tx=0x7f87f000de00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:09.169 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.169+0000 7f8805d9b700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8808101710 0x7f8808073140 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:09.169 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.169+0000 7f87ff7fe700 1 -- 192.168.123.104:0/548996123 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87f00041d0 con 0x7f8808100510 2026-03-10T14:05:09.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.169+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8808073680 con 0x7f8808100510 2026-03-10T14:05:09.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.169+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8808073ba0 con 0x7f8808100510 2026-03-10T14:05:09.171 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.169+0000 7f87ff7fe700 1 -- 192.168.123.104:0/548996123 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f87f0009c70 con 0x7f8808100510 2026-03-10T14:05:09.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.169+0000 7f87ff7fe700 1 -- 192.168.123.104:0/548996123 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87f0003e40 con 0x7f8808100510 2026-03-10T14:05:09.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.170+0000 7f87ff7fe700 1 -- 192.168.123.104:0/548996123 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f87f0010460 con 0x7f8808100510 2026-03-10T14:05:09.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.170+0000 7f87ff7fe700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87f406c7a0 0x7f87f406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:09.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.170+0000 7f87ff7fe700 1 -- 192.168.123.104:0/548996123 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f87f0021030 con 0x7f8808100510 2026-03-10T14:05:09.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.171+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87e8005320 con 0x7f8808100510 2026-03-10T14:05:09.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.171+0000 7f8805d9b700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87f406c7a0 0x7f87f406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:09.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.173+0000 7f8805d9b700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87f406c7a0 0x7f87f406ec50 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f87f8009fd0 tx=0x7f87f8005c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:09.172 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.173+0000 7f87ff7fe700 1 -- 192.168.123.104:0/548996123 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f87f0059b10 con 0x7f8808100510 2026-03-10T14:05:09.313 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.313+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f87e8005190 con 0x7f8808100510 2026-03-10T14:05:09.319 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.321+0000 7f87ff7fe700 1 -- 192.168.123.104:0/548996123 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7f87f0059930 con 0x7f8808100510 2026-03-10T14:05:09.319 INFO:teuthology.orchestra.run.vm04.stdout:[client.1] 2026-03-10T14:05:09.319 INFO:teuthology.orchestra.run.vm04.stdout: key = AQAVJbBpCCbdEhAA72opa/aikzXIlN48qMlY7A== 2026-03-10T14:05:09.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87f406c7a0 msgr2=0x7f87f406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:09.321 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87f406c7a0 0x7f87f406ec50 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f87f8009fd0 tx=0x7f87f8005c00 comp rx=0 tx=0).stop 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808100510 msgr2=0x7f8808074af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808100510 0x7f8808074af0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f87f000da40 tx=0x7f87f000de00 comp rx=0 tx=0).stop 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 shutdown_connections 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87f406c7a0 0x7f87f406ec50 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8808100510 0x7f8808074af0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 --2- 192.168.123.104:0/548996123 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8808101710 0x7f8808073140 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 >> 192.168.123.104:0/548996123 conn(0x7f88080fba80 msgr2=0x7f8808104940 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 shutdown_connections 2026-03-10T14:05:09.322 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:09.323+0000 7f880ca44700 1 -- 192.168.123.104:0/548996123 wait complete. 2026-03-10T14:05:09.360 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:05:09.360 DEBUG:teuthology.orchestra.run.vm04:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-10T14:05:09.360 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-10T14:05:09.435 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-10T14:05:09.435 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-10T14:05:09.435 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mgr dump --format=json 2026-03-10T14:05:09.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:09 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/93954319' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T14:05:09.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:09 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/93954319' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T14:05:09.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:09 vm04 ceph-mon[55966]: from='client.? 192.168.123.104:0/548996123' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T14:05:09.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:09 vm04 ceph-mon[55966]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T14:05:09.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:09 vm04 ceph-mon[55966]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T14:05:09.586 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:09.651 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:09 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/93954319' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T14:05:09.651 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:09 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/93954319' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T14:05:09.651 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:09 vm03 ceph-mon[49718]: from='client.? 192.168.123.104:0/548996123' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T14:05:09.651 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:09 vm03 ceph-mon[49718]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T14:05:09.651 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:09 vm03 ceph-mon[49718]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T14:05:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.837+0000 7fb9a2623700 1 -- 192.168.123.103:0/2177199223 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c104520 msgr2=0x7fb99c1048f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.837+0000 7fb9a2623700 1 --2- 192.168.123.103:0/2177199223 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c104520 0x7fb99c1048f0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fb984009b00 tx=0x7fb984009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.838+0000 7fb9a2623700 1 -- 192.168.123.103:0/2177199223 shutdown_connections 2026-03-10T14:05:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.838+0000 7fb9a2623700 1 --2- 192.168.123.103:0/2177199223 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb99c0fff00 0x7fb99c100370 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.838+0000 7fb9a2623700 1 --2- 192.168.123.103:0/2177199223 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c104520 0x7fb99c1048f0 secure :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fb984009b00 tx=0x7fb984009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.838+0000 7fb9a2623700 1 -- 192.168.123.103:0/2177199223 >> 192.168.123.103:0/2177199223 conn(0x7fb99c0754a0 msgr2=0x7fb99c0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.838+0000 7fb9a2623700 1 -- 192.168.123.103:0/2177199223 shutdown_connections 2026-03-10T14:05:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.839+0000 7fb9a2623700 1 -- 192.168.123.103:0/2177199223 wait complete. 2026-03-10T14:05:09.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.839+0000 7fb9a2623700 1 Processor -- start 2026-03-10T14:05:09.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.839+0000 7fb9a2623700 1 -- start start 2026-03-10T14:05:09.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb9a2623700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb99c0fff00 0x7fb99c198580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb9a2623700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c198ac0 0x7fb99c19cf30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb9a2623700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb99c1990d0 con 0x7fb99c198ac0 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb9a2623700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb99c199240 con 0x7fb99c0fff00 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb99b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c198ac0 0x7fb99c19cf30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb99b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c198ac0 0x7fb99c19cf30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41856/0 (socket says 192.168.123.103:41856) 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb99b7fe700 1 -- 192.168.123.103:0/1716304344 learned_addr learned my addr 192.168.123.103:0/1716304344 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb99b7fe700 1 -- 192.168.123.103:0/1716304344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb99c0fff00 msgr2=0x7fb99c198580 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb99b7fe700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb99c0fff00 0x7fb99c198580 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb99b7fe700 1 -- 192.168.123.103:0/1716304344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9840097e0 con 0x7fb99c198ac0 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb99b7fe700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c198ac0 0x7fb99c19cf30 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7fb98c00dc70 tx=0x7fb98c009520 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:09.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.840+0000 7fb9997fa700 1 -- 192.168.123.103:0/1716304344 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb98c009b30 con 0x7fb99c198ac0 2026-03-10T14:05:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.841+0000 7fb9997fa700 1 -- 192.168.123.103:0/1716304344 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb98c009c90 con 0x7fb99c198ac0 2026-03-10T14:05:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.841+0000 7fb9997fa700 1 -- 192.168.123.103:0/1716304344 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb98c00f6f0 con 0x7fb99c198ac0 2026-03-10T14:05:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.841+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb99c19d530 con 0x7fb99c198ac0 2026-03-10T14:05:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.841+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb99c19da80 con 0x7fb99c198ac0 2026-03-10T14:05:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.842+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb99c103990 con 0x7fb99c198ac0 2026-03-10T14:05:09.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.843+0000 7fb9997fa700 1 -- 192.168.123.103:0/1716304344 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb98c010460 con 0x7fb99c198ac0 2026-03-10T14:05:09.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.843+0000 7fb9997fa700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb98806c680 0x7fb98806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:09.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.844+0000 7fb9997fa700 1 -- 192.168.123.103:0/1716304344 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb98c08b0e0 con 0x7fb99c198ac0 2026-03-10T14:05:09.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.844+0000 7fb99bfff700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb98806c680 0x7fb98806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:09.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.844+0000 7fb99bfff700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb98806c680 0x7fb98806eb30 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fb98400b5c0 tx=0x7fb984005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:09.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.846+0000 7fb9997fa700 1 -- 192.168.123.103:0/1716304344 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb98c055ba0 con 0x7fb99c198ac0 2026-03-10T14:05:09.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.990+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7fb99c066e40 con 0x7fb99c198ac0 2026-03-10T14:05:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.993+0000 7fb9997fa700 1 -- 192.168.123.103:0/1716304344 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v19) v1 ==== 74+0+172845 (secure 0 0 0) 0x7fb98c0591c0 con 0x7fb99c198ac0 2026-03-10T14:05:09.992 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:09.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.999+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb98806c680 msgr2=0x7fb98806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:09.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.999+0000 7fb9a2623700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb98806c680 0x7fb98806eb30 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fb98400b5c0 tx=0x7fb984005fb0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.999+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c198ac0 msgr2=0x7fb99c19cf30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:09.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.999+0000 7fb9a2623700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c198ac0 0x7fb99c19cf30 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7fb98c00dc70 tx=0x7fb98c009520 comp rx=0 tx=0).stop 2026-03-10T14:05:09.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.999+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 shutdown_connections 2026-03-10T14:05:09.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.999+0000 7fb9a2623700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb98806c680 0x7fb98806eb30 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:09.999+0000 7fb9a2623700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb99c0fff00 0x7fb99c198580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.000+0000 7fb9a2623700 1 --2- 192.168.123.103:0/1716304344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb99c198ac0 0x7fb99c19cf30 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:09.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.000+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 >> 192.168.123.103:0/1716304344 conn(0x7fb99c0754a0 msgr2=0x7fb99c109970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:09.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.000+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 shutdown_connections 2026-03-10T14:05:09.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.000+0000 7fb9a2623700 1 -- 192.168.123.103:0/1716304344 wait complete. 2026-03-10T14:05:10.067 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":19,"active_gid":14223,"active_name":"vm03.rwbbep","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":2},{"type":"v1","addr":"192.168.123.103:6801","nonce":2}]},"active_addr":"192.168.123.103:6801/2","active_change":"2026-03-10T14:03:43.076424+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14248,"name":"vm04.ywwcto","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_password":{"name":"alertmanager_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web password","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_user":{"name":"alertmanager_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web user","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph:v18","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"prometheus_web_password":{"name":"prometheus_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web password","long_desc":"","tags":[],"see_also":[]},"prometheus_web_user":{"name":"prometheus_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web user","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"noautoscale":{"name":"noautoscale","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"global autoscale flag","long_desc":"Option to turn on/off the autoscaler for all pools","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_password":{"name":"alertmanager_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web password","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_user":{"name":"alertmanager_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web user","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph:v18","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"prometheus_web_password":{"name":"prometheus_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web password","long_desc":"","tags":[],"see_also":[]},"prometheus_web_user":{"name":"prometheus_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web user","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"noautoscale":{"name":"noautoscale","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"global autoscale flag","long_desc":"Option to turn on/off the autoscaler for all pools","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.103:8443/","prometheus":"http://192.168.123.103:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":1586580482}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":2268959518}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":3410780149}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":1546295591}]}]} 2026-03-10T14:05:10.068 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-10T14:05:10.068 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-10T14:05:10.068 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd dump --format=json 2026-03-10T14:05:10.222 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.456+0000 7f3537d26700 1 -- 192.168.123.103:0/1018435577 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530108780 msgr2=0x7f3530108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.456+0000 7f3537d26700 1 --2- 192.168.123.103:0/1018435577 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530108780 0x7f3530108b50 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f3520009b50 tx=0x7f3520009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.457+0000 7f3537d26700 1 -- 192.168.123.103:0/1018435577 shutdown_connections 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.457+0000 7f3537d26700 1 --2- 192.168.123.103:0/1018435577 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3530102780 0x7f3530102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.457+0000 7f3537d26700 1 --2- 192.168.123.103:0/1018435577 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530108780 0x7f3530108b50 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.457+0000 7f3537d26700 1 -- 192.168.123.103:0/1018435577 >> 192.168.123.103:0/1018435577 conn(0x7f35300fe280 msgr2=0x7f3530100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.457+0000 7f3537d26700 1 -- 192.168.123.103:0/1018435577 shutdown_connections 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3537d26700 1 -- 192.168.123.103:0/1018435577 wait complete. 2026-03-10T14:05:10.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3537d26700 1 Processor -- start 2026-03-10T14:05:10.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3537d26700 1 -- start start 2026-03-10T14:05:10.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3537d26700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530102780 0x7f3530075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:10.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3535ac2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530102780 0x7f3530075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:10.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3535ac2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530102780 0x7f3530075260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41876/0 (socket says 192.168.123.103:41876) 2026-03-10T14:05:10.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3537d26700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3530108780 0x7f35300757a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:10.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3537d26700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f35300793a0 con 0x7f3530102780 2026-03-10T14:05:10.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3537d26700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3530075ce0 con 0x7f3530108780 2026-03-10T14:05:10.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3535ac2700 1 -- 192.168.123.103:0/3780296297 learned_addr learned my addr 192.168.123.103:0/3780296297 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3535ac2700 1 -- 192.168.123.103:0/3780296297 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3530108780 msgr2=0x7f35300757a0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3535ac2700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3530108780 0x7f35300757a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.458+0000 7f3535ac2700 1 -- 192.168.123.103:0/3780296297 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f35200097e0 con 0x7f3530102780 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.459+0000 7f3535ac2700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530102780 0x7f3530075260 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f3520004cb0 tx=0x7f3520005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.459+0000 7f3526ffd700 1 -- 192.168.123.103:0/3780296297 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f352001d070 con 0x7f3530102780 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.459+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3530075f60 con 0x7f3530102780 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.459+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f35301a6b90 con 0x7f3530102780 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.459+0000 7f3526ffd700 1 -- 192.168.123.103:0/3780296297 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3520022470 con 0x7f3530102780 2026-03-10T14:05:10.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.459+0000 7f3526ffd700 1 -- 192.168.123.103:0/3780296297 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f352000f650 con 0x7f3530102780 2026-03-10T14:05:10.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.460+0000 7f3526ffd700 1 -- 192.168.123.103:0/3780296297 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f352000f870 con 0x7f3530102780 2026-03-10T14:05:10.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.460+0000 7f3526ffd700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f351c06c680 0x7f351c06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:10.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.461+0000 7f35352c1700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f351c06c680 0x7f351c06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:10.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.461+0000 7f3526ffd700 1 -- 192.168.123.103:0/3780296297 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f352008dbf0 con 0x7f3530102780 2026-03-10T14:05:10.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.461+0000 7f35352c1700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f351c06c680 0x7f351c06eb30 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f3530076a00 tx=0x7f352c00b3f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:10.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.461+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3514005320 con 0x7f3530102780 2026-03-10T14:05:10.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.464+0000 7f3526ffd700 1 -- 192.168.123.103:0/3780296297 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3520058760 con 0x7f3530102780 2026-03-10T14:05:10.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.569+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f3514005190 con 0x7f3530102780 2026-03-10T14:05:10.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.570+0000 7f3526ffd700 1 -- 192.168.123.103:0/3780296297 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11272 (secure 0 0 0) 0x7f3520027020 con 0x7f3530102780 2026-03-10T14:05:10.570 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:10.570 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":33,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","created":"2026-03-10T14:02:37.819782+0000","modified":"2026-03-10T14:05:07.388081+0000","last_up_change":"2026-03-10T14:05:06.380447+0000","last_in_change":"2026-03-10T14:04:56.344082+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T14:04:37.304178+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"32845076-88bb-4ae3-87fc-7369eed22d26","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6803","nonce":121236340}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6805","nonce":121236340}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6809","nonce":121236340}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6807","nonce":121236340}]},"public_addr":"192.168.123.103:6803/121236340","cluster_addr":"192.168.123.103:6805/121236340","heartbeat_back_addr":"192.168.123.103:6809/121236340","heartbeat_front_addr":"192.168.123.103:6807/121236340","state":["exists","up"]},{"osd":1,"uuid":"0af1d772-f786-4e93-8006-866ee43f51d1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6811","nonce":3922533653}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6813","nonce":3922533653}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6817","nonce":3922533653}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6815","nonce":3922533653}]},"public_addr":"192.168.123.103:6811/3922533653","cluster_addr":"192.168.123.103:6813/3922533653","heartbeat_back_addr":"192.168.123.103:6817/3922533653","heartbeat_front_addr":"192.168.123.103:6815/3922533653","state":["exists","up"]},{"osd":2,"uuid":"03fb18a9-019c-440f-9b09-702297165b29","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6819","nonce":1891998236}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6821","nonce":1891998236}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6825","nonce":1891998236}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6823","nonce":1891998236}]},"public_addr":"192.168.123.103:6819/1891998236","cluster_addr":"192.168.123.103:6821/1891998236","heartbeat_back_addr":"192.168.123.103:6825/1891998236","heartbeat_front_addr":"192.168.123.103:6823/1891998236","state":["exists","up"]},{"osd":3,"uuid":"9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6801","nonce":3770653735}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6803","nonce":3770653735}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6807","nonce":3770653735}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6805","nonce":3770653735}]},"public_addr":"192.168.123.104:6801/3770653735","cluster_addr":"192.168.123.104:6803/3770653735","heartbeat_back_addr":"192.168.123.104:6807/3770653735","heartbeat_front_addr":"192.168.123.104:6805/3770653735","state":["exists","up"]},{"osd":4,"uuid":"9d0ae930-1cc2-4815-bd24-e9129038b319","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6809","nonce":1824767996}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6811","nonce":1824767996}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6815","nonce":1824767996}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6813","nonce":1824767996}]},"public_addr":"192.168.123.104:6809/1824767996","cluster_addr":"192.168.123.104:6811/1824767996","heartbeat_back_addr":"192.168.123.104:6815/1824767996","heartbeat_front_addr":"192.168.123.104:6813/1824767996","state":["exists","up"]},{"osd":5,"uuid":"5c4fe084-183b-43ce-9ebe-daeadbb4f59a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6817","nonce":3736804423}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6819","nonce":3736804423}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6823","nonce":3736804423}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6821","nonce":3736804423}]},"public_addr":"192.168.123.104:6817/3736804423","cluster_addr":"192.168.123.104:6819/3736804423","heartbeat_back_addr":"192.168.123.104:6823/3736804423","heartbeat_front_addr":"192.168.123.104:6821/3736804423","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:16.138307+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:26.330857+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:36.153297+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:46.727100+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:56.580723+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:05:05.150006+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:0/330658303":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/1628969362":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/1542797438":"2026-03-11T14:03:07.554401+0000","192.168.123.103:0/621911225":"2026-03-11T14:03:07.554401+0000","192.168.123.103:6801/2":"2026-03-11T14:02:53.370766+0000","192.168.123.103:6800/2":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/1489630015":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/4124605323":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/1771049444":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/3848481828":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/3383031364":"2026-03-11T14:03:07.554401+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T14:05:10.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.574+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f351c06c680 msgr2=0x7f351c06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:10.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.574+0000 7f3537d26700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f351c06c680 0x7f351c06eb30 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f3530076a00 tx=0x7f352c00b3f0 comp rx=0 tx=0).stop 2026-03-10T14:05:10.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.574+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530102780 msgr2=0x7f3530075260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:10.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.574+0000 7f3537d26700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530102780 0x7f3530075260 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f3520004cb0 tx=0x7f3520005dc0 comp rx=0 tx=0).stop 2026-03-10T14:05:10.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.574+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 shutdown_connections 2026-03-10T14:05:10.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.575+0000 7f3537d26700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3530102780 0x7f3530075260 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:10.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.575+0000 7f3537d26700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f351c06c680 0x7f351c06eb30 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:10.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.575+0000 7f3537d26700 1 --2- 192.168.123.103:0/3780296297 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3530108780 0x7f35300757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:10.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.575+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 >> 192.168.123.103:0/3780296297 conn(0x7f35300fe280 msgr2=0x7f35300ffbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:10.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.575+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 shutdown_connections 2026-03-10T14:05:10.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:10.575+0000 7f3537d26700 1 -- 192.168.123.103:0/3780296297 wait complete. 2026-03-10T14:05:10.635 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-10T14:05:10.635 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd dump --format=json 2026-03-10T14:05:10.780 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:10.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:10 vm03 ceph-mon[49718]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:10.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:10 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1716304344' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T14:05:10.820 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:10 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3780296297' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T14:05:11.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.015+0000 7fc93c9c5700 1 -- 192.168.123.103:0/2784917587 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 msgr2=0x7fc934108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:11.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.015+0000 7fc93c9c5700 1 --2- 192.168.123.103:0/2784917587 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 0x7fc934108b50 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fc920009b00 tx=0x7fc920009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:11.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.016+0000 7fc93c9c5700 1 -- 192.168.123.103:0/2784917587 shutdown_connections 2026-03-10T14:05:11.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.016+0000 7fc93c9c5700 1 --2- 192.168.123.103:0/2784917587 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc934102780 0x7fc934102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:11.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.016+0000 7fc93c9c5700 1 --2- 192.168.123.103:0/2784917587 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 0x7fc934108b50 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:11.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.016+0000 7fc93c9c5700 1 -- 192.168.123.103:0/2784917587 >> 192.168.123.103:0/2784917587 conn(0x7fc9340fe280 msgr2=0x7fc934100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:11.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.016+0000 7fc93c9c5700 1 -- 192.168.123.103:0/2784917587 shutdown_connections 2026-03-10T14:05:11.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.016+0000 7fc93c9c5700 1 -- 192.168.123.103:0/2784917587 wait complete. 2026-03-10T14:05:11.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.017+0000 7fc93c9c5700 1 Processor -- start 2026-03-10T14:05:11.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.017+0000 7fc93c9c5700 1 -- start start 2026-03-10T14:05:11.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.017+0000 7fc93c9c5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc934102780 0x7fc9341983d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:11.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc93c9c5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 0x7fc934198910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:11.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc93c9c5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc934198ff0 con 0x7fc934108780 2026-03-10T14:05:11.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc939f60700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 0x7fc934198910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:11.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc93c9c5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc93419cd30 con 0x7fc934102780 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc939f60700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 0x7fc934198910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41890/0 (socket says 192.168.123.103:41890) 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc939f60700 1 -- 192.168.123.103:0/596291033 learned_addr learned my addr 192.168.123.103:0/596291033 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc939f60700 1 -- 192.168.123.103:0/596291033 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc934102780 msgr2=0x7fc9341983d0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc939f60700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc934102780 0x7fc9341983d0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc939f60700 1 -- 192.168.123.103:0/596291033 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9200097e0 con 0x7fc934108780 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc939f60700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 0x7fc934198910 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7fc92800cc60 tx=0x7fc9280074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc92f7fe700 1 -- 192.168.123.103:0/596291033 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc928007af0 con 0x7fc934108780 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc92f7fe700 1 -- 192.168.123.103:0/596291033 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc928004d10 con 0x7fc934108780 2026-03-10T14:05:11.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.018+0000 7fc92f7fe700 1 -- 192.168.123.103:0/596291033 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc9280056e0 con 0x7fc934108780 2026-03-10T14:05:11.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.019+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc93419d010 con 0x7fc934108780 2026-03-10T14:05:11.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.019+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc93419d530 con 0x7fc934108780 2026-03-10T14:05:11.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.021+0000 7fc92f7fe700 1 -- 192.168.123.103:0/596291033 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc928004750 con 0x7fc934108780 2026-03-10T14:05:11.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.021+0000 7fc92f7fe700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc92406c7a0 0x7fc92406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:11.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.021+0000 7fc92f7fe700 1 -- 192.168.123.103:0/596291033 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc92808b310 con 0x7fc934108780 2026-03-10T14:05:11.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.021+0000 7fc93a761700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc92406c7a0 0x7fc92406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:11.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.022+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc93410aca0 con 0x7fc934108780 2026-03-10T14:05:11.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.022+0000 7fc93a761700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc92406c7a0 0x7fc92406ec50 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fc920006010 tx=0x7fc920005780 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:11.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.024+0000 7fc92f7fe700 1 -- 192.168.123.103:0/596291033 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc928055e50 con 0x7fc934108780 2026-03-10T14:05:11.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:10 vm04 ceph-mon[55966]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:11.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:10 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/1716304344' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T14:05:11.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:10 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3780296297' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T14:05:11.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.127+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7fc9341996e0 con 0x7fc934108780 2026-03-10T14:05:11.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.128+0000 7fc92f7fe700 1 -- 192.168.123.103:0/596291033 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11272 (secure 0 0 0) 0x7fc928059470 con 0x7fc934108780 2026-03-10T14:05:11.127 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:11.127 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":33,"fsid":"b81bf660-1c89-11f1-b612-27d302cdb124","created":"2026-03-10T14:02:37.819782+0000","modified":"2026-03-10T14:05:07.388081+0000","last_up_change":"2026-03-10T14:05:06.380447+0000","last_in_change":"2026-03-10T14:04:56.344082+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T14:04:37.304178+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"32845076-88bb-4ae3-87fc-7369eed22d26","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6803","nonce":121236340}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6805","nonce":121236340}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6809","nonce":121236340}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":121236340},{"type":"v1","addr":"192.168.123.103:6807","nonce":121236340}]},"public_addr":"192.168.123.103:6803/121236340","cluster_addr":"192.168.123.103:6805/121236340","heartbeat_back_addr":"192.168.123.103:6809/121236340","heartbeat_front_addr":"192.168.123.103:6807/121236340","state":["exists","up"]},{"osd":1,"uuid":"0af1d772-f786-4e93-8006-866ee43f51d1","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6811","nonce":3922533653}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6813","nonce":3922533653}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6817","nonce":3922533653}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3922533653},{"type":"v1","addr":"192.168.123.103:6815","nonce":3922533653}]},"public_addr":"192.168.123.103:6811/3922533653","cluster_addr":"192.168.123.103:6813/3922533653","heartbeat_back_addr":"192.168.123.103:6817/3922533653","heartbeat_front_addr":"192.168.123.103:6815/3922533653","state":["exists","up"]},{"osd":2,"uuid":"03fb18a9-019c-440f-9b09-702297165b29","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6819","nonce":1891998236}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6821","nonce":1891998236}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6825","nonce":1891998236}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":1891998236},{"type":"v1","addr":"192.168.123.103:6823","nonce":1891998236}]},"public_addr":"192.168.123.103:6819/1891998236","cluster_addr":"192.168.123.103:6821/1891998236","heartbeat_back_addr":"192.168.123.103:6825/1891998236","heartbeat_front_addr":"192.168.123.103:6823/1891998236","state":["exists","up"]},{"osd":3,"uuid":"9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6800","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6801","nonce":3770653735}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6802","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6803","nonce":3770653735}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6806","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6807","nonce":3770653735}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6804","nonce":3770653735},{"type":"v1","addr":"192.168.123.104:6805","nonce":3770653735}]},"public_addr":"192.168.123.104:6801/3770653735","cluster_addr":"192.168.123.104:6803/3770653735","heartbeat_back_addr":"192.168.123.104:6807/3770653735","heartbeat_front_addr":"192.168.123.104:6805/3770653735","state":["exists","up"]},{"osd":4,"uuid":"9d0ae930-1cc2-4815-bd24-e9129038b319","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6808","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6809","nonce":1824767996}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6810","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6811","nonce":1824767996}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6814","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6815","nonce":1824767996}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6812","nonce":1824767996},{"type":"v1","addr":"192.168.123.104:6813","nonce":1824767996}]},"public_addr":"192.168.123.104:6809/1824767996","cluster_addr":"192.168.123.104:6811/1824767996","heartbeat_back_addr":"192.168.123.104:6815/1824767996","heartbeat_front_addr":"192.168.123.104:6813/1824767996","state":["exists","up"]},{"osd":5,"uuid":"5c4fe084-183b-43ce-9ebe-daeadbb4f59a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6816","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6817","nonce":3736804423}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6818","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6819","nonce":3736804423}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6822","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6823","nonce":3736804423}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6820","nonce":3736804423},{"type":"v1","addr":"192.168.123.104:6821","nonce":3736804423}]},"public_addr":"192.168.123.104:6817/3736804423","cluster_addr":"192.168.123.104:6819/3736804423","heartbeat_back_addr":"192.168.123.104:6823/3736804423","heartbeat_front_addr":"192.168.123.104:6821/3736804423","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:16.138307+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:26.330857+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:36.153297+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:46.727100+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:04:56.580723+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T14:05:05.150006+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:0/330658303":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/1628969362":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/1542797438":"2026-03-11T14:03:07.554401+0000","192.168.123.103:0/621911225":"2026-03-11T14:03:07.554401+0000","192.168.123.103:6801/2":"2026-03-11T14:02:53.370766+0000","192.168.123.103:6800/2":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/1489630015":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/4124605323":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/1771049444":"2026-03-11T14:02:53.370766+0000","192.168.123.103:0/3848481828":"2026-03-11T14:03:43.076138+0000","192.168.123.103:0/3383031364":"2026-03-11T14:03:07.554401+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.130+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc92406c7a0 msgr2=0x7fc92406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.130+0000 7fc93c9c5700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc92406c7a0 0x7fc92406ec50 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fc920006010 tx=0x7fc920005780 comp rx=0 tx=0).stop 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.130+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 msgr2=0x7fc934198910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.130+0000 7fc93c9c5700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 0x7fc934198910 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7fc92800cc60 tx=0x7fc9280074a0 comp rx=0 tx=0).stop 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.130+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 shutdown_connections 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.130+0000 7fc93c9c5700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc92406c7a0 0x7fc92406ec50 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.130+0000 7fc93c9c5700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc934102780 0x7fc9341983d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.130+0000 7fc93c9c5700 1 --2- 192.168.123.103:0/596291033 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc934108780 0x7fc934198910 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:11.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.131+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 >> 192.168.123.103:0/596291033 conn(0x7fc9340fe280 msgr2=0x7fc9340ffbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:11.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.131+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 shutdown_connections 2026-03-10T14:05:11.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:11.131+0000 7fc93c9c5700 1 -- 192.168.123.103:0/596291033 wait complete. 2026-03-10T14:05:11.194 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph tell osd.0 flush_pg_stats 2026-03-10T14:05:11.194 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph tell osd.1 flush_pg_stats 2026-03-10T14:05:11.194 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph tell osd.2 flush_pg_stats 2026-03-10T14:05:11.194 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph tell osd.3 flush_pg_stats 2026-03-10T14:05:11.194 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph tell osd.4 flush_pg_stats 2026-03-10T14:05:11.194 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph tell osd.5 flush_pg_stats 2026-03-10T14:05:11.638 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:11.668 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:11.669 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:11.703 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:11.706 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:11.731 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:11.958 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:11 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/596291033' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T14:05:12.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:11 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/596291033' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T14:05:12.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.405+0000 7fac89538700 1 -- 192.168.123.103:0/1326624235 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac8410ed80 msgr2=0x7fac8406d260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.405+0000 7fac89538700 1 --2- 192.168.123.103:0/1326624235 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac8410ed80 0x7fac8406d260 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fac74009b00 tx=0x7fac74009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:12.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.413+0000 7fac89538700 1 -- 192.168.123.103:0/1326624235 shutdown_connections 2026-03-10T14:05:12.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.413+0000 7fac89538700 1 --2- 192.168.123.103:0/1326624235 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac8406d7a0 0x7fac8406dc10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.413+0000 7fac89538700 1 --2- 192.168.123.103:0/1326624235 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac8410ed80 0x7fac8406d260 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.413+0000 7fac89538700 1 -- 192.168.123.103:0/1326624235 >> 192.168.123.103:0/1326624235 conn(0x7fac8406c830 msgr2=0x7fac84071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.422+0000 7fac89538700 1 -- 192.168.123.103:0/1326624235 shutdown_connections 2026-03-10T14:05:12.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.422+0000 7fac89538700 1 -- 192.168.123.103:0/1326624235 wait complete. 2026-03-10T14:05:12.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.422+0000 7fac89538700 1 Processor -- start 2026-03-10T14:05:12.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.422+0000 7fac89538700 1 -- start start 2026-03-10T14:05:12.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.423+0000 7fac89538700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac8406d7a0 0x7fac841a4e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.423+0000 7fac89538700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac8410ed80 0x7fac841a53a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.423+0000 7fac89538700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac841a5a80 con 0x7fac8410ed80 2026-03-10T14:05:12.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.423+0000 7fac89538700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac841a9810 con 0x7fac8406d7a0 2026-03-10T14:05:12.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.423+0000 7fac827fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac8410ed80 0x7fac841a53a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.423+0000 7fac82ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac8406d7a0 0x7fac841a4e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.423+0000 7fac827fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac8410ed80 0x7fac841a53a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41908/0 (socket says 192.168.123.103:41908) 2026-03-10T14:05:12.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.423+0000 7fac827fc700 1 -- 192.168.123.103:0/4263361807 learned_addr learned my addr 192.168.123.103:0/4263361807 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:12.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.424+0000 7fac827fc700 1 -- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac8406d7a0 msgr2=0x7fac841a4e60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.424+0000 7fac827fc700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac8406d7a0 0x7fac841a4e60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.424+0000 7fac827fc700 1 -- 192.168.123.103:0/4263361807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac78009710 con 0x7fac8410ed80 2026-03-10T14:05:12.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.424+0000 7fac827fc700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac8410ed80 0x7fac841a53a0 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7fac7800eee0 tx=0x7fac7800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.424+0000 7fac6bfff700 1 -- 192.168.123.103:0/4263361807 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac78009a70 con 0x7fac8410ed80 2026-03-10T14:05:12.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.424+0000 7fac6bfff700 1 -- 192.168.123.103:0/4263361807 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fac78004500 con 0x7fac8410ed80 2026-03-10T14:05:12.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.425+0000 7fac6bfff700 1 -- 192.168.123.103:0/4263361807 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac78005350 con 0x7fac8410ed80 2026-03-10T14:05:12.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.425+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fac740097e0 con 0x7fac8410ed80 2026-03-10T14:05:12.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.425+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fac841a9e30 con 0x7fac8410ed80 2026-03-10T14:05:12.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.426+0000 7fac6bfff700 1 -- 192.168.123.103:0/4263361807 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fac7801e030 con 0x7fac8410ed80 2026-03-10T14:05:12.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.427+0000 7fac6bfff700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fac6c06c520 0x7fac6c06e9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.427+0000 7fac6bfff700 1 -- 192.168.123.103:0/4263361807 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fac78014070 con 0x7fac8410ed80 2026-03-10T14:05:12.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.427+0000 7fac82ffd700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fac6c06c520 0x7fac6c06e9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.427+0000 7fac82ffd700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fac6c06c520 0x7fac6c06e9d0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fac7400b5c0 tx=0x7fac74005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.427+0000 7fac69ffb700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] conn(0x7fac84061190 0x7fac84061560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.428+0000 7fac69ffb700 1 -- 192.168.123.103:0/4263361807 --> [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fac841aa670 con 0x7fac84061190 2026-03-10T14:05:12.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.428+0000 7fac837fe700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] conn(0x7fac84061190 0x7fac84061560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.428+0000 7fac837fe700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] conn(0x7fac84061190 0x7fac84061560 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.429+0000 7fac6bfff700 1 -- 192.168.123.103:0/4263361807 <== osd.4 v2:192.168.123.104:6808/1824767996 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7fac841aa670 con 0x7fac84061190 2026-03-10T14:05:12.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.452+0000 7fac69ffb700 1 -- 192.168.123.103:0/4263361807 --> [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fac8404ea50 con 0x7fac84061190 2026-03-10T14:05:12.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.452+0000 7fac6bfff700 1 -- 192.168.123.103:0/4263361807 <== osd.4 v2:192.168.123.104:6808/1824767996 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7fac8404ea50 con 0x7fac84061190 2026-03-10T14:05:12.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.453+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] conn(0x7fac84061190 msgr2=0x7fac84061560 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.453+0000 7fac89538700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] conn(0x7fac84061190 0x7fac84061560 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.453+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fac6c06c520 msgr2=0x7fac6c06e9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.453+0000 7fac89538700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fac6c06c520 0x7fac6c06e9d0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fac7400b5c0 tx=0x7fac74005fb0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.453+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac8410ed80 msgr2=0x7fac841a53a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.453+0000 7fac89538700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac8410ed80 0x7fac841a53a0 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7fac7800eee0 tx=0x7fac7800c5b0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.454+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 shutdown_connections 2026-03-10T14:05:12.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.454+0000 7fac89538700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:6808/1824767996,v1:192.168.123.104:6809/1824767996] conn(0x7fac84061190 0x7fac84061560 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.454+0000 7fac89538700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fac6c06c520 0x7fac6c06e9d0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.454+0000 7fac89538700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac8406d7a0 0x7fac841a4e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.454+0000 7fac89538700 1 --2- 192.168.123.103:0/4263361807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac8410ed80 0x7fac841a53a0 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.454+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 >> 192.168.123.103:0/4263361807 conn(0x7fac8406c830 msgr2=0x7fac84118860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.454+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 shutdown_connections 2026-03-10T14:05:12.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.454+0000 7fac89538700 1 -- 192.168.123.103:0/4263361807 wait complete. 2026-03-10T14:05:12.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.493+0000 7f27b640b700 1 -- 192.168.123.103:0/1755666463 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0108780 msgr2=0x7f27b0108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.504+0000 7f27b640b700 1 --2- 192.168.123.103:0/1755666463 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0108780 0x7f27b0108b50 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f27a0009b00 tx=0x7f27a0009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:12.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.504+0000 7f27b640b700 1 -- 192.168.123.103:0/1755666463 shutdown_connections 2026-03-10T14:05:12.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.504+0000 7f27b640b700 1 --2- 192.168.123.103:0/1755666463 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f27b0102780 0x7f27b0102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.504+0000 7f27b640b700 1 --2- 192.168.123.103:0/1755666463 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0108780 0x7f27b0108b50 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.504+0000 7f27b640b700 1 -- 192.168.123.103:0/1755666463 >> 192.168.123.103:0/1755666463 conn(0x7f27b00fe280 msgr2=0x7f27b0100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.504+0000 7f27b640b700 1 -- 192.168.123.103:0/1755666463 shutdown_connections 2026-03-10T14:05:12.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.504+0000 7f27b640b700 1 -- 192.168.123.103:0/1755666463 wait complete. 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.505+0000 7f27b640b700 1 Processor -- start 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.505+0000 7f27b640b700 1 -- start start 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.505+0000 7f27b640b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0102780 0x7f27b0198360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27b640b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f27b0108780 0x7f27b01988a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27b640b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27b0198ef0 con 0x7f27b0108780 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27b640b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27b0199030 con 0x7f27b0102780 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27affff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0102780 0x7f27b0198360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27affff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0102780 0x7f27b0198360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:33944/0 (socket says 192.168.123.103:33944) 2026-03-10T14:05:12.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27affff700 1 -- 192.168.123.103:0/328333344 learned_addr learned my addr 192.168.123.103:0/328333344 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:12.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27af7fe700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f27b0108780 0x7f27b01988a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27af7fe700 1 -- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0102780 msgr2=0x7f27b0198360 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.506+0000 7f27af7fe700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0102780 0x7f27b0198360 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.507+0000 7f27af7fe700 1 -- 192.168.123.103:0/328333344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f27a00097e0 con 0x7f27b0108780 2026-03-10T14:05:12.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.508+0000 7f27af7fe700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f27b0108780 0x7f27b01988a0 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f27a400cc60 tx=0x7f27a40074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.508+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f27a4007af0 con 0x7f27b0108780 2026-03-10T14:05:12.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.508+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f27a4004d10 con 0x7f27b0108780 2026-03-10T14:05:12.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.508+0000 7f27affff700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0102780 0x7f27b0198360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:12.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.509+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f27a40056a0 con 0x7f27b0108780 2026-03-10T14:05:12.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.510+0000 7f27b640b700 1 -- 192.168.123.103:0/328333344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f27b019cc70 con 0x7f27b0108780 2026-03-10T14:05:12.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.510+0000 7f27b640b700 1 -- 192.168.123.103:0/328333344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f27b019d190 con 0x7f27b0108780 2026-03-10T14:05:12.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.511+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f27b010fd00 con 0x7f27b0108780 2026-03-10T14:05:12.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.516+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f27a4004e80 con 0x7f27b0108780 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.519+0000 7f27ad7fa700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f279806c7a0 0x7f279806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.519+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f27a408b430 con 0x7f27b0108780 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.519+0000 7f27ad7fa700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] conn(0x7f2798072330 0x7f2798074740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.519+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 --> [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f2798074df0 con 0x7f2798072330 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.521+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f27a408b760 con 0x7f27b0108780 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.522+0000 7f27affff700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f279806c7a0 0x7f279806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.522+0000 7f27b49a8700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] conn(0x7f2798072330 0x7f2798074740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.522+0000 7f27affff700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f279806c7a0 0x7f279806ec50 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f27a000b5c0 tx=0x7f27a0009f90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.522+0000 7f27b49a8700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] conn(0x7f2798072330 0x7f2798074740 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.523+0000 7fa21631e700 1 -- 192.168.123.103:0/3028733540 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac3c0 msgr2=0x7fa2080a4cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.523+0000 7fa21631e700 1 --2- 192.168.123.103:0/3028733540 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac3c0 0x7fa2080a4cd0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fa2100669f0 tx=0x7fa2100699f0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.526+0000 7fda94e58700 1 -- 192.168.123.103:0/506409378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 msgr2=0x7fda9006d260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.524+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 <== osd.5 v2:192.168.123.104:6816/3736804423 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f2798074df0 con 0x7f2798072330 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.527+0000 7fa21631e700 1 -- 192.168.123.103:0/3028733540 shutdown_connections 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.527+0000 7fa21631e700 1 --2- 192.168.123.103:0/3028733540 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2080ac790 0x7fa2080a5210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.527+0000 7fa21631e700 1 --2- 192.168.123.103:0/3028733540 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac3c0 0x7fa2080a4cd0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.527+0000 7fa21631e700 1 -- 192.168.123.103:0/3028733540 >> 192.168.123.103:0/3028733540 conn(0x7fa20801a290 msgr2=0x7fa20801a690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.526+0000 7fda94e58700 1 --2- 192.168.123.103:0/506409378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 0x7fda9006d260 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7fda8800b470 tx=0x7fda8800b780 comp rx=0 tx=0).stop 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.527+0000 7fa21631e700 1 -- 192.168.123.103:0/3028733540 shutdown_connections 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.527+0000 7fa21631e700 1 -- 192.168.123.103:0/3028733540 wait complete. 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.528+0000 7fda94e58700 1 -- 192.168.123.103:0/506409378 shutdown_connections 2026-03-10T14:05:12.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.528+0000 7fda94e58700 1 --2- 192.168.123.103:0/506409378 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda9006d7a0 0x7fda9006dc10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.528+0000 7fda94e58700 1 --2- 192.168.123.103:0/506409378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 0x7fda9006d260 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.528+0000 7fda94e58700 1 -- 192.168.123.103:0/506409378 >> 192.168.123.103:0/506409378 conn(0x7fda9006c830 msgr2=0x7fda90071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.527+0000 7fa21631e700 1 Processor -- start 2026-03-10T14:05:12.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.528+0000 7fda94e58700 1 -- 192.168.123.103:0/506409378 shutdown_connections 2026-03-10T14:05:12.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.528+0000 7fda94e58700 1 -- 192.168.123.103:0/506409378 wait complete. 2026-03-10T14:05:12.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.529+0000 7fda94e58700 1 Processor -- start 2026-03-10T14:05:12.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.528+0000 7fa21631e700 1 -- start start 2026-03-10T14:05:12.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.529+0000 7fa21631e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2080ac3c0 0x7fa2080b5040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.529+0000 7fa21631e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac790 0x7fa2080b0040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.529+0000 7fa21631e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa2080b0580 con 0x7fa2080ac3c0 2026-03-10T14:05:12.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.529+0000 7fa21631e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa2080b06f0 con 0x7fa2080ac790 2026-03-10T14:05:12.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.529+0000 7fa214b1b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac790 0x7fa2080b0040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fa214b1b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac790 0x7fa2080b0040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:33968/0 (socket says 192.168.123.103:33968) 2026-03-10T14:05:12.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fa214b1b700 1 -- 192.168.123.103:0/1623774762 learned_addr learned my addr 192.168.123.103:0/1623774762 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:12.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fa21531c700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2080ac3c0 0x7fa2080b5040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fda94e58700 1 -- start start 2026-03-10T14:05:12.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fa214b1b700 1 -- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2080ac3c0 msgr2=0x7fa2080b5040 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fa214b1b700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2080ac3c0 0x7fa2080b5040 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fa214b1b700 1 -- 192.168.123.103:0/1623774762 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa20c009710 con 0x7fa2080ac790 2026-03-10T14:05:12.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fda94e58700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda9006d7a0 0x7fda901195c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fa21531c700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2080ac3c0 0x7fa2080b5040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:12.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fda94e58700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 0x7fda901145e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fda94e58700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda90114b20 con 0x7fda9010ed80 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.530+0000 7fda94e58700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda90114c90 con 0x7fda9006d7a0 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.533+0000 7fa214b1b700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac790 0x7fa2080b0040 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fa20c00ec80 tx=0x7fa20c00c5b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.533+0000 7fda8dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 0x7fda901145e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.533+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa20c00cd50 con 0x7fa2080ac790 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.533+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa20c00ceb0 con 0x7fa2080ac790 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.533+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa20c005320 con 0x7fa2080ac790 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.534+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa210067050 con 0x7fa2080ac790 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.534+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa2080b0d90 con 0x7fa2080ac790 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.533+0000 7fda8dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 0x7fda901145e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41980/0 (socket says 192.168.123.103:41980) 2026-03-10T14:05:12.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.533+0000 7fda8dd9b700 1 -- 192.168.123.103:0/1010758390 learned_addr learned my addr 192.168.123.103:0/1010758390 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:12.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.534+0000 7fda8dd9b700 1 -- 192.168.123.103:0/1010758390 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda9006d7a0 msgr2=0x7fda901195c0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:12.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.534+0000 7fda8dd9b700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda9006d7a0 0x7fda901195c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.534+0000 7fda8dd9b700 1 -- 192.168.123.103:0/1010758390 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda88009e30 con 0x7fda9010ed80 2026-03-10T14:05:12.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.536+0000 7fda8dd9b700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 0x7fda901145e0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7fda8000ec90 tx=0x7fda8000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.537+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda8000cbc0 con 0x7fda9010ed80 2026-03-10T14:05:12.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.537+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fda8000cd20 con 0x7fda9010ed80 2026-03-10T14:05:12.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.537+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda90114f20 con 0x7fda9010ed80 2026-03-10T14:05:12.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.537+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda901b7bd0 con 0x7fda9010ed80 2026-03-10T14:05:12.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.537+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda80010640 con 0x7fda9010ed80 2026-03-10T14:05:12.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.536+0000 7fa1fbfff700 1 -- 192.168.123.103:0/1623774762 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fa1f4000ff0 con 0x7fa2080ac790 2026-03-10T14:05:12.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.540+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa20c005500 con 0x7fa2080ac790 2026-03-10T14:05:12.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.540+0000 7fa2067fc700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1fc06c860 0x7fa1fc06ed10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.540+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa20c014070 con 0x7fa2080ac790 2026-03-10T14:05:12.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.540+0000 7fa2067fc700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] conn(0x7fa1fc0723f0 0x7fa1fc074800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.539+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fda70000fc0 con 0x7fda9010ed80 2026-03-10T14:05:12.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.543+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fda800107a0 con 0x7fda9010ed80 2026-03-10T14:05:12.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.544+0000 7fda7f7fe700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fda7806c750 0x7fda7806ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.546+0000 7fda8e59c700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fda7806c750 0x7fda7806ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.546+0000 7fa21531c700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1fc06c860 0x7fa1fc06ed10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.546+0000 7fa215b1d700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] conn(0x7fa1fc0723f0 0x7fa1fc074800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.546+0000 7fa215b1d700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] conn(0x7fa1fc0723f0 0x7fa1fc074800 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.547+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fda80014070 con 0x7fda9010ed80 2026-03-10T14:05:12.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.547+0000 7fda8e59c700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fda7806c750 0x7fda7806ec00 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fda88015040 tx=0x7fda880094c0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.547+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 --> [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fa1fc074eb0 con 0x7fa1fc0723f0 2026-03-10T14:05:12.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.547+0000 7fda7f7fe700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] conn(0x7fda780722e0 0x7fda780746f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.548+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7fa20c004cb0 con 0x7fa2080ac790 2026-03-10T14:05:12.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.548+0000 7fa21531c700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1fc06c860 0x7fa1fc06ed10 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fa210064170 tx=0x7fa210075040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.548+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 <== osd.0 v2:192.168.123.103:6802/121236340 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7fa1fc074eb0 con 0x7fa1fc0723f0 2026-03-10T14:05:12.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.548+0000 7fda8ed9d700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] conn(0x7fda780722e0 0x7fda780746f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.548+0000 7fda8ed9d700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] conn(0x7fda780722e0 0x7fda780746f0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.549+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 --> [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fda78074da0 con 0x7fda780722e0 2026-03-10T14:05:12.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.549+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7fda800902a0 con 0x7fda9010ed80 2026-03-10T14:05:12.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.549+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 <== osd.1 v2:192.168.123.103:6810/3922533653 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7fda78074da0 con 0x7fda780722e0 2026-03-10T14:05:12.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.560+0000 7f7c266a0700 1 -- 192.168.123.103:0/2765847044 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c20071e40 msgr2=0x7f7c200722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.560+0000 7f7c266a0700 1 --2- 192.168.123.103:0/2765847044 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c20071e40 0x7f7c200722b0 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f7c1800d3f0 tx=0x7f7c1800d700 comp rx=0 tx=0).stop 2026-03-10T14:05:12.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.560+0000 7f7c266a0700 1 -- 192.168.123.103:0/2765847044 shutdown_connections 2026-03-10T14:05:12.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.560+0000 7f7c266a0700 1 --2- 192.168.123.103:0/2765847044 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c20071e40 0x7f7c200722b0 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.560+0000 7f7c266a0700 1 --2- 192.168.123.103:0/2765847044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c2010c8b0 0x7f7c2010cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.560+0000 7f7c266a0700 1 -- 192.168.123.103:0/2765847044 >> 192.168.123.103:0/2765847044 conn(0x7f7c2006c6c0 msgr2=0x7f7c2006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.560+0000 7f7c266a0700 1 -- 192.168.123.103:0/2765847044 shutdown_connections 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.560+0000 7f7c266a0700 1 -- 192.168.123.103:0/2765847044 wait complete. 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c266a0700 1 Processor -- start 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c266a0700 1 -- start start 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c266a0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c2010c8b0 0x7f7c201328e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c266a0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c20132e20 0x7f7c20133290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c266a0700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c2007ee70 con 0x7f7c2010c8b0 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c266a0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c2007efe0 con 0x7f7c20132e20 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c1ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c2010c8b0 0x7f7c201328e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c1ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c2010c8b0 0x7f7c201328e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41994/0 (socket says 192.168.123.103:41994) 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c1ffff700 1 -- 192.168.123.103:0/2722466003 learned_addr learned my addr 192.168.123.103:0/2722466003 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c1f7fe700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c20132e20 0x7f7c20133290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c1ffff700 1 -- 192.168.123.103:0/2722466003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c20132e20 msgr2=0x7f7c20133290 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c1ffff700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c20132e20 0x7f7c20133290 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.561+0000 7f7c1ffff700 1 -- 192.168.123.103:0/2722466003 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7c18007ed0 con 0x7f7c2010c8b0 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.562+0000 7f7c1ffff700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c2010c8b0 0x7f7c201328e0 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f7c1000d8d0 tx=0x7f7c1000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.562+0000 7f7c1d7fa700 1 -- 192.168.123.103:0/2722466003 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7c10009880 con 0x7f7c2010c8b0 2026-03-10T14:05:12.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.562+0000 7f7c266a0700 1 -- 192.168.123.103:0/2722466003 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7c2007f260 con 0x7f7c2010c8b0 2026-03-10T14:05:12.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.562+0000 7f7c266a0700 1 -- 192.168.123.103:0/2722466003 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7c2007f7b0 con 0x7f7c2010c8b0 2026-03-10T14:05:12.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.563+0000 7f7c1d7fa700 1 -- 192.168.123.103:0/2722466003 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7c10010460 con 0x7f7c2010c8b0 2026-03-10T14:05:12.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.563+0000 7f7c1d7fa700 1 -- 192.168.123.103:0/2722466003 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7c1000f5d0 con 0x7f7c2010c8b0 2026-03-10T14:05:12.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.564+0000 7f7c1d7fa700 1 -- 192.168.123.103:0/2722466003 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7c1000f730 con 0x7f7c2010c8b0 2026-03-10T14:05:12.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.564+0000 7f7c1d7fa700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7c0806c7a0 0x7f7c0806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.564+0000 7f7c1f7fe700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7c0806c7a0 0x7f7c0806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.565+0000 7f7c1f7fe700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7c0806c7a0 0x7f7c0806ec50 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7c18000f80 tx=0x7f7c1800db00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.565+0000 7f7c1d7fa700 1 -- 192.168.123.103:0/2722466003 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7c1008c2c0 con 0x7f7c2010c8b0 2026-03-10T14:05:12.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.565+0000 7f7c266a0700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] conn(0x7f7c0c001610 0x7f7c0c003ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.565+0000 7f7c266a0700 1 -- 192.168.123.103:0/2722466003 --> [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f7c0c006bf0 con 0x7f7c0c001610 2026-03-10T14:05:12.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.565+0000 7f7c24c3d700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] conn(0x7f7c0c001610 0x7f7c0c003ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.567+0000 7f7c24c3d700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] conn(0x7f7c0c001610 0x7f7c0c003ac0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.567+0000 7f7c1d7fa700 1 -- 192.168.123.103:0/2722466003 <== osd.2 v2:192.168.123.103:6818/1891998236 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f7c0c006bf0 con 0x7f7c0c001610 2026-03-10T14:05:12.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.577+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 --> [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f27b019d730 con 0x7f2798072330 2026-03-10T14:05:12.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.578+0000 7f27ad7fa700 1 -- 192.168.123.103:0/328333344 <== osd.5 v2:192.168.123.104:6816/3736804423 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f27b019d730 con 0x7f2798072330 2026-03-10T14:05:12.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.578+0000 7f7c266a0700 1 -- 192.168.123.103:0/2722466003 --> [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f7c0c005cd0 con 0x7f7c0c001610 2026-03-10T14:05:12.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.578+0000 7f7c1d7fa700 1 -- 192.168.123.103:0/2722466003 <== osd.2 v2:192.168.123.103:6818/1891998236 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f7c0c005cd0 con 0x7f7c0c001610 2026-03-10T14:05:12.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] conn(0x7f2798072330 msgr2=0x7f2798074740 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] conn(0x7f2798072330 0x7f2798074740 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f7c06ffd700 1 -- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] conn(0x7f7c0c001610 msgr2=0x7f7c0c003ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f7c06ffd700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] conn(0x7f7c0c001610 0x7f7c0c003ac0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f279806c7a0 msgr2=0x7f279806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f279806c7a0 0x7f279806ec50 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f27a000b5c0 tx=0x7f27a0009f90 comp rx=0 tx=0).stop 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f27b0108780 msgr2=0x7f27b01988a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f27b0108780 0x7f27b01988a0 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f27a400cc60 tx=0x7f27a40074a0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f7c06ffd700 1 -- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7c0806c7a0 msgr2=0x7f7c0806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f7c06ffd700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7c0806c7a0 0x7f7c0806ec50 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7c18000f80 tx=0x7f7c1800db00 comp rx=0 tx=0).stop 2026-03-10T14:05:12.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f7c06ffd700 1 -- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c2010c8b0 msgr2=0x7f7c201328e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f7c06ffd700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c2010c8b0 0x7f7c201328e0 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f7c1000d8d0 tx=0x7f7c1000dbe0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f7c06ffd700 1 -- 192.168.123.103:0/2722466003 shutdown_connections 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f7c06ffd700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6818/1891998236,v1:192.168.123.103:6819/1891998236] conn(0x7f7c0c001610 0x7f7c0c003ac0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f7c06ffd700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7c2010c8b0 0x7f7c201328e0 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f7c06ffd700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7c0806c7a0 0x7f7c0806ec50 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f7c06ffd700 1 --2- 192.168.123.103:0/2722466003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7c20132e20 0x7f7c20133290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f7c06ffd700 1 -- 192.168.123.103:0/2722466003 >> 192.168.123.103:0/2722466003 conn(0x7f7c2006c6c0 msgr2=0x7f7c20070070 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f7c06ffd700 1 -- 192.168.123.103:0/2722466003 shutdown_connections 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f7c06ffd700 1 -- 192.168.123.103:0/2722466003 wait complete. 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 shutdown_connections 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:6816/3736804423,v1:192.168.123.104:6817/3736804423] conn(0x7f2798072330 0x7f2798074740 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f279806c7a0 0x7f279806ec50 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f27b0102780 0x7f27b0198360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 --2- 192.168.123.103:0/328333344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f27b0108780 0x7f27b01988a0 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.579+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 >> 192.168.123.103:0/328333344 conn(0x7f27b00fe280 msgr2=0x7f27b00ff990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 shutdown_connections 2026-03-10T14:05:12.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.580+0000 7f2796ffd700 1 -- 192.168.123.103:0/328333344 wait complete. 2026-03-10T14:05:12.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.583+0000 7fa1fbfff700 1 -- 192.168.123.103:0/1623774762 --> [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fa1f4002ce0 con 0x7fa1fc0723f0 2026-03-10T14:05:12.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.584+0000 7fa2067fc700 1 -- 192.168.123.103:0/1623774762 <== osd.0 v2:192.168.123.103:6802/121236340 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7fa1f4002ce0 con 0x7fa1fc0723f0 2026-03-10T14:05:12.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.588+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] conn(0x7fa1fc0723f0 msgr2=0x7fa1fc074800 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.588+0000 7fa21631e700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] conn(0x7fa1fc0723f0 0x7fa1fc074800 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.588+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1fc06c860 msgr2=0x7fa1fc06ed10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.588+0000 7fa21631e700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1fc06c860 0x7fa1fc06ed10 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fa210064170 tx=0x7fa210075040 comp rx=0 tx=0).stop 2026-03-10T14:05:12.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.588+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac790 msgr2=0x7fa2080b0040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.588+0000 7fa21631e700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac790 0x7fa2080b0040 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fa20c00ec80 tx=0x7fa20c00c5b0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.606 INFO:teuthology.orchestra.run.vm03.stdout:120259084293 2026-03-10T14:05:12.606 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd last-stat-seq osd.4 2026-03-10T14:05:12.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.607+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 shutdown_connections 2026-03-10T14:05:12.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.607+0000 7fa21631e700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6802/121236340,v1:192.168.123.103:6803/121236340] conn(0x7fa1fc0723f0 0x7fa1fc074800 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.607+0000 7fa21631e700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2080ac3c0 0x7fa2080b5040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.607+0000 7fa21631e700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1fc06c860 0x7fa1fc06ed10 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.607+0000 7fa21631e700 1 --2- 192.168.123.103:0/1623774762 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa2080ac790 0x7fa2080b0040 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.607+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 >> 192.168.123.103:0/1623774762 conn(0x7fa20801a290 msgr2=0x7fa2080a3cc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.609+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 shutdown_connections 2026-03-10T14:05:12.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.609+0000 7fa21631e700 1 -- 192.168.123.103:0/1623774762 wait complete. 2026-03-10T14:05:12.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.637+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 --> [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fda70002d40 con 0x7fda780722e0 2026-03-10T14:05:12.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.639+0000 7fda7f7fe700 1 -- 192.168.123.103:0/1010758390 <== osd.1 v2:192.168.123.103:6810/3922533653 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7fda70002d40 con 0x7fda780722e0 2026-03-10T14:05:12.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.643+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] conn(0x7fda780722e0 msgr2=0x7fda780746f0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.643+0000 7fda94e58700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] conn(0x7fda780722e0 0x7fda780746f0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.644+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fda7806c750 msgr2=0x7fda7806ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.644+0000 7fda94e58700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fda7806c750 0x7fda7806ec00 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fda88015040 tx=0x7fda880094c0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.644+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 msgr2=0x7fda901145e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.644+0000 7fda94e58700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 0x7fda901145e0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7fda8000ec90 tx=0x7fda8000c5b0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.646+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 shutdown_connections 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.646+0000 7fda94e58700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6810/3922533653,v1:192.168.123.103:6811/3922533653] conn(0x7fda780722e0 0x7fda780746f0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.646+0000 7fda94e58700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fda7806c750 0x7fda7806ec00 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.646+0000 7fda94e58700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fda9006d7a0 0x7fda901195c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.646+0000 7fda94e58700 1 --2- 192.168.123.103:0/1010758390 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda9010ed80 0x7fda901145e0 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.646+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 >> 192.168.123.103:0/1010758390 conn(0x7fda9006c830 msgr2=0x7fda90071340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.647+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 shutdown_connections 2026-03-10T14:05:12.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.647+0000 7fda94e58700 1 -- 192.168.123.103:0/1010758390 wait complete. 2026-03-10T14:05:12.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.657+0000 7f13c7397700 1 -- 192.168.123.103:0/4171034431 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 msgr2=0x7f13c0071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.657+0000 7f13c7397700 1 --2- 192.168.123.103:0/4171034431 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 0x7f13c0071fd0 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f13bc009b50 tx=0x7f13bc009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:12.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.670+0000 7f13c7397700 1 -- 192.168.123.103:0/4171034431 shutdown_connections 2026-03-10T14:05:12.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.670+0000 7f13c7397700 1 --2- 192.168.123.103:0/4171034431 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 0x7f13c0071fd0 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.670+0000 7f13c7397700 1 --2- 192.168.123.103:0/4171034431 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13c010eab0 0x7f13c010ee80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.671+0000 7f13c7397700 1 -- 192.168.123.103:0/4171034431 >> 192.168.123.103:0/4171034431 conn(0x7f13c006c6c0 msgr2=0x7f13c006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.671+0000 7f13c7397700 1 -- 192.168.123.103:0/4171034431 shutdown_connections 2026-03-10T14:05:12.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.671+0000 7f13c7397700 1 -- 192.168.123.103:0/4171034431 wait complete. 2026-03-10T14:05:12.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.673+0000 7f13c7397700 1 Processor -- start 2026-03-10T14:05:12.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.676+0000 7f13c7397700 1 -- start start 2026-03-10T14:05:12.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.676+0000 7f13c7397700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 0x7f13c0119750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.676+0000 7f13c7397700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13c010eab0 0x7f13c01147a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.676+0000 7f13c7397700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f13c0114e80 con 0x7f13c0071b60 2026-03-10T14:05:12.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.676+0000 7f13c7397700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f13c0114ff0 con 0x7f13c010eab0 2026-03-10T14:05:12.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.680+0000 7f13c5133700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 0x7f13c0119750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.680+0000 7f13c5133700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 0x7f13c0119750 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42016/0 (socket says 192.168.123.103:42016) 2026-03-10T14:05:12.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.680+0000 7f13c5133700 1 -- 192.168.123.103:0/2217592409 learned_addr learned my addr 192.168.123.103:0/2217592409 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:12.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.681+0000 7f13c5133700 1 -- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13c010eab0 msgr2=0x7f13c01147a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:12.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.681+0000 7f13c5133700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13c010eab0 0x7f13c01147a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.681+0000 7f13c5133700 1 -- 192.168.123.103:0/2217592409 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13bc0097e0 con 0x7f13c0071b60 2026-03-10T14:05:12.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.681+0000 7f13c5133700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 0x7f13c0119750 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f13b800c420 tx=0x7f13b800c7e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.682+0000 7f13b67fc700 1 -- 192.168.123.103:0/2217592409 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13b800ce90 con 0x7f13c0071b60 2026-03-10T14:05:12.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.682+0000 7f13c7397700 1 -- 192.168.123.103:0/2217592409 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f13c0115270 con 0x7f13c0071b60 2026-03-10T14:05:12.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.682+0000 7f13c7397700 1 -- 192.168.123.103:0/2217592409 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f13c01155f0 con 0x7f13c0071b60 2026-03-10T14:05:12.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.682+0000 7f13b67fc700 1 -- 192.168.123.103:0/2217592409 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f13b800f040 con 0x7f13c0071b60 2026-03-10T14:05:12.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.682+0000 7f13b67fc700 1 -- 192.168.123.103:0/2217592409 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13b80147b0 con 0x7f13c0071b60 2026-03-10T14:05:12.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.692+0000 7f13b67fc700 1 -- 192.168.123.103:0/2217592409 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f13b8014910 con 0x7f13c0071b60 2026-03-10T14:05:12.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.692+0000 7f13b67fc700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f13ac06c6d0 0x7f13ac06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.692+0000 7f13b67fc700 1 -- 192.168.123.103:0/2217592409 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f13b808c8b0 con 0x7f13c0071b60 2026-03-10T14:05:12.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.693+0000 7f13c4932700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f13ac06c6d0 0x7f13ac06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.695+0000 7f13c7397700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] conn(0x7f13a4001610 0x7f13a4003ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:12.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.696+0000 7f13c7397700 1 -- 192.168.123.103:0/2217592409 --> [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f13a4006bf0 con 0x7f13a4001610 2026-03-10T14:05:12.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.696+0000 7f13c5934700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] conn(0x7f13a4001610 0x7f13a4003ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:12.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.696+0000 7f13c5934700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] conn(0x7f13a4001610 0x7f13a4003ac0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.697+0000 7f13b67fc700 1 -- 192.168.123.103:0/2217592409 <== osd.3 v2:192.168.123.104:6800/3770653735 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f13a4006bf0 con 0x7f13a4001610 2026-03-10T14:05:12.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.698+0000 7f13c4932700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f13ac06c6d0 0x7f13ac06eb80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f13bc005fd0 tx=0x7f13bc005bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:12.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.706+0000 7f13c7397700 1 -- 192.168.123.103:0/2217592409 --> [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f13a4005cd0 con 0x7f13a4001610 2026-03-10T14:05:12.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.707+0000 7f13b67fc700 1 -- 192.168.123.103:0/2217592409 <== osd.3 v2:192.168.123.104:6800/3770653735 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f13a4005cd0 con 0x7f13a4001610 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 -- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] conn(0x7f13a4001610 msgr2=0x7f13a4003ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] conn(0x7f13a4001610 0x7f13a4003ac0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 -- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f13ac06c6d0 msgr2=0x7f13ac06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f13ac06c6d0 0x7f13ac06eb80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f13bc005fd0 tx=0x7f13bc005bc0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 -- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 msgr2=0x7f13c0119750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 0x7f13c0119750 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f13b800c420 tx=0x7f13b800c7e0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 -- 192.168.123.103:0/2217592409 shutdown_connections 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:6800/3770653735,v1:192.168.123.104:6801/3770653735] conn(0x7f13a4001610 0x7f13a4003ac0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13c0071b60 0x7f13c0119750 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f13ac06c6d0 0x7f13ac06eb80 secure :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f13bc005fd0 tx=0x7f13bc005bc0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 --2- 192.168.123.103:0/2217592409 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f13c010eab0 0x7f13c01147a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:12.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.720+0000 7f13abfff700 1 -- 192.168.123.103:0/2217592409 >> 192.168.123.103:0/2217592409 conn(0x7f13c006c6c0 msgr2=0x7f13c0070520 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:12.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.724+0000 7f13abfff700 1 -- 192.168.123.103:0/2217592409 shutdown_connections 2026-03-10T14:05:12.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:12.726+0000 7f13abfff700 1 -- 192.168.123.103:0/2217592409 wait complete. 2026-03-10T14:05:12.734 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:12 vm03 ceph-mon[49718]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:12.764 INFO:teuthology.orchestra.run.vm03.stdout:55834574858 2026-03-10T14:05:12.764 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd last-stat-seq osd.1 2026-03-10T14:05:12.787 INFO:teuthology.orchestra.run.vm03.stdout:38654705677 2026-03-10T14:05:12.787 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd last-stat-seq osd.0 2026-03-10T14:05:12.787 INFO:teuthology.orchestra.run.vm03.stdout:73014444040 2026-03-10T14:05:12.787 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd last-stat-seq osd.2 2026-03-10T14:05:12.791 INFO:teuthology.orchestra.run.vm03.stdout:137438953475 2026-03-10T14:05:12.791 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd last-stat-seq osd.5 2026-03-10T14:05:12.824 INFO:teuthology.orchestra.run.vm03.stdout:98784247814 2026-03-10T14:05:12.824 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd last-stat-seq osd.3 2026-03-10T14:05:13.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:12 vm04 ceph-mon[55966]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:13.072 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:13.297 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:13.350 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:13.385 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:13.399 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:13.566 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:13.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.651+0000 7f4c9bfff700 1 -- 192.168.123.103:0/644878118 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c071e40 msgr2=0x7f4c9c0722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.651+0000 7f4c9bfff700 1 --2- 192.168.123.103:0/644878118 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c071e40 0x7f4c9c0722b0 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f4c9400b600 tx=0x7f4c9400b910 comp rx=0 tx=0).stop 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.651+0000 7f4c9bfff700 1 -- 192.168.123.103:0/644878118 shutdown_connections 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.651+0000 7f4c9bfff700 1 --2- 192.168.123.103:0/644878118 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c071e40 0x7f4c9c0722b0 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.651+0000 7f4c9bfff700 1 --2- 192.168.123.103:0/644878118 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c10c8f0 0x7f4c9c10ccc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.651+0000 7f4c9bfff700 1 -- 192.168.123.103:0/644878118 >> 192.168.123.103:0/644878118 conn(0x7f4c9c06c6c0 msgr2=0x7f4c9c06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.651+0000 7f4c9bfff700 1 -- 192.168.123.103:0/644878118 shutdown_connections 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.651+0000 7f4c9bfff700 1 -- 192.168.123.103:0/644878118 wait complete. 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.653+0000 7f4c9bfff700 1 Processor -- start 2026-03-10T14:05:13.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.654+0000 7f4c9bfff700 1 -- start start 2026-03-10T14:05:13.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.654+0000 7f4c9bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c10c8f0 0x7f4c9c075f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:13.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.654+0000 7f4c9bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c076490 0x7f4c9c076900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:13.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.655+0000 7f4c9bfff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c9c074300 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.655+0000 7f4c9bfff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c9c074440 con 0x7f4c9c076490 2026-03-10T14:05:13.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.655+0000 7f4c9affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c10c8f0 0x7f4c9c075f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:13.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.655+0000 7f4c9affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c10c8f0 0x7f4c9c075f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42262/0 (socket says 192.168.123.103:42262) 2026-03-10T14:05:13.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.655+0000 7f4c9affd700 1 -- 192.168.123.103:0/751383164 learned_addr learned my addr 192.168.123.103:0/751383164 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:13.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.657+0000 7f4c9a7fc700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c076490 0x7f4c9c076900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:13.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.657+0000 7f4c9affd700 1 -- 192.168.123.103:0/751383164 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c076490 msgr2=0x7f4c9c076900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:13.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.657+0000 7f4c9affd700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c076490 0x7f4c9c076900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:13.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.657+0000 7f4c9affd700 1 -- 192.168.123.103:0/751383164 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c9400b050 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.657+0000 7f4c9affd700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c10c8f0 0x7f4c9c075f50 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f4c8c00ed70 tx=0x7f4c8c00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:13.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.657+0000 7f4ca08b4700 1 -- 192.168.123.103:0/751383164 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c8c009980 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.657+0000 7f4ca08b4700 1 -- 192.168.123.103:0/751383164 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4c8c00cd70 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.657+0000 7f4ca08b4700 1 -- 192.168.123.103:0/751383164 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c8c0189c0 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.658+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c9c0746d0 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.658+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c9c074bf0 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.658+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4c9c04f2a0 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.661+0000 7f4ca08b4700 1 -- 192.168.123.103:0/751383164 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4c8c018b20 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.665+0000 7f4ca08b4700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c8406c6e0 0x7f4c8406eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:13.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.665+0000 7f4ca08b4700 1 -- 192.168.123.103:0/751383164 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4c8c014070 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.665+0000 7f4ca08b4700 1 -- 192.168.123.103:0/751383164 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4c8c08c660 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.666+0000 7f4c9a7fc700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c8406c6e0 0x7f4c8406eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:13.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.666+0000 7f4c9a7fc700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c8406c6e0 0x7f4c8406eb90 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f4c9400bce0 tx=0x7f4c94006040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:13.804 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:13 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:05:13.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.902+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7f4c9c04ea50 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.904 INFO:teuthology.orchestra.run.vm03.stdout:120259084293 2026-03-10T14:05:13.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.903+0000 7f4ca08b4700 1 -- 192.168.123.103:0/751383164 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f4c8c057040 con 0x7f4c9c10c8f0 2026-03-10T14:05:13.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.927+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c8406c6e0 msgr2=0x7f4c8406eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:13.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.927+0000 7f4c9bfff700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c8406c6e0 0x7f4c8406eb90 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f4c9400bce0 tx=0x7f4c94006040 comp rx=0 tx=0).stop 2026-03-10T14:05:13.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.927+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c10c8f0 msgr2=0x7f4c9c075f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:13.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.927+0000 7f4c9bfff700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c10c8f0 0x7f4c9c075f50 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f4c8c00ed70 tx=0x7f4c8c00c5b0 comp rx=0 tx=0).stop 2026-03-10T14:05:13.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.931+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 shutdown_connections 2026-03-10T14:05:13.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.931+0000 7f4c9bfff700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c9c10c8f0 0x7f4c9c075f50 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:13.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.931+0000 7f4c9bfff700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c8406c6e0 0x7f4c8406eb90 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:13.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.931+0000 7f4c9bfff700 1 --2- 192.168.123.103:0/751383164 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4c9c076490 0x7f4c9c076900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:13.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.931+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 >> 192.168.123.103:0/751383164 conn(0x7f4c9c06c6c0 msgr2=0x7f4c9c06fff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:13.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.931+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 shutdown_connections 2026-03-10T14:05:13.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:13.931+0000 7f4c9bfff700 1 -- 192.168.123.103:0/751383164 wait complete. 2026-03-10T14:05:14.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.028+0000 7fd79669f700 1 -- 192.168.123.103:0/3926321704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd79006d7a0 msgr2=0x7fd79006dc10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.028+0000 7fd79669f700 1 --2- 192.168.123.103:0/3926321704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd79006d7a0 0x7fd79006dc10 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fd780009b00 tx=0x7fd780009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:14.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.028+0000 7fd78ffff700 1 --2- 192.168.123.103:0/3926321704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 0x7fd79006d260 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:14.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.029+0000 7fd79669f700 1 -- 192.168.123.103:0/3926321704 shutdown_connections 2026-03-10T14:05:14.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.029+0000 7fd79669f700 1 --2- 192.168.123.103:0/3926321704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd79006d7a0 0x7fd79006dc10 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.029+0000 7fd79669f700 1 --2- 192.168.123.103:0/3926321704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 0x7fd79006d260 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.029+0000 7fd79669f700 1 -- 192.168.123.103:0/3926321704 >> 192.168.123.103:0/3926321704 conn(0x7fd79006c830 msgr2=0x7fd790071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.037+0000 7fd79669f700 1 -- 192.168.123.103:0/3926321704 shutdown_connections 2026-03-10T14:05:14.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.037+0000 7fd79669f700 1 -- 192.168.123.103:0/3926321704 wait complete. 2026-03-10T14:05:14.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.038+0000 7fd79669f700 1 Processor -- start 2026-03-10T14:05:14.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.039+0000 7fd79669f700 1 -- start start 2026-03-10T14:05:14.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.039+0000 7fd79669f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd79006d7a0 0x7fd790115c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.039+0000 7fd79669f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 0x7fd790116150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.039+0000 7fd79669f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd790116690 con 0x7fd79010ed80 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.039+0000 7fd79669f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd790116800 con 0x7fd79006d7a0 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.041+0000 7fd78f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 0x7fd790116150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.041+0000 7fd78f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 0x7fd790116150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42290/0 (socket says 192.168.123.103:42290) 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.041+0000 7fd78f7fe700 1 -- 192.168.123.103:0/223012811 learned_addr learned my addr 192.168.123.103:0/223012811 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd78f7fe700 1 -- 192.168.123.103:0/223012811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd79006d7a0 msgr2=0x7fd790115c10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd78f7fe700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd79006d7a0 0x7fd790115c10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd78f7fe700 1 -- 192.168.123.103:0/223012811 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd7800097e0 con 0x7fd79010ed80 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd78f7fe700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 0x7fd790116150 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7fd78000b5c0 tx=0x7fd780004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd78d7fa700 1 -- 192.168.123.103:0/223012811 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd78001d070 con 0x7fd79010ed80 2026-03-10T14:05:14.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd78d7fa700 1 -- 192.168.123.103:0/223012811 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd78000bc50 con 0x7fd79010ed80 2026-03-10T14:05:14.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd78d7fa700 1 -- 192.168.123.103:0/223012811 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd78000f860 con 0x7fd79010ed80 2026-03-10T14:05:14.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd79669f700 1 -- 192.168.123.103:0/223012811 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd7901b7550 con 0x7fd79010ed80 2026-03-10T14:05:14.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.042+0000 7fd79669f700 1 -- 192.168.123.103:0/223012811 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd7901b7900 con 0x7fd79010ed80 2026-03-10T14:05:14.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.045+0000 7fd79669f700 1 -- 192.168.123.103:0/223012811 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd79004ea50 con 0x7fd79010ed80 2026-03-10T14:05:14.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.048+0000 7fd78d7fa700 1 -- 192.168.123.103:0/223012811 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd78000f9c0 con 0x7fd79010ed80 2026-03-10T14:05:14.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.048+0000 7fd78d7fa700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd77806c680 0x7fd77806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.048+0000 7fd78d7fa700 1 -- 192.168.123.103:0/223012811 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd78008cad0 con 0x7fd79010ed80 2026-03-10T14:05:14.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.048+0000 7fd78d7fa700 1 -- 192.168.123.103:0/223012811 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd780090280 con 0x7fd79010ed80 2026-03-10T14:05:14.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.051+0000 7fd78ffff700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd77806c680 0x7fd77806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.052+0000 7fd78ffff700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd77806c680 0x7fd77806eb30 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fd78400a9b0 tx=0x7fd784005c90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:13 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:05:14.142 INFO:tasks.cephadm.ceph_manager.ceph:need seq 120259084293 got 120259084293 for osd.4 2026-03-10T14:05:14.142 DEBUG:teuthology.parallel:result is None 2026-03-10T14:05:14.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.232+0000 7f8ec72c1700 1 -- 192.168.123.103:0/1313931004 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec0071b60 msgr2=0x7f8ec0071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.232+0000 7f8ec72c1700 1 --2- 192.168.123.103:0/1313931004 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec0071b60 0x7f8ec0071fd0 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f8eb0009b50 tx=0x7f8eb0009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:14.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.245+0000 7f8ec72c1700 1 -- 192.168.123.103:0/1313931004 shutdown_connections 2026-03-10T14:05:14.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.245+0000 7f8ec72c1700 1 --2- 192.168.123.103:0/1313931004 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec0071b60 0x7f8ec0071fd0 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.246+0000 7f8ec72c1700 1 --2- 192.168.123.103:0/1313931004 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8ec010e9e0 0x7f8ec010edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.246+0000 7f8ec72c1700 1 -- 192.168.123.103:0/1313931004 >> 192.168.123.103:0/1313931004 conn(0x7f8ec006c6c0 msgr2=0x7f8ec006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.252+0000 7f8ec72c1700 1 -- 192.168.123.103:0/1313931004 shutdown_connections 2026-03-10T14:05:14.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.252+0000 7f8ec72c1700 1 -- 192.168.123.103:0/1313931004 wait complete. 2026-03-10T14:05:14.252 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.253+0000 7f8ec72c1700 1 Processor -- start 2026-03-10T14:05:14.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.258+0000 7f8ec72c1700 1 -- start start 2026-03-10T14:05:14.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.258+0000 7f8ec72c1700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8ec0071b60 0x7f8ec019c510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.258+0000 7f8ec72c1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec010e9e0 0x7f8ec019ca50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.258+0000 7f8ec62bf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8ec0071b60 0x7f8ec019c510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.258+0000 7f8ec62bf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8ec0071b60 0x7f8ec019c510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:52364/0 (socket says 192.168.123.103:52364) 2026-03-10T14:05:14.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.259+0000 7f8ec72c1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ec019d130 con 0x7f8ec010e9e0 2026-03-10T14:05:14.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.259+0000 7f8ec72c1700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ec01a0ec0 con 0x7f8ec0071b60 2026-03-10T14:05:14.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.260+0000 7f8ec5abe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec010e9e0 0x7f8ec019ca50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.260+0000 7f8ec5abe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec010e9e0 0x7f8ec019ca50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42322/0 (socket says 192.168.123.103:42322) 2026-03-10T14:05:14.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.260+0000 7f8ec5abe700 1 -- 192.168.123.103:0/2760340398 learned_addr learned my addr 192.168.123.103:0/2760340398 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:14.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.260+0000 7f8ec5abe700 1 -- 192.168.123.103:0/2760340398 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8ec0071b60 msgr2=0x7f8ec019c510 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.260+0000 7f8ec5abe700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8ec0071b60 0x7f8ec019c510 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.260+0000 7f8ec5abe700 1 -- 192.168.123.103:0/2760340398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8eb00097e0 con 0x7f8ec010e9e0 2026-03-10T14:05:14.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.261+0000 7f8ec5abe700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec010e9e0 0x7f8ec019ca50 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f8eb0005950 tx=0x7f8eb0004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.261+0000 7f8eb77fe700 1 -- 192.168.123.103:0/2760340398 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8eb001d070 con 0x7f8ec010e9e0 2026-03-10T14:05:14.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.261+0000 7f8eb77fe700 1 -- 192.168.123.103:0/2760340398 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8eb0022470 con 0x7f8ec010e9e0 2026-03-10T14:05:14.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.262+0000 7f8eb77fe700 1 -- 192.168.123.103:0/2760340398 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8eb000f670 con 0x7f8ec010e9e0 2026-03-10T14:05:14.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.264+0000 7f8ec72c1700 1 -- 192.168.123.103:0/2760340398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ec01a1140 con 0x7f8ec010e9e0 2026-03-10T14:05:14.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.264+0000 7f8ec72c1700 1 -- 192.168.123.103:0/2760340398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ec01a15b0 con 0x7f8ec010e9e0 2026-03-10T14:05:14.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.264+0000 7f8eb57fa700 1 -- 192.168.123.103:0/2760340398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ea80052f0 con 0x7f8ec010e9e0 2026-03-10T14:05:14.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.265+0000 7f8eb77fe700 1 -- 192.168.123.103:0/2760340398 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8eb000bac0 con 0x7f8ec010e9e0 2026-03-10T14:05:14.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.266+0000 7f8eb77fe700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8eac06c520 0x7f8eac06e9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.266+0000 7f8eb77fe700 1 -- 192.168.123.103:0/2760340398 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f8eb008d340 con 0x7f8ec010e9e0 2026-03-10T14:05:14.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.266+0000 7f8ec62bf700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8eac06c520 0x7f8eac06e9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.266+0000 7f8ec62bf700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8eac06c520 0x7f8eac06e9d0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f8ebc01ae60 tx=0x7f8ebc01a460 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.268+0000 7f8eb77fe700 1 -- 192.168.123.103:0/2760340398 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8eb0057e80 con 0x7f8ec010e9e0 2026-03-10T14:05:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.322+0000 7f5ae3fff700 1 -- 192.168.123.103:0/3098381424 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 msgr2=0x7f5ae4071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.322+0000 7f5ae3fff700 1 --2- 192.168.123.103:0/3098381424 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 0x7f5ae4071fd0 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f5ad8009b50 tx=0x7f5ad8009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.325+0000 7f5ae3fff700 1 -- 192.168.123.103:0/3098381424 shutdown_connections 2026-03-10T14:05:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.325+0000 7f5ae3fff700 1 --2- 192.168.123.103:0/3098381424 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 0x7f5ae4071fd0 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.325+0000 7f5ae3fff700 1 --2- 192.168.123.103:0/3098381424 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ae410e9e0 0x7f5ae410edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.325+0000 7f5ae3fff700 1 -- 192.168.123.103:0/3098381424 >> 192.168.123.103:0/3098381424 conn(0x7f5ae406c6c0 msgr2=0x7f5ae406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.326+0000 7f5ae3fff700 1 -- 192.168.123.103:0/3098381424 shutdown_connections 2026-03-10T14:05:14.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.326+0000 7f5ae3fff700 1 -- 192.168.123.103:0/3098381424 wait complete. 2026-03-10T14:05:14.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.326+0000 7f5ae3fff700 1 Processor -- start 2026-03-10T14:05:14.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.327+0000 7f5ae3fff700 1 -- start start 2026-03-10T14:05:14.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.327+0000 7f5ae3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 0x7f5ae419c510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.327+0000 7f5ae3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ae410e9e0 0x7f5ae419ca50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.327+0000 7f5ae3fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ae419d130 con 0x7f5ae4071b60 2026-03-10T14:05:14.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.327+0000 7f5ae3fff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ae41a0ec0 con 0x7f5ae410e9e0 2026-03-10T14:05:14.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.328+0000 7f5ae2ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 0x7f5ae419c510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.328+0000 7f5ae2ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 0x7f5ae419c510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42332/0 (socket says 192.168.123.103:42332) 2026-03-10T14:05:14.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.328+0000 7f5ae2ffd700 1 -- 192.168.123.103:0/4027366355 learned_addr learned my addr 192.168.123.103:0/4027366355 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:14.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.328+0000 7f5ae27fc700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ae410e9e0 0x7f5ae419ca50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.328+0000 7f5ae2ffd700 1 -- 192.168.123.103:0/4027366355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ae410e9e0 msgr2=0x7f5ae419ca50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.329+0000 7f5ae2ffd700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ae410e9e0 0x7f5ae419ca50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.329+0000 7f5ae2ffd700 1 -- 192.168.123.103:0/4027366355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5ad80097e0 con 0x7f5ae4071b60 2026-03-10T14:05:14.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.329+0000 7f5ae27fc700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ae410e9e0 0x7f5ae419ca50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:14.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.330+0000 7f5ae2ffd700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 0x7f5ae419c510 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f5ad401afb0 tx=0x7f5ad401d670 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.331+0000 7f5acbfff700 1 -- 192.168.123.103:0/4027366355 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ad4021440 con 0x7f5ae4071b60 2026-03-10T14:05:14.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.331+0000 7f5acbfff700 1 -- 192.168.123.103:0/4027366355 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5ad4021a80 con 0x7f5ae4071b60 2026-03-10T14:05:14.330 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.331+0000 7f5acbfff700 1 -- 192.168.123.103:0/4027366355 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ad4029c20 con 0x7f5ae4071b60 2026-03-10T14:05:14.330 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.331+0000 7f5ae3fff700 1 -- 192.168.123.103:0/4027366355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ae41a11a0 con 0x7f5ae4071b60 2026-03-10T14:05:14.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.332+0000 7f5ae3fff700 1 -- 192.168.123.103:0/4027366355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ae41a1670 con 0x7f5ae4071b60 2026-03-10T14:05:14.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.335+0000 7f5ae3fff700 1 -- 192.168.123.103:0/4027366355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5ae404f2a0 con 0x7f5ae4071b60 2026-03-10T14:05:14.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.340+0000 7f5acbfff700 1 -- 192.168.123.103:0/4027366355 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5ad40215a0 con 0x7f5ae4071b60 2026-03-10T14:05:14.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.340+0000 7f5acbfff700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5acc06c480 0x7f5acc06e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.342+0000 7f5acbfff700 1 -- 192.168.123.103:0/4027366355 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f5ad4025070 con 0x7f5ae4071b60 2026-03-10T14:05:14.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.344+0000 7f5ae27fc700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5acc06c480 0x7f5acc06e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.344+0000 7f5acbfff700 1 -- 192.168.123.103:0/4027366355 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5ad4067d80 con 0x7f5ae4071b60 2026-03-10T14:05:14.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.346+0000 7fd79669f700 1 -- 192.168.123.103:0/223012811 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7fd790066e40 con 0x7fd79010ed80 2026-03-10T14:05:14.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.346+0000 7f5ae27fc700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5acc06c480 0x7f5acc06e930 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f5ad800b5c0 tx=0x7f5ad8005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.359+0000 7fd78d7fa700 1 -- 192.168.123.103:0/223012811 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fd780057640 con 0x7fd79010ed80 2026-03-10T14:05:14.358 INFO:teuthology.orchestra.run.vm03.stdout:73014444041 2026-03-10T14:05:14.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.363+0000 7fd776ffd700 1 -- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd77806c680 msgr2=0x7fd77806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.363+0000 7fd776ffd700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd77806c680 0x7fd77806eb30 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fd78400a9b0 tx=0x7fd784005c90 comp rx=0 tx=0).stop 2026-03-10T14:05:14.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.363+0000 7fd776ffd700 1 -- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 msgr2=0x7fd790116150 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.363+0000 7fd776ffd700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 0x7fd790116150 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7fd78000b5c0 tx=0x7fd780004990 comp rx=0 tx=0).stop 2026-03-10T14:05:14.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.366+0000 7fd776ffd700 1 -- 192.168.123.103:0/223012811 shutdown_connections 2026-03-10T14:05:14.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.366+0000 7fd776ffd700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd77806c680 0x7fd77806eb30 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.366+0000 7fd776ffd700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd79006d7a0 0x7fd790115c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.366+0000 7fd776ffd700 1 --2- 192.168.123.103:0/223012811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd79010ed80 0x7fd790116150 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.366+0000 7fd776ffd700 1 -- 192.168.123.103:0/223012811 >> 192.168.123.103:0/223012811 conn(0x7fd79006c830 msgr2=0x7fd7900711a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.368+0000 7fd776ffd700 1 -- 192.168.123.103:0/223012811 shutdown_connections 2026-03-10T14:05:14.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.369+0000 7fd776ffd700 1 -- 192.168.123.103:0/223012811 wait complete. 2026-03-10T14:05:14.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.371+0000 7f3fe696a700 1 -- 192.168.123.103:0/4024419796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fe010c8b0 msgr2=0x7f3fe010cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.371+0000 7f3fe696a700 1 --2- 192.168.123.103:0/4024419796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fe010c8b0 0x7f3fe010cc80 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f3fd800b3a0 tx=0x7f3fd800b6b0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.371+0000 7f3fe696a700 1 -- 192.168.123.103:0/4024419796 shutdown_connections 2026-03-10T14:05:14.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.371+0000 7f3fe696a700 1 --2- 192.168.123.103:0/4024419796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3fe0071e40 0x7f3fe00722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.371+0000 7f3fe696a700 1 --2- 192.168.123.103:0/4024419796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fe010c8b0 0x7f3fe010cc80 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.371+0000 7f3fe696a700 1 -- 192.168.123.103:0/4024419796 >> 192.168.123.103:0/4024419796 conn(0x7f3fe006c6c0 msgr2=0x7f3fe006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.372+0000 7f3fe696a700 1 -- 192.168.123.103:0/4024419796 shutdown_connections 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.373+0000 7f3fe696a700 1 -- 192.168.123.103:0/4024419796 wait complete. 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.373+0000 7f3fe696a700 1 Processor -- start 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.373+0000 7f3fe696a700 1 -- start start 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fe696a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3fe0071e40 0x7f3fe007d1a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fe696a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fe010c8b0 0x7f3fe007d6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fe696a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3fe007dcb0 con 0x7f3fe0071e40 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fe696a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3fe007de20 con 0x7f3fe010c8b0 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fdffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3fe0071e40 0x7f3fe007d1a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fdffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3fe0071e40 0x7f3fe007d1a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42342/0 (socket says 192.168.123.103:42342) 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fdffff700 1 -- 192.168.123.103:0/1154027704 learned_addr learned my addr 192.168.123.103:0/1154027704 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fdffff700 1 -- 192.168.123.103:0/1154027704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fe010c8b0 msgr2=0x7f3fe007d6e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fdffff700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fe010c8b0 0x7f3fe007d6e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.374+0000 7f3fdffff700 1 -- 192.168.123.103:0/1154027704 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3fd0009710 con 0x7f3fe0071e40 2026-03-10T14:05:14.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.375+0000 7f3fdffff700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3fe0071e40 0x7f3fe007d1a0 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f3fd8015040 tx=0x7f3fd8003fe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.376+0000 7f3fdd7fa700 1 -- 192.168.123.103:0/1154027704 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3fd800e040 con 0x7f3fe0071e40 2026-03-10T14:05:14.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.378+0000 7f3fe696a700 1 -- 192.168.123.103:0/1154027704 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3fd800b050 con 0x7f3fe0071e40 2026-03-10T14:05:14.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.378+0000 7f3fe696a700 1 -- 192.168.123.103:0/1154027704 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3fe0081f70 con 0x7f3fe0071e40 2026-03-10T14:05:14.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.379+0000 7f3fdd7fa700 1 -- 192.168.123.103:0/1154027704 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3fd80047d0 con 0x7f3fe0071e40 2026-03-10T14:05:14.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.379+0000 7f3fdd7fa700 1 -- 192.168.123.103:0/1154027704 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3fd801dcd0 con 0x7f3fe0071e40 2026-03-10T14:05:14.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.380+0000 7f3fdd7fa700 1 -- 192.168.123.103:0/1154027704 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f3fd8019070 con 0x7f3fe0071e40 2026-03-10T14:05:14.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.381+0000 7f3fdd7fa700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3fc806c7a0 0x7f3fc806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.381+0000 7f3fdf7fe700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3fc806c7a0 0x7f3fc806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.383+0000 7f3fdd7fa700 1 -- 192.168.123.103:0/1154027704 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f3fd808e2d0 con 0x7f3fe0071e40 2026-03-10T14:05:14.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.383+0000 7f3fdf7fe700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3fc806c7a0 0x7f3fc806ec50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f3fe007e7c0 tx=0x7f3fd0009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.383+0000 7f3fc6ffd700 1 -- 192.168.123.103:0/1154027704 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3fcc005320 con 0x7f3fe0071e40 2026-03-10T14:05:14.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.386+0000 7f3fdd7fa700 1 -- 192.168.123.103:0/1154027704 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3fd8058e10 con 0x7f3fe0071e40 2026-03-10T14:05:14.475 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.476+0000 7f40f5668700 1 -- 192.168.123.103:0/2817000373 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f010e9e0 msgr2=0x7f40f010edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.475 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.476+0000 7f40f5668700 1 --2- 192.168.123.103:0/2817000373 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f010e9e0 0x7f40f010edb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f40e0009a60 tx=0x7f40e0009d70 comp rx=0 tx=0).stop 2026-03-10T14:05:14.475 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.477+0000 7f40f5668700 1 -- 192.168.123.103:0/2817000373 shutdown_connections 2026-03-10T14:05:14.475 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.477+0000 7f40f5668700 1 --2- 192.168.123.103:0/2817000373 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40f0071b60 0x7f40f0071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.477+0000 7f40f5668700 1 --2- 192.168.123.103:0/2817000373 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f010e9e0 0x7f40f010edb0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.477+0000 7f40f5668700 1 -- 192.168.123.103:0/2817000373 >> 192.168.123.103:0/2817000373 conn(0x7f40f006c6c0 msgr2=0x7f40f006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.477+0000 7f40f5668700 1 -- 192.168.123.103:0/2817000373 shutdown_connections 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.477+0000 7f40f5668700 1 -- 192.168.123.103:0/2817000373 wait complete. 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.477+0000 7f40f5668700 1 Processor -- start 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.477+0000 7f40f5668700 1 -- start start 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40f5668700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f0071b60 0x7f40f01194b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40f5668700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40f010e9e0 0x7f40f01144b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40f5668700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40f01149f0 con 0x7f40f010e9e0 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40f5668700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40f0114b60 con 0x7f40f0071b60 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40effff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f0071b60 0x7f40f01194b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40effff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f0071b60 0x7f40f01194b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:52390/0 (socket says 192.168.123.103:52390) 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40effff700 1 -- 192.168.123.103:0/2951734797 learned_addr learned my addr 192.168.123.103:0/2951734797 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40ef7fe700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40f010e9e0 0x7f40f01144b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40effff700 1 -- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40f010e9e0 msgr2=0x7f40f01144b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40effff700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40f010e9e0 0x7f40f01144b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40effff700 1 -- 192.168.123.103:0/2951734797 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40e0009710 con 0x7f40f0071b60 2026-03-10T14:05:14.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.478+0000 7f40effff700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f0071b60 0x7f40f01194b0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f40e0004150 tx=0x7f40e0004230 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.488+0000 7f40ed7fa700 1 -- 192.168.123.103:0/2951734797 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40e001d070 con 0x7f40f0071b60 2026-03-10T14:05:14.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.488+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f40f0114de0 con 0x7f40f0071b60 2026-03-10T14:05:14.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.488+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f40f01152d0 con 0x7f40f0071b60 2026-03-10T14:05:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.488+0000 7f40ed7fa700 1 -- 192.168.123.103:0/2951734797 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f40e00037c0 con 0x7f40f0071b60 2026-03-10T14:05:14.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.488+0000 7f40ed7fa700 1 -- 192.168.123.103:0/2951734797 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40e000f9b0 con 0x7f40f0071b60 2026-03-10T14:05:14.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.490+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f40dc005320 con 0x7f40f0071b60 2026-03-10T14:05:14.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.490+0000 7f40ed7fa700 1 -- 192.168.123.103:0/2951734797 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f40e0022620 con 0x7f40f0071b60 2026-03-10T14:05:14.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.496+0000 7f40ed7fa700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f40d806c7a0 0x7f40d806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:14.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.496+0000 7f40ef7fe700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f40d806c7a0 0x7f40d806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:14.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.496+0000 7f40ef7fe700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f40d806c7a0 0x7f40d806ec50 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f40f0115ca0 tx=0x7f40e4009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:14.497 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444040 got 73014444041 for osd.2 2026-03-10T14:05:14.498 DEBUG:teuthology.parallel:result is None 2026-03-10T14:05:14.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.497+0000 7f40ed7fa700 1 -- 192.168.123.103:0/2951734797 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f40e008c7c0 con 0x7f40f0071b60 2026-03-10T14:05:14.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.497+0000 7f40ed7fa700 1 -- 192.168.123.103:0/2951734797 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f40e008ed30 con 0x7f40f0071b60 2026-03-10T14:05:14.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.564+0000 7f8eb57fa700 1 -- 192.168.123.103:0/2760340398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7f8ea8005c90 con 0x7f8ec010e9e0 2026-03-10T14:05:14.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.565+0000 7f8eb77fe700 1 -- 192.168.123.103:0/2760340398 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f8eb005b4a0 con 0x7f8ec010e9e0 2026-03-10T14:05:14.564 INFO:teuthology.orchestra.run.vm03.stdout:55834574858 2026-03-10T14:05:14.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 -- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8eac06c520 msgr2=0x7f8eac06e9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8eac06c520 0x7f8eac06e9d0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f8ebc01ae60 tx=0x7f8ebc01a460 comp rx=0 tx=0).stop 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 -- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec010e9e0 msgr2=0x7f8ec019ca50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec010e9e0 0x7f8ec019ca50 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f8eb0005950 tx=0x7f8eb0004c30 comp rx=0 tx=0).stop 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 -- 192.168.123.103:0/2760340398 shutdown_connections 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8eac06c520 0x7f8eac06e9d0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8ec0071b60 0x7f8ec019c510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 --2- 192.168.123.103:0/2760340398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ec010e9e0 0x7f8ec019ca50 secure :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7f8eb0005950 tx=0x7f8eb0004c30 comp rx=0 tx=0).stop 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 -- 192.168.123.103:0/2760340398 >> 192.168.123.103:0/2760340398 conn(0x7f8ec006c6c0 msgr2=0x7f8ec006cfa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.568+0000 7f8ec72c1700 1 -- 192.168.123.103:0/2760340398 shutdown_connections 2026-03-10T14:05:14.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.569+0000 7f8ec72c1700 1 -- 192.168.123.103:0/2760340398 wait complete. 2026-03-10T14:05:14.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.626+0000 7f5ae3fff700 1 -- 192.168.123.103:0/4027366355 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f5ae404ea50 con 0x7f5ae4071b60 2026-03-10T14:05:14.627 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574858 got 55834574858 for osd.1 2026-03-10T14:05:14.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.628+0000 7f5acbfff700 1 -- 192.168.123.103:0/4027366355 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f5ad406b3a0 con 0x7f5ae4071b60 2026-03-10T14:05:14.627 INFO:teuthology.orchestra.run.vm03.stdout:137438953475 2026-03-10T14:05:14.627 DEBUG:teuthology.parallel:result is None 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.632+0000 7f5ac9ffb700 1 -- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5acc06c480 msgr2=0x7f5acc06e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.632+0000 7f5ac9ffb700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5acc06c480 0x7f5acc06e930 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f5ad800b5c0 tx=0x7f5ad8005fb0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.632+0000 7f5ac9ffb700 1 -- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 msgr2=0x7f5ae419c510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.632+0000 7f5ac9ffb700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 0x7f5ae419c510 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f5ad401afb0 tx=0x7f5ad401d670 comp rx=0 tx=0).stop 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.636+0000 7f5ac9ffb700 1 -- 192.168.123.103:0/4027366355 shutdown_connections 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.636+0000 7f5ac9ffb700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae4071b60 0x7f5ae419c510 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.636+0000 7f5ac9ffb700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5acc06c480 0x7f5acc06e930 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.636+0000 7f5ac9ffb700 1 --2- 192.168.123.103:0/4027366355 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ae410e9e0 0x7f5ae419ca50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.636+0000 7f5ac9ffb700 1 -- 192.168.123.103:0/4027366355 >> 192.168.123.103:0/4027366355 conn(0x7f5ae406c6c0 msgr2=0x7f5ae406cfa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.636+0000 7f5ac9ffb700 1 -- 192.168.123.103:0/4027366355 shutdown_connections 2026-03-10T14:05:14.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.636+0000 7f5ac9ffb700 1 -- 192.168.123.103:0/4027366355 wait complete. 2026-03-10T14:05:14.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.690+0000 7f3fc6ffd700 1 -- 192.168.123.103:0/1154027704 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7f3fcc005190 con 0x7f3fe0071e40 2026-03-10T14:05:14.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.690+0000 7f3fdd7fa700 1 -- 192.168.123.103:0/1154027704 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f3fd8021020 con 0x7f3fe0071e40 2026-03-10T14:05:14.689 INFO:teuthology.orchestra.run.vm03.stdout:38654705677 2026-03-10T14:05:14.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.691+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7f40dc005190 con 0x7f40f0071b60 2026-03-10T14:05:14.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.692+0000 7f40ed7fa700 1 -- 192.168.123.103:0/2951734797 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f40e00573b0 con 0x7f40f0071b60 2026-03-10T14:05:14.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 -- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3fc806c7a0 msgr2=0x7f3fc806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3fc806c7a0 0x7f3fc806ec50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f3fe007e7c0 tx=0x7f3fd0009450 comp rx=0 tx=0).stop 2026-03-10T14:05:14.693 INFO:teuthology.orchestra.run.vm03.stdout:98784247814 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 -- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3fe0071e40 msgr2=0x7f3fe007d1a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3fe0071e40 0x7f3fe007d1a0 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f3fd8015040 tx=0x7f3fd8003fe0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 -- 192.168.123.103:0/1154027704 shutdown_connections 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3fe0071e40 0x7f3fe007d1a0 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3fc806c7a0 0x7f3fc806ec50 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 --2- 192.168.123.103:0/1154027704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3fe010c8b0 0x7f3fe007d6e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.695+0000 7f3fc6ffd700 1 -- 192.168.123.103:0/1154027704 >> 192.168.123.103:0/1154027704 conn(0x7f3fe006c6c0 msgr2=0x7f3fe00703e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.696+0000 7f3fc6ffd700 1 -- 192.168.123.103:0/1154027704 shutdown_connections 2026-03-10T14:05:14.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.696+0000 7f3fc6ffd700 1 -- 192.168.123.103:0/1154027704 wait complete. 2026-03-10T14:05:14.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.696+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f40d806c7a0 msgr2=0x7f40d806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.696+0000 7f40f5668700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f40d806c7a0 0x7f40d806ec50 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f40f0115ca0 tx=0x7f40e4009500 comp rx=0 tx=0).stop 2026-03-10T14:05:14.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.696+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f0071b60 msgr2=0x7f40f01194b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:14.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.696+0000 7f40f5668700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f0071b60 0x7f40f01194b0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f40e0004150 tx=0x7f40e0004230 comp rx=0 tx=0).stop 2026-03-10T14:05:14.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.699+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 shutdown_connections 2026-03-10T14:05:14.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.699+0000 7f40f5668700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f40d806c7a0 0x7f40d806ec50 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.699+0000 7f40f5668700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40f0071b60 0x7f40f01194b0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.699+0000 7f40f5668700 1 --2- 192.168.123.103:0/2951734797 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40f010e9e0 0x7f40f01144b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:14.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.699+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 >> 192.168.123.103:0/2951734797 conn(0x7f40f006c6c0 msgr2=0x7f40f006cf60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:14.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.699+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 shutdown_connections 2026-03-10T14:05:14.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:14.699+0000 7f40f5668700 1 -- 192.168.123.103:0/2951734797 wait complete. 2026-03-10T14:05:14.765 INFO:tasks.cephadm.ceph_manager.ceph:need seq 98784247814 got 98784247814 for osd.3 2026-03-10T14:05:14.765 DEBUG:teuthology.parallel:result is None 2026-03-10T14:05:14.772 INFO:tasks.cephadm.ceph_manager.ceph:need seq 137438953475 got 137438953475 for osd.5 2026-03-10T14:05:14.772 DEBUG:teuthology.parallel:result is None 2026-03-10T14:05:14.801 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705677 got 38654705677 for osd.0 2026-03-10T14:05:14.801 DEBUG:teuthology.parallel:result is None 2026-03-10T14:05:14.801 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-10T14:05:14.801 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph pg dump --format=json 2026-03-10T14:05:14.859 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:14 vm03 ceph-mon[49718]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:14.860 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:14 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/751383164' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T14:05:14.860 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:14 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/223012811' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T14:05:14.860 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:14 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/2760340398' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T14:05:14.860 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:14 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/4027366355' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T14:05:14.860 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:14 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1154027704' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T14:05:14.860 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:14 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/2951734797' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T14:05:15.000 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:15.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:14 vm04 ceph-mon[55966]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:15.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:14 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/751383164' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T14:05:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:14 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/223012811' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T14:05:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:14 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/2760340398' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T14:05:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:14 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/4027366355' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T14:05:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:14 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/1154027704' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T14:05:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:14 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/2951734797' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T14:05:15.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.245+0000 7f87db1ef700 1 -- 192.168.123.103:0/3940182105 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 msgr2=0x7f87d4072fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:15.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.245+0000 7f87db1ef700 1 --2- 192.168.123.103:0/3940182105 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 0x7f87d4072fc0 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f87c4009b00 tx=0x7f87c4009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:15.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.246+0000 7f87db1ef700 1 -- 192.168.123.103:0/3940182105 shutdown_connections 2026-03-10T14:05:15.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.246+0000 7f87db1ef700 1 --2- 192.168.123.103:0/3940182105 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87d4073500 0x7f87d4073970 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.246+0000 7f87db1ef700 1 --2- 192.168.123.103:0/3940182105 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 0x7f87d4072fc0 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.246+0000 7f87db1ef700 1 -- 192.168.123.103:0/3940182105 >> 192.168.123.103:0/3940182105 conn(0x7f87d4078580 msgr2=0x7f87d4078980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:15.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.246+0000 7f87db1ef700 1 -- 192.168.123.103:0/3940182105 shutdown_connections 2026-03-10T14:05:15.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.246+0000 7f87db1ef700 1 -- 192.168.123.103:0/3940182105 wait complete. 2026-03-10T14:05:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87db1ef700 1 Processor -- start 2026-03-10T14:05:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87db1ef700 1 -- start start 2026-03-10T14:05:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87db1ef700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87d4073500 0x7f87d41a2570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87db1ef700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 0x7f87d41a2ab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87db1ef700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87d41a30b0 con 0x7f87d4074d90 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87db1ef700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87d419c5f0 con 0x7f87d4073500 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87d8f8b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87d4073500 0x7f87d41a2570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87d3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 0x7f87d41a2ab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87d8f8b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87d4073500 0x7f87d41a2570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:52408/0 (socket says 192.168.123.103:52408) 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.247+0000 7f87d8f8b700 1 -- 192.168.123.103:0/1907750131 learned_addr learned my addr 192.168.123.103:0/1907750131 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.248+0000 7f87d8f8b700 1 -- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 msgr2=0x7f87d41a2ab0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.248+0000 7f87d8f8b700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 0x7f87d41a2ab0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.248+0000 7f87d8f8b700 1 -- 192.168.123.103:0/1907750131 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f87c40097e0 con 0x7f87d4073500 2026-03-10T14:05:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.248+0000 7f87d3fff700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 0x7f87d41a2ab0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:05:15.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.248+0000 7f87d8f8b700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87d4073500 0x7f87d41a2570 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f87c4004930 tx=0x7f87c4004960 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:15.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.248+0000 7f87d1ffb700 1 -- 192.168.123.103:0/1907750131 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87c401d070 con 0x7f87d4073500 2026-03-10T14:05:15.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.248+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f87d419c870 con 0x7f87d4073500 2026-03-10T14:05:15.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.248+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f87d419cd60 con 0x7f87d4073500 2026-03-10T14:05:15.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.249+0000 7f87d1ffb700 1 -- 192.168.123.103:0/1907750131 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f87c400bc50 con 0x7f87d4073500 2026-03-10T14:05:15.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.249+0000 7f87d1ffb700 1 -- 192.168.123.103:0/1907750131 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87c400f700 con 0x7f87d4073500 2026-03-10T14:05:15.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.250+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87d4101d20 con 0x7f87d4073500 2026-03-10T14:05:15.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.250+0000 7f87d1ffb700 1 -- 192.168.123.103:0/1907750131 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f87c400f860 con 0x7f87d4073500 2026-03-10T14:05:15.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.250+0000 7f87d1ffb700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87bc06c630 0x7f87bc06eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:15.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.250+0000 7f87d1ffb700 1 -- 192.168.123.103:0/1907750131 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f87c408c980 con 0x7f87d4073500 2026-03-10T14:05:15.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.250+0000 7f87d3fff700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87bc06c630 0x7f87bc06eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:15.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.251+0000 7f87d3fff700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87bc06c630 0x7f87bc06eae0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f87d40fea80 tx=0x7f87c800a380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:15.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.253+0000 7f87d1ffb700 1 -- 192.168.123.103:0/1907750131 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f87c405b020 con 0x7f87d4073500 2026-03-10T14:05:15.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.354+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f87d419db00 con 0x7f87bc06c630 2026-03-10T14:05:15.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.355+0000 7f87d1ffb700 1 -- 192.168.123.103:0/1907750131 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19119 (secure 0 0 0) 0x7f87d419db00 con 0x7f87bc06c630 2026-03-10T14:05:15.355 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:15.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.357+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87bc06c630 msgr2=0x7f87bc06eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:15.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.357+0000 7f87db1ef700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87bc06c630 0x7f87bc06eae0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f87d40fea80 tx=0x7f87c800a380 comp rx=0 tx=0).stop 2026-03-10T14:05:15.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87d4073500 msgr2=0x7f87d41a2570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:15.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87d4073500 0x7f87d41a2570 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f87c4004930 tx=0x7f87c4004960 comp rx=0 tx=0).stop 2026-03-10T14:05:15.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 shutdown_connections 2026-03-10T14:05:15.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87bc06c630 0x7f87bc06eae0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87d4073500 0x7f87d41a2570 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 --2- 192.168.123.103:0/1907750131 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87d4074d90 0x7f87d41a2ab0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 >> 192.168.123.103:0/1907750131 conn(0x7f87d4078580 msgr2=0x7f87d40fedc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:15.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 shutdown_connections 2026-03-10T14:05:15.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.358+0000 7f87db1ef700 1 -- 192.168.123.103:0/1907750131 wait complete. 2026-03-10T14:05:15.357 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-10T14:05:15.417 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":67,"stamp":"2026-03-10T14:05:15.096139+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163644,"kb_used_data":3084,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640900,"statfs":{"total":128823853056,"available":128656281600,"internally_reserved":0,"allocated":3158016,"data_stored":2042028,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"8.712956"},"pg_stats":[{"pgid":"1.0","version":"20'76","reported_seq":138,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-10T14:05:06.385915+0000","last_change":"2026-03-10T14:04:57.353937+0000","last_active":"2026-03-10T14:05:06.385915+0000","last_peered":"2026-03-10T14:05:06.385915+0000","last_clean":"2026-03-10T14:05:06.385915+0000","last_became_active":"2026-03-10T14:04:57.353524+0000","last_became_peered":"2026-03-10T14:04:57.353524+0000","last_unstale":"2026-03-10T14:05:06.385915+0000","last_undegraded":"2026-03-10T14:05:06.385915+0000","last_fullsized":"2026-03-10T14:05:06.385915+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T14:04:38.022847+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T14:04:38.022847+0000","last_clean_scrub_stamp":"2026-03-10T14:04:38.022847+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-12T00:29:05.927542+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110698,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54100000000000004}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55200000000000005}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59999999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63900000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51800000000000002}]}]},{"osd":4,"up_from":28,"seq":120259084293,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110698,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40400000000000003}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54700000000000004}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39300000000000002}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.32600000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.374}]}]},{"osd":3,"up_from":23,"seq":98784247815,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":569978,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.93799999999999994}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57899999999999996}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.81200000000000006}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.311}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39400000000000002}]}]},{"osd":2,"up_from":17,"seq":73014444041,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110698,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.373}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42399999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.50700000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":1.0349999999999999}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":1.1000000000000001}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":569978,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.33400000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60899999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59999999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64400000000000002}]}]},{"osd":1,"up_from":13,"seq":55834574859,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":569978,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.435}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47399999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.76000000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.45100000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64000000000000001}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T14:05:15.418 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph pg dump --format=json 2026-03-10T14:05:15.604 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:15.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.833+0000 7f406cf5d700 1 -- 192.168.123.103:0/1798606977 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 msgr2=0x7f40681009b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:15.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.833+0000 7f406cf5d700 1 --2- 192.168.123.103:0/1798606977 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 0x7f40681009b0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f4058009b50 tx=0x7f4058009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:15.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.834+0000 7f406cf5d700 1 -- 192.168.123.103:0/1798606977 shutdown_connections 2026-03-10T14:05:15.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.834+0000 7f406cf5d700 1 --2- 192.168.123.103:0/1798606977 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 0x7f40681009b0 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.834+0000 7f406cf5d700 1 --2- 192.168.123.103:0/1798606977 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4068106560 0x7f4068106930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.834+0000 7f406cf5d700 1 -- 192.168.123.103:0/1798606977 >> 192.168.123.103:0/1798606977 conn(0x7f40680fc000 msgr2=0x7f40680fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:15.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.835+0000 7f406cf5d700 1 -- 192.168.123.103:0/1798606977 shutdown_connections 2026-03-10T14:05:15.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.835+0000 7f406cf5d700 1 -- 192.168.123.103:0/1798606977 wait complete. 2026-03-10T14:05:15.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.835+0000 7f406cf5d700 1 Processor -- start 2026-03-10T14:05:15.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.835+0000 7f406cf5d700 1 -- start start 2026-03-10T14:05:15.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.836+0000 7f406cf5d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 0x7f4068193ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.836+0000 7f406cf5d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4068106560 0x7f4068194420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.836+0000 7f406cf5d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4068194b00 con 0x7f4068100540 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.836+0000 7f4065d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4068106560 0x7f4068194420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.836+0000 7f406cf5d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4068198890 con 0x7f4068106560 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.836+0000 7f4065d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4068106560 0x7f4068194420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:52428/0 (socket says 192.168.123.103:52428) 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.836+0000 7f4065d9b700 1 -- 192.168.123.103:0/3822188482 learned_addr learned my addr 192.168.123.103:0/3822188482 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.836+0000 7f4065d9b700 1 -- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 msgr2=0x7f4068193ee0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f406659c700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 0x7f4068193ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f4065d9b700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 0x7f4068193ee0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f4065d9b700 1 -- 192.168.123.103:0/3822188482 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40580097e0 con 0x7f4068106560 2026-03-10T14:05:15.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f406659c700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 0x7f4068193ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:15.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f4065d9b700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4068106560 0x7f4068194420 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f4058005950 tx=0x7f40580049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:15.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f405f7fe700 1 -- 192.168.123.103:0/3822188482 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f405801d070 con 0x7f4068106560 2026-03-10T14:05:15.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f405f7fe700 1 -- 192.168.123.103:0/3822188482 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4058004b80 con 0x7f4068106560 2026-03-10T14:05:15.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4068198b10 con 0x7f4068106560 2026-03-10T14:05:15.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f405f7fe700 1 -- 192.168.123.103:0/3822188482 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f405800f670 con 0x7f4068106560 2026-03-10T14:05:15.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.837+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4068199000 con 0x7f4068106560 2026-03-10T14:05:15.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.838+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f406804ea50 con 0x7f4068106560 2026-03-10T14:05:15.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.839+0000 7f405f7fe700 1 -- 192.168.123.103:0/3822188482 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f405800bc50 con 0x7f4068106560 2026-03-10T14:05:15.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.840+0000 7f405f7fe700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4054070b00 0x7f4054072fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:15.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.840+0000 7f405f7fe700 1 -- 192.168.123.103:0/3822188482 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f405808d700 con 0x7f4068106560 2026-03-10T14:05:15.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.840+0000 7f406659c700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4054070b00 0x7f4054072fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:15.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.840+0000 7f406659c700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4054070b00 0x7f4054072fb0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f4050009dd0 tx=0x7f4050009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:15.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.841+0000 7f405f7fe700 1 -- 192.168.123.103:0/3822188482 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4058058240 con 0x7f4068106560 2026-03-10T14:05:15.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.937+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f40681992e0 con 0x7f4054070b00 2026-03-10T14:05:15.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.940+0000 7f405f7fe700 1 -- 192.168.123.103:0/3822188482 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19119 (secure 0 0 0) 0x7f40681992e0 con 0x7f4054070b00 2026-03-10T14:05:15.939 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:15.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.942+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4054070b00 msgr2=0x7f4054072fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:15.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.942+0000 7f406cf5d700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4054070b00 0x7f4054072fb0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f4050009dd0 tx=0x7f4050009450 comp rx=0 tx=0).stop 2026-03-10T14:05:15.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.942+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4068106560 msgr2=0x7f4068194420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:15.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.942+0000 7f406cf5d700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4068106560 0x7f4068194420 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f4058005950 tx=0x7f40580049e0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.942+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 shutdown_connections 2026-03-10T14:05:15.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.942+0000 7f406cf5d700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4068100540 0x7f4068193ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.942+0000 7f406cf5d700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4054070b00 0x7f4054072fb0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.943+0000 7f406cf5d700 1 --2- 192.168.123.103:0/3822188482 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4068106560 0x7f4068194420 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:15.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.943+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 >> 192.168.123.103:0/3822188482 conn(0x7f40680fc000 msgr2=0x7f40680fd880 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:15.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.943+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 shutdown_connections 2026-03-10T14:05:15.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:15.943+0000 7f406cf5d700 1 -- 192.168.123.103:0/3822188482 wait complete. 2026-03-10T14:05:15.943 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-10T14:05:16.009 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":67,"stamp":"2026-03-10T14:05:15.096139+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163644,"kb_used_data":3084,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640900,"statfs":{"total":128823853056,"available":128656281600,"internally_reserved":0,"allocated":3158016,"data_stored":2042028,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"8.712956"},"pg_stats":[{"pgid":"1.0","version":"20'76","reported_seq":138,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-10T14:05:06.385915+0000","last_change":"2026-03-10T14:04:57.353937+0000","last_active":"2026-03-10T14:05:06.385915+0000","last_peered":"2026-03-10T14:05:06.385915+0000","last_clean":"2026-03-10T14:05:06.385915+0000","last_became_active":"2026-03-10T14:04:57.353524+0000","last_became_peered":"2026-03-10T14:04:57.353524+0000","last_unstale":"2026-03-10T14:05:06.385915+0000","last_undegraded":"2026-03-10T14:05:06.385915+0000","last_fullsized":"2026-03-10T14:05:06.385915+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T14:04:38.022847+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T14:04:38.022847+0000","last_clean_scrub_stamp":"2026-03-10T14:04:38.022847+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-12T00:29:05.927542+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110698,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54100000000000004}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55200000000000005}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59999999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63900000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51800000000000002}]}]},{"osd":4,"up_from":28,"seq":120259084293,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110698,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40400000000000003}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54700000000000004}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39300000000000002}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.32600000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.374}]}]},{"osd":3,"up_from":23,"seq":98784247815,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":569978,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.93799999999999994}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57899999999999996}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.81200000000000006}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.311}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39400000000000002}]}]},{"osd":2,"up_from":17,"seq":73014444041,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110698,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.373}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42399999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.50700000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":1.0349999999999999}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":1.1000000000000001}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":569978,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.33400000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.60899999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59999999999999998}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64400000000000002}]}]},{"osd":1,"up_from":13,"seq":55834574859,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":569978,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.435}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47399999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.76000000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.45100000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64000000000000001}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T14:05:16.009 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-10T14:05:16.009 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-10T14:05:16.009 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-10T14:05:16.009 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph health --format=json 2026-03-10T14:05:16.158 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:16.200 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:15 vm03 ceph-mon[49718]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:16.200 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:15 vm03 ceph-mon[49718]: from='client.24233 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T14:05:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:15 vm04 ceph-mon[55966]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:15 vm04 ceph-mon[55966]: from='client.24233 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T14:05:16.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.420+0000 7f15353b6700 1 -- 192.168.123.103:0/2492921800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 msgr2=0x7f153010a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:16.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.420+0000 7f15353b6700 1 --2- 192.168.123.103:0/2492921800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 0x7f153010a590 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7f1520009b00 tx=0x7f1520009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:16.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.420+0000 7f15353b6700 1 -- 192.168.123.103:0/2492921800 shutdown_connections 2026-03-10T14:05:16.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.420+0000 7f15353b6700 1 --2- 192.168.123.103:0/2492921800 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 0x7f153010a590 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:16.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.420+0000 7f15353b6700 1 --2- 192.168.123.103:0/2492921800 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15301015b0 0x7f1530101980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:16.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.420+0000 7f15353b6700 1 -- 192.168.123.103:0/2492921800 >> 192.168.123.103:0/2492921800 conn(0x7f15300faf00 msgr2=0x7f15300fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:16.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.421+0000 7f15353b6700 1 -- 192.168.123.103:0/2492921800 shutdown_connections 2026-03-10T14:05:16.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.421+0000 7f15353b6700 1 -- 192.168.123.103:0/2492921800 wait complete. 2026-03-10T14:05:16.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.421+0000 7f15353b6700 1 Processor -- start 2026-03-10T14:05:16.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.421+0000 7f15353b6700 1 -- start start 2026-03-10T14:05:16.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.421+0000 7f15353b6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15301015b0 0x7f1530193f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:16.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.421+0000 7f15353b6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 0x7f15301944b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:16.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.421+0000 7f15353b6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1530194b00 con 0x7f1530101ec0 2026-03-10T14:05:16.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.421+0000 7f15353b6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1530194c40 con 0x7f15301015b0 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.422+0000 7f152effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15301015b0 0x7f1530193f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.422+0000 7f152e7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 0x7f15301944b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.422+0000 7f152e7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 0x7f15301944b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42402/0 (socket says 192.168.123.103:42402) 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.422+0000 7f152e7fc700 1 -- 192.168.123.103:0/1254911374 learned_addr learned my addr 192.168.123.103:0/1254911374 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.422+0000 7f152e7fc700 1 -- 192.168.123.103:0/1254911374 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15301015b0 msgr2=0x7f1530193f70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.422+0000 7f152e7fc700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15301015b0 0x7f1530193f70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.422+0000 7f152e7fc700 1 -- 192.168.123.103:0/1254911374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15200097e0 con 0x7f1530101ec0 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.422+0000 7f152e7fc700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 0x7f15301944b0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f1520009ad0 tx=0x7f1520004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:16.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.423+0000 7f1527fff700 1 -- 192.168.123.103:0/1254911374 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f152001d070 con 0x7f1530101ec0 2026-03-10T14:05:16.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.423+0000 7f1527fff700 1 -- 192.168.123.103:0/1254911374 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f152000bc50 con 0x7f1530101ec0 2026-03-10T14:05:16.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.423+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1530198a30 con 0x7f1530101ec0 2026-03-10T14:05:16.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.423+0000 7f1527fff700 1 -- 192.168.123.103:0/1254911374 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f152000f650 con 0x7f1530101ec0 2026-03-10T14:05:16.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.423+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1530198f20 con 0x7f1530101ec0 2026-03-10T14:05:16.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.424+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1530107c30 con 0x7f1530101ec0 2026-03-10T14:05:16.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.425+0000 7f1527fff700 1 -- 192.168.123.103:0/1254911374 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f1520022470 con 0x7f1530101ec0 2026-03-10T14:05:16.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.425+0000 7f1527fff700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f151c070bb0 0x7f151c073060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:16.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.425+0000 7f1527fff700 1 -- 192.168.123.103:0/1254911374 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f152008cc10 con 0x7f1530101ec0 2026-03-10T14:05:16.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.428+0000 7f152effd700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f151c070bb0 0x7f151c073060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:16.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.428+0000 7f1527fff700 1 -- 192.168.123.103:0/1254911374 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1520057750 con 0x7f1530101ec0 2026-03-10T14:05:16.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.428+0000 7f152effd700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f151c070bb0 0x7f151c073060 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f151800ac50 tx=0x7f151800a380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:16.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.552+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7f15301952d0 con 0x7f1530101ec0 2026-03-10T14:05:16.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.553+0000 7f1527fff700 1 -- 192.168.123.103:0/1254911374 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7f152005ad70 con 0x7f1530101ec0 2026-03-10T14:05:16.553 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:16.553 INFO:teuthology.orchestra.run.vm03.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-10T14:05:16.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.556+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f151c070bb0 msgr2=0x7f151c073060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:16.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.556+0000 7f15353b6700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f151c070bb0 0x7f151c073060 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f151800ac50 tx=0x7f151800a380 comp rx=0 tx=0).stop 2026-03-10T14:05:16.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.557+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 msgr2=0x7f15301944b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:16.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.557+0000 7f15353b6700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 0x7f15301944b0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f1520009ad0 tx=0x7f1520004c30 comp rx=0 tx=0).stop 2026-03-10T14:05:16.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.557+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 shutdown_connections 2026-03-10T14:05:16.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.557+0000 7f15353b6700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f151c070bb0 0x7f151c073060 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:16.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.557+0000 7f15353b6700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15301015b0 0x7f1530193f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:16.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.558+0000 7f15353b6700 1 --2- 192.168.123.103:0/1254911374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1530101ec0 0x7f15301944b0 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:16.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.558+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 >> 192.168.123.103:0/1254911374 conn(0x7f15300faf00 msgr2=0x7f1530104d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:16.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.558+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 shutdown_connections 2026-03-10T14:05:16.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:16.558+0000 7f15353b6700 1 -- 192.168.123.103:0/1254911374 wait complete. 2026-03-10T14:05:16.619 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-10T14:05:16.620 INFO:tasks.cephadm:Setup complete, yielding 2026-03-10T14:05:16.620 INFO:teuthology.run_tasks:Running task print... 2026-03-10T14:05:16.622 INFO:teuthology.task.print:**** done end installing v18.2.0 cephadm ... 2026-03-10T14:05:16.622 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T14:05:16.624 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:16.624 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-10T14:05:16.776 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:17.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.037+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/4178508061 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c8103a00 msgr2=0x7fa8c8103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:17.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.037+0000 7fa8ce0ae700 1 --2- 192.168.123.103:0/4178508061 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c8103a00 0x7fa8c8103e70 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7fa8b8009b00 tx=0x7fa8b8009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:17.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.038+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/4178508061 shutdown_connections 2026-03-10T14:05:17.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.038+0000 7fa8ce0ae700 1 --2- 192.168.123.103:0/4178508061 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c8103a00 0x7fa8c8103e70 secure :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7fa8b8009b00 tx=0x7fa8b8009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:17.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.038+0000 7fa8ce0ae700 1 --2- 192.168.123.103:0/4178508061 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa8c8102760 0x7fa8c8102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.038+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/4178508061 >> 192.168.123.103:0/4178508061 conn(0x7fa8c80fddb0 msgr2=0x7fa8c81001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:17.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.039+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/4178508061 shutdown_connections 2026-03-10T14:05:17.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.039+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/4178508061 wait complete. 2026-03-10T14:05:17.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.039+0000 7fa8ce0ae700 1 Processor -- start 2026-03-10T14:05:17.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.039+0000 7fa8ce0ae700 1 -- start start 2026-03-10T14:05:17.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.039+0000 7fa8ce0ae700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa8c8102760 0x7fa8c81982d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:17.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.039+0000 7fa8ce0ae700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c8198810 0x7fa8c81a1f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:17.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.039+0000 7fa8ce0ae700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa8c8079320 con 0x7fa8c8198810 2026-03-10T14:05:17.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.039+0000 7fa8ce0ae700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa8c8198c80 con 0x7fa8c8102760 2026-03-10T14:05:17.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.040+0000 7fa8c77fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa8c8102760 0x7fa8c81982d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:17.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.040+0000 7fa8c77fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa8c8102760 0x7fa8c81982d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:52478/0 (socket says 192.168.123.103:52478) 2026-03-10T14:05:17.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.040+0000 7fa8c77fe700 1 -- 192.168.123.103:0/938904124 learned_addr learned my addr 192.168.123.103:0/938904124 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:17.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.040+0000 7fa8c6ffd700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c8198810 0x7fa8c81a1f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:17.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.040+0000 7fa8c77fe700 1 -- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c8198810 msgr2=0x7fa8c81a1f90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:17.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.040+0000 7fa8c77fe700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c8198810 0x7fa8c81a1f90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.040+0000 7fa8c77fe700 1 -- 192.168.123.103:0/938904124 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa8b80097e0 con 0x7fa8c8102760 2026-03-10T14:05:17.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.040+0000 7fa8c77fe700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa8c8102760 0x7fa8c81982d0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fa8b000b700 tx=0x7fa8b000bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:17.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.041+0000 7fa8c4ff9700 1 -- 192.168.123.103:0/938904124 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa8b0010840 con 0x7fa8c8102760 2026-03-10T14:05:17.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.041+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa8c81a24d0 con 0x7fa8c8102760 2026-03-10T14:05:17.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.041+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa8c81a2830 con 0x7fa8c8102760 2026-03-10T14:05:17.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.041+0000 7fa8c4ff9700 1 -- 192.168.123.103:0/938904124 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa8b0010e80 con 0x7fa8c8102760 2026-03-10T14:05:17.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.041+0000 7fa8c4ff9700 1 -- 192.168.123.103:0/938904124 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa8b000d590 con 0x7fa8c8102760 2026-03-10T14:05:17.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.042+0000 7fa8c4ff9700 1 -- 192.168.123.103:0/938904124 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa8b00109a0 con 0x7fa8c8102760 2026-03-10T14:05:17.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.042+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa8a8005320 con 0x7fa8c8102760 2026-03-10T14:05:17.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.043+0000 7fa8c4ff9700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa8b406c680 0x7fa8b406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:17.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.043+0000 7fa8c4ff9700 1 -- 192.168.123.103:0/938904124 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa8b008af70 con 0x7fa8c8102760 2026-03-10T14:05:17.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.043+0000 7fa8c6ffd700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa8b406c680 0x7fa8b406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:17.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.044+0000 7fa8c6ffd700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa8b406c680 0x7fa8b406eb30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fa8b8000c00 tx=0x7fa8b8005fd0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:17.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.045+0000 7fa8c4ff9700 1 -- 192.168.123.103:0/938904124 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa8b0059440 con 0x7fa8c8102760 2026-03-10T14:05:17.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:16 vm03 ceph-mon[49718]: from='client.24237 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T14:05:17.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:16 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/1254911374' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T14:05:17.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.159+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7fa8a8005190 con 0x7fa8c8102760 2026-03-10T14:05:17.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.167+0000 7fa8c4ff9700 1 -- 192.168.123.103:0/938904124 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v14)=0 v14) v1 ==== 143+0+0 (secure 0 0 0) 0x7fa8b0058fd0 con 0x7fa8c8102760 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.172+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa8b406c680 msgr2=0x7fa8b406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.172+0000 7fa8ce0ae700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa8b406c680 0x7fa8b406eb30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fa8b8000c00 tx=0x7fa8b8005fd0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.172+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa8c8102760 msgr2=0x7fa8c81982d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.172+0000 7fa8ce0ae700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa8c8102760 0x7fa8c81982d0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fa8b000b700 tx=0x7fa8b000bac0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.173+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 shutdown_connections 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.173+0000 7fa8ce0ae700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa8b406c680 0x7fa8b406eb30 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.173+0000 7fa8ce0ae700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa8c8102760 0x7fa8c81982d0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.173+0000 7fa8ce0ae700 1 --2- 192.168.123.103:0/938904124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c8198810 0x7fa8c81a1f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.173+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 >> 192.168.123.103:0/938904124 conn(0x7fa8c80fddb0 msgr2=0x7fa8c8106c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:17.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.174+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 shutdown_connections 2026-03-10T14:05:17.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.174+0000 7fa8ce0ae700 1 -- 192.168.123.103:0/938904124 wait complete. 2026-03-10T14:05:17.228 INFO:teuthology.run_tasks:Running task print... 2026-03-10T14:05:17.231 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-10T14:05:17.231 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T14:05:17.234 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:17.234 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph orch status' 2026-03-10T14:05:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:16 vm04 ceph-mon[55966]: from='client.24237 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T14:05:17.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:16 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/1254911374' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T14:05:17.398 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:17.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.655+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3875913507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4d8102780 msgr2=0x7fb4d8102b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:17.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.655+0000 7fb4dd40a700 1 --2- 192.168.123.103:0/3875913507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4d8102780 0x7fb4d8102b90 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7fb4c0009b00 tx=0x7fb4c0009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:17.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.658+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3875913507 shutdown_connections 2026-03-10T14:05:17.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.658+0000 7fb4dd40a700 1 --2- 192.168.123.103:0/3875913507 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4d8103980 0x7fb4d8103dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.658+0000 7fb4dd40a700 1 --2- 192.168.123.103:0/3875913507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4d8102780 0x7fb4d8102b90 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.658+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3875913507 >> 192.168.123.103:0/3875913507 conn(0x7fb4d80fdd50 msgr2=0x7fb4d8100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:17.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.661+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3875913507 shutdown_connections 2026-03-10T14:05:17.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.661+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3875913507 wait complete. 2026-03-10T14:05:17.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.662+0000 7fb4dd40a700 1 Processor -- start 2026-03-10T14:05:17.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.662+0000 7fb4dd40a700 1 -- start start 2026-03-10T14:05:17.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.662+0000 7fb4dd40a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4d8102780 0x7fb4d819c400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:17.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.662+0000 7fb4dd40a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4d8103980 0x7fb4d819c940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.662+0000 7fb4dd40a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb4d819ced0 con 0x7fb4d8102780 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.662+0000 7fb4dd40a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb4d819d010 con 0x7fb4d8103980 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.663+0000 7fb4d6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4d8102780 0x7fb4d819c400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.663+0000 7fb4d67fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4d8103980 0x7fb4d819c940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.663+0000 7fb4d67fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4d8103980 0x7fb4d819c940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:52494/0 (socket says 192.168.123.103:52494) 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.663+0000 7fb4d67fc700 1 -- 192.168.123.103:0/3396017437 learned_addr learned my addr 192.168.123.103:0/3396017437 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4d67fc700 1 -- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4d8102780 msgr2=0x7fb4d819c400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4d67fc700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4d8102780 0x7fb4d819c400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4d67fc700 1 -- 192.168.123.103:0/3396017437 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb4c00097e0 con 0x7fb4d8103980 2026-03-10T14:05:17.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4d67fc700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4d8103980 0x7fb4d819c940 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fb4c800c930 tx=0x7fb4c800cc40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:17.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4cffff700 1 -- 192.168.123.103:0/3396017437 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb4c8007ab0 con 0x7fb4d8103980 2026-03-10T14:05:17.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb4d806a890 con 0x7fb4d8103980 2026-03-10T14:05:17.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb4d806adb0 con 0x7fb4d8103980 2026-03-10T14:05:17.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4cffff700 1 -- 192.168.123.103:0/3396017437 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb4c800ce80 con 0x7fb4d8103980 2026-03-10T14:05:17.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.664+0000 7fb4cffff700 1 -- 192.168.123.103:0/3396017437 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb4c80186a0 con 0x7fb4d8103980 2026-03-10T14:05:17.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.666+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb4d8066e40 con 0x7fb4d8103980 2026-03-10T14:05:17.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.667+0000 7fb4cffff700 1 -- 192.168.123.103:0/3396017437 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb4c8007c10 con 0x7fb4d8103980 2026-03-10T14:05:17.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.667+0000 7fb4cffff700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb4c406c630 0x7fb4c406eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:17.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.667+0000 7fb4cffff700 1 -- 192.168.123.103:0/3396017437 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb4c808b900 con 0x7fb4d8103980 2026-03-10T14:05:17.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.667+0000 7fb4d6ffd700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb4c406c630 0x7fb4c406eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:17.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.668+0000 7fb4d6ffd700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb4c406c630 0x7fb4c406eae0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fb4c00052d0 tx=0x7fb4c000b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:17.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.669+0000 7fb4cffff700 1 -- 192.168.123.103:0/3396017437 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb4c8051720 con 0x7fb4d8103980 2026-03-10T14:05:17.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.785+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb4d81082d0 con 0x7fb4c406c630 2026-03-10T14:05:17.787 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.788+0000 7fb4cffff700 1 -- 192.168.123.103:0/3396017437 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7fb4d81082d0 con 0x7fb4c406c630 2026-03-10T14:05:17.787 INFO:teuthology.orchestra.run.vm03.stdout:Backend: cephadm 2026-03-10T14:05:17.787 INFO:teuthology.orchestra.run.vm03.stdout:Available: Yes 2026-03-10T14:05:17.787 INFO:teuthology.orchestra.run.vm03.stdout:Paused: No 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.790+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb4c406c630 msgr2=0x7fb4c406eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.790+0000 7fb4dd40a700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb4c406c630 0x7fb4c406eae0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fb4c00052d0 tx=0x7fb4c000b540 comp rx=0 tx=0).stop 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.790+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4d8103980 msgr2=0x7fb4d819c940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.790+0000 7fb4dd40a700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4d8103980 0x7fb4d819c940 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fb4c800c930 tx=0x7fb4c800cc40 comp rx=0 tx=0).stop 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.791+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 shutdown_connections 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.791+0000 7fb4dd40a700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4d8102780 0x7fb4d819c400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.791+0000 7fb4dd40a700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb4c406c630 0x7fb4c406eae0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.791+0000 7fb4dd40a700 1 --2- 192.168.123.103:0/3396017437 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4d8103980 0x7fb4d819c940 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:17.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.791+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 >> 192.168.123.103:0/3396017437 conn(0x7fb4d80fdd50 msgr2=0x7fb4d8106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:17.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.791+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 shutdown_connections 2026-03-10T14:05:17.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:17.791+0000 7fb4dd40a700 1 -- 192.168.123.103:0/3396017437 wait complete. 2026-03-10T14:05:17.856 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph orch ps' 2026-03-10T14:05:18.018 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:18.300 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:18 vm03 ceph-mon[49718]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:18.300 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:18 vm03 ceph-mon[49718]: from='client.? ' entity='client.admin' 2026-03-10T14:05:18.300 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:18 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:18.300 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:18 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:18.300 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:18 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:18.300 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:18 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.299+0000 7f92cf72b700 1 -- 192.168.123.103:0/2801616907 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 msgr2=0x7f92c8103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.299+0000 7f92cf72b700 1 --2- 192.168.123.103:0/2801616907 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 0x7f92c8103dd0 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f92c4009b50 tx=0x7f92c4009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.301+0000 7f92cf72b700 1 -- 192.168.123.103:0/2801616907 shutdown_connections 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.301+0000 7f92cf72b700 1 --2- 192.168.123.103:0/2801616907 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 0x7f92c8103dd0 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.301+0000 7f92cf72b700 1 --2- 192.168.123.103:0/2801616907 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c8102780 0x7f92c8102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.301+0000 7f92cf72b700 1 -- 192.168.123.103:0/2801616907 >> 192.168.123.103:0/2801616907 conn(0x7f92c80fdd50 msgr2=0x7f92c8100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.301+0000 7f92cf72b700 1 -- 192.168.123.103:0/2801616907 shutdown_connections 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.301+0000 7f92cf72b700 1 -- 192.168.123.103:0/2801616907 wait complete. 2026-03-10T14:05:18.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.302+0000 7f92cf72b700 1 Processor -- start 2026-03-10T14:05:18.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.302+0000 7f92cf72b700 1 -- start start 2026-03-10T14:05:18.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.302+0000 7f92cf72b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c8102780 0x7f92c8198040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:18.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.302+0000 7f92cf72b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 0x7f92c8198580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:18.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.302+0000 7f92cf72b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92c8198ba0 con 0x7f92c8103980 2026-03-10T14:05:18.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.302+0000 7f92cf72b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92c8198ce0 con 0x7f92c8102780 2026-03-10T14:05:18.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cccc6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 0x7f92c8198580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:18.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cccc6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 0x7f92c8198580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42462/0 (socket says 192.168.123.103:42462) 2026-03-10T14:05:18.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cccc6700 1 -- 192.168.123.103:0/3036480629 learned_addr learned my addr 192.168.123.103:0/3036480629 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:18.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cccc6700 1 -- 192.168.123.103:0/3036480629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c8102780 msgr2=0x7f92c8198040 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:18.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cccc6700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c8102780 0x7f92c8198040 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cccc6700 1 -- 192.168.123.103:0/3036480629 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f92c40097e0 con 0x7f92c8103980 2026-03-10T14:05:18.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cccc6700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 0x7f92c8198580 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f92c4009b20 tx=0x7f92c400b920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:18.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92be7fc700 1 -- 192.168.123.103:0/3036480629 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92c401d070 con 0x7f92c8103980 2026-03-10T14:05:18.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92be7fc700 1 -- 192.168.123.103:0/3036480629 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f92c400bc70 con 0x7f92c8103980 2026-03-10T14:05:18.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92be7fc700 1 -- 192.168.123.103:0/3036480629 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92c400f8e0 con 0x7f92c8103980 2026-03-10T14:05:18.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f92c819d730 con 0x7f92c8103980 2026-03-10T14:05:18.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.303+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f92c819dc20 con 0x7f92c8103980 2026-03-10T14:05:18.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.305+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f92c8066e40 con 0x7f92c8103980 2026-03-10T14:05:18.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.305+0000 7f92be7fc700 1 -- 192.168.123.103:0/3036480629 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f92c4022c50 con 0x7f92c8103980 2026-03-10T14:05:18.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.306+0000 7f92be7fc700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f92b406c680 0x7f92b406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:18.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.306+0000 7f92be7fc700 1 -- 192.168.123.103:0/3036480629 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f92c408ce80 con 0x7f92c8103980 2026-03-10T14:05:18.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.308+0000 7f92cd4c7700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f92b406c680 0x7f92b406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:18.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.308+0000 7f92be7fc700 1 -- 192.168.123.103:0/3036480629 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f92c405b3d0 con 0x7f92c8103980 2026-03-10T14:05:18.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.309+0000 7f92cd4c7700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f92b406c680 0x7f92b406eb30 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f92c81037e0 tx=0x7f92b8005c30 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:18.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.413+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f92c81082d0 con 0x7f92b406c680 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.418+0000 7f92be7fc700 1 -- 192.168.123.103:0/3036480629 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2660 (secure 0 0 0) 0x7f92c81082d0 con 0x7f92b406c680 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (75s) 40s ago 116s 21.3M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (2m) 40s ago 2m 7776k - 18.2.0 dc2bc1663786 34b6dfc459c5 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (90s) 15s ago 90s 8052k - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 40s ago 2m 7419k - 18.2.0 dc2bc1663786 57962aef7443 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (89s) 15s ago 89s 7407k - 18.2.0 dc2bc1663786 0918365fa827 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (74s) 40s ago 105s 77.8M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:9283,8765,8443 running (2m) 40s ago 2m 486M - 18.2.0 dc2bc1663786 378306a7bb3c 2026-03-10T14:05:18.417 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (85s) 15s ago 85s 445M - 18.2.0 dc2bc1663786 f2d79432e040 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 40s ago 2m 48.7M 2048M 18.2.0 dc2bc1663786 f59cc7d5bdfd 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (84s) 15s ago 84s 40.9M 2048M 18.2.0 dc2bc1663786 4113774b34c7 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (119s) 40s ago 119s 13.7M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (86s) 15s ago 86s 13.9M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (65s) 40s ago 65s 40.8M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (55s) 40s ago 55s 37.4M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (45s) 40s ago 44s 32.9M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (35s) 15s ago 35s 43.4M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (25s) 15s ago 25s 41.8M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (16s) 15s ago 16s 12.4M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:05:18.418 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (68s) 40s ago 100s 29.7M - 2.43.0 a07b618ecd1d fcef697ff8c4 2026-03-10T14:05:18.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f92b406c680 msgr2=0x7f92b406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:18.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f92b406c680 0x7f92b406eb30 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f92c81037e0 tx=0x7f92b8005c30 comp rx=0 tx=0).stop 2026-03-10T14:05:18.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 msgr2=0x7f92c8198580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:18.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 0x7f92c8198580 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f92c4009b20 tx=0x7f92c400b920 comp rx=0 tx=0).stop 2026-03-10T14:05:18.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 shutdown_connections 2026-03-10T14:05:18.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f92b406c680 0x7f92b406eb30 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c8102780 0x7f92c8198040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 --2- 192.168.123.103:0/3036480629 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c8103980 0x7f92c8198580 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 >> 192.168.123.103:0/3036480629 conn(0x7f92c80fdd50 msgr2=0x7f92c8106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:18.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.421+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 shutdown_connections 2026-03-10T14:05:18.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.422+0000 7f92cf72b700 1 -- 192.168.123.103:0/3036480629 wait complete. 2026-03-10T14:05:18.480 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph orch ls' 2026-03-10T14:05:18.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:18 vm04 ceph-mon[55966]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:18.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:18 vm04 ceph-mon[55966]: from='client.? ' entity='client.admin' 2026-03-10T14:05:18.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:18 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:18.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:18 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:18.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:18 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:18.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:18 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:18.615 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:18.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.850+0000 7ff7aa191700 1 -- 192.168.123.103:0/378324158 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 msgr2=0x7ff7a4073c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:18.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.850+0000 7ff7aa191700 1 --2- 192.168.123.103:0/378324158 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 0x7ff7a4073c20 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7ff794009b00 tx=0x7ff794009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:18.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.850+0000 7ff7aa191700 1 -- 192.168.123.103:0/378324158 shutdown_connections 2026-03-10T14:05:18.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.850+0000 7ff7aa191700 1 --2- 192.168.123.103:0/378324158 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 0x7ff7a4073c20 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.850+0000 7ff7aa191700 1 --2- 192.168.123.103:0/378324158 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff7a4074d80 0x7ff7a40731e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.850+0000 7ff7aa191700 1 -- 192.168.123.103:0/378324158 >> 192.168.123.103:0/378324158 conn(0x7ff7a40fbb10 msgr2=0x7ff7a40fdf40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:18.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.851+0000 7ff7aa191700 1 -- 192.168.123.103:0/378324158 shutdown_connections 2026-03-10T14:05:18.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.851+0000 7ff7aa191700 1 -- 192.168.123.103:0/378324158 wait complete. 2026-03-10T14:05:18.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.851+0000 7ff7aa191700 1 Processor -- start 2026-03-10T14:05:18.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.851+0000 7ff7aa191700 1 -- start start 2026-03-10T14:05:18.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7aa191700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 0x7ff7a419c430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:18.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7a37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 0x7ff7a419c430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7a37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 0x7ff7a419c430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42478/0 (socket says 192.168.123.103:42478) 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7aa191700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff7a4074d80 0x7ff7a419c970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7aa191700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7a419cf90 con 0x7ff7a40737b0 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7aa191700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7a419d0d0 con 0x7ff7a4074d80 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7a37fe700 1 -- 192.168.123.103:0/1024103748 learned_addr learned my addr 192.168.123.103:0/1024103748 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7a37fe700 1 -- 192.168.123.103:0/1024103748 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff7a4074d80 msgr2=0x7ff7a419c970 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7a37fe700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff7a4074d80 0x7ff7a419c970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7a37fe700 1 -- 192.168.123.103:0/1024103748 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7940097e0 con 0x7ff7a40737b0 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.852+0000 7ff7a37fe700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 0x7ff7a419c430 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7ff78c00ba70 tx=0x7ff78c00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.853+0000 7ff7a0ff9700 1 -- 192.168.123.103:0/1024103748 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff78c00c780 con 0x7ff7a40737b0 2026-03-10T14:05:18.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.853+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff7a41a1b80 con 0x7ff7a40737b0 2026-03-10T14:05:18.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.853+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff7a41a20d0 con 0x7ff7a40737b0 2026-03-10T14:05:18.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.853+0000 7ff7a0ff9700 1 -- 192.168.123.103:0/1024103748 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff78c00cdc0 con 0x7ff7a40737b0 2026-03-10T14:05:18.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.853+0000 7ff7a0ff9700 1 -- 192.168.123.103:0/1024103748 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff78c012550 con 0x7ff7a40737b0 2026-03-10T14:05:18.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.854+0000 7ff7a0ff9700 1 -- 192.168.123.103:0/1024103748 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7ff78c0126b0 con 0x7ff7a40737b0 2026-03-10T14:05:18.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.855+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff784005320 con 0x7ff7a40737b0 2026-03-10T14:05:18.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.855+0000 7ff7a0ff9700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff79006c6d0 0x7ff79006eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:18.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.855+0000 7ff7a0ff9700 1 -- 192.168.123.103:0/1024103748 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ff78c0147d0 con 0x7ff7a40737b0 2026-03-10T14:05:18.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.855+0000 7ff7a2ffd700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff79006c6d0 0x7ff79006eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:18.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.858+0000 7ff7a0ff9700 1 -- 192.168.123.103:0/1024103748 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ff78c019080 con 0x7ff7a40737b0 2026-03-10T14:05:18.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.859+0000 7ff7a2ffd700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff79006c6d0 0x7ff79006eb80 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7ff794009ad0 tx=0x7ff794009f90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:18.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.969+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7ff784000bf0 con 0x7ff79006c6d0 2026-03-10T14:05:18.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.973+0000 7ff7a0ff9700 1 -- 192.168.123.103:0/1024103748 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1160 (secure 0 0 0) 0x7ff784000bf0 con 0x7ff79006c6d0 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager ?:9093,9094 1/1 40s ago 2m count:1 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter 2/2 40s ago 2m * 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:crash 2/2 40s ago 2m * 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:grafana ?:3000 1/1 40s ago 2m count:1 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:mgr 2/2 40s ago 2m count:2 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:mon 2/2 40s ago 119s vm03:192.168.123.103=vm03;vm04:192.168.123.104=vm04;count:2 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter ?:9100 2/2 40s ago 2m * 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:osd 6 40s ago - 2026-03-10T14:05:18.972 INFO:teuthology.orchestra.run.vm03.stdout:prometheus ?:9095 1/1 40s ago 2m count:1 2026-03-10T14:05:18.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.975+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff79006c6d0 msgr2=0x7ff79006eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:18.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.975+0000 7ff7aa191700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff79006c6d0 0x7ff79006eb80 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7ff794009ad0 tx=0x7ff794009f90 comp rx=0 tx=0).stop 2026-03-10T14:05:18.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.976+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 msgr2=0x7ff7a419c430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:18.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.976+0000 7ff7aa191700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 0x7ff7a419c430 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7ff78c00ba70 tx=0x7ff78c00be30 comp rx=0 tx=0).stop 2026-03-10T14:05:18.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.976+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 shutdown_connections 2026-03-10T14:05:18.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.976+0000 7ff7aa191700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff7a40737b0 0x7ff7a419c430 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.976+0000 7ff7aa191700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff79006c6d0 0x7ff79006eb80 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.976+0000 7ff7aa191700 1 --2- 192.168.123.103:0/1024103748 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff7a4074d80 0x7ff7a419c970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:18.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.976+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 >> 192.168.123.103:0/1024103748 conn(0x7ff7a40fbb10 msgr2=0x7ff7a4101ef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:18.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.977+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 shutdown_connections 2026-03-10T14:05:18.975 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:18.977+0000 7ff7aa191700 1 -- 192.168.123.103:0/1024103748 wait complete. 2026-03-10T14:05:19.030 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph orch host ls' 2026-03-10T14:05:19.177 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:19.220 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:19 vm03 ceph-mon[49718]: from='client.24243 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:19.220 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:19 vm03 ceph-mon[49718]: from='client.14448 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.419+0000 7f99f1a8e700 1 -- 192.168.123.103:0/1342904942 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 msgr2=0x7f99ec103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.419+0000 7f99f1a8e700 1 --2- 192.168.123.103:0/1342904942 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 0x7f99ec103db0 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7f99dc009b50 tx=0x7f99dc009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.419+0000 7f99f1a8e700 1 -- 192.168.123.103:0/1342904942 shutdown_connections 2026-03-10T14:05:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.419+0000 7f99f1a8e700 1 --2- 192.168.123.103:0/1342904942 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 0x7f99ec103db0 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.419+0000 7f99f1a8e700 1 --2- 192.168.123.103:0/1342904942 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f99ec102760 0x7f99ec102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.419+0000 7f99f1a8e700 1 -- 192.168.123.103:0/1342904942 >> 192.168.123.103:0/1342904942 conn(0x7f99ec0fdcf0 msgr2=0x7f99ec100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.420+0000 7f99f1a8e700 1 -- 192.168.123.103:0/1342904942 shutdown_connections 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.420+0000 7f99f1a8e700 1 -- 192.168.123.103:0/1342904942 wait complete. 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.420+0000 7f99f1a8e700 1 Processor -- start 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.420+0000 7f99f1a8e700 1 -- start start 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.420+0000 7f99f1a8e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f99ec102760 0x7f99ec193ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.420+0000 7f99f1a8e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 0x7f99ec194220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.420+0000 7f99f1a8e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99ec1947f0 con 0x7f99ec103960 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.420+0000 7f99f1a8e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99ec194930 con 0x7f99ec102760 2026-03-10T14:05:19.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.421+0000 7f99eaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 0x7f99ec194220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:19.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.421+0000 7f99eaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 0x7f99ec194220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42494/0 (socket says 192.168.123.103:42494) 2026-03-10T14:05:19.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.421+0000 7f99eaffd700 1 -- 192.168.123.103:0/4087214381 learned_addr learned my addr 192.168.123.103:0/4087214381 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:19.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.421+0000 7f99eaffd700 1 -- 192.168.123.103:0/4087214381 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f99ec102760 msgr2=0x7f99ec193ce0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:19.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.421+0000 7f99eaffd700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f99ec102760 0x7f99ec193ce0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:19.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.422+0000 7f99eaffd700 1 -- 192.168.123.103:0/4087214381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f99dc0097e0 con 0x7f99ec103960 2026-03-10T14:05:19.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.422+0000 7f99eaffd700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 0x7f99ec194220 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f99dc005850 tx=0x7f99dc00b920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:19.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.422+0000 7f99e8ff9700 1 -- 192.168.123.103:0/4087214381 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f99dc01d070 con 0x7f99ec103960 2026-03-10T14:05:19.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.422+0000 7f99e8ff9700 1 -- 192.168.123.103:0/4087214381 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f99dc00bd20 con 0x7f99ec103960 2026-03-10T14:05:19.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.422+0000 7f99e8ff9700 1 -- 192.168.123.103:0/4087214381 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f99dc0219b0 con 0x7f99ec103960 2026-03-10T14:05:19.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.423+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f99ec06a830 con 0x7f99ec103960 2026-03-10T14:05:19.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.423+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f99ec06ad20 con 0x7f99ec103960 2026-03-10T14:05:19.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.423+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f99ec066e40 con 0x7f99ec103960 2026-03-10T14:05:19.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.424+0000 7f99e8ff9700 1 -- 192.168.123.103:0/4087214381 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f99dc00f460 con 0x7f99ec103960 2026-03-10T14:05:19.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.425+0000 7f99e8ff9700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f99d406c630 0x7f99d406eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:19.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.425+0000 7f99e8ff9700 1 -- 192.168.123.103:0/4087214381 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f99dc05f8c0 con 0x7f99ec103960 2026-03-10T14:05:19.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.426+0000 7f99e8ff9700 1 -- 192.168.123.103:0/4087214381 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f99dc058360 con 0x7f99ec103960 2026-03-10T14:05:19.425 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.427+0000 7f99eb7fe700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f99d406c630 0x7f99d406eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:19.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.427+0000 7f99eb7fe700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f99d406c630 0x7f99d406eae0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f99ec1037c0 tx=0x7f99e0006c60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:19.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.533+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f99ec1082b0 con 0x7f99d406c630 2026-03-10T14:05:19.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.535+0000 7f99e8ff9700 1 -- 192.168.123.103:0/4087214381 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7f99ec1082b0 con 0x7f99d406c630 2026-03-10T14:05:19.534 INFO:teuthology.orchestra.run.vm03.stdout:HOST ADDR LABELS STATUS 2026-03-10T14:05:19.534 INFO:teuthology.orchestra.run.vm03.stdout:vm03 192.168.123.103 2026-03-10T14:05:19.534 INFO:teuthology.orchestra.run.vm03.stdout:vm04 192.168.123.104 2026-03-10T14:05:19.534 INFO:teuthology.orchestra.run.vm03.stdout:2 hosts in cluster 2026-03-10T14:05:19.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.538+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f99d406c630 msgr2=0x7f99d406eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:19.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.538+0000 7f99f1a8e700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f99d406c630 0x7f99d406eae0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f99ec1037c0 tx=0x7f99e0006c60 comp rx=0 tx=0).stop 2026-03-10T14:05:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.538+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 msgr2=0x7f99ec194220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.538+0000 7f99f1a8e700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 0x7f99ec194220 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f99dc005850 tx=0x7f99dc00b920 comp rx=0 tx=0).stop 2026-03-10T14:05:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.539+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 shutdown_connections 2026-03-10T14:05:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.539+0000 7f99f1a8e700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f99d406c630 0x7f99d406eae0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.539+0000 7f99f1a8e700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f99ec102760 0x7f99ec193ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.539+0000 7f99f1a8e700 1 --2- 192.168.123.103:0/4087214381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99ec103960 0x7f99ec194220 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.539+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 >> 192.168.123.103:0/4087214381 conn(0x7f99ec0fdcf0 msgr2=0x7f99ec106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.539+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 shutdown_connections 2026-03-10T14:05:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:19.540+0000 7f99f1a8e700 1 -- 192.168.123.103:0/4087214381 wait complete. 2026-03-10T14:05:19.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:19 vm04 ceph-mon[55966]: from='client.24243 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:19.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:19 vm04 ceph-mon[55966]: from='client.14448 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:19.600 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph orch device ls' 2026-03-10T14:05:19.753 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.002+0000 7f743f510700 1 -- 192.168.123.103:0/1782190112 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438102760 msgr2=0x7f7438102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.002+0000 7f743f510700 1 --2- 192.168.123.103:0/1782190112 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438102760 0x7f7438102b70 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f7428009b50 tx=0x7f7428009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.003+0000 7f743f510700 1 -- 192.168.123.103:0/1782190112 shutdown_connections 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.003+0000 7f743f510700 1 --2- 192.168.123.103:0/1782190112 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7438103960 0x7f7438103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.003+0000 7f743f510700 1 --2- 192.168.123.103:0/1782190112 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438102760 0x7f7438102b70 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.003+0000 7f743f510700 1 -- 192.168.123.103:0/1782190112 >> 192.168.123.103:0/1782190112 conn(0x7f74380fdcf0 msgr2=0x7f7438100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.003+0000 7f743f510700 1 -- 192.168.123.103:0/1782190112 shutdown_connections 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.003+0000 7f743f510700 1 -- 192.168.123.103:0/1782190112 wait complete. 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743f510700 1 Processor -- start 2026-03-10T14:05:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743f510700 1 -- start start 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743f510700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7438102760 0x7f7438193c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743f510700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438103960 0x7f74381941b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743f510700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7438194780 con 0x7f7438103960 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743f510700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74381948c0 con 0x7f7438102760 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743d2ac700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7438102760 0x7f7438193c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743d2ac700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7438102760 0x7f7438193c70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:52524/0 (socket says 192.168.123.103:52524) 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743d2ac700 1 -- 192.168.123.103:0/2699237536 learned_addr learned my addr 192.168.123.103:0/2699237536 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.004+0000 7f743d2ac700 1 -- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438103960 msgr2=0x7f74381941b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f743caab700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438103960 0x7f74381941b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f743d2ac700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438103960 0x7f74381941b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f743d2ac700 1 -- 192.168.123.103:0/2699237536 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f74280097e0 con 0x7f7438102760 2026-03-10T14:05:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f743d2ac700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7438102760 0x7f7438193c70 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f7428005250 tx=0x7f7428004c80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:20.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f743caab700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438103960 0x7f74381941b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:20.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f742e7fc700 1 -- 192.168.123.103:0/2699237536 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f742801d070 con 0x7f7438102760 2026-03-10T14:05:20.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f742e7fc700 1 -- 192.168.123.103:0/2699237536 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f742800bc50 con 0x7f7438102760 2026-03-10T14:05:20.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f74381aa370 con 0x7f7438102760 2026-03-10T14:05:20.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f742e7fc700 1 -- 192.168.123.103:0/2699237536 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f742800f870 con 0x7f7438102760 2026-03-10T14:05:20.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.005+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f74381aa860 con 0x7f7438102760 2026-03-10T14:05:20.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.006+0000 7f742e7fc700 1 -- 192.168.123.103:0/2699237536 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f742800f9d0 con 0x7f7438102760 2026-03-10T14:05:20.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.006+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f741c005320 con 0x7f7438102760 2026-03-10T14:05:20.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.007+0000 7f742e7fc700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f742406c630 0x7f742406eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:20.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.007+0000 7f743caab700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f742406c630 0x7f742406eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:20.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.008+0000 7f742e7fc700 1 -- 192.168.123.103:0/2699237536 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f742808ca90 con 0x7f7438102760 2026-03-10T14:05:20.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.008+0000 7f743caab700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f742406c630 0x7f742406eae0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f7434009dd0 tx=0x7f7434009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:20.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.010+0000 7f742e7fc700 1 -- 192.168.123.103:0/2699237536 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f742805c2a0 con 0x7f7438102760 2026-03-10T14:05:20.119 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.119+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f741c000bf0 con 0x7f742406c630 2026-03-10T14:05:20.119 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.121+0000 7f742e7fc700 1 -- 192.168.123.103:0/2699237536 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1188 (secure 0 0 0) 0x7f741c000bf0 con 0x7f742406c630 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdb hdd DWNBRSTVMM03001 20.0G Yes 43s ago 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdc hdd DWNBRSTVMM03002 20.0G No 43s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdd hdd DWNBRSTVMM03003 20.0G No 43s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vde hdd DWNBRSTVMM03004 20.0G No 43s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:vm04 /dev/vdb hdd DWNBRSTVMM04001 20.0G Yes 15s ago 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:vm04 /dev/vdc hdd DWNBRSTVMM04002 20.0G No 15s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:vm04 /dev/vdd hdd DWNBRSTVMM04003 20.0G No 15s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T14:05:20.120 INFO:teuthology.orchestra.run.vm03.stdout:vm04 /dev/vde hdd DWNBRSTVMM04004 20.0G No 15s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f742406c630 msgr2=0x7f742406eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f742406c630 0x7f742406eae0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f7434009dd0 tx=0x7f7434009450 comp rx=0 tx=0).stop 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7438102760 msgr2=0x7f7438193c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7438102760 0x7f7438193c70 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f7428005250 tx=0x7f7428004c80 comp rx=0 tx=0).stop 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 shutdown_connections 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f742406c630 0x7f742406eae0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7438102760 0x7f7438193c70 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 --2- 192.168.123.103:0/2699237536 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7438103960 0x7f74381941b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.123+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 >> 192.168.123.103:0/2699237536 conn(0x7f74380fdcf0 msgr2=0x7f7438106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.124+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 shutdown_connections 2026-03-10T14:05:20.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.124+0000 7f743f510700 1 -- 192.168.123.103:0/2699237536 wait complete. 2026-03-10T14:05:20.193 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T14:05:20.196 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:20.196 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-10T14:05:20.345 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:20.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:20 vm03 ceph-mon[49718]: from='client.14452 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:20.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:20 vm03 ceph-mon[49718]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:20.375 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:20 vm03 ceph-mon[49718]: from='client.14456 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:20.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:20 vm04 ceph-mon[55966]: from='client.14452 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:20.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:20 vm04 ceph-mon[55966]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:20.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:20 vm04 ceph-mon[55966]: from='client.14456 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.600+0000 7fbd75258700 1 -- 192.168.123.103:0/3308426633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 msgr2=0x7fbd70102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.600+0000 7fbd75258700 1 --2- 192.168.123.103:0/3308426633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 0x7fbd70102b70 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7fbd58009b50 tx=0x7fbd58009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.601+0000 7fbd75258700 1 -- 192.168.123.103:0/3308426633 shutdown_connections 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.601+0000 7fbd75258700 1 --2- 192.168.123.103:0/3308426633 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd70103960 0x7fbd70103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.601+0000 7fbd75258700 1 --2- 192.168.123.103:0/3308426633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 0x7fbd70102b70 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.601+0000 7fbd75258700 1 -- 192.168.123.103:0/3308426633 >> 192.168.123.103:0/3308426633 conn(0x7fbd700fdcf0 msgr2=0x7fbd70100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.601+0000 7fbd75258700 1 -- 192.168.123.103:0/3308426633 shutdown_connections 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.601+0000 7fbd75258700 1 -- 192.168.123.103:0/3308426633 wait complete. 2026-03-10T14:05:20.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd75258700 1 Processor -- start 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd75258700 1 -- start start 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd75258700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 0x7fbd701980c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd6ed9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 0x7fbd701980c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd6ed9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 0x7fbd701980c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42542/0 (socket says 192.168.123.103:42542) 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd75258700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd70103960 0x7fbd70198600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd75258700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd70198c20 con 0x7fbd70102760 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd75258700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd70198d60 con 0x7fbd70103960 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.602+0000 7fbd6ed9d700 1 -- 192.168.123.103:0/3548747057 learned_addr learned my addr 192.168.123.103:0/3548747057 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:20.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.603+0000 7fbd6e59c700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd70103960 0x7fbd70198600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:20.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.603+0000 7fbd6ed9d700 1 -- 192.168.123.103:0/3548747057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd70103960 msgr2=0x7fbd70198600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:20.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.603+0000 7fbd6ed9d700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd70103960 0x7fbd70198600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:20.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.603+0000 7fbd6ed9d700 1 -- 192.168.123.103:0/3548747057 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbd580097e0 con 0x7fbd70102760 2026-03-10T14:05:20.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.603+0000 7fbd6ed9d700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 0x7fbd701980c0 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7fbd58004ce0 tx=0x7fbd58005f00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:20.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.603+0000 7fbd67fff700 1 -- 192.168.123.103:0/3548747057 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd5801d070 con 0x7fbd70102760 2026-03-10T14:05:20.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.603+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbd7019d7b0 con 0x7fbd70102760 2026-03-10T14:05:20.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.603+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbd7019dca0 con 0x7fbd70102760 2026-03-10T14:05:20.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.604+0000 7fbd67fff700 1 -- 192.168.123.103:0/3548747057 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbd5800bc00 con 0x7fbd70102760 2026-03-10T14:05:20.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.604+0000 7fbd67fff700 1 -- 192.168.123.103:0/3548747057 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd580176f0 con 0x7fbd70102760 2026-03-10T14:05:20.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.605+0000 7fbd67fff700 1 -- 192.168.123.103:0/3548747057 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fbd58017850 con 0x7fbd70102760 2026-03-10T14:05:20.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.606+0000 7fbd67fff700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbd5c06c680 0x7fbd5c06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:20.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.606+0000 7fbd67fff700 1 -- 192.168.123.103:0/3548747057 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fbd5808cb10 con 0x7fbd70102760 2026-03-10T14:05:20.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.606+0000 7fbd6e59c700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbd5c06c680 0x7fbd5c06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:20.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.606+0000 7fbd6e59c700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbd5c06c680 0x7fbd5c06eb30 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fbd60005950 tx=0x7fbd600058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:20.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.606+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbd50005320 con 0x7fbd70102760 2026-03-10T14:05:20.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.609+0000 7fbd67fff700 1 -- 192.168.123.103:0/3548747057 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbd5805c340 con 0x7fbd70102760 2026-03-10T14:05:20.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:20.737+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7fbd50000bf0 con 0x7fbd5c06c680 2026-03-10T14:05:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:21 vm04 ceph-mon[55966]: from='client.24247 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:21 vm04 ceph-mon[55966]: from='client.14462 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:21 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T14:05:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:21 vm03 ceph-mon[49718]: from='client.24247 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:21 vm03 ceph-mon[49718]: from='client.14462 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:21 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T14:05:22.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.266+0000 7fbd67fff700 1 -- 192.168.123.103:0/3548747057 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7fbd50000bf0 con 0x7fbd5c06c680 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbd5c06c680 msgr2=0x7fbd5c06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbd5c06c680 0x7fbd5c06eb30 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fbd60005950 tx=0x7fbd600058e0 comp rx=0 tx=0).stop 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 msgr2=0x7fbd701980c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 0x7fbd701980c0 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7fbd58004ce0 tx=0x7fbd58005f00 comp rx=0 tx=0).stop 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 shutdown_connections 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd70102760 0x7fbd701980c0 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbd5c06c680 0x7fbd5c06eb30 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 --2- 192.168.123.103:0/3548747057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd70103960 0x7fbd70198600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 >> 192.168.123.103:0/3548747057 conn(0x7fbd700fdcf0 msgr2=0x7fbd70106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 shutdown_connections 2026-03-10T14:05:22.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.269+0000 7fbd75258700 1 -- 192.168.123.103:0/3548747057 wait complete. 2026-03-10T14:05:22.320 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph fs dump' 2026-03-10T14:05:22.511 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:22.552 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:22 vm03 ceph-mon[49718]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:22.552 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T14:05:22.552 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:22 vm03 ceph-mon[49718]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T14:05:22.552 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:22 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T14:05:22.552 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:22 vm03 ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[49714]: 2026-03-10T14:05:22.221+0000 7ffba27ac700 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:05:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:22 vm04 ceph-mon[55966]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T14:05:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:22 vm04 ceph-mon[55966]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T14:05:22.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:22 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T14:05:22.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.845+0000 7f48f794a700 1 -- 192.168.123.103:0/2005929436 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 msgr2=0x7f48f01081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:22.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.845+0000 7f48f794a700 1 --2- 192.168.123.103:0/2005929436 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 0x7f48f01081c0 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7f48ec009b00 tx=0x7f48ec009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:22.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.849+0000 7f48f794a700 1 -- 192.168.123.103:0/2005929436 shutdown_connections 2026-03-10T14:05:22.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.849+0000 7f48f794a700 1 --2- 192.168.123.103:0/2005929436 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 0x7f48f01081c0 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:22.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.849+0000 7f48f794a700 1 --2- 192.168.123.103:0/2005929436 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48f0071db0 0x7f48f00721c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:22.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.849+0000 7f48f794a700 1 -- 192.168.123.103:0/2005929436 >> 192.168.123.103:0/2005929436 conn(0x7f48f006d3e0 msgr2=0x7f48f006f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:22.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.849+0000 7f48f794a700 1 -- 192.168.123.103:0/2005929436 shutdown_connections 2026-03-10T14:05:22.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f794a700 1 -- 192.168.123.103:0/2005929436 wait complete. 2026-03-10T14:05:22.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f794a700 1 Processor -- start 2026-03-10T14:05:22.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f794a700 1 -- start start 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f794a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48f0071db0 0x7f48f0116d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f794a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 0x7f48f01172b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f794a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48f01178d0 con 0x7f48f0107d50 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f794a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48f01b2d50 con 0x7f48f0071db0 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f4ee5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 0x7f48f01172b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f4ee5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 0x7f48f01172b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42562/0 (socket says 192.168.123.103:42562) 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.850+0000 7f48f4ee5700 1 -- 192.168.123.103:0/2696486457 learned_addr learned my addr 192.168.123.103:0/2696486457 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.851+0000 7f48f56e6700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48f0071db0 0x7f48f0116d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.851+0000 7f48f4ee5700 1 -- 192.168.123.103:0/2696486457 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48f0071db0 msgr2=0x7f48f0116d70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.851+0000 7f48f4ee5700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48f0071db0 0x7f48f0116d70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.851+0000 7f48f4ee5700 1 -- 192.168.123.103:0/2696486457 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48ec0097e0 con 0x7f48f0107d50 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.851+0000 7f48f4ee5700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 0x7f48f01172b0 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f48ec005280 tx=0x7f48ec003680 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:22.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.852+0000 7f48e67fc700 1 -- 192.168.123.103:0/2696486457 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f48ec01d070 con 0x7f48f0107d50 2026-03-10T14:05:22.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.852+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f48f01b2e90 con 0x7f48f0107d50 2026-03-10T14:05:22.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.852+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f48f01b3300 con 0x7f48f0107d50 2026-03-10T14:05:22.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.852+0000 7f48e67fc700 1 -- 192.168.123.103:0/2696486457 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f48ec003c10 con 0x7f48f0107d50 2026-03-10T14:05:22.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.852+0000 7f48e67fc700 1 -- 192.168.123.103:0/2696486457 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f48ec021880 con 0x7f48f0107d50 2026-03-10T14:05:22.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.854+0000 7f48e67fc700 1 -- 192.168.123.103:0/2696486457 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f48ec005470 con 0x7f48f0107d50 2026-03-10T14:05:22.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.855+0000 7f48e67fc700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f48dc06c7a0 0x7f48dc06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:22.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.857+0000 7f48f56e6700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f48dc06c7a0 0x7f48dc06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:22.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.857+0000 7f48e67fc700 1 -- 192.168.123.103:0/2696486457 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(36..36 src has 1..36) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f48ec08d940 con 0x7f48f0107d50 2026-03-10T14:05:22.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.857+0000 7f48f56e6700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f48dc06c7a0 0x7f48dc06ec50 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f48e0005950 tx=0x7f48e00058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:22.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.857+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f48f0118040 con 0x7f48f0107d50 2026-03-10T14:05:22.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:22.862+0000 7f48e67fc700 1 -- 192.168.123.103:0/2696486457 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f48ec058140 con 0x7f48f0107d50 2026-03-10T14:05:23.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.004+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f48f01181d0 con 0x7f48f0107d50 2026-03-10T14:05:23.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.005+0000 7f48e67fc700 1 -- 192.168.123.103:0/2696486457 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1093 (secure 0 0 0) 0x7f48ec026070 con 0x7f48f0107d50 2026-03-10T14:05:23.004 INFO:teuthology.orchestra.run.vm03.stdout:e2 2026-03-10T14:05:23.004 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:epoch 2 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:05:22.222346+0000 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:in 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:up {} 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 0 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:23.005 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f48dc06c7a0 msgr2=0x7f48dc06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f48dc06c7a0 0x7f48dc06ec50 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f48e0005950 tx=0x7f48e00058e0 comp rx=0 tx=0).stop 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 msgr2=0x7f48f01172b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 0x7f48f01172b0 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f48ec005280 tx=0x7f48ec003680 comp rx=0 tx=0).stop 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 shutdown_connections 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f48dc06c7a0 0x7f48dc06ec50 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48f0071db0 0x7f48f0116d70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 --2- 192.168.123.103:0/2696486457 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48f0107d50 0x7f48f01172b0 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.008+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 >> 192.168.123.103:0/2696486457 conn(0x7f48f006d3e0 msgr2=0x7f48f010af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.009+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 shutdown_connections 2026-03-10T14:05:23.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.009+0000 7f48f794a700 1 -- 192.168.123.103:0/2696486457 wait complete. 2026-03-10T14:05:23.008 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 2 2026-03-10T14:05:23.189 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T14:05:23.192 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:23.192 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph fs set cephfs max_mds 2' 2026-03-10T14:05:23.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: fsmap cephfs:0 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: Saving service mds.cephfs spec with placement count:4 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.aqaspa", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.aqaspa", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: Deploying daemon mds.cephfs.vm03.aqaspa on vm03 2026-03-10T14:05:23.418 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:23 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/2696486457' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:05:23.430 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:23.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: fsmap cephfs:0 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: Saving service mds.cephfs spec with placement count:4 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.aqaspa", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.aqaspa", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: Deploying daemon mds.cephfs.vm03.aqaspa on vm03 2026-03-10T14:05:23.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:23 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/2696486457' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:05:23.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.727+0000 7f0de0873700 1 -- 192.168.123.103:0/374226911 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 msgr2=0x7f0dd8102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.727+0000 7f0de0873700 1 --2- 192.168.123.103:0/374226911 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 0x7f0dd8102b70 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f0dd0009b00 tx=0x7f0dd0009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.728+0000 7f0de0873700 1 -- 192.168.123.103:0/374226911 shutdown_connections 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.728+0000 7f0de0873700 1 --2- 192.168.123.103:0/374226911 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dd8103960 0x7f0dd8103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.728+0000 7f0de0873700 1 --2- 192.168.123.103:0/374226911 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 0x7f0dd8102b70 unknown :-1 s=CLOSED pgs=248 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.728+0000 7f0de0873700 1 -- 192.168.123.103:0/374226911 >> 192.168.123.103:0/374226911 conn(0x7f0dd80fdcf0 msgr2=0x7f0dd8100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.728+0000 7f0de0873700 1 -- 192.168.123.103:0/374226911 shutdown_connections 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.728+0000 7f0de0873700 1 -- 192.168.123.103:0/374226911 wait complete. 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0de0873700 1 Processor -- start 2026-03-10T14:05:23.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0de0873700 1 -- start start 2026-03-10T14:05:23.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0de0873700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 0x7f0dd8198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:23.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0de0873700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dd8103960 0x7f0dd8198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:23.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0de0873700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0dd8198af0 con 0x7f0dd8102760 2026-03-10T14:05:23.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0de0873700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0dd8198c30 con 0x7f0dd8103960 2026-03-10T14:05:23.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0dde60f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 0x7f0dd8198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:23.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0dde60f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 0x7f0dd8198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57544/0 (socket says 192.168.123.103:57544) 2026-03-10T14:05:23.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0dde60f700 1 -- 192.168.123.103:0/175519246 learned_addr learned my addr 192.168.123.103:0/175519246 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:23.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.729+0000 7f0dde60f700 1 -- 192.168.123.103:0/175519246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dd8103960 msgr2=0x7f0dd8198560 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:05:23.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.730+0000 7f0ddde0e700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dd8103960 0x7f0dd8198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:23.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.730+0000 7f0dde60f700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dd8103960 0x7f0dd8198560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:23.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.730+0000 7f0dde60f700 1 -- 192.168.123.103:0/175519246 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0dd00097e0 con 0x7f0dd8102760 2026-03-10T14:05:23.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.730+0000 7f0dde60f700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 0x7f0dd8198020 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f0dd0009fd0 tx=0x7f0dd0004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.730+0000 7f0dcb7fe700 1 -- 192.168.123.103:0/175519246 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0dd001d070 con 0x7f0dd8102760 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.730+0000 7f0ddde0e700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dd8103960 0x7f0dd8198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.731+0000 7f0dcb7fe700 1 -- 192.168.123.103:0/175519246 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0dd000bc50 con 0x7f0dd8102760 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.731+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0dd819d690 con 0x7f0dd8102760 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.731+0000 7f0dcb7fe700 1 -- 192.168.123.103:0/175519246 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0dd000f870 con 0x7f0dd8102760 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.731+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0dd819db50 con 0x7f0dd8102760 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.732+0000 7f0dcb7fe700 1 -- 192.168.123.103:0/175519246 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f0dd000f9d0 con 0x7f0dd8102760 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.732+0000 7f0dcb7fe700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0dc406c680 0x7f0dc406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.733+0000 7f0dcb7fe700 1 -- 192.168.123.103:0/175519246 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f0dd008d390 con 0x7f0dd8102760 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.733+0000 7f0ddde0e700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0dc406c680 0x7f0dc406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:23.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.733+0000 7f0ddde0e700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0dc406c680 0x7f0dc406eb30 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f0dcc005fd0 tx=0x7f0dcc005e20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:23.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.734+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0dd8066e40 con 0x7f0dd8102760 2026-03-10T14:05:23.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.737+0000 7f0dcb7fe700 1 -- 192.168.123.103:0/175519246 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0dd005b5b0 con 0x7f0dd8102760 2026-03-10T14:05:23.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:23.862+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"} v 0) v1 -- 0x7f0dd819dee0 con 0x7f0dd8102760 2026-03-10T14:05:24.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.415+0000 7f0dcb7fe700 1 -- 192.168.123.103:0/175519246 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]=0 v3) v1 ==== 105+0+0 (secure 0 0 0) 0x7f0dd0027070 con 0x7f0dd8102760 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0dc406c680 msgr2=0x7f0dc406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0dc406c680 0x7f0dc406eb30 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f0dcc005fd0 tx=0x7f0dcc005e20 comp rx=0 tx=0).stop 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 msgr2=0x7f0dd8198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 0x7f0dd8198020 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f0dd0009fd0 tx=0x7f0dd0004ab0 comp rx=0 tx=0).stop 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 shutdown_connections 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0dd8102760 0x7f0dd8198020 unknown :-1 s=CLOSED pgs=249 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0dc406c680 0x7f0dc406eb30 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 --2- 192.168.123.103:0/175519246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0dd8103960 0x7f0dd8198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 >> 192.168.123.103:0/175519246 conn(0x7f0dd80fdcf0 msgr2=0x7f0dd8106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 shutdown_connections 2026-03-10T14:05:24.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:24.418+0000 7f0de0873700 1 -- 192.168.123.103:0/175519246 wait complete. 2026-03-10T14:05:24.479 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T14:05:24.481 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:24.481 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph fs set cephfs allow_standby_replay true' 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: pgmap v74: 65 pgs: 5 creating+peering, 16 active+clean, 44 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.sslxuq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.sslxuq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: Deploying daemon mds.cephfs.vm04.sslxuq on vm04 2026-03-10T14:05:24.508 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:24 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/175519246' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: pgmap v74: 65 pgs: 5 creating+peering, 16 active+clean, 44 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.sslxuq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.sslxuq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: Deploying daemon mds.cephfs.vm04.sslxuq on vm04 2026-03-10T14:05:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:24 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/175519246' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T14:05:24.671 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:25.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.224+0000 7fb404db9700 1 -- 192.168.123.103:0/2372102860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001024f0 msgr2=0x7fb400102900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.224+0000 7fb404db9700 1 --2- 192.168.123.103:0/2372102860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001024f0 0x7fb400102900 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7fb3f800b3a0 tx=0x7fb3f800b6b0 comp rx=0 tx=0).stop 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.225+0000 7fb404db9700 1 -- 192.168.123.103:0/2372102860 shutdown_connections 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.225+0000 7fb404db9700 1 --2- 192.168.123.103:0/2372102860 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4001036f0 0x7fb400103b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.225+0000 7fb404db9700 1 --2- 192.168.123.103:0/2372102860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001024f0 0x7fb400102900 unknown :-1 s=CLOSED pgs=250 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.225+0000 7fb404db9700 1 -- 192.168.123.103:0/2372102860 >> 192.168.123.103:0/2372102860 conn(0x7fb4000fdac0 msgr2=0x7fb4000ffed0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.225+0000 7fb404db9700 1 -- 192.168.123.103:0/2372102860 shutdown_connections 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.225+0000 7fb404db9700 1 -- 192.168.123.103:0/2372102860 wait complete. 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb404db9700 1 Processor -- start 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb404db9700 1 -- start start 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb404db9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4001024f0 0x7fb40006d990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb404db9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001036f0 0x7fb40006ded0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb404db9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb40006e4a0 con 0x7fb4001036f0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb404db9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb40006e610 con 0x7fb4001024f0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb3fdd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001036f0 0x7fb40006ded0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb3fdd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001036f0 0x7fb40006ded0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57576/0 (socket says 192.168.123.103:57576) 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb3fe59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4001024f0 0x7fb40006d990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb3fe59c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4001024f0 0x7fb40006d990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:59480/0 (socket says 192.168.123.103:59480) 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.226+0000 7fb3fe59c700 1 -- 192.168.123.103:0/3217792192 learned_addr learned my addr 192.168.123.103:0/3217792192 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.227+0000 7fb3fe59c700 1 -- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001036f0 msgr2=0x7fb40006ded0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.227+0000 7fb3fe59c700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001036f0 0x7fb40006ded0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.227+0000 7fb3fe59c700 1 -- 192.168.123.103:0/3217792192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3f800b050 con 0x7fb4001024f0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.227+0000 7fb3fdd9b700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001036f0 0x7fb40006ded0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.227+0000 7fb3fe59c700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4001024f0 0x7fb40006d990 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fb3f8015040 tx=0x7fb3f8008f40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.228+0000 7fb3ef7fe700 1 -- 192.168.123.103:0/3217792192 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3f800e050 con 0x7fb4001024f0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.228+0000 7fb404db9700 1 -- 192.168.123.103:0/3217792192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb4001b2950 con 0x7fb4001024f0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.228+0000 7fb404db9700 1 -- 192.168.123.103:0/3217792192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb4001b2ea0 con 0x7fb4001024f0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.228+0000 7fb3ef7fe700 1 -- 192.168.123.103:0/3217792192 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb3f8003f80 con 0x7fb4001024f0 2026-03-10T14:05:25.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.228+0000 7fb3ef7fe700 1 -- 192.168.123.103:0/3217792192 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3f801db10 con 0x7fb4001024f0 2026-03-10T14:05:25.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.230+0000 7fb3ef7fe700 1 -- 192.168.123.103:0/3217792192 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb3f8019040 con 0x7fb4001024f0 2026-03-10T14:05:25.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.230+0000 7fb3ef7fe700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb3e806c7a0 0x7fb3e806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:25.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.230+0000 7fb3fdd9b700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb3e806c7a0 0x7fb3e806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:25.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.230+0000 7fb3ef7fe700 1 -- 192.168.123.103:0/3217792192 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb3f808cf90 con 0x7fb4001024f0 2026-03-10T14:05:25.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.230+0000 7fb3fdd9b700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb3e806c7a0 0x7fb3e806ec50 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fb3f0005950 tx=0x7fb3f000a300 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:25.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.232+0000 7fb404db9700 1 -- 192.168.123.103:0/3217792192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb3e0005320 con 0x7fb4001024f0 2026-03-10T14:05:25.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.235+0000 7fb3ef7fe700 1 -- 192.168.123.103:0/3217792192 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb3f805b220 con 0x7fb4001024f0 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.itwezo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.itwezo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: Deploying daemon mds.cephfs.vm03.itwezo on vm03 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] up:boot 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/175519246' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] up:boot 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: daemon mds.cephfs.vm04.sslxuq assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: fsmap cephfs:0 2 up:standby 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:creating} 1 up:standby 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: daemon mds.cephfs.vm04.sslxuq is now active in filesystem cephfs as rank 0 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] up:active 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: daemon mds.cephfs.vm03.aqaspa assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: Health check failed: insufficient standby MDS daemons available (MDS_INSUFFICIENT_STANDBY) 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 1 up:standby 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:creating} 2026-03-10T14:05:25.279 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:25 vm03 ceph-mon[49718]: daemon mds.cephfs.vm03.aqaspa is now active in filesystem cephfs as rank 1 2026-03-10T14:05:25.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.382+0000 7fb404db9700 1 -- 192.168.123.103:0/3217792192 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"} v 0) v1 -- 0x7fb3e0005f70 con 0x7fb4001024f0 2026-03-10T14:05:25.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T14:05:25.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:25.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.itwezo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.itwezo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: Deploying daemon mds.cephfs.vm03.itwezo on vm03 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] up:boot 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/175519246' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] up:boot 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: daemon mds.cephfs.vm04.sslxuq assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: fsmap cephfs:0 2 up:standby 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:creating} 1 up:standby 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: daemon mds.cephfs.vm04.sslxuq is now active in filesystem cephfs as rank 0 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] up:active 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: daemon mds.cephfs.vm03.aqaspa assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: Health check failed: insufficient standby MDS daemons available (MDS_INSUFFICIENT_STANDBY) 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 1 up:standby 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:creating} 2026-03-10T14:05:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:25 vm04 ceph-mon[55966]: daemon mds.cephfs.vm03.aqaspa is now active in filesystem cephfs as rank 1 2026-03-10T14:05:25.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.961+0000 7fb3ef7fe700 1 -- 192.168.123.103:0/3217792192 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]=0 v7) v1 ==== 121+0+0 (secure 0 0 0) 0x7fb3f80170c0 con 0x7fb4001024f0 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 -- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb3e806c7a0 msgr2=0x7fb3e806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb3e806c7a0 0x7fb3e806ec50 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fb3f0005950 tx=0x7fb3f000a300 comp rx=0 tx=0).stop 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 -- 192.168.123.103:0/3217792192 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4001024f0 msgr2=0x7fb40006d990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4001024f0 0x7fb40006d990 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fb3f8015040 tx=0x7fb3f8008f40 comp rx=0 tx=0).stop 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 -- 192.168.123.103:0/3217792192 shutdown_connections 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb3e806c7a0 0x7fb3e806ec50 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4001024f0 0x7fb40006d990 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 --2- 192.168.123.103:0/3217792192 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4001036f0 0x7fb40006ded0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 -- 192.168.123.103:0/3217792192 >> 192.168.123.103:0/3217792192 conn(0x7fb4000fdac0 msgr2=0x7fb400106920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 -- 192.168.123.103:0/3217792192 shutdown_connections 2026-03-10T14:05:25.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:25.964+0000 7fb3ed7fa700 1 -- 192.168.123.103:0/3217792192 wait complete. 2026-03-10T14:05:26.037 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T14:05:26.040 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:26.040 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph fs set cephfs inline_data false' 2026-03-10T14:05:26.235 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: pgmap v77: 65 pgs: 5 creating+peering, 52 active+clean, 8 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3217792192' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]: dispatch 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]: dispatch 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.puavjd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.puavjd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: Deploying daemon mds.cephfs.vm04.puavjd on vm04 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: Health check cleared: MDS_INSUFFICIENT_STANDBY (was: insufficient standby MDS daemons available) 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: Cluster is now healthy 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] up:active 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]': finished 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] up:boot 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 1 up:standby 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:05:26.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:26 vm03 ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 1 up:standby-replay 2026-03-10T14:05:26.354 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: pgmap v77: 65 pgs: 5 creating+peering, 52 active+clean, 8 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T14:05:26.354 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3217792192' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]: dispatch 2026-03-10T14:05:26.354 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]: dispatch 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.puavjd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.puavjd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: Deploying daemon mds.cephfs.vm04.puavjd on vm04 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: Health check cleared: MDS_INSUFFICIENT_STANDBY (was: insufficient standby MDS daemons available) 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: Cluster is now healthy 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] up:active 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "true"}]': finished 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] up:boot 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 1 up:standby 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:05:26.355 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:26 vm04 ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 1 up:standby-replay 2026-03-10T14:05:26.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.528+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/3250016888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec107d50 msgr2=0x7fa5ec1081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:26.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.528+0000 7fa5f1c9d700 1 --2- 192.168.123.103:0/3250016888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec107d50 0x7fa5ec1081c0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fa5e400b210 tx=0x7fa5e400b520 comp rx=0 tx=0).stop 2026-03-10T14:05:26.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.530+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/3250016888 shutdown_connections 2026-03-10T14:05:26.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.530+0000 7fa5f1c9d700 1 --2- 192.168.123.103:0/3250016888 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec107d50 0x7fa5ec1081c0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:26.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.530+0000 7fa5f1c9d700 1 --2- 192.168.123.103:0/3250016888 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5ec071db0 0x7fa5ec0721c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:26.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.530+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/3250016888 >> 192.168.123.103:0/3250016888 conn(0x7fa5ec06d3e0 msgr2=0x7fa5ec06f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:26.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.530+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/3250016888 shutdown_connections 2026-03-10T14:05:26.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.530+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/3250016888 wait complete. 2026-03-10T14:05:26.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.531+0000 7fa5f1c9d700 1 Processor -- start 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5f1c9d700 1 -- start start 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5f1c9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5ec071db0 0x7fa5ec116ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5f1c9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec117010 0x7fa5ec1b2a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5f1c9d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5ec117510 con 0x7fa5ec071db0 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5f1c9d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5ec117680 con 0x7fa5ec117010 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5eaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec117010 0x7fa5ec1b2a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5eaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec117010 0x7fa5ec1b2a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:59516/0 (socket says 192.168.123.103:59516) 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5eaffd700 1 -- 192.168.123.103:0/2366990349 learned_addr learned my addr 192.168.123.103:0/2366990349 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5eb7fe700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5ec071db0 0x7fa5ec116ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5eaffd700 1 -- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5ec071db0 msgr2=0x7fa5ec116ad0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5eaffd700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5ec071db0 0x7fa5ec116ad0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:26.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.532+0000 7fa5eaffd700 1 -- 192.168.123.103:0/2366990349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5e4009e30 con 0x7fa5ec117010 2026-03-10T14:05:26.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.533+0000 7fa5eaffd700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec117010 0x7fa5ec1b2a50 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fa5e4007770 tx=0x7fa5e40077a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:26.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.533+0000 7fa5e8ff9700 1 -- 192.168.123.103:0/2366990349 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5e400e050 con 0x7fa5ec117010 2026-03-10T14:05:26.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.533+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa5ec1b2f90 con 0x7fa5ec117010 2026-03-10T14:05:26.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.533+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5ec1b34b0 con 0x7fa5ec117010 2026-03-10T14:05:26.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.534+0000 7fa5e8ff9700 1 -- 192.168.123.103:0/2366990349 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa5e4007c80 con 0x7fa5ec117010 2026-03-10T14:05:26.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.534+0000 7fa5e8ff9700 1 -- 192.168.123.103:0/2366990349 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5e401b840 con 0x7fa5ec117010 2026-03-10T14:05:26.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.535+0000 7fa5e8ff9700 1 -- 192.168.123.103:0/2366990349 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa5e4019040 con 0x7fa5ec117010 2026-03-10T14:05:26.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.537+0000 7fa5e8ff9700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa5d406c6d0 0x7fa5d406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:26.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.537+0000 7fa5e8ff9700 1 -- 192.168.123.103:0/2366990349 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fa5e408c400 con 0x7fa5ec117010 2026-03-10T14:05:26.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.537+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa5d8005320 con 0x7fa5ec117010 2026-03-10T14:05:26.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.537+0000 7fa5eb7fe700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa5d406c6d0 0x7fa5d406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:26.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.538+0000 7fa5eb7fe700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa5d406c6d0 0x7fa5d406eb80 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fa5dc005950 tx=0x7fa5dc00b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:26.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.541+0000 7fa5e8ff9700 1 -- 192.168.123.103:0/2366990349 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa5e408ae60 con 0x7fa5ec117010 2026-03-10T14:05:26.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:26.700+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"} v 0) v1 -- 0x7fa5d8005f70 con 0x7fa5ec117010 2026-03-10T14:05:27.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.428+0000 7fa5e8ff9700 1 -- 192.168.123.103:0/2366990349 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]=0 inline data disabled v9) v1 ==== 133+0+0 (secure 0 0 0) 0x7fa5e405a380 con 0x7fa5ec117010 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.432+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa5d406c6d0 msgr2=0x7fa5d406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.432+0000 7fa5f1c9d700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa5d406c6d0 0x7fa5d406eb80 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fa5dc005950 tx=0x7fa5dc00b410 comp rx=0 tx=0).stop 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.432+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec117010 msgr2=0x7fa5ec1b2a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.432+0000 7fa5f1c9d700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec117010 0x7fa5ec1b2a50 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fa5e4007770 tx=0x7fa5e40077a0 comp rx=0 tx=0).stop 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.433+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 shutdown_connections 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.433+0000 7fa5f1c9d700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5ec071db0 0x7fa5ec116ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.433+0000 7fa5f1c9d700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa5d406c6d0 0x7fa5d406eb80 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.433+0000 7fa5f1c9d700 1 --2- 192.168.123.103:0/2366990349 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5ec117010 0x7fa5ec1b2a50 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.433+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 >> 192.168.123.103:0/2366990349 conn(0x7fa5ec06d3e0 msgr2=0x7fa5ec070700 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.433+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 shutdown_connections 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.433+0000 7fa5f1c9d700 1 -- 192.168.123.103:0/2366990349 wait complete. 2026-03-10T14:05:27.438 INFO:teuthology.orchestra.run.vm03.stderr:inline data disabled 2026-03-10T14:05:27.505 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T14:05:27.508 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:27.508 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph fs dump' 2026-03-10T14:05:27.553 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.553 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.553 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.553 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.553 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.553 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:27 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:27.553 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:27 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/2366990349' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-10T14:05:27.553 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:27 vm03 ceph-mon[49718]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-10T14:05:27.688 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:27.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:27.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:27 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:27.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:27 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/2366990349' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-10T14:05:27.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:27 vm04 ceph-mon[55966]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 -- 192.168.123.103:0/377592909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8071db0 msgr2=0x7f57f80721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 --2- 192.168.123.103:0/377592909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8071db0 0x7f57f80721c0 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7f57e8009b00 tx=0x7f57e8009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 -- 192.168.123.103:0/377592909 shutdown_connections 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 --2- 192.168.123.103:0/377592909 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f57f8107d50 0x7f57f81081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 --2- 192.168.123.103:0/377592909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8071db0 0x7f57f80721c0 unknown :-1 s=CLOSED pgs=253 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 -- 192.168.123.103:0/377592909 >> 192.168.123.103:0/377592909 conn(0x7f57f806d3e0 msgr2=0x7f57f806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 -- 192.168.123.103:0/377592909 shutdown_connections 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 -- 192.168.123.103:0/377592909 wait complete. 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.997+0000 7f57fd3f2700 1 Processor -- start 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57fd3f2700 1 -- start start 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57fd3f2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f57f8071db0 0x7f57f81a4bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57fd3f2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8107d50 0x7f57f81a5100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57fd3f2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57f81a5720 con 0x7f57f8107d50 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57fd3f2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57f81a5860 con 0x7f57f8071db0 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57f67fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8107d50 0x7f57f81a5100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57f67fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8107d50 0x7f57f81a5100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57630/0 (socket says 192.168.123.103:57630) 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57f67fc700 1 -- 192.168.123.103:0/2426383652 learned_addr learned my addr 192.168.123.103:0/2426383652 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:27.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57f6ffd700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f57f8071db0 0x7f57f81a4bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57f67fc700 1 -- 192.168.123.103:0/2426383652 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f57f8071db0 msgr2=0x7f57f81a4bc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57f67fc700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f57f8071db0 0x7f57f81a4bc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.998+0000 7f57f67fc700 1 -- 192.168.123.103:0/2426383652 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57e80097e0 con 0x7f57f8107d50 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.999+0000 7f57f67fc700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8107d50 0x7f57f81a5100 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7f57ec00d8d0 tx=0x7f57ec00dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.999+0000 7f57dffff700 1 -- 192.168.123.103:0/2426383652 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f57ec009880 con 0x7f57f8107d50 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.999+0000 7f57dffff700 1 -- 192.168.123.103:0/2426383652 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f57ec010460 con 0x7f57f8107d50 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.999+0000 7f57dffff700 1 -- 192.168.123.103:0/2426383652 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f57ec00f5d0 con 0x7f57f8107d50 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.999+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f57f810f4e0 con 0x7f57f8107d50 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:27.999+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57f810f9b0 con 0x7f57f8107d50 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.000+0000 7f57dffff700 1 -- 192.168.123.103:0/2426383652 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f57ec0099e0 con 0x7f57f8107d50 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.000+0000 7f57dffff700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57e007d5b0 0x7f57e007fa60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:28.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.000+0000 7f57dffff700 1 -- 192.168.123.103:0/2426383652 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f57ec08b380 con 0x7f57f8107d50 2026-03-10T14:05:28.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.001+0000 7f57f6ffd700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57e007d5b0 0x7f57e007fa60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:28.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.001+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f57f8066e40 con 0x7f57f8107d50 2026-03-10T14:05:28.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.004+0000 7f57f6ffd700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57e007d5b0 0x7f57e007fa60 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f57e800b5c0 tx=0x7f57e8005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:28.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.004+0000 7f57dffff700 1 -- 192.168.123.103:0/2426383652 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f57ec059610 con 0x7f57f8107d50 2026-03-10T14:05:28.167 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.168+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f57f8110290 con 0x7f57f8107d50 2026-03-10T14:05:28.167 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.169+0000 7f57dffff700 1 -- 192.168.123.103:0/2426383652 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 10 v10) v1 ==== 76+0+1796 (secure 0 0 0) 0x7f57ec016070 con 0x7f57f8107d50 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:e10 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:epoch 10 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:05:27.426630+0000 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 2 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 2 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:28.168 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57e007d5b0 msgr2=0x7f57e007fa60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57e007d5b0 0x7f57e007fa60 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f57e800b5c0 tx=0x7f57e8005fb0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8107d50 msgr2=0x7f57f81a5100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8107d50 0x7f57f81a5100 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7f57ec00d8d0 tx=0x7f57ec00dbe0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 shutdown_connections 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57e007d5b0 0x7f57e007fa60 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f57f8071db0 0x7f57f81a4bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 --2- 192.168.123.103:0/2426383652 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57f8107d50 0x7f57f81a5100 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.171+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 >> 192.168.123.103:0/2426383652 conn(0x7f57f806d3e0 msgr2=0x7f57f810af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.172+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 shutdown_connections 2026-03-10T14:05:28.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.172+0000 7f57fd3f2700 1 -- 192.168.123.103:0/2426383652 wait complete. 2026-03-10T14:05:28.171 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 10 2026-03-10T14:05:28.231 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-10T14:05:28.420 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:28.436 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: pgmap v78: 65 pgs: 65 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s wr, 2 op/s 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] up:boot 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 1 up:standby-replay 1 up:standby 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/2426383652' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:28 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: pgmap v78: 65 pgs: 65 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.4 KiB/s wr, 2 op/s 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] up:boot 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 1 up:standby-replay 1 up:standby 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/2426383652' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:28.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:28 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:28.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.741+0000 7faba12c5700 1 -- 192.168.123.103:0/721286048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c071950 msgr2=0x7fab9c071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.741+0000 7faba12c5700 1 --2- 192.168.123.103:0/721286048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c071950 0x7fab9c071d60 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7fab8c0099c0 tx=0x7fab8c009cd0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.741+0000 7faba12c5700 1 -- 192.168.123.103:0/721286048 shutdown_connections 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.741+0000 7faba12c5700 1 --2- 192.168.123.103:0/721286048 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab9c072330 0x7fab9c0770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.741+0000 7faba12c5700 1 --2- 192.168.123.103:0/721286048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c071950 0x7fab9c071d60 unknown :-1 s=CLOSED pgs=255 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.741+0000 7faba12c5700 1 -- 192.168.123.103:0/721286048 >> 192.168.123.103:0/721286048 conn(0x7fab9c06d1a0 msgr2=0x7fab9c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.741+0000 7faba12c5700 1 -- 192.168.123.103:0/721286048 shutdown_connections 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.741+0000 7faba12c5700 1 -- 192.168.123.103:0/721286048 wait complete. 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7faba12c5700 1 Processor -- start 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7faba12c5700 1 -- start start 2026-03-10T14:05:28.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7faba12c5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab9c072330 0x7fab9c0824e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7faba12c5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c082a20 0x7fab9c082e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7faba12c5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab9c1b2a90 con 0x7fab9c082a20 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7faba12c5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab9c1b2bd0 con 0x7fab9c072330 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7fab9b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c082a20 0x7fab9c082e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7fab9b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c082a20 0x7fab9c082e90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57636/0 (socket says 192.168.123.103:57636) 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7fab9b7fe700 1 -- 192.168.123.103:0/3016323206 learned_addr learned my addr 192.168.123.103:0/3016323206 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7fab9b7fe700 1 -- 192.168.123.103:0/3016323206 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab9c072330 msgr2=0x7fab9c0824e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7fab9b7fe700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab9c072330 0x7fab9c0824e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.742+0000 7fab9b7fe700 1 -- 192.168.123.103:0/3016323206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab8c0096b0 con 0x7fab9c082a20 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.743+0000 7fab9b7fe700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c082a20 0x7fab9c082e90 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7fab9400bf40 tx=0x7fab9400bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:28.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.743+0000 7fab997fa700 1 -- 192.168.123.103:0/3016323206 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab9400cb40 con 0x7fab9c082a20 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.743+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fab9c1b2d10 con 0x7fab9c082a20 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.743+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fab9c1b3230 con 0x7fab9c082a20 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.744+0000 7fab997fa700 1 -- 192.168.123.103:0/3016323206 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fab9400cca0 con 0x7fab9c082a20 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.744+0000 7fab997fa700 1 -- 192.168.123.103:0/3016323206 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab94012720 con 0x7fab9c082a20 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.744+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fab88005320 con 0x7fab9c082a20 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.745+0000 7fab997fa700 1 -- 192.168.123.103:0/3016323206 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fab940129a0 con 0x7fab9c082a20 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.745+0000 7fab997fa700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fab8406e9b0 0x7fab84070e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.746+0000 7fab997fa700 1 -- 192.168.123.103:0/3016323206 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fab9408b400 con 0x7fab9c082a20 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.747+0000 7fab9bfff700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fab8406e9b0 0x7fab84070e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.748+0000 7fab9bfff700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fab8406e9b0 0x7fab84070e60 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fab8c009910 tx=0x7fab8c009e50 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:28.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.748+0000 7fab997fa700 1 -- 192.168.123.103:0/3016323206 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fab94059740 con 0x7fab9c082a20 2026-03-10T14:05:28.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.894+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7fab88005cc0 con 0x7fab9c082a20 2026-03-10T14:05:28.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.894+0000 7fab997fa700 1 -- 192.168.123.103:0/3016323206 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 10 v10) v1 ==== 94+0+4785 (secure 0 0 0) 0x7fab940592d0 con 0x7fab9c082a20 2026-03-10T14:05:28.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fab8406e9b0 msgr2=0x7fab84070e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:28.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fab8406e9b0 0x7fab84070e60 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fab8c009910 tx=0x7fab8c009e50 comp rx=0 tx=0).stop 2026-03-10T14:05:28.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c082a20 msgr2=0x7fab9c082e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:28.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c082a20 0x7fab9c082e90 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7fab9400bf40 tx=0x7fab9400bf70 comp rx=0 tx=0).stop 2026-03-10T14:05:28.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 shutdown_connections 2026-03-10T14:05:28.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fab8406e9b0 0x7fab84070e60 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab9c072330 0x7fab9c0824e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 --2- 192.168.123.103:0/3016323206 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab9c082a20 0x7fab9c082e90 unknown :-1 s=CLOSED pgs=256 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:28.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.898+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 >> 192.168.123.103:0/3016323206 conn(0x7fab9c06d1a0 msgr2=0x7fab9c076490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:28.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.905+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 shutdown_connections 2026-03-10T14:05:28.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:28.905+0000 7faba12c5700 1 -- 192.168.123.103:0/3016323206 wait complete. 2026-03-10T14:05:28.905 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 10 2026-03-10T14:05:28.918 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:05:28.965 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-10T14:05:29.192 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:29.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:29 vm03 ceph-mon[49718]: pgmap v79: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3.0 KiB/s wr, 14 op/s 2026-03-10T14:05:29.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:29 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3016323206' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T14:05:29.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:29 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:29.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:29 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:29.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:29 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:29.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:29 vm03 ceph-mon[49718]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON) 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.535+0000 7f4a43d95700 1 -- 192.168.123.103:0/4291811041 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 msgr2=0x7f4a3c103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.535+0000 7f4a43d95700 1 --2- 192.168.123.103:0/4291811041 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 0x7f4a3c103dd0 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7f4a38009b00 tx=0x7f4a38009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.536+0000 7f4a43d95700 1 -- 192.168.123.103:0/4291811041 shutdown_connections 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.536+0000 7f4a43d95700 1 --2- 192.168.123.103:0/4291811041 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 0x7f4a3c103dd0 unknown :-1 s=CLOSED pgs=257 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.536+0000 7f4a43d95700 1 --2- 192.168.123.103:0/4291811041 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4a3c102780 0x7f4a3c102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.536+0000 7f4a43d95700 1 -- 192.168.123.103:0/4291811041 >> 192.168.123.103:0/4291811041 conn(0x7f4a3c0fdd50 msgr2=0x7f4a3c100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.537+0000 7f4a43d95700 1 -- 192.168.123.103:0/4291811041 shutdown_connections 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.537+0000 7f4a43d95700 1 -- 192.168.123.103:0/4291811041 wait complete. 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.537+0000 7f4a43d95700 1 Processor -- start 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.537+0000 7f4a43d95700 1 -- start start 2026-03-10T14:05:29.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.538+0000 7f4a43d95700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4a3c102780 0x7f4a3c197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:29.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.538+0000 7f4a43d95700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 0x7f4a3c198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:29.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.538+0000 7f4a41330700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 0x7f4a3c198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:29.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.538+0000 7f4a41330700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 0x7f4a3c198530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57656/0 (socket says 192.168.123.103:57656) 2026-03-10T14:05:29.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.538+0000 7f4a41330700 1 -- 192.168.123.103:0/741570783 learned_addr learned my addr 192.168.123.103:0/741570783 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:29.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.538+0000 7f4a43d95700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a3c198b50 con 0x7f4a3c103980 2026-03-10T14:05:29.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.538+0000 7f4a43d95700 1 -- 192.168.123.103:0/741570783 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a3c198c90 con 0x7f4a3c102780 2026-03-10T14:05:29.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a41b31700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4a3c102780 0x7f4a3c197ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:29.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a41330700 1 -- 192.168.123.103:0/741570783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4a3c102780 msgr2=0x7f4a3c197ff0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:29.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a41330700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4a3c102780 0x7f4a3c197ff0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:29.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a41330700 1 -- 192.168.123.103:0/741570783 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a380097e0 con 0x7f4a3c103980 2026-03-10T14:05:29.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a41b31700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4a3c102780 0x7f4a3c197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:29.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a41330700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 0x7f4a3c198530 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7f4a380048c0 tx=0x7f4a380049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:29.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a32ffd700 1 -- 192.168.123.103:0/741570783 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a3801d070 con 0x7f4a3c103980 2026-03-10T14:05:29.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a32ffd700 1 -- 192.168.123.103:0/741570783 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4a3800bc50 con 0x7f4a3c103980 2026-03-10T14:05:29.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a32ffd700 1 -- 192.168.123.103:0/741570783 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a3800f830 con 0x7f4a3c103980 2026-03-10T14:05:29.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a43d95700 1 -- 192.168.123.103:0/741570783 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a3c19d6e0 con 0x7f4a3c103980 2026-03-10T14:05:29.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.539+0000 7f4a43d95700 1 -- 192.168.123.103:0/741570783 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a3c19dbd0 con 0x7f4a3c103980 2026-03-10T14:05:29.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.541+0000 7f4a43d95700 1 -- 192.168.123.103:0/741570783 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a3c066e40 con 0x7f4a3c103980 2026-03-10T14:05:29.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.541+0000 7f4a32ffd700 1 -- 192.168.123.103:0/741570783 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4a38022b50 con 0x7f4a3c103980 2026-03-10T14:05:29.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.541+0000 7f4a32ffd700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a2806c750 0x7f4a2806ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:29.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.542+0000 7f4a32ffd700 1 -- 192.168.123.103:0/741570783 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f4a3808cfc0 con 0x7f4a3c103980 2026-03-10T14:05:29.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.545+0000 7f4a41b31700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a2806c750 0x7f4a2806ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:29.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.545+0000 7f4a41b31700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a2806c750 0x7f4a2806ec00 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f4a2c005fd0 tx=0x7f4a2c005f00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:29.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.547+0000 7f4a32ffd700 1 -- 192.168.123.103:0/741570783 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4a3805b250 con 0x7f4a3c103980 2026-03-10T14:05:29.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.748+0000 7f4a43d95700 1 -- 192.168.123.103:0/741570783 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7f4a3c19dfa0 con 0x7f4a3c103980 2026-03-10T14:05:29.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.749+0000 7f4a32ffd700 1 -- 192.168.123.103:0/741570783 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v11) v1 ==== 78+0+83 (secure 0 0 0) 0x7f4a38027070 con 0x7f4a3c103980 2026-03-10T14:05:29.750 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 -- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a2806c750 msgr2=0x7f4a2806ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a2806c750 0x7f4a2806ec00 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f4a2c005fd0 tx=0x7f4a2c005f00 comp rx=0 tx=0).stop 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 -- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 msgr2=0x7f4a3c198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 0x7f4a3c198530 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7f4a380048c0 tx=0x7f4a380049a0 comp rx=0 tx=0).stop 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 -- 192.168.123.103:0/741570783 shutdown_connections 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a2806c750 0x7f4a2806ec00 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4a3c102780 0x7f4a3c197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 --2- 192.168.123.103:0/741570783 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a3c103980 0x7f4a3c198530 unknown :-1 s=CLOSED pgs=258 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 -- 192.168.123.103:0/741570783 >> 192.168.123.103:0/741570783 conn(0x7f4a3c0fdd50 msgr2=0x7f4a3c106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 -- 192.168.123.103:0/741570783 shutdown_connections 2026-03-10T14:05:29.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:29.752+0000 7f4a30ff9700 1 -- 192.168.123.103:0/741570783 wait complete. 2026-03-10T14:05:29.763 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:05:29.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:29 vm04 ceph-mon[55966]: pgmap v79: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3.0 KiB/s wr, 14 op/s 2026-03-10T14:05:29.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:29 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3016323206' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T14:05:29.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:29 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:29.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:29 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:29.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:29 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:29.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:29 vm04 ceph-mon[55966]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON) 2026-03-10T14:05:29.827 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-10T14:05:29.831 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 2026-03-10T14:05:30.075 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:30.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.389+0000 7f5c07d63700 1 -- 192.168.123.103:0/1064764091 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 msgr2=0x7f5c00068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.389+0000 7f5c07d63700 1 --2- 192.168.123.103:0/1064764091 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 0x7f5c00068ac0 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7f5bf800b3a0 tx=0x7f5bf800b6b0 comp rx=0 tx=0).stop 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.391+0000 7f5c07d63700 1 -- 192.168.123.103:0/1064764091 shutdown_connections 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.391+0000 7f5c07d63700 1 --2- 192.168.123.103:0/1064764091 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c00069000 0x7f5c001051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.391+0000 7f5c07d63700 1 --2- 192.168.123.103:0/1064764091 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 0x7f5c00068ac0 unknown :-1 s=CLOSED pgs=259 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.391+0000 7f5c07d63700 1 -- 192.168.123.103:0/1064764091 >> 192.168.123.103:0/1064764091 conn(0x7f5c000754a0 msgr2=0x7f5c000758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.391+0000 7f5c07d63700 1 -- 192.168.123.103:0/1064764091 shutdown_connections 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.391+0000 7f5c07d63700 1 -- 192.168.123.103:0/1064764091 wait complete. 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c07d63700 1 Processor -- start 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c07d63700 1 -- start start 2026-03-10T14:05:30.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c07d63700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 0x7f5c001a2640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:30.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c07d63700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c00069000 0x7f5c001a2b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:30.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c07d63700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c001a3210 con 0x7f5c000686f0 2026-03-10T14:05:30.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c07d63700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c0019c6c0 con 0x7f5c00069000 2026-03-10T14:05:30.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c052fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c00069000 0x7f5c001a2b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:30.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c052fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c00069000 0x7f5c001a2b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:59580/0 (socket says 192.168.123.103:59580) 2026-03-10T14:05:30.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c052fe700 1 -- 192.168.123.103:0/4144809186 learned_addr learned my addr 192.168.123.103:0/4144809186 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:30.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c052fe700 1 -- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 msgr2=0x7f5c001a2640 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.392+0000 7f5c05aff700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 0x7f5c001a2640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.393+0000 7f5c052fe700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 0x7f5c001a2640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.393+0000 7f5c052fe700 1 -- 192.168.123.103:0/4144809186 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5bf800b050 con 0x7f5c00069000 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.393+0000 7f5c052fe700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c00069000 0x7f5c001a2b80 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f5bfc00d8d0 tx=0x7f5bfc00dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.393+0000 7f5c05aff700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 0x7f5c001a2640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.394+0000 7f5bf6ffd700 1 -- 192.168.123.103:0/4144809186 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5bfc009940 con 0x7f5c00069000 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.394+0000 7f5bf6ffd700 1 -- 192.168.123.103:0/4144809186 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5bfc010460 con 0x7f5c00069000 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.394+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5c0019c9a0 con 0x7f5c00069000 2026-03-10T14:05:30.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.394+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5c0019cef0 con 0x7f5c00069000 2026-03-10T14:05:30.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.395+0000 7f5bf6ffd700 1 -- 192.168.123.103:0/4144809186 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5bfc00f5d0 con 0x7f5c00069000 2026-03-10T14:05:30.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.395+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5be4005320 con 0x7f5c00069000 2026-03-10T14:05:30.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.395+0000 7f5bf6ffd700 1 -- 192.168.123.103:0/4144809186 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5bfc00f7b0 con 0x7f5c00069000 2026-03-10T14:05:30.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.396+0000 7f5bf6ffd700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5bec06c520 0x7f5bec06e9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:30.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.396+0000 7f5c05aff700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5bec06c520 0x7f5bec06e9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:30.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.396+0000 7f5c05aff700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5bec06c520 0x7f5bec06e9d0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f5bf8009250 tx=0x7f5bf800bf90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:30.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.397+0000 7f5bf6ffd700 1 -- 192.168.123.103:0/4144809186 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f5bfc08b4e0 con 0x7f5c00069000 2026-03-10T14:05:30.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.398+0000 7f5bf6ffd700 1 -- 192.168.123.103:0/4144809186 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5bfc05a420 con 0x7f5c00069000 2026-03-10T14:05:30.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.533+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f5be4005f70 con 0x7f5c00069000 2026-03-10T14:05:30.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.534+0000 7f5bf6ffd700 1 -- 192.168.123.103:0/4144809186 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 12 v12) v1 ==== 94+0+4781 (secure 0 0 0) 0x7f5bfc05a240 con 0x7f5c00069000 2026-03-10T14:05:30.533 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:30.533 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":12,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[],"filesystems":[{"mdsmap":{"epoch":12,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:05:30.451438+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_14486":{"gid":14486,"name":"cephfs.vm03.itwezo","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.103:6829/1684840896","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1684840896},{"type":"v1","addr":"192.168.123.103:6829","nonce":1684840896}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24273":{"gid":24273,"name":"cephfs.vm04.puavjd","rank":1,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.104:6827/3934202413","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3934202413},{"type":"v1","addr":"192.168.123.104:6827","nonce":3934202413}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-10T14:05:30.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5bec06c520 msgr2=0x7f5bec06e9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:30.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5bec06c520 0x7f5bec06e9d0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f5bf8009250 tx=0x7f5bf800bf90 comp rx=0 tx=0).stop 2026-03-10T14:05:30.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c00069000 msgr2=0x7f5c001a2b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:30.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c00069000 0x7f5c001a2b80 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f5bfc00d8d0 tx=0x7f5bfc00dc90 comp rx=0 tx=0).stop 2026-03-10T14:05:30.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 shutdown_connections 2026-03-10T14:05:30.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c000686f0 0x7f5c001a2640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:30.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5bec06c520 0x7f5bec06e9d0 secure :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f5bf8009250 tx=0x7f5bf800bf90 comp rx=0 tx=0).stop 2026-03-10T14:05:30.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 --2- 192.168.123.103:0/4144809186 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5c00069000 0x7f5c001a2b80 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:30.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.536+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 >> 192.168.123.103:0/4144809186 conn(0x7f5c000754a0 msgr2=0x7f5c000fe9b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:30.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.537+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 shutdown_connections 2026-03-10T14:05:30.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:30.537+0000 7f5c07d63700 1 -- 192.168.123.103:0/4144809186 wait complete. 2026-03-10T14:05:30.537 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-10T14:05:30.599 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 12, 'max_mds': 2, 'flags': 50} 2026-03-10T14:05:30.599 INFO:teuthology.run_tasks:Running task ceph-fuse... 2026-03-10T14:05:30.609 INFO:tasks.ceph_fuse:Running ceph_fuse task... 2026-03-10T14:05:30.609 INFO:tasks.ceph_fuse:config is {'client.0': {}, 'client.1': {}} 2026-03-10T14:05:30.609 INFO:tasks.ceph_fuse:client.0 config is {} 2026-03-10T14:05:30.610 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T14:05:30.610 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:30.610 INFO:tasks.ceph_fuse:client.1 config is {} 2026-03-10T14:05:30.610 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T14:05:30.610 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:30.610 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:30.610 DEBUG:teuthology.orchestra.run.vm04:> ip netns list 2026-03-10T14:05:30.628 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:30.628 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link delete ceph-brx 2026-03-10T14:05:30.699 INFO:teuthology.orchestra.run.vm04.stderr:Cannot find device "ceph-brx" 2026-03-10T14:05:30.700 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T14:05:30.700 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:30.700 DEBUG:teuthology.orchestra.run.vm03:> ip netns list 2026-03-10T14:05:30.717 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:30.717 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link delete ceph-brx 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] up:active 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] up:active 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/741570783' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:30.781 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:30 vm03 ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:30.785 INFO:teuthology.orchestra.run.vm03.stderr:Cannot find device "ceph-brx" 2026-03-10T14:05:30.787 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T14:05:30.787 INFO:tasks.ceph_fuse:Mounting ceph-fuse clients... 2026-03-10T14:05:30.787 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T14:05:30.787 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs ls 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] up:active 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] up:active 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/741570783' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:30.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:30 vm04 ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:30.978 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.221+0000 7f81cac10700 1 -- 192.168.123.103:0/804361964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4108780 msgr2=0x7f81c4108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.221+0000 7f81cac10700 1 --2- 192.168.123.103:0/804361964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4108780 0x7f81c4108b50 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7f81b4009b00 tx=0x7f81b4009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.222+0000 7f81cac10700 1 -- 192.168.123.103:0/804361964 shutdown_connections 2026-03-10T14:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.222+0000 7f81cac10700 1 --2- 192.168.123.103:0/804361964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f81c4102780 0x7f81c4102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.222+0000 7f81cac10700 1 --2- 192.168.123.103:0/804361964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4108780 0x7f81c4108b50 unknown :-1 s=CLOSED pgs=260 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.222+0000 7f81cac10700 1 -- 192.168.123.103:0/804361964 >> 192.168.123.103:0/804361964 conn(0x7f81c40fe280 msgr2=0x7f81c4100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.223+0000 7f81cac10700 1 -- 192.168.123.103:0/804361964 shutdown_connections 2026-03-10T14:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.223+0000 7f81cac10700 1 -- 192.168.123.103:0/804361964 wait complete. 2026-03-10T14:05:31.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.223+0000 7f81cac10700 1 Processor -- start 2026-03-10T14:05:31.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.223+0000 7f81cac10700 1 -- start start 2026-03-10T14:05:31.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81cac10700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f81c4102780 0x7f81c41985f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81cac10700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4198b30 0x7f81c419cf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81cac10700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81c4199140 con 0x7f81c4198b30 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81cac10700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f81c41992b0 con 0x7f81c4102780 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81c3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4198b30 0x7f81c419cf50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81c3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4198b30 0x7f81c419cf50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57686/0 (socket says 192.168.123.103:57686) 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81c3fff700 1 -- 192.168.123.103:0/3608658295 learned_addr learned my addr 192.168.123.103:0/3608658295 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81c3fff700 1 -- 192.168.123.103:0/3608658295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f81c4102780 msgr2=0x7f81c41985f0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81c89ac700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f81c4102780 0x7f81c41985f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81c3fff700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f81c4102780 0x7f81c41985f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.224+0000 7f81c3fff700 1 -- 192.168.123.103:0/3608658295 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f81b40097e0 con 0x7f81c4198b30 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.225+0000 7f81c3fff700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4198b30 0x7f81c419cf50 secure :-1 s=READY pgs=261 cs=0 l=1 rev1=1 crypto rx=0x7f81b800d900 tx=0x7f81b800dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:31.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.225+0000 7f81c1ffb700 1 -- 192.168.123.103:0/3608658295 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f81b8009af0 con 0x7f81c4198b30 2026-03-10T14:05:31.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.225+0000 7f81c1ffb700 1 -- 192.168.123.103:0/3608658295 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f81b800dea0 con 0x7f81c4198b30 2026-03-10T14:05:31.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.225+0000 7f81c1ffb700 1 -- 192.168.123.103:0/3608658295 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f81b800f660 con 0x7f81c4198b30 2026-03-10T14:05:31.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.225+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f81c419d550 con 0x7f81c4198b30 2026-03-10T14:05:31.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.225+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f81c419daa0 con 0x7f81c4198b30 2026-03-10T14:05:31.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.225+0000 7f81c89ac700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f81c4102780 0x7f81c41985f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:31.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.226+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f81c410aca0 con 0x7f81c4198b30 2026-03-10T14:05:31.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.227+0000 7f81c1ffb700 1 -- 192.168.123.103:0/3608658295 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f81b8009c50 con 0x7f81c4198b30 2026-03-10T14:05:31.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.227+0000 7f81c1ffb700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f81ac06c680 0x7f81ac06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:31.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.227+0000 7f81c1ffb700 1 -- 192.168.123.103:0/3608658295 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f81b8020030 con 0x7f81c4198b30 2026-03-10T14:05:31.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.228+0000 7f81c89ac700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f81ac06c680 0x7f81ac06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:31.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.228+0000 7f81c89ac700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f81ac06c680 0x7f81ac06eb30 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f81b4006010 tx=0x7f81b40058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:31.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.230+0000 7f81c1ffb700 1 -- 192.168.123.103:0/3608658295 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f81b8055e40 con 0x7f81c4198b30 2026-03-10T14:05:31.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.357+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f81c419dd80 con 0x7f81c4198b30 2026-03-10T14:05:31.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.358+0000 7f81c1ffb700 1 -- 192.168.123.103:0/3608658295 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v12) v1 ==== 53+0+83 (secure 0 0 0) 0x7f81b8059460 con 0x7f81c4198b30 2026-03-10T14:05:31.358 INFO:teuthology.orchestra.run.vm03.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T14:05:31.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.362+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f81ac06c680 msgr2=0x7f81ac06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:31.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.362+0000 7f81cac10700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f81ac06c680 0x7f81ac06eb30 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f81b4006010 tx=0x7f81b40058e0 comp rx=0 tx=0).stop 2026-03-10T14:05:31.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.362+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4198b30 msgr2=0x7f81c419cf50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:31.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.362+0000 7f81cac10700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4198b30 0x7f81c419cf50 secure :-1 s=READY pgs=261 cs=0 l=1 rev1=1 crypto rx=0x7f81b800d900 tx=0x7f81b800dcc0 comp rx=0 tx=0).stop 2026-03-10T14:05:31.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.362+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 shutdown_connections 2026-03-10T14:05:31.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.363+0000 7f81cac10700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f81ac06c680 0x7f81ac06eb30 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:31.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.363+0000 7f81cac10700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f81c4102780 0x7f81c41985f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:31.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.363+0000 7f81cac10700 1 --2- 192.168.123.103:0/3608658295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f81c4198b30 0x7f81c419cf50 unknown :-1 s=CLOSED pgs=261 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:31.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.363+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 >> 192.168.123.103:0/3608658295 conn(0x7f81c40fe280 msgr2=0x7f81c40ffde0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:31.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.363+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 shutdown_connections 2026-03-10T14:05:31.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31.363+0000 7f81cac10700 1 -- 192.168.123.103:0/3608658295 wait complete. 2026-03-10T14:05:31.424 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T14:05:31.425 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T14:05:31.425 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm03.local 2026-03-10T14:05:31.425 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-10T14:05:31.425 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:31.425 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T14:05:31.425 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T14:05:31.425 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T14:05:31.425 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-10T14:05:31.425 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:31.425 DEBUG:teuthology.orchestra.run.vm03:> ip addr 2026-03-10T14:05:31.439 INFO:teuthology.orchestra.run.vm03.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T14:05:31.439 INFO:teuthology.orchestra.run.vm03.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T14:05:31.439 INFO:teuthology.orchestra.run.vm03.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T14:05:31.439 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-10T14:05:31.439 INFO:teuthology.orchestra.run.vm03.stdout: inet6 ::1/128 scope host 2026-03-10T14:05:31.439 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-10T14:05:31.440 INFO:teuthology.orchestra.run.vm03.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T14:05:31.440 INFO:teuthology.orchestra.run.vm03.stdout: link/ether 52:55:00:00:00:03 brd ff:ff:ff:ff:ff:ff 2026-03-10T14:05:31.440 INFO:teuthology.orchestra.run.vm03.stdout: altname enp0s3 2026-03-10T14:05:31.440 INFO:teuthology.orchestra.run.vm03.stdout: altname ens3 2026-03-10T14:05:31.440 INFO:teuthology.orchestra.run.vm03.stdout: inet 192.168.123.103/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T14:05:31.440 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft 3164sec preferred_lft 3164sec 2026-03-10T14:05:31.440 INFO:teuthology.orchestra.run.vm03.stdout: inet6 fe80::5055:ff:fe00:3/64 scope link noprefixroute 2026-03-10T14:05:31.440 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-10T14:05:31.440 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T14:05:31.440 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:31.440 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-10T14:05:31.440 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link add name ceph-brx type bridge 2026-03-10T14:05:31.440 DEBUG:teuthology.orchestra.run.vm03:> sudo ip addr flush dev ceph-brx 2026-03-10T14:05:31.440 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set ceph-brx up 2026-03-10T14:05:31.440 DEBUG:teuthology.orchestra.run.vm03:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T14:05:31.440 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-10T14:05:31.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:31.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:31.597 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:31.597 DEBUG:teuthology.orchestra.run.vm03:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T14:05:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:31 vm03 ceph-mon[49718]: pgmap v80: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3.6 KiB/s wr, 14 op/s 2026-03-10T14:05:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:31 vm03 ceph-mon[49718]: pgmap v81: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3.7 KiB/s wr, 14 op/s 2026-03-10T14:05:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:31 vm03 ceph-mon[49718]: pgmap v82: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.5 KiB/s rd, 4.3 KiB/s wr, 16 op/s 2026-03-10T14:05:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:31 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] up:standby-replay 2026-03-10T14:05:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:31 vm03 ceph-mon[49718]: mds.? [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] up:standby-replay 2026-03-10T14:05:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:31 vm03 ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:05:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:31 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/4144809186' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T14:05:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:31 vm03 ceph-mon[49718]: from='client.? 192.168.123.103:0/3608658295' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T14:05:31.650 INFO:teuthology.orchestra.run.vm03.stdout:1 2026-03-10T14:05:31.650 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:31.650 DEBUG:teuthology.orchestra.run.vm03:> ip r 2026-03-10T14:05:31.706 INFO:teuthology.orchestra.run.vm03.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.103 metric 100 2026-03-10T14:05:31.706 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.103 metric 100 2026-03-10T14:05:31.707 INFO:teuthology.orchestra.run.vm03.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T14:05:31.707 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:31.707 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-10T14:05:31.707 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T14:05:31.707 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T14:05:31.707 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T14:05:31.707 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-10T14:05:31.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:31.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:31 vm04 ceph-mon[55966]: pgmap v80: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3.6 KiB/s wr, 14 op/s 2026-03-10T14:05:31.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:31 vm04 ceph-mon[55966]: pgmap v81: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3.7 KiB/s wr, 14 op/s 2026-03-10T14:05:31.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:31 vm04 ceph-mon[55966]: pgmap v82: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.5 KiB/s rd, 4.3 KiB/s wr, 16 op/s 2026-03-10T14:05:31.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:31 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] up:standby-replay 2026-03-10T14:05:31.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:31 vm04 ceph-mon[55966]: mds.? [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] up:standby-replay 2026-03-10T14:05:31.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:31 vm04 ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:05:31.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:31 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/4144809186' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T14:05:31.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:31 vm04 ceph-mon[55966]: from='client.? 192.168.123.103:0/3608658295' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T14:05:31.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:31 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:31.853 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:31.853 DEBUG:teuthology.orchestra.run.vm03:> ip netns list 2026-03-10T14:05:31.915 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:31.915 DEBUG:teuthology.orchestra.run.vm03:> ip netns list-id 2026-03-10T14:05:31.974 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:31.974 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-10T14:05:31.974 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T14:05:31.974 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-10T14:05:31.974 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-10T14:05:32.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:32 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:32.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:32 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:32.086 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-10T14:05:32.086 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:32.086 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-10T14:05:32.086 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-10T14:05:32.086 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T14:05:32.086 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-10T14:05:32.086 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-10T14:05:32.087 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-10T14:05:32.087 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-10T14:05:32.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:32 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:32.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:32 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:32.235 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:32.236 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-10T14:05:32.236 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set brx.0 up 2026-03-10T14:05:32.236 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T14:05:32.236 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-10T14:05:32.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:32 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:32.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:32 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:32.344 INFO:tasks.cephfs.fuse_mount:Client client.0 config is {} 2026-03-10T14:05:32.344 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T14:05:32.344 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:32.404 INFO:teuthology.orchestra.run.vm03.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-10T14:05:32.404 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T14:05:32.404 DEBUG:teuthology.orchestra.run.vm03:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:32.462 DEBUG:teuthology.orchestra.run.vm03:> sudo modprobe fuse 2026-03-10T14:05:32.529 DEBUG:teuthology.orchestra.run.vm03:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/proc 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/dev 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/security 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/dev/shm 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/dev/pts 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/cgroup 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/pstore 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/bpf 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/config 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/ 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/selinux 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/dev/hugepages 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/dev/mqueue 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/debug 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/tracing 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/fuse/connections 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/user/1000 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/950bbe4b784535a88e623c668fc55cacc712b6363cb28b3e2633a55b09f8bd07/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/25fe7e320eb45c3a0f44e9f2a98e94d19d6d5fb847439d1cf5e74eeb8ee43448/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/user/0 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/de1478d67532ac550c6223a711f421a717131b285a3e802036d00162da0d2a78/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/26ea6cd1072f95103dae444f037d3f8a910f574601ce02c8ec93e94854859436/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/0bc911bba918342865f59e27c6b801b3b503f76a540879b670a74c73a0b67546/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/694122104dae69d66088bc02ca576f0baf55f408e433a1f792e7ed6c507569ab/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/db0d46ae2e48ff24ae4a937ff3c92e5c444195adf32d984f07239e1cdf1be539/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/455d6f1b993f78844ade87b30f011ecb52ff488867a09825b4be67816318f79e/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/396853c69066ad3a1491e7bb8f2f8ce19b797902a8dd57d5a9eaaeb967b3a629/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/c87272cdf873aef2edf469aa045a1d0555087ceb15f190a7c0186ed384e11ea1/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/26c3fd7d9193f86a1f32785f7989b8d86be2ed83c9a8483b10abc9efb0b76551/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/3800d6ec288a862a41c792c242b94d974abd3bde6aeacd62ead193b4281b8ab3/merged 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T14:05:32.586 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T14:05:32.587 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:32.587 DEBUG:teuthology.orchestra.run.vm03:> ls /sys/fs/fuse/connections 2026-03-10T14:05:32.642 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T14:05:32.642 DEBUG:teuthology.orchestra.run.vm03:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.0 --id 0) 2026-03-10T14:05:32.685 DEBUG:teuthology.orchestra.run.vm03:> sudo modprobe fuse 2026-03-10T14:05:32.709 DEBUG:teuthology.orchestra.run.vm03:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T14:05:32.757 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm03.stderr:ceph-fuse[91444]: starting ceph client 2026-03-10T14:05:32.757 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm03.stderr:2026-03-10T14:05:32.758+0000 7fc019bc8480 -1 init, newargv = 0x56264fe8f4e0 newargc=15 2026-03-10T14:05:32.769 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm03.stderr:ceph-fuse[91444]: starting fuse 2026-03-10T14:05:32.779 INFO:teuthology.orchestra.run.vm03.stdout:/proc 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/dev 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/security 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/dev/shm 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/dev/pts 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/cgroup 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/pstore 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/bpf 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/config 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/ 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/selinux 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/dev/hugepages 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/dev/mqueue 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/debug 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/kernel/tracing 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/sys/fs/fuse/connections 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/user/1000 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/950bbe4b784535a88e623c668fc55cacc712b6363cb28b3e2633a55b09f8bd07/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/25fe7e320eb45c3a0f44e9f2a98e94d19d6d5fb847439d1cf5e74eeb8ee43448/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/user/0 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/de1478d67532ac550c6223a711f421a717131b285a3e802036d00162da0d2a78/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/26ea6cd1072f95103dae444f037d3f8a910f574601ce02c8ec93e94854859436/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/0bc911bba918342865f59e27c6b801b3b503f76a540879b670a74c73a0b67546/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/694122104dae69d66088bc02ca576f0baf55f408e433a1f792e7ed6c507569ab/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/db0d46ae2e48ff24ae4a937ff3c92e5c444195adf32d984f07239e1cdf1be539/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/455d6f1b993f78844ade87b30f011ecb52ff488867a09825b4be67816318f79e/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/396853c69066ad3a1491e7bb8f2f8ce19b797902a8dd57d5a9eaaeb967b3a629/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/c87272cdf873aef2edf469aa045a1d0555087ceb15f190a7c0186ed384e11ea1/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/26c3fd7d9193f86a1f32785f7989b8d86be2ed83c9a8483b10abc9efb0b76551/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/var/lib/containers/storage/overlay/3800d6ec288a862a41c792c242b94d974abd3bde6aeacd62ead193b4281b8ab3/merged 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T14:05:32.780 INFO:teuthology.orchestra.run.vm03.stdout:/home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:32.781 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:32.781 DEBUG:teuthology.orchestra.run.vm03:> ls /sys/fs/fuse/connections 2026-03-10T14:05:32.839 INFO:teuthology.orchestra.run.vm03.stdout:59 2026-03-10T14:05:32.839 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [59] 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> sudo stdin-killer -- python3 -c ' 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> import glob 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> import re 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> import os 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> import subprocess 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> def _find_admin_socket(client_name): 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> asok_path = "/var/run/ceph/ceph-client.0.*.asok" 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> files = glob.glob(asok_path) 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> mountpoint = "/home/ubuntu/cephtest/mnt.0" 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> # Given a non-glob path, it better be there 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> if "*" not in asok_path: 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> assert(len(files) == 1) 2026-03-10T14:05:32.839 DEBUG:teuthology.orchestra.run.vm03:> return files[0] 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> for f in files: 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> contents = proc_f.read() 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> if mountpoint in contents: 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> return f 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> print(_find_admin_socket("client.0")) 2026-03-10T14:05:32.840 DEBUG:teuthology.orchestra.run.vm03:> ' 2026-03-10T14:05:32.941 INFO:teuthology.orchestra.run.vm03.stdout:/var/run/ceph/ceph-client.0.91444.asok 2026-03-10T14:05:32.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:32 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:32.950 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.0.91444.asok 2026-03-10T14:05:32.950 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:32.950 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.0.91444.asok status 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "metadata": { 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "ceph_sha1": "5dd24139a1eada541a3bc16b6941c5dde975e26d", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "entity_id": "0", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "hostname": "vm03.local", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.0", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "pid": "91444", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "root": "/" 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "dentry_count": 0, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "dentry_pinned_count": 0, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "id": 14514, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "inst": { 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "name": { 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "type": "client", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "num": 14514 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "addr": { 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "type": "v1", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "addr": "192.168.144.1:0", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "nonce": 4117503891 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "addr": { 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "type": "v1", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "addr": "192.168.144.1:0", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "nonce": 4117503891 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "inst_str": "client.14514 192.168.144.1:0/4117503891", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "addr_str": "192.168.144.1:0/4117503891", 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "inode_count": 1, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "mds_epoch": 12, 2026-03-10T14:05:33.065 INFO:teuthology.orchestra.run.vm03.stdout: "osd_epoch": 38, 2026-03-10T14:05:33.066 INFO:teuthology.orchestra.run.vm03.stdout: "osd_epoch_barrier": 0, 2026-03-10T14:05:33.066 INFO:teuthology.orchestra.run.vm03.stdout: "blocklisted": false, 2026-03-10T14:05:33.066 INFO:teuthology.orchestra.run.vm03.stdout: "fs_name": "cephfs" 2026-03-10T14:05:33.066 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:05:33.072 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T14:05:33.072 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs ls 2026-03-10T14:05:33.235 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.512+0000 7f3dc9439700 1 -- 192.168.123.103:0/728854241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4108790 msgr2=0x7f3dc4108b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.512+0000 7f3dc9439700 1 --2- 192.168.123.103:0/728854241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4108790 0x7f3dc4108b60 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7f3dac009b00 tx=0x7f3dac009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.513+0000 7f3dc9439700 1 -- 192.168.123.103:0/728854241 shutdown_connections 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.513+0000 7f3dc9439700 1 --2- 192.168.123.103:0/728854241 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dc4102790 0x7f3dc4102c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.513+0000 7f3dc9439700 1 --2- 192.168.123.103:0/728854241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4108790 0x7f3dc4108b60 unknown :-1 s=CLOSED pgs=264 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.513+0000 7f3dc9439700 1 -- 192.168.123.103:0/728854241 >> 192.168.123.103:0/728854241 conn(0x7f3dc40fe2b0 msgr2=0x7f3dc41006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.513+0000 7f3dc9439700 1 -- 192.168.123.103:0/728854241 shutdown_connections 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.513+0000 7f3dc9439700 1 -- 192.168.123.103:0/728854241 wait complete. 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.514+0000 7f3dc9439700 1 Processor -- start 2026-03-10T14:05:33.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.514+0000 7f3dc9439700 1 -- start start 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.514+0000 7f3dc9439700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4102790 0x7f3dc41983c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.514+0000 7f3dc9439700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dc4108790 0x7f3dc4198900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.514+0000 7f3dc9439700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dc4198fe0 con 0x7f3dc4102790 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.514+0000 7f3dc9439700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dc419cd70 con 0x7f3dc4108790 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dc2ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4102790 0x7f3dc41983c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dc27fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dc4108790 0x7f3dc4198900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dc2ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4102790 0x7f3dc41983c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57698/0 (socket says 192.168.123.103:57698) 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dc2ffd700 1 -- 192.168.123.103:0/199309586 learned_addr learned my addr 192.168.123.103:0/199309586 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dc2ffd700 1 -- 192.168.123.103:0/199309586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dc4108790 msgr2=0x7f3dc4198900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:33.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dc2ffd700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dc4108790 0x7f3dc4198900 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:33.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dc2ffd700 1 -- 192.168.123.103:0/199309586 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3dac0097e0 con 0x7f3dc4102790 2026-03-10T14:05:33.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dc2ffd700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4102790 0x7f3dc41983c0 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7f3dac009fd0 tx=0x7f3dac004c90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:33.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.515+0000 7f3dbbfff700 1 -- 192.168.123.103:0/199309586 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dac01d070 con 0x7f3dc4102790 2026-03-10T14:05:33.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.516+0000 7f3dbbfff700 1 -- 192.168.123.103:0/199309586 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3dac004500 con 0x7f3dc4102790 2026-03-10T14:05:33.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.516+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3dc419cff0 con 0x7f3dc4102790 2026-03-10T14:05:33.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.516+0000 7f3dbbfff700 1 -- 192.168.123.103:0/199309586 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dac003d70 con 0x7f3dc4102790 2026-03-10T14:05:33.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.516+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3dc419d480 con 0x7f3dc4102790 2026-03-10T14:05:33.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.517+0000 7f3dbbfff700 1 -- 192.168.123.103:0/199309586 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f3dac022470 con 0x7f3dc4102790 2026-03-10T14:05:33.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.517+0000 7f3dbbfff700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3db006c6d0 0x7f3db006eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:33.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.517+0000 7f3dbbfff700 1 -- 192.168.123.103:0/199309586 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f3dac08e3f0 con 0x7f3dc4102790 2026-03-10T14:05:33.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.518+0000 7f3dc27fc700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3db006c6d0 0x7f3db006eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:33.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.518+0000 7f3dc27fc700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3db006c6d0 0x7f3db006eb80 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f3dc41999e0 tx=0x7f3db4009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:33.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.518+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3dc404ea50 con 0x7f3dc4102790 2026-03-10T14:05:33.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.521+0000 7f3dbbfff700 1 -- 192.168.123.103:0/199309586 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3dac058af0 con 0x7f3dc4102790 2026-03-10T14:05:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:33 vm04 ceph-mon[55966]: pgmap v83: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.9 KiB/s rd, 3.6 KiB/s wr, 14 op/s 2026-03-10T14:05:33.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:33 vm03.local ceph-mon[49718]: pgmap v83: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.9 KiB/s rd, 3.6 KiB/s wr, 14 op/s 2026-03-10T14:05:33.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.646+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f3dc4066e40 con 0x7f3dc4102790 2026-03-10T14:05:33.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.647+0000 7f3dbbfff700 1 -- 192.168.123.103:0/199309586 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v12) v1 ==== 53+0+83 (secure 0 0 0) 0x7f3dac05c110 con 0x7f3dc4102790 2026-03-10T14:05:33.646 INFO:teuthology.orchestra.run.vm03.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T14:05:33.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.649+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3db006c6d0 msgr2=0x7f3db006eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:33.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.649+0000 7f3dc9439700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3db006c6d0 0x7f3db006eb80 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f3dc41999e0 tx=0x7f3db4009380 comp rx=0 tx=0).stop 2026-03-10T14:05:33.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.649+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4102790 msgr2=0x7f3dc41983c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:33.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.649+0000 7f3dc9439700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4102790 0x7f3dc41983c0 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7f3dac009fd0 tx=0x7f3dac004c90 comp rx=0 tx=0).stop 2026-03-10T14:05:33.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.650+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 shutdown_connections 2026-03-10T14:05:33.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.650+0000 7f3dc9439700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dc4102790 0x7f3dc41983c0 unknown :-1 s=CLOSED pgs=265 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:33.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.650+0000 7f3dc9439700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3db006c6d0 0x7f3db006eb80 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:33.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.650+0000 7f3dc9439700 1 --2- 192.168.123.103:0/199309586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3dc4108790 0x7f3dc4198900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:33.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.650+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 >> 192.168.123.103:0/199309586 conn(0x7f3dc40fe2b0 msgr2=0x7f3dc40ffad0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:33.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.650+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 shutdown_connections 2026-03-10T14:05:33.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:33.650+0000 7f3dc9439700 1 -- 192.168.123.103:0/199309586 wait complete. 2026-03-10T14:05:33.717 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T14:05:33.718 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T14:05:33.718 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm04.local 2026-03-10T14:05:33.718 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-10T14:05:33.718 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:33.718 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T14:05:33.718 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T14:05:33.718 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T14:05:33.718 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-10T14:05:33.718 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:33.718 DEBUG:teuthology.orchestra.run.vm04:> ip addr 2026-03-10T14:05:33.734 INFO:teuthology.orchestra.run.vm04.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T14:05:33.734 INFO:teuthology.orchestra.run.vm04.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: valid_lft forever preferred_lft forever 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: inet6 ::1/128 scope host 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: valid_lft forever preferred_lft forever 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: link/ether 52:55:00:00:00:04 brd ff:ff:ff:ff:ff:ff 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: altname enp0s3 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: altname ens3 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: inet 192.168.123.104/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: valid_lft 3131sec preferred_lft 3131sec 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: inet6 fe80::5055:ff:fe00:4/64 scope link noprefixroute 2026-03-10T14:05:33.735 INFO:teuthology.orchestra.run.vm04.stdout: valid_lft forever preferred_lft forever 2026-03-10T14:05:33.735 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T14:05:33.735 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:33.735 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T14:05:33.735 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link add name ceph-brx type bridge 2026-03-10T14:05:33.735 DEBUG:teuthology.orchestra.run.vm04:> sudo ip addr flush dev ceph-brx 2026-03-10T14:05:33.735 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link set ceph-brx up 2026-03-10T14:05:33.735 DEBUG:teuthology.orchestra.run.vm04:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T14:05:33.735 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T14:05:33.811 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:33 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:33.883 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:33 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:33.888 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:33.888 DEBUG:teuthology.orchestra.run.vm04:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T14:05:33.961 INFO:teuthology.orchestra.run.vm04.stdout:1 2026-03-10T14:05:33.963 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:33.963 DEBUG:teuthology.orchestra.run.vm04:> ip r 2026-03-10T14:05:34.021 INFO:teuthology.orchestra.run.vm04.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.104 metric 100 2026-03-10T14:05:34.021 INFO:teuthology.orchestra.run.vm04.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.104 metric 100 2026-03-10T14:05:34.021 INFO:teuthology.orchestra.run.vm04.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T14:05:34.022 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:34.022 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T14:05:34.022 DEBUG:teuthology.orchestra.run.vm04:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T14:05:34.022 DEBUG:teuthology.orchestra.run.vm04:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T14:05:34.022 DEBUG:teuthology.orchestra.run.vm04:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T14:05:34.022 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T14:05:34.101 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:34 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:34.168 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:34.173 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:34.173 DEBUG:teuthology.orchestra.run.vm04:> ip netns list 2026-03-10T14:05:34.229 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:34.229 DEBUG:teuthology.orchestra.run.vm04:> ip netns list-id 2026-03-10T14:05:34.284 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:34.284 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T14:05:34.284 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T14:05:34.284 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-10T14:05:34.284 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T14:05:34.359 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:34 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:34.385 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:34.389 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-10T14:05:34.389 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:34.389 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T14:05:34.389 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-10T14:05:34.389 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T14:05:34.389 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-10T14:05:34.389 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-10T14:05:34.389 DEBUG:teuthology.orchestra.run.vm04:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-10T14:05:34.389 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T14:05:34.465 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:34 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:34.527 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:34.531 DEBUG:teuthology.orchestra.run.vm04:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T14:05:34.531 DEBUG:teuthology.orchestra.run.vm04:> set -e 2026-03-10T14:05:34.531 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link set brx.0 up 2026-03-10T14:05:34.531 DEBUG:teuthology.orchestra.run.vm04:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T14:05:34.531 DEBUG:teuthology.orchestra.run.vm04:> ') 2026-03-10T14:05:34.605 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:34 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T14:05:34.622 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:34 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/199309586' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T14:05:34.622 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:34 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:34.628 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:34.644 INFO:tasks.cephfs.fuse_mount:Client client.1 config is {} 2026-03-10T14:05:34.644 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T14:05:34.644 DEBUG:teuthology.orchestra.run.vm04:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:34.698 INFO:teuthology.orchestra.run.vm04.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-10T14:05:34.698 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T14:05:34.698 DEBUG:teuthology.orchestra.run.vm04:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:34.752 DEBUG:teuthology.orchestra.run.vm04:> sudo modprobe fuse 2026-03-10T14:05:34.818 DEBUG:teuthology.orchestra.run.vm04:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T14:05:34.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:34 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/199309586' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T14:05:34.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:34 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/proc 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/dev 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/security 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/dev/shm 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/dev/pts 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/run 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/cgroup 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/pstore 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/bpf 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/config 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/ 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/selinux 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/dev/hugepages 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/debug 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/dev/mqueue 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/tracing 2026-03-10T14:05:34.873 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/fuse/connections 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/run/user/1000 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/run/user/0 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/dca654992eb44bf029fb999a696c780e942cf2585099369b704615dc253ad353/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/ed40113b1f88ca30a73c7a40c79fbe87c511b151dfe80da1b3b42ab5e09581a5/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/71c6181bed3457b7a8cef81f116d182f596817d354b9ac050f1ef49c13f95cd3/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/1d92e3618214306aa4b575d57dc9df33fdcb412486c8ada23ae451353b4a6019/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/ccb5a97605c21311efe54e64489e44bbfaf754ee7df710e3425150b67e39ee7e/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/20c3c87832424b80a2ae7d32a095eb09f043cff76a38d029004139ba9e6e4d9f/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/3beafd8d94b2fc6861e2ba0932883814b13a77198a5730df9d7f07e2fe8e1732/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/eb38c6f3d57f886325cb1281d63bd0857441777e45abcd052f007fef8c484b41/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/5c6ab39d12143c94090fdbde9b25c1cebeea3f7c75786089996d645bb5079c0f/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/6bf47f882850e4d3b9a9b9683f167b422ba316bd85f209c1d210facdc106a349/merged 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T14:05:34.874 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:34.874 DEBUG:teuthology.orchestra.run.vm04:> ls /sys/fs/fuse/connections 2026-03-10T14:05:34.932 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T14:05:34.932 DEBUG:teuthology.orchestra.run.vm04:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.1 --id 1) 2026-03-10T14:05:34.973 DEBUG:teuthology.orchestra.run.vm04:> sudo modprobe fuse 2026-03-10T14:05:35.003 DEBUG:teuthology.orchestra.run.vm04:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T14:05:35.046 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm04.stderr:2026-03-10T14:05:35.046+0000 7f8f479cf480 -1 init, newargv = 0x55d755d09c60 newargc=15 2026-03-10T14:05:35.046 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm04.stderr:ceph-fuse[78367]: starting ceph client 2026-03-10T14:05:35.057 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm04.stderr:ceph-fuse[78367]: starting fuse 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/proc 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/sys 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/dev 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/security 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/dev/shm 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/dev/pts 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/run 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/cgroup 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/pstore 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/bpf 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/config 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/ 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T14:05:35.067 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/selinux 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/dev/hugepages 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/debug 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/dev/mqueue 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/sys/kernel/tracing 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/sys/fs/fuse/connections 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/user/1000 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/user/0 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/dca654992eb44bf029fb999a696c780e942cf2585099369b704615dc253ad353/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/ed40113b1f88ca30a73c7a40c79fbe87c511b151dfe80da1b3b42ab5e09581a5/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/71c6181bed3457b7a8cef81f116d182f596817d354b9ac050f1ef49c13f95cd3/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/1d92e3618214306aa4b575d57dc9df33fdcb412486c8ada23ae451353b4a6019/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/ccb5a97605c21311efe54e64489e44bbfaf754ee7df710e3425150b67e39ee7e/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/20c3c87832424b80a2ae7d32a095eb09f043cff76a38d029004139ba9e6e4d9f/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/3beafd8d94b2fc6861e2ba0932883814b13a77198a5730df9d7f07e2fe8e1732/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/eb38c6f3d57f886325cb1281d63bd0857441777e45abcd052f007fef8c484b41/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/5c6ab39d12143c94090fdbde9b25c1cebeea3f7c75786089996d645bb5079c0f/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/var/lib/containers/storage/overlay/6bf47f882850e4d3b9a9b9683f167b422ba316bd85f209c1d210facdc106a349/merged 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run.vm04.stdout:/home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:35.068 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:35.068 DEBUG:teuthology.orchestra.run.vm04:> ls /sys/fs/fuse/connections 2026-03-10T14:05:35.126 INFO:teuthology.orchestra.run.vm04.stdout:90 2026-03-10T14:05:35.126 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [90] 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> sudo stdin-killer -- python3 -c ' 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> import glob 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> import re 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> import os 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> import subprocess 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> def _find_admin_socket(client_name): 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> asok_path = "/var/run/ceph/ceph-client.1.*.asok" 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> files = glob.glob(asok_path) 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> mountpoint = "/home/ubuntu/cephtest/mnt.1" 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> # Given a non-glob path, it better be there 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> if "*" not in asok_path: 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> assert(len(files) == 1) 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> return files[0] 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> for f in files: 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> contents = proc_f.read() 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> if mountpoint in contents: 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> return f 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> print(_find_admin_socket("client.1")) 2026-03-10T14:05:35.126 DEBUG:teuthology.orchestra.run.vm04:> ' 2026-03-10T14:05:35.224 INFO:teuthology.orchestra.run.vm04.stdout:/var/run/ceph/ceph-client.1.78367.asok 2026-03-10T14:05:35.226 INFO:teuthology.orchestra.run.vm04.stderr:2026-03-10T14:05:35 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T14:05:35.231 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.1.78367.asok 2026-03-10T14:05:35.231 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:35.231 DEBUG:teuthology.orchestra.run.vm04:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.1.78367.asok status 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout:{ 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "metadata": { 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "ceph_sha1": "5dd24139a1eada541a3bc16b6941c5dde975e26d", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "entity_id": "1", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "hostname": "vm04.local", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.1", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "pid": "78367", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "root": "/" 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "dentry_count": 0, 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "dentry_pinned_count": 0, 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "id": 24297, 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "inst": { 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "name": { 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "type": "client", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "num": 24297 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "addr": { 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "type": "v1", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "addr": "192.168.144.1:0", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "nonce": 448103355 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: } 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "addr": { 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "type": "v1", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "addr": "192.168.144.1:0", 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "nonce": 448103355 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: }, 2026-03-10T14:05:35.342 INFO:teuthology.orchestra.run.vm04.stdout: "inst_str": "client.24297 192.168.144.1:0/448103355", 2026-03-10T14:05:35.343 INFO:teuthology.orchestra.run.vm04.stdout: "addr_str": "192.168.144.1:0/448103355", 2026-03-10T14:05:35.343 INFO:teuthology.orchestra.run.vm04.stdout: "inode_count": 1, 2026-03-10T14:05:35.343 INFO:teuthology.orchestra.run.vm04.stdout: "mds_epoch": 12, 2026-03-10T14:05:35.343 INFO:teuthology.orchestra.run.vm04.stdout: "osd_epoch": 38, 2026-03-10T14:05:35.343 INFO:teuthology.orchestra.run.vm04.stdout: "osd_epoch_barrier": 0, 2026-03-10T14:05:35.343 INFO:teuthology.orchestra.run.vm04.stdout: "blocklisted": false, 2026-03-10T14:05:35.343 INFO:teuthology.orchestra.run.vm04.stdout: "fs_name": "cephfs" 2026-03-10T14:05:35.343 INFO:teuthology.orchestra.run.vm04.stdout:} 2026-03-10T14:05:35.349 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:35.349 DEBUG:teuthology.orchestra.run.vm03:> stat --file-system '--printf=%T 2026-03-10T14:05:35.349 DEBUG:teuthology.orchestra.run.vm03:> ' -- /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:35.370 INFO:teuthology.orchestra.run.vm03.stdout:fuseblk 2026-03-10T14:05:35.371 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:35.371 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:35.371 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:35.446 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:35.446 DEBUG:teuthology.orchestra.run.vm04:> stat --file-system '--printf=%T 2026-03-10T14:05:35.446 DEBUG:teuthology.orchestra.run.vm04:> ' -- /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:35.464 INFO:teuthology.orchestra.run.vm04.stdout:fuseblk 2026-03-10T14:05:35.464 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:35.464 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:05:35.464 DEBUG:teuthology.orchestra.run.vm04:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:35.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:35 vm04.local ceph-mon[55966]: pgmap v84: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.7 KiB/s rd, 2.9 KiB/s wr, 16 op/s 2026-03-10T14:05:35.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:35 vm03.local ceph-mon[49718]: pgmap v84: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.7 KiB/s rd, 2.9 KiB/s wr, 16 op/s 2026-03-10T14:05:37.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:37 vm04.local ceph-mon[55966]: pgmap v85: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 1.2 KiB/s wr, 4 op/s 2026-03-10T14:05:37.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:37 vm03.local ceph-mon[49718]: pgmap v85: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 1.2 KiB/s wr, 4 op/s 2026-03-10T14:05:39.325 INFO:teuthology.run_tasks:Running task print... 2026-03-10T14:05:39.328 INFO:teuthology.task.print:**** done client 2026-03-10T14:05:39.328 INFO:teuthology.run_tasks:Running task parallel... 2026-03-10T14:05:39.331 INFO:teuthology.task.parallel:starting parallel... 2026-03-10T14:05:39.331 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T14:05:39.331 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T14:05:39.331 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:39.331 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs false || true' 2026-03-10T14:05:39.331 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T14:05:39.331 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-10T14:05:39.333 INFO:tasks.workunit:Pulling workunits from ref 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T14:05:39.333 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-10T14:05:39.333 INFO:tasks.workunit:timeout=3h 2026-03-10T14:05:39.333 INFO:tasks.workunit:cleanup=True 2026-03-10T14:05:39.333 DEBUG:teuthology.orchestra.run.vm03:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout:Device: 3bh/59d Inode: 1 Links: 2 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-10 14:05:24.426627627 +0000 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-10 14:05:35.444943714 +0000 2026-03-10T14:05:39.353 INFO:teuthology.orchestra.run.vm03.stdout: Birth: - 2026-03-10T14:05:39.353 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-10T14:05:39.354 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-10T14:05:39.427 DEBUG:teuthology.orchestra.run.vm04:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:39.449 INFO:teuthology.orchestra.run.vm04.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:39.450 INFO:teuthology.orchestra.run.vm04.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T14:05:39.450 INFO:teuthology.orchestra.run.vm04.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-10T14:05:39.450 INFO:teuthology.orchestra.run.vm04.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T14:05:39.450 INFO:teuthology.orchestra.run.vm04.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T14:05:39.450 INFO:teuthology.orchestra.run.vm04.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T14:05:39.450 INFO:teuthology.orchestra.run.vm04.stdout:Modify: 2026-03-10 14:05:39.421508385 +0000 2026-03-10T14:05:39.450 INFO:teuthology.orchestra.run.vm04.stdout:Change: 2026-03-10 14:05:39.421508385 +0000 2026-03-10T14:05:39.450 INFO:teuthology.orchestra.run.vm04.stdout: Birth: - 2026-03-10T14:05:39.450 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-10T14:05:39.450 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-10T14:05:39.503 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:39.523 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T14:05:39.523 DEBUG:teuthology.orchestra.run.vm04:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T14:05:39.561 INFO:tasks.workunit.client.0.vm03.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-10T14:05:39.580 INFO:tasks.workunit.client.1.vm04.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-10T14:05:39.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:39 vm03.local ceph-mon[49718]: pgmap v86: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.1 KiB/s rd, 509 B/s wr, 6 op/s 2026-03-10T14:05:39.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.770+0000 7fb798e3a700 1 -- 192.168.123.103:0/2682187100 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb794102760 msgr2=0x7fb794102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:39.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.770+0000 7fb798e3a700 1 --2- 192.168.123.103:0/2682187100 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb794102760 0x7fb794102b70 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb784009b00 tx=0x7fb784009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:39.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.771+0000 7fb798e3a700 1 -- 192.168.123.103:0/2682187100 shutdown_connections 2026-03-10T14:05:39.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.771+0000 7fb798e3a700 1 --2- 192.168.123.103:0/2682187100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb794103a00 0x7fb794103e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:39.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.771+0000 7fb798e3a700 1 --2- 192.168.123.103:0/2682187100 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb794102760 0x7fb794102b70 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:39.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.771+0000 7fb798e3a700 1 -- 192.168.123.103:0/2682187100 >> 192.168.123.103:0/2682187100 conn(0x7fb7940fddb0 msgr2=0x7fb7941001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:39.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.771+0000 7fb798e3a700 1 -- 192.168.123.103:0/2682187100 shutdown_connections 2026-03-10T14:05:39.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.771+0000 7fb798e3a700 1 -- 192.168.123.103:0/2682187100 wait complete. 2026-03-10T14:05:39.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.772+0000 7fb798e3a700 1 Processor -- start 2026-03-10T14:05:39.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.772+0000 7fb798e3a700 1 -- start start 2026-03-10T14:05:39.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.772+0000 7fb798e3a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb794103a00 0x7fb794198250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:39.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb798e3a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb794198790 0x7fb79419d800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:39.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb798e3a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb794198c90 con 0x7fb794198790 2026-03-10T14:05:39.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb798e3a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb794198e00 con 0x7fb794103a00 2026-03-10T14:05:39.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb791d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb794198790 0x7fb79419d800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:39.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb791d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb794198790 0x7fb79419d800 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42556/0 (socket says 192.168.123.103:42556) 2026-03-10T14:05:39.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb791d9b700 1 -- 192.168.123.103:0/798114801 learned_addr learned my addr 192.168.123.103:0/798114801 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:39.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb791d9b700 1 -- 192.168.123.103:0/798114801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb794103a00 msgr2=0x7fb794198250 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:39.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb791d9b700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb794103a00 0x7fb794198250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:39.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb791d9b700 1 -- 192.168.123.103:0/798114801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb77c009710 con 0x7fb794198790 2026-03-10T14:05:39.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb791d9b700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb794198790 0x7fb79419d800 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7fb77c00eee0 tx=0x7fb77c00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:39.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb78b7fe700 1 -- 192.168.123.103:0/798114801 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb77c00ce10 con 0x7fb794198790 2026-03-10T14:05:39.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb78b7fe700 1 -- 192.168.123.103:0/798114801 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb77c004500 con 0x7fb794198790 2026-03-10T14:05:39.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb78b7fe700 1 -- 192.168.123.103:0/798114801 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb77c005490 con 0x7fb794198790 2026-03-10T14:05:39.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.773+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb7840097e0 con 0x7fb794198790 2026-03-10T14:05:39.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.774+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb79419e130 con 0x7fb794198790 2026-03-10T14:05:39.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.774+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb794066e40 con 0x7fb794198790 2026-03-10T14:05:39.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.778+0000 7fb78b7fe700 1 -- 192.168.123.103:0/798114801 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb77c01e030 con 0x7fb794198790 2026-03-10T14:05:39.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.778+0000 7fb78b7fe700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb78006c750 0x7fb78006ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:39.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.778+0000 7fb79259c700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb78006c750 0x7fb78006ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:39.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.778+0000 7fb78b7fe700 1 -- 192.168.123.103:0/798114801 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb77c014070 con 0x7fb794198790 2026-03-10T14:05:39.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.779+0000 7fb78b7fe700 1 -- 192.168.123.103:0/798114801 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb77c08c200 con 0x7fb794198790 2026-03-10T14:05:39.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.779+0000 7fb79259c700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb78006c750 0x7fb78006ec00 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fb78400b5c0 tx=0x7fb784005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:39.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:39 vm04.local ceph-mon[55966]: pgmap v86: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.1 KiB/s rd, 509 B/s wr, 6 op/s 2026-03-10T14:05:39.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.886+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7fb79419e870 con 0x7fb794198790 2026-03-10T14:05:39.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.891+0000 7fb78b7fe700 1 -- 192.168.123.103:0/798114801 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v16)=0 v16) v1 ==== 126+0+0 (secure 0 0 0) 0x7fb77c05a1d0 con 0x7fb794198790 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb78006c750 msgr2=0x7fb78006ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb78006c750 0x7fb78006ec00 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fb78400b5c0 tx=0x7fb784005fb0 comp rx=0 tx=0).stop 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb794198790 msgr2=0x7fb79419d800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb794198790 0x7fb79419d800 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7fb77c00eee0 tx=0x7fb77c00c5b0 comp rx=0 tx=0).stop 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 shutdown_connections 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb78006c750 0x7fb78006ec00 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb794103a00 0x7fb794198250 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 --2- 192.168.123.103:0/798114801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb794198790 0x7fb79419d800 unknown :-1 s=CLOSED pgs=267 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.898+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 >> 192.168.123.103:0/798114801 conn(0x7fb7940fddb0 msgr2=0x7fb794106c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.899+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 shutdown_connections 2026-03-10T14:05:39.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:39.899+0000 7fb798e3a700 1 -- 192.168.123.103:0/798114801 wait complete. 2026-03-10T14:05:39.954 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T14:05:39.954 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:39.954 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-10T14:05:40.163 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:40.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.413+0000 7f3048f98700 1 -- 192.168.123.103:0/3672403962 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044102760 msgr2=0x7f3044102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:40.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.413+0000 7f3048f98700 1 --2- 192.168.123.103:0/3672403962 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044102760 0x7f3044102b70 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f302c009b00 tx=0x7f302c009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:40.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.413+0000 7f3048f98700 1 -- 192.168.123.103:0/3672403962 shutdown_connections 2026-03-10T14:05:40.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.414+0000 7f3048f98700 1 --2- 192.168.123.103:0/3672403962 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3044103960 0x7f3044103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:40.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.414+0000 7f3048f98700 1 --2- 192.168.123.103:0/3672403962 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044102760 0x7f3044102b70 unknown :-1 s=CLOSED pgs=268 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:40.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.414+0000 7f3048f98700 1 -- 192.168.123.103:0/3672403962 >> 192.168.123.103:0/3672403962 conn(0x7f30440fdcf0 msgr2=0x7f3044100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:40.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.414+0000 7f3048f98700 1 -- 192.168.123.103:0/3672403962 shutdown_connections 2026-03-10T14:05:40.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.414+0000 7f3048f98700 1 -- 192.168.123.103:0/3672403962 wait complete. 2026-03-10T14:05:40.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.415+0000 7f3048f98700 1 Processor -- start 2026-03-10T14:05:40.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.415+0000 7f3048f98700 1 -- start start 2026-03-10T14:05:40.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.415+0000 7f3048f98700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044103960 0x7f30441982d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:40.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3048f98700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3044198810 0x7f304419d880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:40.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3048f98700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3044198d10 con 0x7f3044103960 2026-03-10T14:05:40.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3048f98700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3044198e80 con 0x7f3044198810 2026-03-10T14:05:40.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3041d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3044198810 0x7f304419d880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3041d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3044198810 0x7f304419d880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:42188/0 (socket says 192.168.123.103:42188) 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3041d9b700 1 -- 192.168.123.103:0/3491095492 learned_addr learned my addr 192.168.123.103:0/3491095492 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f304259c700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044103960 0x7f30441982d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3041d9b700 1 -- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044103960 msgr2=0x7f30441982d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3041d9b700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044103960 0x7f30441982d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f3041d9b700 1 -- 192.168.123.103:0/3491095492 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f302c0097e0 con 0x7f3044198810 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.416+0000 7f304259c700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044103960 0x7f30441982d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.417+0000 7f3041d9b700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3044198810 0x7f304419d880 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f303400b700 tx=0x7f303400bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:40.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.417+0000 7f303b7fe700 1 -- 192.168.123.103:0/3491095492 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3034010840 con 0x7f3044198810 2026-03-10T14:05:40.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.417+0000 7f303b7fe700 1 -- 192.168.123.103:0/3491095492 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3034010e80 con 0x7f3044198810 2026-03-10T14:05:40.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.417+0000 7f303b7fe700 1 -- 192.168.123.103:0/3491095492 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f303400d590 con 0x7f3044198810 2026-03-10T14:05:40.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.417+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f304419de20 con 0x7f3044198810 2026-03-10T14:05:40.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.417+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f304419e3c0 con 0x7f3044198810 2026-03-10T14:05:40.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.418+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3044066e40 con 0x7f3044198810 2026-03-10T14:05:40.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.419+0000 7f303b7fe700 1 -- 192.168.123.103:0/3491095492 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f303400f3e0 con 0x7f3044198810 2026-03-10T14:05:40.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.420+0000 7f303b7fe700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f303006c630 0x7f303006eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:40.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.420+0000 7f303b7fe700 1 -- 192.168.123.103:0/3491095492 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f303408ba40 con 0x7f3044198810 2026-03-10T14:05:40.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.420+0000 7f304259c700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f303006c630 0x7f303006eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:40.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.420+0000 7f304259c700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f303006c630 0x7f303006eae0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f302c009fd0 tx=0x7f302c00b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:40.420 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.422+0000 7f303b7fe700 1 -- 192.168.123.103:0/3491095492 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3034059cc0 con 0x7f3044198810 2026-03-10T14:05:40.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.525+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f304419e670 con 0x7f3044198810 2026-03-10T14:05:40.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.528+0000 7f303b7fe700 1 -- 192.168.123.103:0/3491095492 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v16)=0 v16) v1 ==== 155+0+0 (secure 0 0 0) 0x7f303400f690 con 0x7f3044198810 2026-03-10T14:05:40.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.530+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f303006c630 msgr2=0x7f303006eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:40.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.530+0000 7f3048f98700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f303006c630 0x7f303006eae0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f302c009fd0 tx=0x7f302c00b540 comp rx=0 tx=0).stop 2026-03-10T14:05:40.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.530+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3044198810 msgr2=0x7f304419d880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:40.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.530+0000 7f3048f98700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3044198810 0x7f304419d880 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f303400b700 tx=0x7f303400bac0 comp rx=0 tx=0).stop 2026-03-10T14:05:40.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.531+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 shutdown_connections 2026-03-10T14:05:40.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.531+0000 7f3048f98700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3044103960 0x7f30441982d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:40.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.531+0000 7f3048f98700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f303006c630 0x7f303006eae0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:40.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.531+0000 7f3048f98700 1 --2- 192.168.123.103:0/3491095492 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3044198810 0x7f304419d880 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:40.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.531+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 >> 192.168.123.103:0/3491095492 conn(0x7f30440fdcf0 msgr2=0x7f3044106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:40.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.531+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 shutdown_connections 2026-03-10T14:05:40.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:40.531+0000 7f3048f98700 1 -- 192.168.123.103:0/3491095492 wait complete. 2026-03-10T14:05:40.588 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-10T14:05:40.741 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:41.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.007+0000 7f9a24705700 1 -- 192.168.123.103:0/2870115539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 msgr2=0x7f9a1c102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:41.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.007+0000 7f9a24705700 1 --2- 192.168.123.103:0/2870115539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 0x7f9a1c102b70 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f9a0c009b50 tx=0x7f9a0c009e60 comp rx=0 tx=0).stop 2026-03-10T14:05:41.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.008+0000 7f9a24705700 1 -- 192.168.123.103:0/2870115539 shutdown_connections 2026-03-10T14:05:41.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.008+0000 7f9a24705700 1 --2- 192.168.123.103:0/2870115539 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a1c103960 0x7f9a1c103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.008+0000 7f9a24705700 1 --2- 192.168.123.103:0/2870115539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 0x7f9a1c102b70 unknown :-1 s=CLOSED pgs=269 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.008+0000 7f9a24705700 1 -- 192.168.123.103:0/2870115539 >> 192.168.123.103:0/2870115539 conn(0x7f9a1c0fdcf0 msgr2=0x7f9a1c100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:41.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.008+0000 7f9a24705700 1 -- 192.168.123.103:0/2870115539 shutdown_connections 2026-03-10T14:05:41.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.009+0000 7f9a24705700 1 -- 192.168.123.103:0/2870115539 wait complete. 2026-03-10T14:05:41.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.009+0000 7f9a24705700 1 Processor -- start 2026-03-10T14:05:41.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.009+0000 7f9a24705700 1 -- start start 2026-03-10T14:05:41.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.009+0000 7f9a24705700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 0x7f9a1c198050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:41.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.009+0000 7f9a24705700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a1c103960 0x7f9a1c198590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:41.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a24705700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a1c198bb0 con 0x7f9a1c102760 2026-03-10T14:05:41.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a24705700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a1c198cf0 con 0x7f9a1c103960 2026-03-10T14:05:41.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a224a1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 0x7f9a1c198050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:41.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a224a1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 0x7f9a1c198050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42604/0 (socket says 192.168.123.103:42604) 2026-03-10T14:05:41.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a224a1700 1 -- 192.168.123.103:0/2730327710 learned_addr learned my addr 192.168.123.103:0/2730327710 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:41.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a224a1700 1 -- 192.168.123.103:0/2730327710 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a1c103960 msgr2=0x7f9a1c198590 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:41.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a21ca0700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a1c103960 0x7f9a1c198590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:41.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a224a1700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a1c103960 0x7f9a1c198590 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a224a1700 1 -- 192.168.123.103:0/2730327710 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9a0c0097e0 con 0x7f9a1c102760 2026-03-10T14:05:41.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a224a1700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 0x7f9a1c198050 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f9a0c004ce0 tx=0x7f9a0c005f00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:41.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.011+0000 7f9a137fe700 1 -- 192.168.123.103:0/2730327710 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9a0c01d070 con 0x7f9a1c102760 2026-03-10T14:05:41.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.010+0000 7f9a21ca0700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a1c103960 0x7f9a1c198590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:05:41.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.011+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9a1c19d740 con 0x7f9a1c102760 2026-03-10T14:05:41.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.011+0000 7f9a137fe700 1 -- 192.168.123.103:0/2730327710 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9a0c00bc00 con 0x7f9a1c102760 2026-03-10T14:05:41.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.011+0000 7f9a137fe700 1 -- 192.168.123.103:0/2730327710 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9a0c00f850 con 0x7f9a1c102760 2026-03-10T14:05:41.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.011+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9a1c19dc30 con 0x7f9a1c102760 2026-03-10T14:05:41.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.012+0000 7f9a137fe700 1 -- 192.168.123.103:0/2730327710 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9a0c00f9b0 con 0x7f9a1c102760 2026-03-10T14:05:41.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.012+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9a1c066e40 con 0x7f9a1c102760 2026-03-10T14:05:41.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.014+0000 7f9a137fe700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9a0806c680 0x7f9a0806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:41.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.014+0000 7f9a137fe700 1 -- 192.168.123.103:0/2730327710 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f9a0c00bd70 con 0x7f9a1c102760 2026-03-10T14:05:41.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.015+0000 7f9a137fe700 1 -- 192.168.123.103:0/2730327710 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9a0c05c250 con 0x7f9a1c102760 2026-03-10T14:05:41.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.015+0000 7f9a21ca0700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9a0806c680 0x7f9a0806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:41.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.015+0000 7f9a21ca0700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9a0806c680 0x7f9a0806eb30 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f9a18005fd0 tx=0x7f9a18005ee0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:41.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:40 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/798114801' entity='client.admin' 2026-03-10T14:05:41.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:40 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:41.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:40 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:41.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:40 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:41.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:40 vm03.local ceph-mon[49718]: pgmap v87: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.6 KiB/s rd, 426 B/s wr, 5 op/s 2026-03-10T14:05:41.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:40 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:41.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.119+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f9a1c19df10 con 0x7f9a1c102760 2026-03-10T14:05:41.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.119+0000 7f9a137fe700 1 -- 192.168.123.103:0/2730327710 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v16)=0 v16) v1 ==== 163+0+0 (secure 0 0 0) 0x7f9a0c027020 con 0x7f9a1c102760 2026-03-10T14:05:41.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.121+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9a0806c680 msgr2=0x7f9a0806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:41.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.121+0000 7f9a24705700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9a0806c680 0x7f9a0806eb30 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f9a18005fd0 tx=0x7f9a18005ee0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.121+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 msgr2=0x7f9a1c198050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:41.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.121+0000 7f9a24705700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 0x7f9a1c198050 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f9a0c004ce0 tx=0x7f9a0c005f00 comp rx=0 tx=0).stop 2026-03-10T14:05:41.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.122+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 shutdown_connections 2026-03-10T14:05:41.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.122+0000 7f9a24705700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a1c102760 0x7f9a1c198050 unknown :-1 s=CLOSED pgs=270 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.122+0000 7f9a24705700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9a0806c680 0x7f9a0806eb30 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.122+0000 7f9a24705700 1 --2- 192.168.123.103:0/2730327710 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a1c103960 0x7f9a1c198590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.122+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 >> 192.168.123.103:0/2730327710 conn(0x7f9a1c0fdcf0 msgr2=0x7f9a1c106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:41.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.122+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 shutdown_connections 2026-03-10T14:05:41.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.122+0000 7f9a24705700 1 -- 192.168.123.103:0/2730327710 wait complete. 2026-03-10T14:05:41.186 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-10T14:05:41.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:40 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/798114801' entity='client.admin' 2026-03-10T14:05:41.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:40 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:41.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:40 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:41.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:40 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:41.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:40 vm04.local ceph-mon[55966]: pgmap v87: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.6 KiB/s rd, 426 B/s wr, 5 op/s 2026-03-10T14:05:41.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:40 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:41.337 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:41.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.597+0000 7efd72a4a700 1 -- 192.168.123.103:0/2038903852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c103a00 msgr2=0x7efd6c103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:41.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.597+0000 7efd72a4a700 1 --2- 192.168.123.103:0/2038903852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c103a00 0x7efd6c103e70 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7efd5c009b00 tx=0x7efd5c009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:41.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.598+0000 7efd72a4a700 1 -- 192.168.123.103:0/2038903852 shutdown_connections 2026-03-10T14:05:41.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.598+0000 7efd72a4a700 1 --2- 192.168.123.103:0/2038903852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c103a00 0x7efd6c103e70 unknown :-1 s=CLOSED pgs=271 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.598+0000 7efd72a4a700 1 --2- 192.168.123.103:0/2038903852 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd6c102760 0x7efd6c102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.598+0000 7efd72a4a700 1 -- 192.168.123.103:0/2038903852 >> 192.168.123.103:0/2038903852 conn(0x7efd6c0fddb0 msgr2=0x7efd6c1001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:41.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.598+0000 7efd72a4a700 1 -- 192.168.123.103:0/2038903852 shutdown_connections 2026-03-10T14:05:41.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.598+0000 7efd72a4a700 1 -- 192.168.123.103:0/2038903852 wait complete. 2026-03-10T14:05:41.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.599+0000 7efd72a4a700 1 Processor -- start 2026-03-10T14:05:41.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.599+0000 7efd72a4a700 1 -- start start 2026-03-10T14:05:41.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.599+0000 7efd72a4a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c102760 0x7efd6c197fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:41.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.599+0000 7efd72a4a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd6c103a00 0x7efd6c198500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:41.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.599+0000 7efd72a4a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd6c198b20 con 0x7efd6c102760 2026-03-10T14:05:41.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.599+0000 7efd72a4a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd6c198c60 con 0x7efd6c103a00 2026-03-10T14:05:41.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.599+0000 7efd6bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c102760 0x7efd6c197fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:41.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.600+0000 7efd6bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c102760 0x7efd6c197fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42620/0 (socket says 192.168.123.103:42620) 2026-03-10T14:05:41.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.600+0000 7efd6bfff700 1 -- 192.168.123.103:0/3658229380 learned_addr learned my addr 192.168.123.103:0/3658229380 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:41.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.600+0000 7efd6b7fe700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd6c103a00 0x7efd6c198500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:41.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.600+0000 7efd6bfff700 1 -- 192.168.123.103:0/3658229380 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd6c103a00 msgr2=0x7efd6c198500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:41.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.600+0000 7efd6bfff700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd6c103a00 0x7efd6c198500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.600+0000 7efd6bfff700 1 -- 192.168.123.103:0/3658229380 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd5c0097e0 con 0x7efd6c102760 2026-03-10T14:05:41.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.601+0000 7efd6bfff700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c102760 0x7efd6c197fc0 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7efd6000d8d0 tx=0x7efd6000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:41.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.601+0000 7efd697fa700 1 -- 192.168.123.103:0/3658229380 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd60009880 con 0x7efd6c102760 2026-03-10T14:05:41.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.601+0000 7efd697fa700 1 -- 192.168.123.103:0/3658229380 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efd60010460 con 0x7efd6c102760 2026-03-10T14:05:41.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.601+0000 7efd697fa700 1 -- 192.168.123.103:0/3658229380 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd6000f5d0 con 0x7efd6c102760 2026-03-10T14:05:41.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.601+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efd6c19d710 con 0x7efd6c102760 2026-03-10T14:05:41.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.601+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efd6c19dc60 con 0x7efd6c102760 2026-03-10T14:05:41.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.602+0000 7efd697fa700 1 -- 192.168.123.103:0/3658229380 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7efd6000f730 con 0x7efd6c102760 2026-03-10T14:05:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.603+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efd6c10b3e0 con 0x7efd6c102760 2026-03-10T14:05:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.603+0000 7efd697fa700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efd5406c7a0 0x7efd5406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:41.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.606+0000 7efd697fa700 1 -- 192.168.123.103:0/3658229380 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7efd6008c5b0 con 0x7efd6c102760 2026-03-10T14:05:41.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.606+0000 7efd6b7fe700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efd5406c7a0 0x7efd5406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:41.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.606+0000 7efd697fa700 1 -- 192.168.123.103:0/3658229380 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7efd6005a840 con 0x7efd6c102760 2026-03-10T14:05:41.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.606+0000 7efd6b7fe700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efd5406c7a0 0x7efd5406ec50 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7efd5c00b5c0 tx=0x7efd5c005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:41.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.716+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7efd6c10b600 con 0x7efd6c102760 2026-03-10T14:05:41.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.716+0000 7efd697fa700 1 -- 192.168.123.103:0/3658229380 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v16)=0 v16) v1 ==== 135+0+0 (secure 0 0 0) 0x7efd6005a3d0 con 0x7efd6c102760 2026-03-10T14:05:41.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.718+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efd5406c7a0 msgr2=0x7efd5406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:41.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efd5406c7a0 0x7efd5406ec50 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7efd5c00b5c0 tx=0x7efd5c005fb0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c102760 msgr2=0x7efd6c197fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:41.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c102760 0x7efd6c197fc0 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7efd6000d8d0 tx=0x7efd6000dbe0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 shutdown_connections 2026-03-10T14:05:41.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd6c102760 0x7efd6c197fc0 unknown :-1 s=CLOSED pgs=272 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efd5406c7a0 0x7efd5406ec50 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 --2- 192.168.123.103:0/3658229380 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd6c103a00 0x7efd6c198500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:41.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 >> 192.168.123.103:0/3658229380 conn(0x7efd6c0fddb0 msgr2=0x7efd6c100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:41.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.719+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 shutdown_connections 2026-03-10T14:05:41.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:41.720+0000 7efd72a4a700 1 -- 192.168.123.103:0/3658229380 wait complete. 2026-03-10T14:05:41.780 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-10T14:05:41.957 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:42.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.236+0000 7fd33d626700 1 -- 192.168.123.103:0/2770638855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 msgr2=0x7fd338102060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.236+0000 7fd33d626700 1 --2- 192.168.123.103:0/2770638855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 0x7fd338102060 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7fd328009b00 tx=0x7fd328009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.238+0000 7fd33d626700 1 -- 192.168.123.103:0/2770638855 shutdown_connections 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.238+0000 7fd33d626700 1 --2- 192.168.123.103:0/2770638855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3381025a0 0x7fd338104980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.238+0000 7fd33d626700 1 --2- 192.168.123.103:0/2770638855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 0x7fd338102060 unknown :-1 s=CLOSED pgs=273 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.238+0000 7fd33d626700 1 -- 192.168.123.103:0/2770638855 >> 192.168.123.103:0/2770638855 conn(0x7fd3380f9ce0 msgr2=0x7fd3380fc130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.238+0000 7fd33d626700 1 -- 192.168.123.103:0/2770638855 shutdown_connections 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.238+0000 7fd33d626700 1 -- 192.168.123.103:0/2770638855 wait complete. 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd33d626700 1 Processor -- start 2026-03-10T14:05:42.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd33d626700 1 -- start start 2026-03-10T14:05:42.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd33d626700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 0x7fd338198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:42.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd33d626700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3381025a0 0x7fd338198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:42.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd33d626700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd338198af0 con 0x7fd3380ffc80 2026-03-10T14:05:42.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd33d626700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd338198c30 con 0x7fd3381025a0 2026-03-10T14:05:42.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd336ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 0x7fd338198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:42.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd336ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 0x7fd338198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42642/0 (socket says 192.168.123.103:42642) 2026-03-10T14:05:42.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.239+0000 7fd336ffd700 1 -- 192.168.123.103:0/444834718 learned_addr learned my addr 192.168.123.103:0/444834718 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:42.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd336ffd700 1 -- 192.168.123.103:0/444834718 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3381025a0 msgr2=0x7fd338198560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:42.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd336ffd700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3381025a0 0x7fd338198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:42.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd336ffd700 1 -- 192.168.123.103:0/444834718 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd3280097e0 con 0x7fd3380ffc80 2026-03-10T14:05:42.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd336ffd700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 0x7fd338198020 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7fd32800b5c0 tx=0x7fd328004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:42.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd32ffff700 1 -- 192.168.123.103:0/444834718 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd32801d070 con 0x7fd3380ffc80 2026-03-10T14:05:42.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd33d626700 1 -- 192.168.123.103:0/444834718 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd33818e7a0 con 0x7fd3380ffc80 2026-03-10T14:05:42.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd33d626700 1 -- 192.168.123.103:0/444834718 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd33818ec60 con 0x7fd3380ffc80 2026-03-10T14:05:42.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd32ffff700 1 -- 192.168.123.103:0/444834718 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd32800bc50 con 0x7fd3380ffc80 2026-03-10T14:05:42.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.240+0000 7fd32ffff700 1 -- 192.168.123.103:0/444834718 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd32800f780 con 0x7fd3380ffc80 2026-03-10T14:05:42.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.242+0000 7fd32ffff700 1 -- 192.168.123.103:0/444834718 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd32800f8e0 con 0x7fd3380ffc80 2026-03-10T14:05:42.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.242+0000 7fd32ffff700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd324074ed0 0x7fd324077380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:42.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.242+0000 7fd32ffff700 1 -- 192.168.123.103:0/444834718 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fd32808d590 con 0x7fd3380ffc80 2026-03-10T14:05:42.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.242+0000 7fd3367fc700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd324074ed0 0x7fd324077380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:42.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.243+0000 7fd33d626700 1 -- 192.168.123.103:0/444834718 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd318005320 con 0x7fd3380ffc80 2026-03-10T14:05:42.249 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.245+0000 7fd32ffff700 1 -- 192.168.123.103:0/444834718 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd32805c2b0 con 0x7fd3380ffc80 2026-03-10T14:05:42.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.250+0000 7fd3367fc700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd324074ed0 0x7fd324077380 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fd320005fd0 tx=0x7fd320005d00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:42.400 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.401+0000 7fd33d626700 1 -- 192.168.123.103:0/444834718 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7fd318000c90 con 0x7fd324074ed0 2026-03-10T14:05:42.405 INFO:teuthology.orchestra.run.vm03.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:05:42.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.406+0000 7fd32ffff700 1 -- 192.168.123.103:0/444834718 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7fd318000c90 con 0x7fd324074ed0 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 -- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd324074ed0 msgr2=0x7fd324077380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd324074ed0 0x7fd324077380 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fd320005fd0 tx=0x7fd320005d00 comp rx=0 tx=0).stop 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 -- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 msgr2=0x7fd338198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 0x7fd338198020 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7fd32800b5c0 tx=0x7fd328004ab0 comp rx=0 tx=0).stop 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 -- 192.168.123.103:0/444834718 shutdown_connections 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3380ffc80 0x7fd338198020 unknown :-1 s=CLOSED pgs=274 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd324074ed0 0x7fd324077380 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 --2- 192.168.123.103:0/444834718 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3381025a0 0x7fd338198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.412+0000 7fd32dffb700 1 -- 192.168.123.103:0/444834718 >> 192.168.123.103:0/444834718 conn(0x7fd3380f9ce0 msgr2=0x7fd338106ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.413+0000 7fd32dffb700 1 -- 192.168.123.103:0/444834718 shutdown_connections 2026-03-10T14:05:42.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:42.413+0000 7fd32dffb700 1 -- 192.168.123.103:0/444834718 wait complete. 2026-03-10T14:05:42.484 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T14:05:42.484 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:05:42.484 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-10T14:05:42.728 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:05:43.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.169+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/705976072 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18072470 msgr2=0x7f9b1810beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.169+0000 7f9b1f6b3700 1 --2- 192.168.123.103:0/705976072 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18072470 0x7f9b1810beb0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f9b1000b3a0 tx=0x7f9b1000b6b0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.170+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/705976072 shutdown_connections 2026-03-10T14:05:43.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.170+0000 7f9b1f6b3700 1 --2- 192.168.123.103:0/705976072 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18072470 0x7f9b1810beb0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.170+0000 7f9b1f6b3700 1 --2- 192.168.123.103:0/705976072 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b18071a90 0x7f9b18071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.170+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/705976072 >> 192.168.123.103:0/705976072 conn(0x7f9b1806d1a0 msgr2=0x7f9b1806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:43.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.170+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/705976072 shutdown_connections 2026-03-10T14:05:43.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.170+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/705976072 wait complete. 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.171+0000 7f9b1f6b3700 1 Processor -- start 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.171+0000 7f9b1f6b3700 1 -- start start 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.171+0000 7f9b1f6b3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18071a90 0x7f9b18116ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.171+0000 7f9b1f6b3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b18117010 0x7f9b181b2800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.171+0000 7f9b1f6b3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b18117510 con 0x7f9b18117010 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.171+0000 7f9b1f6b3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b18117680 con 0x7f9b18071a90 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.172+0000 7f9b1cc4e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b18117010 0x7f9b181b2800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.172+0000 7f9b1d44f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18071a90 0x7f9b18116ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.172+0000 7f9b1d44f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18071a90 0x7f9b18116ad0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:42242/0 (socket says 192.168.123.103:42242) 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.172+0000 7f9b1d44f700 1 -- 192.168.123.103:0/1122435563 learned_addr learned my addr 192.168.123.103:0/1122435563 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.173+0000 7f9b1d44f700 1 -- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b18117010 msgr2=0x7f9b181b2800 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.173+0000 7f9b1d44f700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b18117010 0x7f9b181b2800 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.173+0000 7f9b1d44f700 1 -- 192.168.123.103:0/1122435563 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9b1000b050 con 0x7f9b18071a90 2026-03-10T14:05:43.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.173+0000 7f9b1d44f700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18071a90 0x7f9b18116ad0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f9b1400eb10 tx=0x7f9b1400ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:43.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.173+0000 7f9b0e7fc700 1 -- 192.168.123.103:0/1122435563 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b1400cc40 con 0x7f9b18071a90 2026-03-10T14:05:43.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.173+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9b181b2da0 con 0x7f9b18071a90 2026-03-10T14:05:43.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.173+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9b181b3260 con 0x7f9b18071a90 2026-03-10T14:05:43.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.174+0000 7f9b0e7fc700 1 -- 192.168.123.103:0/1122435563 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9b1400cda0 con 0x7f9b18071a90 2026-03-10T14:05:43.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.174+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9b18110c20 con 0x7f9b18071a90 2026-03-10T14:05:43.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.174+0000 7f9b0e7fc700 1 -- 192.168.123.103:0/1122435563 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b14018810 con 0x7f9b18071a90 2026-03-10T14:05:43.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.176+0000 7f9b0e7fc700 1 -- 192.168.123.103:0/1122435563 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9b14018aa0 con 0x7f9b18071a90 2026-03-10T14:05:43.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.176+0000 7f9b0e7fc700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9b0406c6d0 0x7f9b0406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.176+0000 7f9b1cc4e700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9b0406c6d0 0x7f9b0406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.176+0000 7f9b0e7fc700 1 -- 192.168.123.103:0/1122435563 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f9b14014070 con 0x7f9b18071a90 2026-03-10T14:05:43.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.177+0000 7f9b1cc4e700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9b0406c6d0 0x7f9b0406eb80 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f9b1000ba80 tx=0x7f9b1000bee0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:43.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.179+0000 7f9b0e7fc700 1 -- 192.168.123.103:0/1122435563 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9b1405afa0 con 0x7f9b18071a90 2026-03-10T14:05:43.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.313+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9b18061190 con 0x7f9b0406c6d0 2026-03-10T14:05:43.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.314+0000 7f9b0e7fc700 1 -- 192.168.123.103:0/1122435563 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f9b18061190 con 0x7f9b0406c6d0 2026-03-10T14:05:43.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9b0406c6d0 msgr2=0x7f9b0406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9b0406c6d0 0x7f9b0406eb80 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f9b1000ba80 tx=0x7f9b1000bee0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18071a90 msgr2=0x7f9b18116ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18071a90 0x7f9b18116ad0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f9b1400eb10 tx=0x7f9b1400ee20 comp rx=0 tx=0).stop 2026-03-10T14:05:43.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 shutdown_connections 2026-03-10T14:05:43.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9b0406c6d0 0x7f9b0406eb80 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9b18071a90 0x7f9b18116ad0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 --2- 192.168.123.103:0/1122435563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b18117010 0x7f9b181b2800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.318+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 >> 192.168.123.103:0/1122435563 conn(0x7f9b1806d1a0 msgr2=0x7f9b1810b260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:43.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.319+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 shutdown_connections 2026-03-10T14:05:43.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.319+0000 7f9b1f6b3700 1 -- 192.168.123.103:0/1122435563 wait complete. 2026-03-10T14:05:43.332 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: pgmap v88: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.6 KiB/s rd, 353 B/s wr, 4 op/s 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: from='client.14544 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: pgmap v89: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 404 B/s wr, 4 op/s 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: pgmap v90: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 252 B/s wr, 3 op/s 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:43.447 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:43 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:05:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.447+0000 7fae8c8ea700 1 -- 192.168.123.103:0/1599614998 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae88072330 msgr2=0x7fae880770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.447+0000 7fae8c8ea700 1 --2- 192.168.123.103:0/1599614998 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae88072330 0x7fae880770b0 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7fae8000b3a0 tx=0x7fae8000b6b0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.448+0000 7fae8c8ea700 1 -- 192.168.123.103:0/1599614998 shutdown_connections 2026-03-10T14:05:43.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.448+0000 7fae8c8ea700 1 --2- 192.168.123.103:0/1599614998 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae88072330 0x7fae880770b0 unknown :-1 s=CLOSED pgs=275 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.448+0000 7fae8c8ea700 1 --2- 192.168.123.103:0/1599614998 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae88071950 0x7fae88071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.448+0000 7fae8c8ea700 1 -- 192.168.123.103:0/1599614998 >> 192.168.123.103:0/1599614998 conn(0x7fae8806d1a0 msgr2=0x7fae8806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:43.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.450+0000 7fae8c8ea700 1 -- 192.168.123.103:0/1599614998 shutdown_connections 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.450+0000 7fae8c8ea700 1 -- 192.168.123.103:0/1599614998 wait complete. 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.450+0000 7fae8c8ea700 1 Processor -- start 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.450+0000 7fae8c8ea700 1 -- start start 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8c8ea700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae88071950 0x7fae881b60d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8c8ea700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae881b6610 0x7fae8807f4e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8c8ea700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae881b6b10 con 0x7fae881b6610 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8c8ea700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae881b6c80 con 0x7fae88071950 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae881b6610 0x7fae8807f4e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae881b6610 0x7fae8807f4e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42684/0 (socket says 192.168.123.103:42684) 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8659c700 1 -- 192.168.123.103:0/373170009 learned_addr learned my addr 192.168.123.103:0/373170009 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae86d9d700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae88071950 0x7fae881b60d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8659c700 1 -- 192.168.123.103:0/373170009 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae88071950 msgr2=0x7fae881b60d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8659c700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae88071950 0x7fae881b60d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.451+0000 7fae8659c700 1 -- 192.168.123.103:0/373170009 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fae8000b050 con 0x7fae881b6610 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.452+0000 7fae8659c700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae881b6610 0x7fae8807f4e0 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7fae80009da0 tx=0x7fae80007b60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:43.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.452+0000 7fae6ffff700 1 -- 192.168.123.103:0/373170009 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fae8000e050 con 0x7fae881b6610 2026-03-10T14:05:43.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.452+0000 7fae8c8ea700 1 -- 192.168.123.103:0/373170009 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fae8807fa20 con 0x7fae881b6610 2026-03-10T14:05:43.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.452+0000 7fae8c8ea700 1 -- 192.168.123.103:0/373170009 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fae8807ff40 con 0x7fae881b6610 2026-03-10T14:05:43.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.453+0000 7fae6ffff700 1 -- 192.168.123.103:0/373170009 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fae80003dc0 con 0x7fae881b6610 2026-03-10T14:05:43.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.453+0000 7fae6ffff700 1 -- 192.168.123.103:0/373170009 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fae8001bb40 con 0x7fae881b6610 2026-03-10T14:05:43.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.454+0000 7fae8c8ea700 1 -- 192.168.123.103:0/373170009 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fae74005320 con 0x7fae881b6610 2026-03-10T14:05:43.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.454+0000 7fae6ffff700 1 -- 192.168.123.103:0/373170009 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fae80019040 con 0x7fae881b6610 2026-03-10T14:05:43.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.454+0000 7fae6ffff700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae7006c7a0 0x7fae7006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.455+0000 7fae86d9d700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae7006c7a0 0x7fae7006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.455+0000 7fae6ffff700 1 -- 192.168.123.103:0/373170009 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fae8008cb50 con 0x7fae881b6610 2026-03-10T14:05:43.457 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.458+0000 7fae86d9d700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae7006c7a0 0x7fae7006ec50 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fae78005fd0 tx=0x7fae7800a560 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:43.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.462+0000 7fae6ffff700 1 -- 192.168.123.103:0/373170009 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fae8005ade0 con 0x7fae881b6610 2026-03-10T14:05:43.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.593+0000 7fae8c8ea700 1 -- 192.168.123.103:0/373170009 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fae74000bf0 con 0x7fae7006c7a0 2026-03-10T14:05:43.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.594+0000 7fae6ffff700 1 -- 192.168.123.103:0/373170009 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fae74000bf0 con 0x7fae7006c7a0 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.597+0000 7fae6dffb700 1 -- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae7006c7a0 msgr2=0x7fae7006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.597+0000 7fae6dffb700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae7006c7a0 0x7fae7006ec50 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fae78005fd0 tx=0x7fae7800a560 comp rx=0 tx=0).stop 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.597+0000 7fae6dffb700 1 -- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae881b6610 msgr2=0x7fae8807f4e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.597+0000 7fae6dffb700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae881b6610 0x7fae8807f4e0 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7fae80009da0 tx=0x7fae80007b60 comp rx=0 tx=0).stop 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.597+0000 7fae6dffb700 1 -- 192.168.123.103:0/373170009 shutdown_connections 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.597+0000 7fae6dffb700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae7006c7a0 0x7fae7006ec50 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.597+0000 7fae6dffb700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fae88071950 0x7fae881b60d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.597+0000 7fae6dffb700 1 --2- 192.168.123.103:0/373170009 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae881b6610 0x7fae8807f4e0 unknown :-1 s=CLOSED pgs=276 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.598+0000 7fae6dffb700 1 -- 192.168.123.103:0/373170009 >> 192.168.123.103:0/373170009 conn(0x7fae8806d1a0 msgr2=0x7fae88076450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.598+0000 7fae6dffb700 1 -- 192.168.123.103:0/373170009 shutdown_connections 2026-03-10T14:05:43.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.598+0000 7fae6dffb700 1 -- 192.168.123.103:0/373170009 wait complete. 2026-03-10T14:05:43.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 -- 192.168.123.103:0/1424385203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20072360 msgr2=0x7f0e200770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 --2- 192.168.123.103:0/1424385203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20072360 0x7f0e200770e0 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7f0e18009230 tx=0x7f0e18009260 comp rx=0 tx=0).stop 2026-03-10T14:05:43.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 -- 192.168.123.103:0/1424385203 shutdown_connections 2026-03-10T14:05:43.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 --2- 192.168.123.103:0/1424385203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20072360 0x7f0e200770e0 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 --2- 192.168.123.103:0/1424385203 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e20071980 0x7f0e20071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 -- 192.168.123.103:0/1424385203 >> 192.168.123.103:0/1424385203 conn(0x7f0e2006d1a0 msgr2=0x7f0e2006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 -- 192.168.123.103:0/1424385203 shutdown_connections 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 -- 192.168.123.103:0/1424385203 wait complete. 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 Processor -- start 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 -- start start 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20071980 0x7f0e20082530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e20082a70 0x7f0e20082ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e201b2a90 con 0x7f0e20071980 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.689+0000 7f0e2663b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e201b2bd0 con 0x7f0e20082a70 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.690+0000 7f0e1ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20071980 0x7f0e20082530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.690+0000 7f0e1ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20071980 0x7f0e20082530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40060/0 (socket says 192.168.123.103:40060) 2026-03-10T14:05:43.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.690+0000 7f0e1ffff700 1 -- 192.168.123.103:0/2410848220 learned_addr learned my addr 192.168.123.103:0/2410848220 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:43.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.690+0000 7f0e1f7fe700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e20082a70 0x7f0e20082ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.690+0000 7f0e1ffff700 1 -- 192.168.123.103:0/2410848220 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e20082a70 msgr2=0x7f0e20082ee0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.690+0000 7f0e1ffff700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e20082a70 0x7f0e20082ee0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.690+0000 7f0e1ffff700 1 -- 192.168.123.103:0/2410848220 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e18008ee0 con 0x7f0e20071980 2026-03-10T14:05:43.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.690+0000 7f0e1ffff700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20071980 0x7f0e20082530 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f0e10011f90 tx=0x7f0e1000f330 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:43.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.691+0000 7f0e1d7fa700 1 -- 192.168.123.103:0/2410848220 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e1000faf0 con 0x7f0e20071980 2026-03-10T14:05:43.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.691+0000 7f0e2663b700 1 -- 192.168.123.103:0/2410848220 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0e201b2d70 con 0x7f0e20071980 2026-03-10T14:05:43.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.691+0000 7f0e2663b700 1 -- 192.168.123.103:0/2410848220 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0e201b3230 con 0x7f0e20071980 2026-03-10T14:05:43.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.691+0000 7f0e1d7fa700 1 -- 192.168.123.103:0/2410848220 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0e10013040 con 0x7f0e20071980 2026-03-10T14:05:43.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.692+0000 7f0e1d7fa700 1 -- 192.168.123.103:0/2410848220 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e100185b0 con 0x7f0e20071980 2026-03-10T14:05:43.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.693+0000 7f0e1d7fa700 1 -- 192.168.123.103:0/2410848220 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f0e1000fc50 con 0x7f0e20071980 2026-03-10T14:05:43.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.693+0000 7f0e1d7fa700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0e0806c7a0 0x7f0e0806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.693+0000 7f0e1f7fe700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0e0806c7a0 0x7f0e0806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.693+0000 7f0e1d7fa700 1 -- 192.168.123.103:0/2410848220 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f0e1008ee40 con 0x7f0e20071980 2026-03-10T14:05:43.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.694+0000 7f0e1f7fe700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0e0806c7a0 0x7f0e0806ec50 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f0e18007530 tx=0x7f0e18007480 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:43.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.694+0000 7f0e2663b700 1 -- 192.168.123.103:0/2410848220 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0e0c005320 con 0x7f0e20071980 2026-03-10T14:05:43.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.697+0000 7f0e1d7fa700 1 -- 192.168.123.103:0/2410848220 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0e10053720 con 0x7f0e20071980 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: pgmap v88: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.6 KiB/s rd, 353 B/s wr, 4 op/s 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: from='client.14544 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: pgmap v89: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 404 B/s wr, 4 op/s 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: pgmap v90: 65 pgs: 65 active+clean; 454 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.2 KiB/s rd, 252 B/s wr, 3 op/s 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:05:43.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:43 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:05:43.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.822+0000 7f0e2663b700 1 -- 192.168.123.103:0/2410848220 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f0e0c000bf0 con 0x7f0e0806c7a0 2026-03-10T14:05:43.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.830+0000 7f0e1d7fa700 1 -- 192.168.123.103:0/2410848220 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3336 (secure 0 0 0) 0x7f0e0c000bf0 con 0x7f0e0806c7a0 2026-03-10T14:05:43.829 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (100s) 13s ago 2m 21.4M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 error 13s ago 2m - - 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (115s) 14s ago 115s 8342k - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 13s ago 2m 7419k - 18.2.0 dc2bc1663786 57962aef7443 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (114s) 14s ago 114s 7407k - 18.2.0 dc2bc1663786 0918365fa827 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (99s) 13s ago 2m 82.1M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (20s) 13s ago 20s 16.4M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (18s) 13s ago 18s 16.1M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (17s) 14s ago 17s 16.8M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (19s) 14s ago 19s 18.5M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:9283,8765,8443 running (3m) 13s ago 3m 499M - 18.2.0 dc2bc1663786 378306a7bb3c 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (111s) 14s ago 111s 446M - 18.2.0 dc2bc1663786 f2d79432e040 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 13s ago 3m 54.7M 2048M 18.2.0 dc2bc1663786 f59cc7d5bdfd 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (109s) 14s ago 109s 44.2M 2048M 18.2.0 dc2bc1663786 4113774b34c7 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (2m) 13s ago 2m 14.2M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (112s) 14s ago 112s 14.1M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (91s) 13s ago 90s 48.8M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (80s) 13s ago 80s 45.7M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (70s) 13s ago 70s 45.3M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (61s) 14s ago 60s 47.2M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (50s) 14s ago 50s 45.6M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (41s) 14s ago 41s 44.0M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:05:43.830 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (93s) 13s ago 2m 36.2M - 2.43.0 a07b618ecd1d fcef697ff8c4 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 -- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0e0806c7a0 msgr2=0x7f0e0806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0e0806c7a0 0x7f0e0806ec50 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f0e18007530 tx=0x7f0e18007480 comp rx=0 tx=0).stop 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 -- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20071980 msgr2=0x7f0e20082530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20071980 0x7f0e20082530 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f0e10011f90 tx=0x7f0e1000f330 comp rx=0 tx=0).stop 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 -- 192.168.123.103:0/2410848220 shutdown_connections 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20071980 0x7f0e20082530 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0e0806c7a0 0x7f0e0806ec50 secure :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f0e18007530 tx=0x7f0e18007480 comp rx=0 tx=0).stop 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 --2- 192.168.123.103:0/2410848220 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e20082a70 0x7f0e20082ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.838+0000 7f0e06ffd700 1 -- 192.168.123.103:0/2410848220 >> 192.168.123.103:0/2410848220 conn(0x7f0e2006d1a0 msgr2=0x7f0e200764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.839+0000 7f0e06ffd700 1 -- 192.168.123.103:0/2410848220 shutdown_connections 2026-03-10T14:05:43.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.839+0000 7f0e06ffd700 1 -- 192.168.123.103:0/2410848220 wait complete. 2026-03-10T14:05:43.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.939+0000 7f5d22e5d700 1 -- 192.168.123.103:0/977759876 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 msgr2=0x7f5d1c105260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.939+0000 7f5d22e5d700 1 --2- 192.168.123.103:0/977759876 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 0x7f5d1c105260 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f5d18009b00 tx=0x7f5d18009e10 comp rx=0 tx=0).stop 2026-03-10T14:05:43.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.940+0000 7f5d22e5d700 1 -- 192.168.123.103:0/977759876 shutdown_connections 2026-03-10T14:05:43.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.940+0000 7f5d22e5d700 1 --2- 192.168.123.103:0/977759876 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d1c1057a0 0x7f5d1c107b80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.940+0000 7f5d22e5d700 1 --2- 192.168.123.103:0/977759876 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 0x7f5d1c105260 unknown :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.940+0000 7f5d22e5d700 1 -- 192.168.123.103:0/977759876 >> 192.168.123.103:0/977759876 conn(0x7f5d1c0fa7b0 msgr2=0x7f5d1c0fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:43.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.940+0000 7f5d22e5d700 1 -- 192.168.123.103:0/977759876 shutdown_connections 2026-03-10T14:05:43.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.940+0000 7f5d22e5d700 1 -- 192.168.123.103:0/977759876 wait complete. 2026-03-10T14:05:43.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.941+0000 7f5d22e5d700 1 Processor -- start 2026-03-10T14:05:43.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.941+0000 7f5d22e5d700 1 -- start start 2026-03-10T14:05:43.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.941+0000 7f5d22e5d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 0x7f5d1c197c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.942+0000 7f5d21e5b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 0x7f5d1c197c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.942+0000 7f5d21e5b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 0x7f5d1c197c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40076/0 (socket says 192.168.123.103:40076) 2026-03-10T14:05:43.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.942+0000 7f5d22e5d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d1c1057a0 0x7f5d1c198180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.942+0000 7f5d21e5b700 1 -- 192.168.123.103:0/1241895201 learned_addr learned my addr 192.168.123.103:0/1241895201 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:43.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.942+0000 7f5d22e5d700 1 -- 192.168.123.103:0/1241895201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d1c1987a0 con 0x7f5d1c0691a0 2026-03-10T14:05:43.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.942+0000 7f5d22e5d700 1 -- 192.168.123.103:0/1241895201 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d1c19d1b0 con 0x7f5d1c1057a0 2026-03-10T14:05:43.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.942+0000 7f5d2165a700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d1c1057a0 0x7f5d1c198180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.943+0000 7f5d21e5b700 1 -- 192.168.123.103:0/1241895201 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d1c1057a0 msgr2=0x7f5d1c198180 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:43.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.943+0000 7f5d21e5b700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d1c1057a0 0x7f5d1c198180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:43.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.943+0000 7f5d21e5b700 1 -- 192.168.123.103:0/1241895201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d180097e0 con 0x7f5d1c0691a0 2026-03-10T14:05:43.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.943+0000 7f5d21e5b700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 0x7f5d1c197c40 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7f5d18006010 tx=0x7f5d18004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:43.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.944+0000 7f5d12ffd700 1 -- 192.168.123.103:0/1241895201 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d1801d070 con 0x7f5d1c0691a0 2026-03-10T14:05:43.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.944+0000 7f5d12ffd700 1 -- 192.168.123.103:0/1241895201 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5d1800bd90 con 0x7f5d1c0691a0 2026-03-10T14:05:43.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.944+0000 7f5d22e5d700 1 -- 192.168.123.103:0/1241895201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d1c19d350 con 0x7f5d1c0691a0 2026-03-10T14:05:43.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.944+0000 7f5d22e5d700 1 -- 192.168.123.103:0/1241895201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d1c19d810 con 0x7f5d1c0691a0 2026-03-10T14:05:43.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.944+0000 7f5d12ffd700 1 -- 192.168.123.103:0/1241895201 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d180229c0 con 0x7f5d1c0691a0 2026-03-10T14:05:43.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.946+0000 7f5d22e5d700 1 -- 192.168.123.103:0/1241895201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d1c04ea50 con 0x7f5d1c0691a0 2026-03-10T14:05:43.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.948+0000 7f5d12ffd700 1 -- 192.168.123.103:0/1241895201 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5d1800f530 con 0x7f5d1c0691a0 2026-03-10T14:05:43.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.948+0000 7f5d12ffd700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d0806c480 0x7f5d0806e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:43.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.948+0000 7f5d12ffd700 1 -- 192.168.123.103:0/1241895201 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f5d1808d790 con 0x7f5d1c0691a0 2026-03-10T14:05:43.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.949+0000 7f5d2165a700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d0806c480 0x7f5d0806e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:43.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.949+0000 7f5d12ffd700 1 -- 192.168.123.103:0/1241895201 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5d18057f90 con 0x7f5d1c0691a0 2026-03-10T14:05:43.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:43.950+0000 7f5d2165a700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d0806c480 0x7f5d0806e930 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f5d0c009c00 tx=0x7f5d0c009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.112+0000 7f5d22e5d700 1 -- 192.168.123.103:0/1241895201 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f5d1c19dac0 con 0x7f5d1c0691a0 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.112+0000 7f5d12ffd700 1 -- 192.168.123.103:0/1241895201 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f5d1805b5b0 con 0x7f5d1c0691a0 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:05:44.111 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T14:05:44.112 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:05:44.112 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:05:44.112 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:05:44.112 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:05:44.112 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:05:44.112 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T14:05:44.112 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:05:44.112 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:05:44.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 -- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d0806c480 msgr2=0x7f5d0806e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d0806c480 0x7f5d0806e930 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f5d0c009c00 tx=0x7f5d0c009380 comp rx=0 tx=0).stop 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 -- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 msgr2=0x7f5d1c197c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 0x7f5d1c197c40 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7f5d18006010 tx=0x7f5d18004c30 comp rx=0 tx=0).stop 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 -- 192.168.123.103:0/1241895201 shutdown_connections 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c0691a0 0x7f5d1c197c40 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d0806c480 0x7f5d0806e930 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 --2- 192.168.123.103:0/1241895201 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d1c1057a0 0x7f5d1c198180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.117+0000 7f5d10ff9700 1 -- 192.168.123.103:0/1241895201 >> 192.168.123.103:0/1241895201 conn(0x7f5d1c0fa7b0 msgr2=0x7f5d1c0fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.118+0000 7f5d10ff9700 1 -- 192.168.123.103:0/1241895201 shutdown_connections 2026-03-10T14:05:44.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.118+0000 7f5d10ff9700 1 -- 192.168.123.103:0/1241895201 wait complete. 2026-03-10T14:05:44.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.208+0000 7f14759f8700 1 -- 192.168.123.103:0/3916043708 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f14680a42e0 msgr2=0x7f14680a46f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.208+0000 7f14759f8700 1 --2- 192.168.123.103:0/3916043708 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f14680a42e0 0x7f14680a46f0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f1464009a60 tx=0x7f1464009d70 comp rx=0 tx=0).stop 2026-03-10T14:05:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.209+0000 7f14759f8700 1 -- 192.168.123.103:0/3916043708 shutdown_connections 2026-03-10T14:05:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.209+0000 7f14759f8700 1 --2- 192.168.123.103:0/3916043708 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14680a5420 0x7f14680a5890 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.209+0000 7f14759f8700 1 --2- 192.168.123.103:0/3916043708 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f14680a42e0 0x7f14680a46f0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.209+0000 7f14759f8700 1 -- 192.168.123.103:0/3916043708 >> 192.168.123.103:0/3916043708 conn(0x7f146809f7b0 msgr2=0x7f14680a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.209+0000 7f14759f8700 1 -- 192.168.123.103:0/3916043708 shutdown_connections 2026-03-10T14:05:44.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.209+0000 7f14759f8700 1 -- 192.168.123.103:0/3916043708 wait complete. 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14759f8700 1 Processor -- start 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14759f8700 1 -- start start 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14759f8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14680a42e0 0x7f14680b3470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14759f8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f14680a5420 0x7f14680b39b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14759f8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14680b3fd0 con 0x7f14680a42e0 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14759f8700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14680b4140 con 0x7f14680a5420 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14749f6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14680a42e0 0x7f14680b3470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14749f6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14680a42e0 0x7f14680b3470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40088/0 (socket says 192.168.123.103:40088) 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f14749f6700 1 -- 192.168.123.103:0/814095562 learned_addr learned my addr 192.168.123.103:0/814095562 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.210+0000 7f146ffff700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f14680a5420 0x7f14680b39b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.211+0000 7f14749f6700 1 -- 192.168.123.103:0/814095562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f14680a5420 msgr2=0x7f14680b39b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.211+0000 7f14749f6700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f14680a5420 0x7f14680b39b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.211+0000 7f14749f6700 1 -- 192.168.123.103:0/814095562 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1464009710 con 0x7f14680a42e0 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.211+0000 7f14749f6700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14680a42e0 0x7f14680b3470 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f1464009a30 tx=0x7f1464005280 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:44.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.212+0000 7f146dffb700 1 -- 192.168.123.103:0/814095562 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f146401d070 con 0x7f14680a42e0 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.212+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f146813ee20 con 0x7f14680a42e0 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.212+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f146813f310 con 0x7f14680a42e0 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.213+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1468004f40 con 0x7f14680a42e0 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.216+0000 7f146dffb700 1 -- 192.168.123.103:0/814095562 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f146400bb40 con 0x7f14680a42e0 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.216+0000 7f146dffb700 1 -- 192.168.123.103:0/814095562 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1464021640 con 0x7f14680a42e0 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.217+0000 7f146dffb700 1 -- 192.168.123.103:0/814095562 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f146400f460 con 0x7f14680a42e0 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.218+0000 7f146dffb700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f146006c4d0 0x7f146006e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.218+0000 7f146dffb700 1 -- 192.168.123.103:0/814095562 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f146408d1e0 con 0x7f14680a42e0 2026-03-10T14:05:44.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.218+0000 7f146dffb700 1 -- 192.168.123.103:0/814095562 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f146408d5f0 con 0x7f14680a42e0 2026-03-10T14:05:44.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.218+0000 7f146ffff700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f146006c4d0 0x7f146006e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.218+0000 7f146ffff700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f146006c4d0 0x7f146006e980 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f145c005950 tx=0x7f145c00b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:44.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.402+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f146813f5f0 con 0x7f14680a42e0 2026-03-10T14:05:44.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.404+0000 7f146dffb700 1 -- 192.168.123.103:0/814095562 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1848 (secure 0 0 0) 0x7f14640579e0 con 0x7f14680a42e0 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:e12 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:epoch 12 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:05:30.451438+0000 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:44.404 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:05:44.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.407+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f146006c4d0 msgr2=0x7f146006e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.407+0000 7f14759f8700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f146006c4d0 0x7f146006e980 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f145c005950 tx=0x7f145c00b410 comp rx=0 tx=0).stop 2026-03-10T14:05:44.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14680a42e0 msgr2=0x7f14680b3470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14680a42e0 0x7f14680b3470 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f1464009a30 tx=0x7f1464005280 comp rx=0 tx=0).stop 2026-03-10T14:05:44.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 shutdown_connections 2026-03-10T14:05:44.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14680a42e0 0x7f14680b3470 unknown :-1 s=CLOSED pgs=281 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f146006c4d0 0x7f146006e980 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 --2- 192.168.123.103:0/814095562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f14680a5420 0x7f14680b39b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 >> 192.168.123.103:0/814095562 conn(0x7f146809f7b0 msgr2=0x7f14680a8650 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:44.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 shutdown_connections 2026-03-10T14:05:44.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.408+0000 7f14759f8700 1 -- 192.168.123.103:0/814095562 wait complete. 2026-03-10T14:05:44.408 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-10T14:05:44.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.511+0000 7fc00cdc9700 1 -- 192.168.123.103:0/3849924865 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008107d50 msgr2=0x7fc0081081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.511+0000 7fc00cdc9700 1 --2- 192.168.123.103:0/3849924865 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008107d50 0x7fc0081081c0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fc00000b3a0 tx=0x7fc00000b6b0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.511+0000 7fc00cdc9700 1 -- 192.168.123.103:0/3849924865 shutdown_connections 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.511+0000 7fc00cdc9700 1 --2- 192.168.123.103:0/3849924865 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008107d50 0x7fc0081081c0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.511+0000 7fc00cdc9700 1 --2- 192.168.123.103:0/3849924865 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc008071db0 0x7fc0080721c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.511+0000 7fc00cdc9700 1 -- 192.168.123.103:0/3849924865 >> 192.168.123.103:0/3849924865 conn(0x7fc00806d3e0 msgr2=0x7fc00806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.513+0000 7fc00cdc9700 1 -- 192.168.123.103:0/3849924865 shutdown_connections 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.513+0000 7fc00cdc9700 1 -- 192.168.123.103:0/3849924865 wait complete. 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.513+0000 7fc00cdc9700 1 Processor -- start 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.513+0000 7fc00cdc9700 1 -- start start 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.513+0000 7fc00cdc9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008071db0 0x7fc00819c490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.513+0000 7fc00cdc9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc00819c9d0 0x7fc0081a1a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.513+0000 7fc00cdc9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc00819ced0 con 0x7fc00819c9d0 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.513+0000 7fc00cdc9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc00819d040 con 0x7fc008071db0 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.514+0000 7fc005d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc00819c9d0 0x7fc0081a1a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.514+0000 7fc005d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc00819c9d0 0x7fc0081a1a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40106/0 (socket says 192.168.123.103:40106) 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.514+0000 7fc005d9b700 1 -- 192.168.123.103:0/1098143410 learned_addr learned my addr 192.168.123.103:0/1098143410 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.514+0000 7fc00659c700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008071db0 0x7fc00819c490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.514+0000 7fc00659c700 1 -- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc00819c9d0 msgr2=0x7fc0081a1a40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.514+0000 7fc00659c700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc00819c9d0 0x7fc0081a1a40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.514+0000 7fc00659c700 1 -- 192.168.123.103:0/1098143410 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc00000b050 con 0x7fc008071db0 2026-03-10T14:05:44.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.515+0000 7fc00659c700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008071db0 0x7fc00819c490 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fbff801fc80 tx=0x7fbff801ff90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:44.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.516+0000 7fbff77fe700 1 -- 192.168.123.103:0/1098143410 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbff801dd60 con 0x7fc008071db0 2026-03-10T14:05:44.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.516+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0081a1fe0 con 0x7fc008071db0 2026-03-10T14:05:44.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.516+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0081a24a0 con 0x7fc008071db0 2026-03-10T14:05:44.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.518+0000 7fbff77fe700 1 -- 192.168.123.103:0/1098143410 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbff8021920 con 0x7fc008071db0 2026-03-10T14:05:44.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.518+0000 7fbff77fe700 1 -- 192.168.123.103:0/1098143410 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbff8029ab0 con 0x7fc008071db0 2026-03-10T14:05:44.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.518+0000 7fbff77fe700 1 -- 192.168.123.103:0/1098143410 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fbff8029cd0 con 0x7fc008071db0 2026-03-10T14:05:44.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.518+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbfe8005320 con 0x7fc008071db0 2026-03-10T14:05:44.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.518+0000 7fbff77fe700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbff006c870 0x7fbff006ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.519+0000 7fc005d9b700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbff006c870 0x7fbff006ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.519+0000 7fc005d9b700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbff006c870 0x7fbff006ed20 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc000000f80 tx=0x7fc00000ba00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:44.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.519+0000 7fbff77fe700 1 -- 192.168.123.103:0/1098143410 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fbff8025070 con 0x7fc008071db0 2026-03-10T14:05:44.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.521+0000 7fbff77fe700 1 -- 192.168.123.103:0/1098143410 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fbff806c7b0 con 0x7fc008071db0 2026-03-10T14:05:44.681 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:44 vm03.local ceph-mon[49718]: from='client.24313 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:44.682 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:44 vm03.local ceph-mon[49718]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:44.682 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:44 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/1241895201' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:05:44.682 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:44 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/814095562' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.680+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbfe8000bf0 con 0x7fbff006c870 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.681+0000 7fbff77fe700 1 -- 192.168.123.103:0/1098143410 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fbfe8000bf0 con 0x7fbff006c870 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "", 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:05:44.682 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbff006c870 msgr2=0x7fbff006ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbff006c870 0x7fbff006ed20 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc000000f80 tx=0x7fc00000ba00 comp rx=0 tx=0).stop 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008071db0 msgr2=0x7fc00819c490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008071db0 0x7fc00819c490 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fbff801fc80 tx=0x7fbff801ff90 comp rx=0 tx=0).stop 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 shutdown_connections 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbff006c870 0x7fbff006ed20 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc008071db0 0x7fc00819c490 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 --2- 192.168.123.103:0/1098143410 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc00819c9d0 0x7fc0081a1a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.686+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 >> 192.168.123.103:0/1098143410 conn(0x7fc00806d3e0 msgr2=0x7fc008070700 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:44.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.687+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 shutdown_connections 2026-03-10T14:05:44.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.687+0000 7fc00cdc9700 1 -- 192.168.123.103:0/1098143410 wait complete. 2026-03-10T14:05:44.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.768+0000 7f6c4b60a700 1 -- 192.168.123.103:0/549910562 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 msgr2=0x7f6c4410be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.768+0000 7f6c4b60a700 1 --2- 192.168.123.103:0/549910562 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 0x7f6c4410be90 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f6c4000b600 tx=0x7f6c4000b910 comp rx=0 tx=0).stop 2026-03-10T14:05:44.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.768+0000 7f6c4b60a700 1 -- 192.168.123.103:0/549910562 shutdown_connections 2026-03-10T14:05:44.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.768+0000 7f6c4b60a700 1 --2- 192.168.123.103:0/549910562 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 0x7f6c4410be90 unknown :-1 s=CLOSED pgs=282 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.768+0000 7f6c4b60a700 1 --2- 192.168.123.103:0/549910562 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c44071a60 0x7f6c44071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.768+0000 7f6c4b60a700 1 -- 192.168.123.103:0/549910562 >> 192.168.123.103:0/549910562 conn(0x7f6c4406d1a0 msgr2=0x7f6c4406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:44.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.768+0000 7f6c4b60a700 1 -- 192.168.123.103:0/549910562 shutdown_connections 2026-03-10T14:05:44.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.769+0000 7f6c4b60a700 1 -- 192.168.123.103:0/549910562 wait complete. 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.769+0000 7f6c4b60a700 1 Processor -- start 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.769+0000 7f6c4b60a700 1 -- start start 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.769+0000 7f6c4b60a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c44071a60 0x7f6c4419c190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.769+0000 7f6c4b60a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 0x7f6c4419c6d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.769+0000 7f6c4b60a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c4419ccf0 con 0x7f6c44072440 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.769+0000 7f6c4b60a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c4419ce30 con 0x7f6c44071a60 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c4a608700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c44071a60 0x7f6c4419c190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c4a608700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c44071a60 0x7f6c4419c190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49392/0 (socket says 192.168.123.103:49392) 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c4a608700 1 -- 192.168.123.103:0/1289424750 learned_addr learned my addr 192.168.123.103:0/1289424750 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:05:44.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c49e07700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 0x7f6c4419c6d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c4a608700 1 -- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 msgr2=0x7f6c4419c6d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c4a608700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 0x7f6c4419c6d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c4a608700 1 -- 192.168.123.103:0/1289424750 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6c4000b050 con 0x7f6c44071a60 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c49e07700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 0x7f6c4419c6d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c4a608700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c44071a60 0x7f6c4419c190 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f6c3c01fd10 tx=0x7f6c3c01d5c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.770+0000 7f6c377fe700 1 -- 192.168.123.103:0/1289424750 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c3c01a990 con 0x7f6c44071a60 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.771+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6c441a18e0 con 0x7f6c44071a60 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.771+0000 7f6c377fe700 1 -- 192.168.123.103:0/1289424750 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6c3c004500 con 0x7f6c44071a60 2026-03-10T14:05:44.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.771+0000 7f6c377fe700 1 -- 192.168.123.103:0/1289424750 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c3c021690 con 0x7f6c44071a60 2026-03-10T14:05:44.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.771+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6c441a1e30 con 0x7f6c44071a60 2026-03-10T14:05:44.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.773+0000 7f6c377fe700 1 -- 192.168.123.103:0/1289424750 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6c3c003680 con 0x7f6c44071a60 2026-03-10T14:05:44.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.773+0000 7f6c377fe700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c3006c630 0x7f6c3006eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:05:44.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.773+0000 7f6c377fe700 1 -- 192.168.123.103:0/1289424750 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f6c3c025070 con 0x7f6c44071a60 2026-03-10T14:05:44.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.773+0000 7f6c49e07700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c3006c630 0x7f6c3006eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:05:44.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.773+0000 7f6c49e07700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c3006c630 0x7f6c3006eae0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f6c400062a0 tx=0x7f6c400061f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:05:44.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.774+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6c44196370 con 0x7f6c44071a60 2026-03-10T14:05:44.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:44.778+0000 7f6c377fe700 1 -- 192.168.123.103:0/1289424750 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6c3c06b2b0 con 0x7f6c44071a60 2026-03-10T14:05:44.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:44 vm04.local ceph-mon[55966]: from='client.24313 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:44.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:44 vm04.local ceph-mon[55966]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:44.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:44 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/1241895201' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:05:44.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:44 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/814095562' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:05:45.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.001+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f6c4404f2a0 con 0x7f6c44071a60 2026-03-10T14:05:45.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.001+0000 7f6c377fe700 1 -- 192.168.123.103:0/1289424750 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+151 (secure 0 0 0) 0x7f6c3c06ae40 con 0x7f6c44071a60 2026-03-10T14:05:45.002 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 failed cephadm daemon(s) 2026-03-10T14:05:45.002 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s) 2026-03-10T14:05:45.002 INFO:teuthology.orchestra.run.vm03.stdout: daemon ceph-exporter.vm03 on vm03 is in error state 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c3006c630 msgr2=0x7f6c3006eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c3006c630 0x7f6c3006eae0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f6c400062a0 tx=0x7f6c400061f0 comp rx=0 tx=0).stop 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c44071a60 msgr2=0x7f6c4419c190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c44071a60 0x7f6c4419c190 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f6c3c01fd10 tx=0x7f6c3c01d5c0 comp rx=0 tx=0).stop 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 shutdown_connections 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c3006c630 0x7f6c3006eae0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c44071a60 0x7f6c4419c190 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 --2- 192.168.123.103:0/1289424750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c44072440 0x7f6c4419c6d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.008+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 >> 192.168.123.103:0/1289424750 conn(0x7f6c4406d1a0 msgr2=0x7f6c4410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.009+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 shutdown_connections 2026-03-10T14:05:45.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:05:45.009+0000 7f6c4b60a700 1 -- 192.168.123.103:0/1289424750 wait complete. 2026-03-10T14:05:45.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:45 vm04.local ceph-mon[55966]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:45.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:45 vm04.local ceph-mon[55966]: pgmap v91: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.4 KiB/s rd, 1.2 KiB/s wr, 6 op/s 2026-03-10T14:05:45.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:45 vm04.local ceph-mon[55966]: from='client.24329 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:45.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:45 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/1289424750' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:05:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:45 vm03.local ceph-mon[49718]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:45 vm03.local ceph-mon[49718]: pgmap v91: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.4 KiB/s rd, 1.2 KiB/s wr, 6 op/s 2026-03-10T14:05:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:45 vm03.local ceph-mon[49718]: from='client.24329 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:05:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:45 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/1289424750' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:05:47.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:47 vm04.local ceph-mon[55966]: pgmap v92: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.5 KiB/s rd, 1009 B/s wr, 4 op/s 2026-03-10T14:05:47.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:47 vm03.local ceph-mon[49718]: pgmap v92: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.5 KiB/s rd, 1009 B/s wr, 4 op/s 2026-03-10T14:05:49.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:49 vm03.local ceph-mon[49718]: pgmap v93: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.9 KiB/s rd, 1.1 KiB/s wr, 6 op/s 2026-03-10T14:05:50.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:49 vm04.local ceph-mon[55966]: pgmap v93: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.9 KiB/s rd, 1.1 KiB/s wr, 6 op/s 2026-03-10T14:05:51.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:51 vm03.local ceph-mon[49718]: pgmap v94: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.3 KiB/s rd, 1.1 KiB/s wr, 5 op/s 2026-03-10T14:05:51.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:51 vm04.local ceph-mon[55966]: pgmap v94: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.3 KiB/s rd, 1.1 KiB/s wr, 5 op/s 2026-03-10T14:05:53.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:53 vm04.local ceph-mon[55966]: pgmap v95: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.5 KiB/s rd, 921 B/s wr, 5 op/s 2026-03-10T14:05:53.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:53 vm03.local ceph-mon[49718]: pgmap v95: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.5 KiB/s rd, 921 B/s wr, 5 op/s 2026-03-10T14:05:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:55 vm04.local ceph-mon[55966]: pgmap v96: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.8 KiB/s rd, 1.2 KiB/s wr, 5 op/s 2026-03-10T14:05:56.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:55 vm03.local ceph-mon[49718]: pgmap v96: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.8 KiB/s rd, 1.2 KiB/s wr, 5 op/s 2026-03-10T14:05:57.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:57 vm03.local ceph-mon[49718]: pgmap v97: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 511 B/s wr, 4 op/s 2026-03-10T14:05:58.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:57 vm04.local ceph-mon[55966]: pgmap v97: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 511 B/s wr, 4 op/s 2026-03-10T14:05:58.709 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:58 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:05:59.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:58 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:00.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:05:59 vm04.local ceph-mon[55966]: pgmap v98: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 511 B/s wr, 5 op/s 2026-03-10T14:06:00.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:05:59 vm03.local ceph-mon[49718]: pgmap v98: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 511 B/s wr, 5 op/s 2026-03-10T14:06:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:01 vm04.local ceph-mon[55966]: pgmap v99: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.0 KiB/s rd, 426 B/s wr, 3 op/s 2026-03-10T14:06:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:01 vm03.local ceph-mon[49718]: pgmap v99: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.0 KiB/s rd, 426 B/s wr, 3 op/s 2026-03-10T14:06:03.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:03 vm04.local ceph-mon[55966]: pgmap v100: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 426 B/s wr, 4 op/s 2026-03-10T14:06:03.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:03 vm03.local ceph-mon[49718]: pgmap v100: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 426 B/s wr, 4 op/s 2026-03-10T14:06:05.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:05 vm03.local ceph-mon[49718]: pgmap v101: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 426 B/s wr, 4 op/s 2026-03-10T14:06:06.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:05 vm04.local ceph-mon[55966]: pgmap v101: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s rd, 426 B/s wr, 4 op/s 2026-03-10T14:06:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:07 vm03.local ceph-mon[49718]: pgmap v102: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:08.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:07 vm04.local ceph-mon[55966]: pgmap v102: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:09.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:09 vm03.local ceph-mon[49718]: pgmap v103: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:10.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:09 vm04.local ceph-mon[55966]: pgmap v103: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:12.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:11 vm04.local ceph-mon[55966]: pgmap v104: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:06:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:11 vm03.local ceph-mon[49718]: pgmap v104: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:06:13.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:12 vm04.local ceph-mon[55966]: pgmap v105: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:12 vm03.local ceph-mon[49718]: pgmap v105: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:13.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:13 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:13 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.092+0000 7efdc891f700 1 -- 192.168.123.103:0/525854654 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc096a30 msgr2=0x7efdbc096e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.092+0000 7efdc891f700 1 --2- 192.168.123.103:0/525854654 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc096a30 0x7efdbc096e40 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7efdb4009b00 tx=0x7efdb4009e10 comp rx=0 tx=0).stop 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.092+0000 7efdc891f700 1 -- 192.168.123.103:0/525854654 shutdown_connections 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.092+0000 7efdc891f700 1 --2- 192.168.123.103:0/525854654 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdbc097410 0x7efdbc0934b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.092+0000 7efdc891f700 1 --2- 192.168.123.103:0/525854654 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc096a30 0x7efdbc096e40 secure :-1 s=CLOSED pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7efdb4009b00 tx=0x7efdb4009e10 comp rx=0 tx=0).stop 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.092+0000 7efdc891f700 1 -- 192.168.123.103:0/525854654 >> 192.168.123.103:0/525854654 conn(0x7efdbc08f020 msgr2=0x7efdbc091470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.093+0000 7efdc891f700 1 -- 192.168.123.103:0/525854654 shutdown_connections 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.093+0000 7efdc891f700 1 -- 192.168.123.103:0/525854654 wait complete. 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc891f700 1 Processor -- start 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc891f700 1 -- start start 2026-03-10T14:06:15.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc891f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdbc097410 0x7efdbc12aed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc891f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc12b410 0x7efdbc130480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc891f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efdbc12b910 con 0x7efdbc12b410 2026-03-10T14:06:15.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc891f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efdbc12ba80 con 0x7efdbc097410 2026-03-10T14:06:15.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc2d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdbc097410 0x7efdbc12aed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc12b410 0x7efdbc130480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc12b410 0x7efdbc130480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58132/0 (socket says 192.168.123.103:58132) 2026-03-10T14:06:15.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.094+0000 7efdc259c700 1 -- 192.168.123.103:0/3762857524 learned_addr learned my addr 192.168.123.103:0/3762857524 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:15.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.095+0000 7efdc259c700 1 -- 192.168.123.103:0/3762857524 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdbc097410 msgr2=0x7efdbc12aed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.095+0000 7efdc259c700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdbc097410 0x7efdbc12aed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.095+0000 7efdc259c700 1 -- 192.168.123.103:0/3762857524 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efdb40097e0 con 0x7efdbc12b410 2026-03-10T14:06:15.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.095+0000 7efdc259c700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc12b410 0x7efdbc130480 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7efdb8009fd0 tx=0x7efdb800edf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:15.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.095+0000 7efdabfff700 1 -- 192.168.123.103:0/3762857524 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efdb8009980 con 0x7efdbc12b410 2026-03-10T14:06:15.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.095+0000 7efdc891f700 1 -- 192.168.123.103:0/3762857524 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efdbc130a20 con 0x7efdbc12b410 2026-03-10T14:06:15.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.096+0000 7efdc891f700 1 -- 192.168.123.103:0/3762857524 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efdbc130f40 con 0x7efdbc12b410 2026-03-10T14:06:15.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.097+0000 7efdabfff700 1 -- 192.168.123.103:0/3762857524 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efdb8004d10 con 0x7efdbc12b410 2026-03-10T14:06:15.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.097+0000 7efdabfff700 1 -- 192.168.123.103:0/3762857524 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efdb8010430 con 0x7efdbc12b410 2026-03-10T14:06:15.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.097+0000 7efdabfff700 1 -- 192.168.123.103:0/3762857524 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7efdb8010650 con 0x7efdbc12b410 2026-03-10T14:06:15.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.097+0000 7efdabfff700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efdac06c680 0x7efdac06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.098+0000 7efdabfff700 1 -- 192.168.123.103:0/3762857524 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7efdb8014070 con 0x7efdbc12b410 2026-03-10T14:06:15.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.098+0000 7efdc2d9d700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efdac06c680 0x7efdac06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.099+0000 7efdc2d9d700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efdac06c680 0x7efdac06eb30 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7efdb4005200 tx=0x7efdb401a040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:15.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.099+0000 7efda9ffb700 1 -- 192.168.123.103:0/3762857524 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efdb0005320 con 0x7efdbc12b410 2026-03-10T14:06:15.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.102+0000 7efdabfff700 1 -- 192.168.123.103:0/3762857524 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7efdb805b000 con 0x7efdbc12b410 2026-03-10T14:06:15.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:14 vm03.local ceph-mon[49718]: pgmap v106: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:15.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.230+0000 7efda9ffb700 1 -- 192.168.123.103:0/3762857524 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7efdb0000bf0 con 0x7efdac06c680 2026-03-10T14:06:15.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.232+0000 7efdabfff700 1 -- 192.168.123.103:0/3762857524 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7efdb0000bf0 con 0x7efdac06c680 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 -- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efdac06c680 msgr2=0x7efdac06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efdac06c680 0x7efdac06eb30 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7efdb4005200 tx=0x7efdb401a040 comp rx=0 tx=0).stop 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 -- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc12b410 msgr2=0x7efdbc130480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc12b410 0x7efdbc130480 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7efdb8009fd0 tx=0x7efdb800edf0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 -- 192.168.123.103:0/3762857524 shutdown_connections 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efdac06c680 0x7efdac06eb30 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efdbc097410 0x7efdbc12aed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 --2- 192.168.123.103:0/3762857524 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdbc12b410 0x7efdbc130480 unknown :-1 s=CLOSED pgs=284 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 -- 192.168.123.103:0/3762857524 >> 192.168.123.103:0/3762857524 conn(0x7efdbc08f020 msgr2=0x7efdbc099a00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 -- 192.168.123.103:0/3762857524 shutdown_connections 2026-03-10T14:06:15.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.234+0000 7efda9ffb700 1 -- 192.168.123.103:0/3762857524 wait complete. 2026-03-10T14:06:15.244 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.313+0000 7f80aa52a700 1 -- 192.168.123.103:0/4036192853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a5450 msgr2=0x7f809c0a58c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.313+0000 7f80aa52a700 1 --2- 192.168.123.103:0/4036192853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a5450 0x7f809c0a58c0 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7f80a0009b00 tx=0x7f80a0009e10 comp rx=0 tx=0).stop 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.313+0000 7f80aa52a700 1 -- 192.168.123.103:0/4036192853 shutdown_connections 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.313+0000 7f80aa52a700 1 --2- 192.168.123.103:0/4036192853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a5450 0x7f809c0a58c0 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.313+0000 7f80aa52a700 1 --2- 192.168.123.103:0/4036192853 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f809c0a4310 0x7f809c0a4720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.313+0000 7f80aa52a700 1 -- 192.168.123.103:0/4036192853 >> 192.168.123.103:0/4036192853 conn(0x7f809c09f7e0 msgr2=0x7f809c0a1c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.313+0000 7f80aa52a700 1 -- 192.168.123.103:0/4036192853 shutdown_connections 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.313+0000 7f80aa52a700 1 -- 192.168.123.103:0/4036192853 wait complete. 2026-03-10T14:06:15.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80aa52a700 1 Processor -- start 2026-03-10T14:06:15.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80aa52a700 1 -- start start 2026-03-10T14:06:15.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:14 vm04.local ceph-mon[55966]: pgmap v106: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80aa52a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a4310 0x7f809c1421e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80aa52a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f809c0a5450 0x7f809c142720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80aa52a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f809c142d40 con 0x7f809c0a4310 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80aa52a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f809c142e80 con 0x7f809c0a5450 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80a9528700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a4310 0x7f809c1421e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80a9528700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a4310 0x7f809c1421e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58136/0 (socket says 192.168.123.103:58136) 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80a9528700 1 -- 192.168.123.103:0/2914671999 learned_addr learned my addr 192.168.123.103:0/2914671999 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.314+0000 7f80a8d27700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f809c0a5450 0x7f809c142720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80a9528700 1 -- 192.168.123.103:0/2914671999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f809c0a5450 msgr2=0x7f809c142720 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80a9528700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f809c0a5450 0x7f809c142720 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80a9528700 1 -- 192.168.123.103:0/2914671999 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80a00097e0 con 0x7f809c0a4310 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80a8d27700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f809c0a5450 0x7f809c142720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:06:15.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80a9528700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a4310 0x7f809c1421e0 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f809800cc60 tx=0x7f80980074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80967fc700 1 -- 192.168.123.103:0/2914671999 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8098007af0 con 0x7f809c0a4310 2026-03-10T14:06:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80967fc700 1 -- 192.168.123.103:0/2914671999 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8098004d10 con 0x7f809c0a4310 2026-03-10T14:06:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f809c147930 con 0x7f809c0a4310 2026-03-10T14:06:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f809c147e80 con 0x7f809c0a4310 2026-03-10T14:06:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.315+0000 7f80967fc700 1 -- 192.168.123.103:0/2914671999 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80980056e0 con 0x7f809c0a4310 2026-03-10T14:06:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.317+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8088005320 con 0x7f809c0a4310 2026-03-10T14:06:15.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.318+0000 7f80967fc700 1 -- 192.168.123.103:0/2914671999 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8098004750 con 0x7f809c0a4310 2026-03-10T14:06:15.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.318+0000 7f80967fc700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f809006c740 0x7f809006ebf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.318+0000 7f80967fc700 1 -- 192.168.123.103:0/2914671999 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f809808b660 con 0x7f809c0a4310 2026-03-10T14:06:15.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.319+0000 7f80a8d27700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f809006c740 0x7f809006ebf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.319+0000 7f80a8d27700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f809006c740 0x7f809006ebf0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f80a0006010 tx=0x7f80a000b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:15.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.321+0000 7f80967fc700 1 -- 192.168.123.103:0/2914671999 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f80980598f0 con 0x7f809c0a4310 2026-03-10T14:06:15.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.455+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8088000bf0 con 0x7f809006c740 2026-03-10T14:06:15.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.457+0000 7f80967fc700 1 -- 192.168.123.103:0/2914671999 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f8088000bf0 con 0x7f809006c740 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f809006c740 msgr2=0x7f809006ebf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f809006c740 0x7f809006ebf0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f80a0006010 tx=0x7f80a000b540 comp rx=0 tx=0).stop 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a4310 msgr2=0x7f809c1421e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a4310 0x7f809c1421e0 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f809800cc60 tx=0x7f80980074a0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 shutdown_connections 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f809c0a4310 0x7f809c1421e0 unknown :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f809006c740 0x7f809006ebf0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 --2- 192.168.123.103:0/2914671999 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f809c0a5450 0x7f809c142720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 >> 192.168.123.103:0/2914671999 conn(0x7f809c09f7e0 msgr2=0x7f809c0a1af0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.464+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 shutdown_connections 2026-03-10T14:06:15.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.465+0000 7f80aa52a700 1 -- 192.168.123.103:0/2914671999 wait complete. 2026-03-10T14:06:15.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.543+0000 7fb9391f1700 1 -- 192.168.123.103:0/2928081672 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c009160 msgr2=0x7fb92c0075c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.543+0000 7fb9391f1700 1 --2- 192.168.123.103:0/2928081672 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c009160 0x7fb92c0075c0 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7fb928009b50 tx=0x7fb928009e60 comp rx=0 tx=0).stop 2026-03-10T14:06:15.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.544+0000 7fb9391f1700 1 -- 192.168.123.103:0/2928081672 shutdown_connections 2026-03-10T14:06:15.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.544+0000 7fb9391f1700 1 --2- 192.168.123.103:0/2928081672 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb92c007b90 0x7fb92c008000 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.544+0000 7fb9391f1700 1 --2- 192.168.123.103:0/2928081672 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c009160 0x7fb92c0075c0 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.544+0000 7fb9391f1700 1 -- 192.168.123.103:0/2928081672 >> 192.168.123.103:0/2928081672 conn(0x7fb92c0909d0 msgr2=0x7fb92c092e00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:15.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.546+0000 7fb9391f1700 1 -- 192.168.123.103:0/2928081672 shutdown_connections 2026-03-10T14:06:15.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.546+0000 7fb9391f1700 1 -- 192.168.123.103:0/2928081672 wait complete. 2026-03-10T14:06:15.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.547+0000 7fb9391f1700 1 Processor -- start 2026-03-10T14:06:15.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.547+0000 7fb9391f1700 1 -- start start 2026-03-10T14:06:15.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.547+0000 7fb9391f1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c007b90 0x7fb92c131250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.547+0000 7fb9391f1700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb92c009160 0x7fb92c131790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.547+0000 7fb9391f1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb92c131db0 con 0x7fb92c007b90 2026-03-10T14:06:15.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.547+0000 7fb9391f1700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb92c131ef0 con 0x7fb92c009160 2026-03-10T14:06:15.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.547+0000 7fb933fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c007b90 0x7fb92c131250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.548+0000 7fb9337fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb92c009160 0x7fb92c131790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.548+0000 7fb9337fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb92c009160 0x7fb92c131790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37722/0 (socket says 192.168.123.103:37722) 2026-03-10T14:06:15.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.548+0000 7fb9337fe700 1 -- 192.168.123.103:0/4022823269 learned_addr learned my addr 192.168.123.103:0/4022823269 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:15.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.548+0000 7fb933fff700 1 -- 192.168.123.103:0/4022823269 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb92c009160 msgr2=0x7fb92c131790 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.548+0000 7fb933fff700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb92c009160 0x7fb92c131790 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.548+0000 7fb933fff700 1 -- 192.168.123.103:0/4022823269 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9280097e0 con 0x7fb92c007b90 2026-03-10T14:06:15.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.548+0000 7fb933fff700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c007b90 0x7fb92c131250 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7fb928009b20 tx=0x7fb928005740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:15.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.549+0000 7fb9317fa700 1 -- 192.168.123.103:0/4022823269 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb92801d070 con 0x7fb92c007b90 2026-03-10T14:06:15.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.549+0000 7fb9317fa700 1 -- 192.168.123.103:0/4022823269 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb92800bae0 con 0x7fb92c007b90 2026-03-10T14:06:15.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.549+0000 7fb9317fa700 1 -- 192.168.123.103:0/4022823269 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb92800f670 con 0x7fb92c007b90 2026-03-10T14:06:15.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.549+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb92c136940 con 0x7fb92c007b90 2026-03-10T14:06:15.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.549+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb92c136e30 con 0x7fb92c007b90 2026-03-10T14:06:15.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.550+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb92c005750 con 0x7fb92c007b90 2026-03-10T14:06:15.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.552+0000 7fb9317fa700 1 -- 192.168.123.103:0/4022823269 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb92800bc50 con 0x7fb92c007b90 2026-03-10T14:06:15.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.552+0000 7fb9317fa700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb92406c7a0 0x7fb92406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.552+0000 7fb9317fa700 1 -- 192.168.123.103:0/4022823269 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb92808d8d0 con 0x7fb92c007b90 2026-03-10T14:06:15.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.555+0000 7fb9337fe700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb92406c7a0 0x7fb92406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.555+0000 7fb9337fe700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb92406c7a0 0x7fb92406ec50 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fb920005950 tx=0x7fb92000b500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:15.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.555+0000 7fb9317fa700 1 -- 192.168.123.103:0/4022823269 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb92805bb60 con 0x7fb92c007b90 2026-03-10T14:06:15.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.671+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fb92c0984d0 con 0x7fb92406c7a0 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.677+0000 7fb9317fa700 1 -- 192.168.123.103:0/4022823269 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3336 (secure 0 0 0) 0x7fb92c0984d0 con 0x7fb92406c7a0 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 45s ago 2m 21.4M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 error 45s ago 3m - - 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (2m) 46s ago 2m 8342k - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 45s ago 3m 7419k - 18.2.0 dc2bc1663786 57962aef7443 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (2m) 46s ago 2m 7407k - 18.2.0 dc2bc1663786 0918365fa827 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 45s ago 2m 82.1M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (52s) 45s ago 52s 16.4M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (50s) 45s ago 50s 16.1M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (49s) 46s ago 49s 16.8M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (51s) 46s ago 51s 18.5M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:9283,8765,8443 running (3m) 45s ago 3m 499M - 18.2.0 dc2bc1663786 378306a7bb3c 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (2m) 46s ago 2m 446M - 18.2.0 dc2bc1663786 f2d79432e040 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 45s ago 3m 54.7M 2048M 18.2.0 dc2bc1663786 f59cc7d5bdfd 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (2m) 46s ago 2m 44.2M 2048M 18.2.0 dc2bc1663786 4113774b34c7 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (2m) 45s ago 2m 14.2M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (2m) 46s ago 2m 14.1M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 45s ago 2m 48.8M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (112s) 45s ago 112s 45.7M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (102s) 45s ago 102s 45.3M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (92s) 46s ago 92s 47.2M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (82s) 46s ago 82s 45.6M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (73s) 46s ago 73s 44.0M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:06:15.676 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 45s ago 2m 36.2M - 2.43.0 a07b618ecd1d fcef697ff8c4 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.679+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb92406c7a0 msgr2=0x7fb92406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.679+0000 7fb9391f1700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb92406c7a0 0x7fb92406ec50 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fb920005950 tx=0x7fb92000b500 comp rx=0 tx=0).stop 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.679+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c007b90 msgr2=0x7fb92c131250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.679+0000 7fb9391f1700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c007b90 0x7fb92c131250 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7fb928009b20 tx=0x7fb928005740 comp rx=0 tx=0).stop 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.680+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 shutdown_connections 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.680+0000 7fb9391f1700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb92c007b90 0x7fb92c131250 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.680+0000 7fb9391f1700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb92406c7a0 0x7fb92406ec50 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.680+0000 7fb9391f1700 1 --2- 192.168.123.103:0/4022823269 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb92c009160 0x7fb92c131790 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.680+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 >> 192.168.123.103:0/4022823269 conn(0x7fb92c0909d0 msgr2=0x7fb92c096db0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.680+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 shutdown_connections 2026-03-10T14:06:15.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.680+0000 7fb9391f1700 1 -- 192.168.123.103:0/4022823269 wait complete. 2026-03-10T14:06:15.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.761+0000 7fd66ece3700 1 -- 192.168.123.103:0/1671630261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a5b50 msgr2=0x7fd6600a7f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.761+0000 7fd66ece3700 1 --2- 192.168.123.103:0/1671630261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a5b50 0x7fd6600a7f30 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7fd658009b00 tx=0x7fd658009e10 comp rx=0 tx=0).stop 2026-03-10T14:06:15.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.761+0000 7fd66ece3700 1 -- 192.168.123.103:0/1671630261 shutdown_connections 2026-03-10T14:06:15.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.761+0000 7fd66ece3700 1 --2- 192.168.123.103:0/1671630261 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6600a8500 0x7fd6600aa920 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.761+0000 7fd66ece3700 1 --2- 192.168.123.103:0/1671630261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a5b50 0x7fd6600a7f30 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.761+0000 7fd66ece3700 1 -- 192.168.123.103:0/1671630261 >> 192.168.123.103:0/1671630261 conn(0x7fd66009f7b0 msgr2=0x7fd6600a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:15.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.762+0000 7fd66ece3700 1 -- 192.168.123.103:0/1671630261 shutdown_connections 2026-03-10T14:06:15.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.762+0000 7fd66ece3700 1 -- 192.168.123.103:0/1671630261 wait complete. 2026-03-10T14:06:15.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66ece3700 1 Processor -- start 2026-03-10T14:06:15.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66ece3700 1 -- start start 2026-03-10T14:06:15.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66ece3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6600a5b50 0x7fd6600ace00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66ece3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a8500 0x7fd6600ab450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66ece3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd66013f920 con 0x7fd6600a8500 2026-03-10T14:06:15.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66ece3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd66013fa60 con 0x7fd6600a5b50 2026-03-10T14:06:15.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66d4e0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a8500 0x7fd6600ab450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66d4e0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a8500 0x7fd6600ab450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58160/0 (socket says 192.168.123.103:58160) 2026-03-10T14:06:15.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.763+0000 7fd66d4e0700 1 -- 192.168.123.103:0/1717731430 learned_addr learned my addr 192.168.123.103:0/1717731430 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:15.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.764+0000 7fd66d4e0700 1 -- 192.168.123.103:0/1717731430 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6600a5b50 msgr2=0x7fd6600ace00 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:06:15.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.764+0000 7fd66d4e0700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6600a5b50 0x7fd6600ace00 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.764+0000 7fd66d4e0700 1 -- 192.168.123.103:0/1717731430 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6580097e0 con 0x7fd6600a8500 2026-03-10T14:06:15.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.764+0000 7fd66d4e0700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a8500 0x7fd6600ab450 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7fd6680502d0 tx=0x7fd6680709d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:15.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.764+0000 7fd65effd700 1 -- 192.168.123.103:0/1717731430 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd668069340 con 0x7fd6600a8500 2026-03-10T14:06:15.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.764+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6600ab990 con 0x7fd6600a8500 2026-03-10T14:06:15.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.764+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6600abeb0 con 0x7fd6600a8500 2026-03-10T14:06:15.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.767+0000 7fd65effd700 1 -- 192.168.123.103:0/1717731430 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd668067a90 con 0x7fd6600a8500 2026-03-10T14:06:15.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.767+0000 7fd65effd700 1 -- 192.168.123.103:0/1717731430 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd668072c20 con 0x7fd6600a8500 2026-03-10T14:06:15.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.767+0000 7fd65effd700 1 -- 192.168.123.103:0/1717731430 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd66807c450 con 0x7fd6600a8500 2026-03-10T14:06:15.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.767+0000 7fd65effd700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd65406c820 0x7fd65406ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:15.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.767+0000 7fd65effd700 1 -- 192.168.123.103:0/1717731430 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fd6680eef40 con 0x7fd6600a8500 2026-03-10T14:06:15.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.767+0000 7fd66dce1700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd65406c820 0x7fd65406ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:15.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.768+0000 7fd66dce1700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd65406c820 0x7fd65406ecd0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fd6580052d0 tx=0x7fd6580058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:15.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.769+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd6600044c0 con 0x7fd6600a8500 2026-03-10T14:06:15.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.772+0000 7fd65effd700 1 -- 192.168.123.103:0/1717731430 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd6680bd1d0 con 0x7fd6600a8500 2026-03-10T14:06:15.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.948+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fd660004f40 con 0x7fd6600a8500 2026-03-10T14:06:15.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.949+0000 7fd65effd700 1 -- 192.168.123.103:0/1717731430 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7fd668077070 con 0x7fd6600a8500 2026-03-10T14:06:15.948 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:06:15.949 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:06:15.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.952+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd65406c820 msgr2=0x7fd65406ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.952+0000 7fd66ece3700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd65406c820 0x7fd65406ecd0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fd6580052d0 tx=0x7fd6580058e0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.952+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a8500 msgr2=0x7fd6600ab450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.952+0000 7fd66ece3700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a8500 0x7fd6600ab450 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7fd6680502d0 tx=0x7fd6680709d0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.953+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 shutdown_connections 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.953+0000 7fd66ece3700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd65406c820 0x7fd65406ecd0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.953+0000 7fd66ece3700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6600a5b50 0x7fd6600ace00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.953+0000 7fd66ece3700 1 --2- 192.168.123.103:0/1717731430 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6600a8500 0x7fd6600ab450 unknown :-1 s=CLOSED pgs=290 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.953+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 >> 192.168.123.103:0/1717731430 conn(0x7fd66009f7b0 msgr2=0x7fd6600a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.953+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 shutdown_connections 2026-03-10T14:06:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:15.954+0000 7fd66ece3700 1 -- 192.168.123.103:0/1717731430 wait complete. 2026-03-10T14:06:16.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.029+0000 7feb8c963700 1 -- 192.168.123.103:0/3448644098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88071a60 msgr2=0x7feb88071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.029+0000 7feb8c963700 1 --2- 192.168.123.103:0/3448644098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88071a60 0x7feb88071e70 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7feb78009b00 tx=0x7feb78009e10 comp rx=0 tx=0).stop 2026-03-10T14:06:16.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.029+0000 7feb8c963700 1 -- 192.168.123.103:0/3448644098 shutdown_connections 2026-03-10T14:06:16.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.029+0000 7feb8c963700 1 --2- 192.168.123.103:0/3448644098 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feb88072440 0x7feb8810be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.029+0000 7feb8c963700 1 --2- 192.168.123.103:0/3448644098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88071a60 0x7feb88071e70 unknown :-1 s=CLOSED pgs=291 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.029+0000 7feb8c963700 1 -- 192.168.123.103:0/3448644098 >> 192.168.123.103:0/3448644098 conn(0x7feb8806d1a0 msgr2=0x7feb8806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:16.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.029+0000 7feb8c963700 1 -- 192.168.123.103:0/3448644098 shutdown_connections 2026-03-10T14:06:16.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.029+0000 7feb8c963700 1 -- 192.168.123.103:0/3448644098 wait complete. 2026-03-10T14:06:16.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8c963700 1 Processor -- start 2026-03-10T14:06:16.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8c963700 1 -- start start 2026-03-10T14:06:16.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8c963700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feb88071a60 0x7feb88116a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8c963700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88072440 0x7feb88116f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8c963700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb881174f0 con 0x7feb88072440 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8c963700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb88117660 con 0x7feb88071a60 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88072440 0x7feb88116f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb86d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feb88071a60 0x7feb88116a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88072440 0x7feb88116f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58180/0 (socket says 192.168.123.103:58180) 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8659c700 1 -- 192.168.123.103:0/1584898510 learned_addr learned my addr 192.168.123.103:0/1584898510 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8659c700 1 -- 192.168.123.103:0/1584898510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feb88071a60 msgr2=0x7feb88116a20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8659c700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feb88071a60 0x7feb88116a20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb8659c700 1 -- 192.168.123.103:0/1584898510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feb780097e0 con 0x7feb88072440 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.030+0000 7feb86d9d700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feb88071a60 0x7feb88116a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.031+0000 7feb8659c700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88072440 0x7feb88116f60 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7feb7c00cc60 tx=0x7feb7c0074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.031+0000 7feb6ffff700 1 -- 192.168.123.103:0/1584898510 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb7c007af0 con 0x7feb88072440 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.031+0000 7feb6ffff700 1 -- 192.168.123.103:0/1584898510 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feb7c007c50 con 0x7feb88072440 2026-03-10T14:06:16.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.031+0000 7feb6ffff700 1 -- 192.168.123.103:0/1584898510 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb7c018700 con 0x7feb88072440 2026-03-10T14:06:16.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.031+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feb881b2870 con 0x7feb88072440 2026-03-10T14:06:16.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.031+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feb881b2d90 con 0x7feb88072440 2026-03-10T14:06:16.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.032+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feb88110c20 con 0x7feb88072440 2026-03-10T14:06:16.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.036+0000 7feb6ffff700 1 -- 192.168.123.103:0/1584898510 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7feb7c01f030 con 0x7feb88072440 2026-03-10T14:06:16.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.036+0000 7feb6ffff700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feb7006c7a0 0x7feb7006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.036+0000 7feb6ffff700 1 -- 192.168.123.103:0/1584898510 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7feb7c08bbe0 con 0x7feb88072440 2026-03-10T14:06:16.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.036+0000 7feb6ffff700 1 -- 192.168.123.103:0/1584898510 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7feb7c0b77c0 con 0x7feb88072440 2026-03-10T14:06:16.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.037+0000 7feb86d9d700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feb7006c7a0 0x7feb7006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.037+0000 7feb86d9d700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feb7006c7a0 0x7feb7006ec50 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7feb881ae720 tx=0x7feb78005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:16.180 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.181+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7feb8804ea50 con 0x7feb88072440 2026-03-10T14:06:16.180 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.181+0000 7feb6ffff700 1 -- 192.168.123.103:0/1584898510 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1848 (secure 0 0 0) 0x7feb7c0563e0 con 0x7feb88072440 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:e12 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:epoch 12 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:05:30.451438+0000 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:06:16.181 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:06:16.182 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:06:16.182 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:06:16.182 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:06:16.182 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:06:16.182 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:06:16.182 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:06:16.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.184+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feb7006c7a0 msgr2=0x7feb7006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.184+0000 7feb8c963700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feb7006c7a0 0x7feb7006ec50 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7feb881ae720 tx=0x7feb78005fb0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.184+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88072440 msgr2=0x7feb88116f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.184+0000 7feb8c963700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88072440 0x7feb88116f60 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7feb7c00cc60 tx=0x7feb7c0074a0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.185+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 shutdown_connections 2026-03-10T14:06:16.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.185+0000 7feb8c963700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feb7006c7a0 0x7feb7006ec50 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.185+0000 7feb8c963700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feb88071a60 0x7feb88116a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.185+0000 7feb8c963700 1 --2- 192.168.123.103:0/1584898510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feb88072440 0x7feb88116f60 unknown :-1 s=CLOSED pgs=292 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.185+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 >> 192.168.123.103:0/1584898510 conn(0x7feb8806d1a0 msgr2=0x7feb8810a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:16.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.185+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 shutdown_connections 2026-03-10T14:06:16.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.185+0000 7feb8c963700 1 -- 192.168.123.103:0/1584898510 wait complete. 2026-03-10T14:06:16.185 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-10T14:06:16.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.258+0000 7f06bd2a7700 1 -- 192.168.123.103:0/4251601422 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 msgr2=0x7f06b00a5c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.258+0000 7f06bd2a7700 1 --2- 192.168.123.103:0/4251601422 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 0x7f06b00a5c00 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7f06a8007780 tx=0x7f06a800c050 comp rx=0 tx=0).stop 2026-03-10T14:06:16.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.258+0000 7f06bd2a7700 1 -- 192.168.123.103:0/4251601422 shutdown_connections 2026-03-10T14:06:16.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.258+0000 7f06bd2a7700 1 --2- 192.168.123.103:0/4251601422 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f06b00a3e30 0x7f06b00a4280 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.258+0000 7f06bd2a7700 1 --2- 192.168.123.103:0/4251601422 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 0x7f06b00a5c00 unknown :-1 s=CLOSED pgs=293 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.258+0000 7f06bd2a7700 1 -- 192.168.123.103:0/4251601422 >> 192.168.123.103:0/4251601422 conn(0x7f06b009f7a0 msgr2=0x7f06b00a1bf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:16.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.258+0000 7f06bd2a7700 1 -- 192.168.123.103:0/4251601422 shutdown_connections 2026-03-10T14:06:16.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.259+0000 7f06bd2a7700 1 -- 192.168.123.103:0/4251601422 wait complete. 2026-03-10T14:06:16.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.260+0000 7f06bd2a7700 1 Processor -- start 2026-03-10T14:06:16.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06bd2a7700 1 -- start start 2026-03-10T14:06:16.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06bd2a7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f06b00a3e30 0x7f06b0014160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06bd2a7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 0x7f06b00146a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06bd2a7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f06b0014cc0 con 0x7f06b00a57f0 2026-03-10T14:06:16.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06bd2a7700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f06b0014e00 con 0x7f06b00a3e30 2026-03-10T14:06:16.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06b77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 0x7f06b00146a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06b77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 0x7f06b00146a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58202/0 (socket says 192.168.123.103:58202) 2026-03-10T14:06:16.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06b77fe700 1 -- 192.168.123.103:0/2776159802 learned_addr learned my addr 192.168.123.103:0/2776159802 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:16.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.261+0000 7f06b7fff700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f06b00a3e30 0x7f06b0014160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.262+0000 7f06b77fe700 1 -- 192.168.123.103:0/2776159802 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f06b00a3e30 msgr2=0x7f06b0014160 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.262+0000 7f06b77fe700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f06b00a3e30 0x7f06b0014160 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.262+0000 7f06b77fe700 1 -- 192.168.123.103:0/2776159802 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f06a8007430 con 0x7f06b00a57f0 2026-03-10T14:06:16.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.262+0000 7f06b77fe700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 0x7f06b00146a0 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7f06ac00ba70 tx=0x7f06ac00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:16.261 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.262+0000 7f06b7fff700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f06b00a3e30 0x7f06b0014160 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:06:16.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.263+0000 7f06b57fa700 1 -- 192.168.123.103:0/2776159802 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f06ac00c780 con 0x7f06b00a57f0 2026-03-10T14:06:16.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.263+0000 7f06b57fa700 1 -- 192.168.123.103:0/2776159802 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f06ac00cdc0 con 0x7f06b00a57f0 2026-03-10T14:06:16.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.263+0000 7f06b57fa700 1 -- 192.168.123.103:0/2776159802 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f06ac012550 con 0x7f06b00a57f0 2026-03-10T14:06:16.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.263+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f06b0015890 con 0x7f06b00a57f0 2026-03-10T14:06:16.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.263+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f06b0015de0 con 0x7f06b00a57f0 2026-03-10T14:06:16.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.264+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f06b0004eb0 con 0x7f06b00a57f0 2026-03-10T14:06:16.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.268+0000 7f06b57fa700 1 -- 192.168.123.103:0/2776159802 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f06ac014440 con 0x7f06b00a57f0 2026-03-10T14:06:16.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.268+0000 7f06b57fa700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f06a006c7b0 0x7f06a006ec60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.268+0000 7f06b57fa700 1 -- 192.168.123.103:0/2776159802 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f06ac08b470 con 0x7f06b00a57f0 2026-03-10T14:06:16.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.269+0000 7f06b7fff700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f06a006c7b0 0x7f06a006ec60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.269+0000 7f06b7fff700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f06a006c7b0 0x7f06a006ec60 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f06a800e010 tx=0x7f06a800c450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:16.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.269+0000 7f06b57fa700 1 -- 192.168.123.103:0/2776159802 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f06ac055aa0 con 0x7f06b00a57f0 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.384+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f06b0015a20 con 0x7f06a006c7b0 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "", 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:06:16.384 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:06:16.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.385+0000 7f06b57fa700 1 -- 192.168.123.103:0/2776159802 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f06a8007e30 con 0x7f06a006c7b0 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.387+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f06a006c7b0 msgr2=0x7f06a006ec60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.387+0000 7f06bd2a7700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f06a006c7b0 0x7f06a006ec60 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f06a800e010 tx=0x7f06a800c450 comp rx=0 tx=0).stop 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.387+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 msgr2=0x7f06b00146a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.387+0000 7f06bd2a7700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 0x7f06b00146a0 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7f06ac00ba70 tx=0x7f06ac00be30 comp rx=0 tx=0).stop 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.387+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 shutdown_connections 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.387+0000 7f06bd2a7700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f06a006c7b0 0x7f06a006ec60 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.387+0000 7f06bd2a7700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f06b00a3e30 0x7f06b0014160 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.388+0000 7f06bd2a7700 1 --2- 192.168.123.103:0/2776159802 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f06b00a57f0 0x7f06b00146a0 unknown :-1 s=CLOSED pgs=294 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.388+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 >> 192.168.123.103:0/2776159802 conn(0x7f06b009f7a0 msgr2=0x7f06b000c9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.388+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 shutdown_connections 2026-03-10T14:06:16.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.388+0000 7f06bd2a7700 1 -- 192.168.123.103:0/2776159802 wait complete. 2026-03-10T14:06:16.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.459+0000 7f194dc92700 1 -- 192.168.123.103:0/3721764145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 msgr2=0x7f1948100960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.459+0000 7f194dc92700 1 --2- 192.168.123.103:0/3721764145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 0x7f1948100960 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7f1930009b00 tx=0x7f1930009e10 comp rx=0 tx=0).stop 2026-03-10T14:06:16.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.460+0000 7f194dc92700 1 -- 192.168.123.103:0/3721764145 shutdown_connections 2026-03-10T14:06:16.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.460+0000 7f194dc92700 1 --2- 192.168.123.103:0/3721764145 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1948101750 0x7f1948101ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.460+0000 7f194dc92700 1 --2- 192.168.123.103:0/3721764145 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 0x7f1948100960 unknown :-1 s=CLOSED pgs=295 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.460+0000 7f194dc92700 1 -- 192.168.123.103:0/3721764145 >> 192.168.123.103:0/3721764145 conn(0x7f19480fbb00 msgr2=0x7f19480fdf30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:16.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.464+0000 7f194dc92700 1 -- 192.168.123.103:0/3721764145 shutdown_connections 2026-03-10T14:06:16.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.464+0000 7f194dc92700 1 -- 192.168.123.103:0/3721764145 wait complete. 2026-03-10T14:06:16.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.464+0000 7f194dc92700 1 Processor -- start 2026-03-10T14:06:16.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.464+0000 7f194dc92700 1 -- start start 2026-03-10T14:06:16.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.465+0000 7f194dc92700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 0x7f1948193c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.465+0000 7f194dc92700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1948101750 0x7f1948194180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.465+0000 7f19477fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 0x7f1948193c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.465+0000 7f19477fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 0x7f1948193c40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58234/0 (socket says 192.168.123.103:58234) 2026-03-10T14:06:16.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.465+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1948194710 con 0x7f1948100550 2026-03-10T14:06:16.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.465+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1948194880 con 0x7f1948101750 2026-03-10T14:06:16.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.465+0000 7f19477fe700 1 -- 192.168.123.103:0/3446687248 learned_addr learned my addr 192.168.123.103:0/3446687248 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:16.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.466+0000 7f19477fe700 1 -- 192.168.123.103:0/3446687248 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1948101750 msgr2=0x7f1948194180 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.466+0000 7f1946ffd700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1948101750 0x7f1948194180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.466+0000 7f19477fe700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1948101750 0x7f1948194180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.466+0000 7f19477fe700 1 -- 192.168.123.103:0/3446687248 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f19300097e0 con 0x7f1948100550 2026-03-10T14:06:16.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.466+0000 7f1946ffd700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1948101750 0x7f1948194180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:06:16.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.466+0000 7f19477fe700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 0x7f1948193c40 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7f1930004a00 tx=0x7f1930004ae0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:16.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.467+0000 7f1944ff9700 1 -- 192.168.123.103:0/3446687248 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f193001d070 con 0x7f1948100550 2026-03-10T14:06:16.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.467+0000 7f1944ff9700 1 -- 192.168.123.103:0/3446687248 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f193000bcd0 con 0x7f1948100550 2026-03-10T14:06:16.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.467+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1948199270 con 0x7f1948100550 2026-03-10T14:06:16.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.467+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1948199760 con 0x7f1948100550 2026-03-10T14:06:16.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.468+0000 7f1944ff9700 1 -- 192.168.123.103:0/3446687248 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f193000f610 con 0x7f1948100550 2026-03-10T14:06:16.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.468+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1948066e40 con 0x7f1948100550 2026-03-10T14:06:16.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.469+0000 7f1944ff9700 1 -- 192.168.123.103:0/3446687248 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f193000f890 con 0x7f1948100550 2026-03-10T14:06:16.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.469+0000 7f1944ff9700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1934070a90 0x7f1934072f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:16.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.469+0000 7f1944ff9700 1 -- 192.168.123.103:0/3446687248 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f1930030080 con 0x7f1948100550 2026-03-10T14:06:16.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.472+0000 7f1944ff9700 1 -- 192.168.123.103:0/3446687248 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1930057890 con 0x7f1948100550 2026-03-10T14:06:16.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.473+0000 7f1946ffd700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1934070a90 0x7f1934072f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:16.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.473+0000 7f1946ffd700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1934070a90 0x7f1934072f40 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f1938005fd0 tx=0x7f1938005e20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:16.619 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 failed cephadm daemon(s) 2026-03-10T14:06:16.619 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s) 2026-03-10T14:06:16.619 INFO:teuthology.orchestra.run.vm03.stdout: daemon ceph-exporter.vm03 on vm03 is in error state 2026-03-10T14:06:16.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.619+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f1948199a40 con 0x7f1948100550 2026-03-10T14:06:16.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.620+0000 7f1944ff9700 1 -- 192.168.123.103:0/3446687248 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+151 (secure 0 0 0) 0x7f193005aeb0 con 0x7f1948100550 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.622+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1934070a90 msgr2=0x7f1934072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.622+0000 7f194dc92700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1934070a90 0x7f1934072f40 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f1938005fd0 tx=0x7f1938005e20 comp rx=0 tx=0).stop 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.622+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 msgr2=0x7f1948193c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.622+0000 7f194dc92700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 0x7f1948193c40 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7f1930004a00 tx=0x7f1930004ae0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.623+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 shutdown_connections 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.623+0000 7f194dc92700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1948100550 0x7f1948193c40 unknown :-1 s=CLOSED pgs=296 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.623+0000 7f194dc92700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1934070a90 0x7f1934072f40 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.623+0000 7f194dc92700 1 --2- 192.168.123.103:0/3446687248 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1948101750 0x7f1948194180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:16.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.623+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 >> 192.168.123.103:0/3446687248 conn(0x7f19480fbb00 msgr2=0x7f1948104980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:16.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.623+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 shutdown_connections 2026-03-10T14:06:16.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:16.623+0000 7f194dc92700 1 -- 192.168.123.103:0/3446687248 wait complete. 2026-03-10T14:06:16.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:16 vm03.local ceph-mon[49718]: from='client.14576 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:16.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:16 vm03.local ceph-mon[49718]: from='client.14580 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:16.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:16 vm03.local ceph-mon[49718]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:16.859 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:16 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/1717731430' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:06:16.859 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:16 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/1584898510' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:06:17.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:16 vm04.local ceph-mon[55966]: from='client.14576 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:17.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:16 vm04.local ceph-mon[55966]: from='client.14580 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:17.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:16 vm04.local ceph-mon[55966]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:17.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:16 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/1717731430' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:06:17.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:16 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/1584898510' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:06:17.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:17 vm03.local ceph-mon[49718]: from='client.14596 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:17.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:17 vm03.local ceph-mon[49718]: pgmap v107: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:17.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:17 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/3446687248' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:06:18.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:17 vm04.local ceph-mon[55966]: from='client.14596 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:18.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:17 vm04.local ceph-mon[55966]: pgmap v107: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:18.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:17 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/3446687248' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:06:20.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:19 vm04.local ceph-mon[55966]: pgmap v108: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:20.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:19 vm03.local ceph-mon[49718]: pgmap v108: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:22.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:22 vm04.local ceph-mon[55966]: pgmap v109: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:06:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:21 vm03.local ceph-mon[49718]: pgmap v109: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:06:23.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:23 vm04.local ceph-mon[55966]: pgmap v110: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:23 vm03.local ceph-mon[49718]: pgmap v110: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:25.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:25 vm04.local ceph-mon[55966]: pgmap v111: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:25 vm03.local ceph-mon[49718]: pgmap v111: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:27.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:27 vm03.local ceph-mon[49718]: pgmap v112: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:27 vm04.local ceph-mon[55966]: pgmap v112: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:29.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:28 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:29.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:28 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:30.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:29 vm04.local ceph-mon[55966]: pgmap v113: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:30.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:29 vm03.local ceph-mon[49718]: pgmap v113: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:31 vm04.local ceph-mon[55966]: pgmap v114: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:06:32.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:31 vm03.local ceph-mon[49718]: pgmap v114: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: git switch -c 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:Or undo this operation with: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: git switch - 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr: 2026-03-10T14:06:33.637 INFO:tasks.workunit.client.1.vm04.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T14:06:33.642 DEBUG:teuthology.orchestra.run.vm04:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-10T14:06:33.700 INFO:tasks.workunit.client.1.vm04.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T14:06:33.703 INFO:tasks.workunit.client.1.vm04.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T14:06:33.703 INFO:tasks.workunit.client.1.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T14:06:33.756 INFO:tasks.workunit.client.1.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T14:06:33.793 INFO:tasks.workunit.client.1.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T14:06:33.821 INFO:tasks.workunit.client.1.vm04.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T14:06:33.823 INFO:tasks.workunit.client.1.vm04.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T14:06:33.823 INFO:tasks.workunit.client.1.vm04.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T14:06:33.853 INFO:tasks.workunit.client.1.vm04.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T14:06:33.856 DEBUG:teuthology.orchestra.run.vm04:> set -ex 2026-03-10T14:06:33.856 DEBUG:teuthology.orchestra.run.vm04:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-10T14:06:33.914 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-10T14:06:33.915 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T14:06:33.915 DEBUG:teuthology.orchestra.run.vm04:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-10T14:06:33.979 INFO:tasks.workunit.client.1.vm04.stderr:+ mkdir -p fsstress 2026-03-10T14:06:33.981 INFO:tasks.workunit.client.1.vm04.stderr:+ pushd fsstress 2026-03-10T14:06:33.982 INFO:tasks.workunit.client.1.vm04.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T14:06:33.982 INFO:tasks.workunit.client.1.vm04.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T14:06:34.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:33 vm04.local ceph-mon[55966]: pgmap v115: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:34.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:33 vm03.local ceph-mon[49718]: pgmap v115: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:06:35.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:34 vm04.local ceph-mon[55966]: pgmap v116: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:35.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:34 vm03.local ceph-mon[49718]: pgmap v116: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:06:35.505 INFO:tasks.workunit.client.1.vm04.stderr:+ tar xzf ltp-full.tgz 2026-03-10T14:06:37.479 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:37 vm03.local ceph-mon[49718]: pgmap v117: 65 pgs: 65 active+clean; 4.9 MiB data, 167 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 382 KiB/s wr, 12 op/s 2026-03-10T14:06:37.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:37 vm04.local ceph-mon[55966]: pgmap v117: 65 pgs: 65 active+clean; 4.9 MiB data, 167 MiB used, 120 GiB / 120 GiB avail; 2.1 KiB/s rd, 382 KiB/s wr, 12 op/s 2026-03-10T14:06:40.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:39 vm04.local ceph-mon[55966]: pgmap v118: 65 pgs: 65 active+clean; 15 MiB data, 200 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.3 MiB/s wr, 67 op/s 2026-03-10T14:06:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:39 vm03.local ceph-mon[49718]: pgmap v118: 65 pgs: 65 active+clean; 15 MiB data, 200 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.3 MiB/s wr, 67 op/s 2026-03-10T14:06:41.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:41 vm04.local ceph-mon[55966]: pgmap v119: 65 pgs: 65 active+clean; 16 MiB data, 208 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.4 MiB/s wr, 88 op/s 2026-03-10T14:06:41.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:41 vm03.local ceph-mon[49718]: pgmap v119: 65 pgs: 65 active+clean; 16 MiB data, 208 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.4 MiB/s wr, 88 op/s 2026-03-10T14:06:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:43 vm04.local ceph-mon[55966]: pgmap v120: 65 pgs: 65 active+clean; 19 MiB data, 237 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.7 MiB/s wr, 151 op/s 2026-03-10T14:06:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:43 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:44.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:43 vm03.local ceph-mon[49718]: pgmap v120: 65 pgs: 65 active+clean; 19 MiB data, 237 MiB used, 120 GiB / 120 GiB avail; 84 KiB/s rd, 1.7 MiB/s wr, 151 op/s 2026-03-10T14:06:44.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:43 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:45.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:44 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:06:45.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:44 vm04.local ceph-mon[55966]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T14:06:45.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:44 vm04.local ceph-mon[55966]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T14:06:45.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:44 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:06:45.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:44 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:06:45.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:44 vm04.local ceph-mon[55966]: Upgrade: Need to upgrade myself (mgr.vm03.rwbbep) 2026-03-10T14:06:45.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:44 vm04.local ceph-mon[55966]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm04 2026-03-10T14:06:45.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:44 vm04.local ceph-mon[55966]: pgmap v121: 65 pgs: 65 active+clean; 32 MiB data, 288 MiB used, 120 GiB / 120 GiB avail; 684 KiB/s rd, 2.8 MiB/s wr, 267 op/s 2026-03-10T14:06:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:44 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:06:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:44 vm03.local ceph-mon[49718]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T14:06:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:44 vm03.local ceph-mon[49718]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T14:06:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:44 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:06:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:44 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:06:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:44 vm03.local ceph-mon[49718]: Upgrade: Need to upgrade myself (mgr.vm03.rwbbep) 2026-03-10T14:06:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:44 vm03.local ceph-mon[49718]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm04 2026-03-10T14:06:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:44 vm03.local ceph-mon[49718]: pgmap v121: 65 pgs: 65 active+clean; 32 MiB data, 288 MiB used, 120 GiB / 120 GiB avail; 684 KiB/s rd, 2.8 MiB/s wr, 267 op/s 2026-03-10T14:06:46.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:45 vm03.local ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:06:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:45 vm04.local ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.798+0000 7f85fc11f700 1 -- 192.168.123.103:0/2716597424 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4072360 msgr2=0x7f85f40770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.798+0000 7f85fc11f700 1 --2- 192.168.123.103:0/2716597424 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4072360 0x7f85f40770e0 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7f85ec00b600 tx=0x7f85ec00b910 comp rx=0 tx=0).stop 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.799+0000 7f85fc11f700 1 -- 192.168.123.103:0/2716597424 shutdown_connections 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.799+0000 7f85fc11f700 1 --2- 192.168.123.103:0/2716597424 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4072360 0x7f85f40770e0 unknown :-1 s=CLOSED pgs=297 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.799+0000 7f85fc11f700 1 --2- 192.168.123.103:0/2716597424 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85f4071980 0x7f85f4071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.799+0000 7f85fc11f700 1 -- 192.168.123.103:0/2716597424 >> 192.168.123.103:0/2716597424 conn(0x7f85f406d1a0 msgr2=0x7f85f406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.799+0000 7f85fc11f700 1 -- 192.168.123.103:0/2716597424 shutdown_connections 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.799+0000 7f85fc11f700 1 -- 192.168.123.103:0/2716597424 wait complete. 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.802+0000 7f85fc11f700 1 Processor -- start 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.802+0000 7f85fc11f700 1 -- start start 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.802+0000 7f85fc11f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4071980 0x7f85f4082620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.802+0000 7f85fc11f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85f4082b60 0x7f85f4082fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:46.801 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.802+0000 7f85fc11f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85f41b2a90 con 0x7f85f4071980 2026-03-10T14:06:46.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.802+0000 7f85fc11f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85f41b2bd0 con 0x7f85f4082b60 2026-03-10T14:06:46.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.803+0000 7f85f9ebb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4071980 0x7f85f4082620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:46.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.803+0000 7f85f9ebb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4071980 0x7f85f4082620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33358/0 (socket says 192.168.123.103:33358) 2026-03-10T14:06:46.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.803+0000 7f85f9ebb700 1 -- 192.168.123.103:0/552424678 learned_addr learned my addr 192.168.123.103:0/552424678 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:46.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.803+0000 7f85f96ba700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85f4082b60 0x7f85f4082fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:46.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.804+0000 7f85f9ebb700 1 -- 192.168.123.103:0/552424678 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85f4082b60 msgr2=0x7f85f4082fd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:46.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.804+0000 7f85f9ebb700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85f4082b60 0x7f85f4082fd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:46.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.804+0000 7f85f9ebb700 1 -- 192.168.123.103:0/552424678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85ec00b050 con 0x7f85f4071980 2026-03-10T14:06:46.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.804+0000 7f85f9ebb700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4071980 0x7f85f4082620 secure :-1 s=READY pgs=298 cs=0 l=1 rev1=1 crypto rx=0x7f85f000b730 tx=0x7f85f000ba40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:46.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.804+0000 7f85eaffd700 1 -- 192.168.123.103:0/552424678 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85f0011840 con 0x7f85f4071980 2026-03-10T14:06:46.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.805+0000 7f85fc11f700 1 -- 192.168.123.103:0/552424678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85f41b2e30 con 0x7f85f4071980 2026-03-10T14:06:46.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.805+0000 7f85fc11f700 1 -- 192.168.123.103:0/552424678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85f41b32f0 con 0x7f85f4071980 2026-03-10T14:06:46.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.805+0000 7f85eaffd700 1 -- 192.168.123.103:0/552424678 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f85f0011e80 con 0x7f85f4071980 2026-03-10T14:06:46.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.805+0000 7f85eaffd700 1 -- 192.168.123.103:0/552424678 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85f000f550 con 0x7f85f4071980 2026-03-10T14:06:46.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.806+0000 7f85fc11f700 1 -- 192.168.123.103:0/552424678 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f85d8005320 con 0x7f85f4071980 2026-03-10T14:06:46.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.808+0000 7f85eaffd700 1 -- 192.168.123.103:0/552424678 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f85f00119a0 con 0x7f85f4071980 2026-03-10T14:06:46.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.808+0000 7f85eaffd700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85e006c6d0 0x7f85e006eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:46.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.809+0000 7f85f96ba700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85e006c6d0 0x7f85e006eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:46.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.809+0000 7f85eaffd700 1 -- 192.168.123.103:0/552424678 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f85f0059da0 con 0x7f85f4071980 2026-03-10T14:06:46.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.809+0000 7f85f96ba700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85e006c6d0 0x7f85e006eb80 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f85ec00bd90 tx=0x7f85ec0096e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:46.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.812+0000 7f85eaffd700 1 -- 192.168.123.103:0/552424678 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f85f0053720 con 0x7f85f4071980 2026-03-10T14:06:46.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.976+0000 7f85fc11f700 1 -- 192.168.123.103:0/552424678 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f85d8000bf0 con 0x7f85e006c6d0 2026-03-10T14:06:46.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.979+0000 7f85eaffd700 1 -- 192.168.123.103:0/552424678 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f85d8000bf0 con 0x7f85e006c6d0 2026-03-10T14:06:46.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.981+0000 7f85e8ff9700 1 -- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85e006c6d0 msgr2=0x7f85e006eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:46.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.981+0000 7f85e8ff9700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85e006c6d0 0x7f85e006eb80 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f85ec00bd90 tx=0x7f85ec0096e0 comp rx=0 tx=0).stop 2026-03-10T14:06:46.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.981+0000 7f85e8ff9700 1 -- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4071980 msgr2=0x7f85f4082620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:46.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.981+0000 7f85e8ff9700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4071980 0x7f85f4082620 secure :-1 s=READY pgs=298 cs=0 l=1 rev1=1 crypto rx=0x7f85f000b730 tx=0x7f85f000ba40 comp rx=0 tx=0).stop 2026-03-10T14:06:46.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.981+0000 7f85e8ff9700 1 -- 192.168.123.103:0/552424678 shutdown_connections 2026-03-10T14:06:46.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.981+0000 7f85e8ff9700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f4071980 0x7f85f4082620 unknown :-1 s=CLOSED pgs=298 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:46.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.982+0000 7f85e8ff9700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85e006c6d0 0x7f85e006eb80 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:46.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.982+0000 7f85e8ff9700 1 --2- 192.168.123.103:0/552424678 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85f4082b60 0x7f85f4082fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:46.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.983+0000 7f85e8ff9700 1 -- 192.168.123.103:0/552424678 >> 192.168.123.103:0/552424678 conn(0x7f85f406d1a0 msgr2=0x7f85f40764e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:46.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.983+0000 7f85e8ff9700 1 -- 192.168.123.103:0/552424678 shutdown_connections 2026-03-10T14:06:46.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:46.983+0000 7f85e8ff9700 1 -- 192.168.123.103:0/552424678 wait complete. 2026-03-10T14:06:47.000 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:06:47.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:46 vm03.local ceph-mon[49718]: pgmap v122: 65 pgs: 65 active+clean; 35 MiB data, 303 MiB used, 120 GiB / 120 GiB avail; 684 KiB/s rd, 3.0 MiB/s wr, 297 op/s 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.145+0000 7fca46e0d700 1 -- 192.168.123.103:0/391811304 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38095f30 msgr2=0x7fca38098310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.145+0000 7fca46e0d700 1 --2- 192.168.123.103:0/391811304 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38095f30 0x7fca38098310 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fca34009b00 tx=0x7fca34009e10 comp rx=0 tx=0).stop 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.151+0000 7fca46e0d700 1 -- 192.168.123.103:0/391811304 shutdown_connections 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.151+0000 7fca46e0d700 1 --2- 192.168.123.103:0/391811304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca38098850 0x7fca3809ac30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.151+0000 7fca46e0d700 1 --2- 192.168.123.103:0/391811304 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38095f30 0x7fca38098310 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.151+0000 7fca46e0d700 1 -- 192.168.123.103:0/391811304 >> 192.168.123.103:0/391811304 conn(0x7fca3808f8b0 msgr2=0x7fca38091d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.152+0000 7fca46e0d700 1 -- 192.168.123.103:0/391811304 shutdown_connections 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.152+0000 7fca46e0d700 1 -- 192.168.123.103:0/391811304 wait complete. 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.152+0000 7fca46e0d700 1 Processor -- start 2026-03-10T14:06:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.153+0000 7fca46e0d700 1 -- start start 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.153+0000 7fca46e0d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca38095f30 0x7fca3813de80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.153+0000 7fca46e0d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38098850 0x7fca3813e3c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.153+0000 7fca46e0d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca3813e9e0 con 0x7fca38095f30 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.153+0000 7fca46e0d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca3813eb20 con 0x7fca38098850 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca3ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38098850 0x7fca3813e3c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca3ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38098850 0x7fca3813e3c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:58448/0 (socket says 192.168.123.103:58448) 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca3ffff700 1 -- 192.168.123.103:0/2811783796 learned_addr learned my addr 192.168.123.103:0/2811783796 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca44ba9700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca38095f30 0x7fca3813de80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca3ffff700 1 -- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca38095f30 msgr2=0x7fca3813de80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca3ffff700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca38095f30 0x7fca3813de80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca3ffff700 1 -- 192.168.123.103:0/2811783796 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca340097e0 con 0x7fca38098850 2026-03-10T14:06:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca3ffff700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38098850 0x7fca3813e3c0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fca2c00d8d0 tx=0x7fca2c00dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:47.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.154+0000 7fca3dffb700 1 -- 192.168.123.103:0/2811783796 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca2c009940 con 0x7fca38098850 2026-03-10T14:06:47.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.155+0000 7fca3dffb700 1 -- 192.168.123.103:0/2811783796 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fca2c010460 con 0x7fca38098850 2026-03-10T14:06:47.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.155+0000 7fca3dffb700 1 -- 192.168.123.103:0/2811783796 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca2c00f5d0 con 0x7fca38098850 2026-03-10T14:06:47.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.156+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca381435d0 con 0x7fca38098850 2026-03-10T14:06:47.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.156+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca38143aa0 con 0x7fca38098850 2026-03-10T14:06:47.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.156+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca38091420 con 0x7fca38098850 2026-03-10T14:06:47.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.157+0000 7fca3dffb700 1 -- 192.168.123.103:0/2811783796 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fca2c009aa0 con 0x7fca38098850 2026-03-10T14:06:47.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.157+0000 7fca3dffb700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca3006c480 0x7fca3006e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.157+0000 7fca3dffb700 1 -- 192.168.123.103:0/2811783796 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fca2c08b650 con 0x7fca38098850 2026-03-10T14:06:47.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.157+0000 7fca44ba9700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca3006c480 0x7fca3006e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:47.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.158+0000 7fca44ba9700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca3006c480 0x7fca3006e930 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fca3400b5c0 tx=0x7fca34005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:47.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.160+0000 7fca3dffb700 1 -- 192.168.123.103:0/2811783796 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fca2c0595d0 con 0x7fca38098850 2026-03-10T14:06:47.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:46 vm04.local ceph-mon[55966]: pgmap v122: 65 pgs: 65 active+clean; 35 MiB data, 303 MiB used, 120 GiB / 120 GiB avail; 684 KiB/s rd, 3.0 MiB/s wr, 297 op/s 2026-03-10T14:06:47.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.329+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fca38002900 con 0x7fca3006c480 2026-03-10T14:06:47.333 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.333+0000 7fca3dffb700 1 -- 192.168.123.103:0/2811783796 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fca38002900 con 0x7fca3006c480 2026-03-10T14:06:47.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca3006c480 msgr2=0x7fca3006e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca3006c480 0x7fca3006e930 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fca3400b5c0 tx=0x7fca34005fb0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38098850 msgr2=0x7fca3813e3c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38098850 0x7fca3813e3c0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fca2c00d8d0 tx=0x7fca2c00dc90 comp rx=0 tx=0).stop 2026-03-10T14:06:47.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 shutdown_connections 2026-03-10T14:06:47.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca38095f30 0x7fca3813de80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fca3006c480 0x7fca3006e930 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 --2- 192.168.123.103:0/2811783796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca38098850 0x7fca3813e3c0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.335+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 >> 192.168.123.103:0/2811783796 conn(0x7fca3808f8b0 msgr2=0x7fca380994b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:47.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.336+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 shutdown_connections 2026-03-10T14:06:47.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.336+0000 7fca46e0d700 1 -- 192.168.123.103:0/2811783796 wait complete. 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.427+0000 7f38b958a700 1 -- 192.168.123.103:0/3800953063 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4072360 msgr2=0x7f38b40770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.427+0000 7f38b958a700 1 --2- 192.168.123.103:0/3800953063 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4072360 0x7f38b40770e0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f38ac009230 tx=0x7f38ac009260 comp rx=0 tx=0).stop 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.430+0000 7f38b958a700 1 -- 192.168.123.103:0/3800953063 shutdown_connections 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.430+0000 7f38b958a700 1 --2- 192.168.123.103:0/3800953063 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4072360 0x7f38b40770e0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.430+0000 7f38b958a700 1 --2- 192.168.123.103:0/3800953063 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b4071980 0x7f38b4071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.430+0000 7f38b958a700 1 -- 192.168.123.103:0/3800953063 >> 192.168.123.103:0/3800953063 conn(0x7f38b406d1a0 msgr2=0x7f38b406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.432+0000 7f38b958a700 1 -- 192.168.123.103:0/3800953063 shutdown_connections 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.432+0000 7f38b958a700 1 -- 192.168.123.103:0/3800953063 wait complete. 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b958a700 1 Processor -- start 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b958a700 1 -- start start 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b958a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4071980 0x7f38b40824a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b958a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b40829e0 0x7f38b4082e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b958a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38b4083e50 con 0x7f38b40829e0 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b958a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38b412dd80 con 0x7f38b4071980 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b27fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b40829e0 0x7f38b4082e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b27fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b40829e0 0x7f38b4082e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33400/0 (socket says 192.168.123.103:33400) 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b27fc700 1 -- 192.168.123.103:0/3756055340 learned_addr learned my addr 192.168.123.103:0/3756055340 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.433+0000 7f38b2ffd700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4071980 0x7f38b40824a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.434+0000 7f38b2ffd700 1 -- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b40829e0 msgr2=0x7f38b4082e50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.434+0000 7f38b2ffd700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b40829e0 0x7f38b4082e50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.434+0000 7f38b2ffd700 1 -- 192.168.123.103:0/3756055340 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38ac008ee0 con 0x7f38b4071980 2026-03-10T14:06:47.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.435+0000 7f38b2ffd700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4071980 0x7f38b40824a0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f38a4009940 tx=0x7f38a4009c50 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:47.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.435+0000 7f389bfff700 1 -- 192.168.123.103:0/3756055340 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38a4010040 con 0x7f38b4071980 2026-03-10T14:06:47.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.435+0000 7f389bfff700 1 -- 192.168.123.103:0/3756055340 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f38a4009e90 con 0x7f38b4071980 2026-03-10T14:06:47.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.435+0000 7f389bfff700 1 -- 192.168.123.103:0/3756055340 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38a4014d90 con 0x7f38b4071980 2026-03-10T14:06:47.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.436+0000 7f38b958a700 1 -- 192.168.123.103:0/3756055340 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38b412e000 con 0x7f38b4071980 2026-03-10T14:06:47.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.436+0000 7f38b958a700 1 -- 192.168.123.103:0/3756055340 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38b412e520 con 0x7f38b4071980 2026-03-10T14:06:47.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.437+0000 7f38b958a700 1 -- 192.168.123.103:0/3756055340 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38b4079e10 con 0x7f38b4071980 2026-03-10T14:06:47.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.439+0000 7f389bfff700 1 -- 192.168.123.103:0/3756055340 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f38a4005030 con 0x7f38b4071980 2026-03-10T14:06:47.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.439+0000 7f389bfff700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f389c06c6d0 0x7f389c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.440+0000 7f389bfff700 1 -- 192.168.123.103:0/3756055340 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f38a408b190 con 0x7f38b4071980 2026-03-10T14:06:47.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.440+0000 7f38b27fc700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f389c06c6d0 0x7f389c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:47.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.440+0000 7f38b27fc700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f389c06c6d0 0x7f389c06eb80 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f38ac009200 tx=0x7f38ac01a040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:47.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.442+0000 7f389bfff700 1 -- 192.168.123.103:0/3756055340 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f38a4059420 con 0x7f38b4071980 2026-03-10T14:06:47.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.639+0000 7f38b958a700 1 -- 192.168.123.103:0/3756055340 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f38b407c760 con 0x7f389c06c6d0 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 77s ago 3m 21.4M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 error 77s ago 3m - - 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (2m) 78s ago 2m 8342k - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 77s ago 3m 7419k - 18.2.0 dc2bc1663786 57962aef7443 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (2m) 78s ago 2m 7407k - 18.2.0 dc2bc1663786 0918365fa827 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 77s ago 3m 82.1M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (84s) 77s ago 84s 16.4M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (82s) 77s ago 82s 16.1M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (81s) 78s ago 81s 16.8M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (83s) 78s ago 83s 18.5M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:9283,8765,8443 running (4m) 77s ago 4m 499M - 18.2.0 dc2bc1663786 378306a7bb3c 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (2m) 78s ago 2m 446M - 18.2.0 dc2bc1663786 f2d79432e040 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 77s ago 4m 54.7M 2048M 18.2.0 dc2bc1663786 f59cc7d5bdfd 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (2m) 78s ago 2m 44.2M 2048M 18.2.0 dc2bc1663786 4113774b34c7 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 77s ago 3m 14.2M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (2m) 78s ago 2m 14.1M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 77s ago 2m 48.8M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 77s ago 2m 45.7M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 77s ago 2m 45.3M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (2m) 78s ago 2m 47.2M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (114s) 78s ago 114s 45.6M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (105s) 78s ago 105s 44.0M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 77s ago 3m 36.2M - 2.43.0 a07b618ecd1d fcef697ff8c4 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.670+0000 7f389bfff700 1 -- 192.168.123.103:0/3756055340 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3336 (secure 0 0 0) 0x7f38b407c760 con 0x7f389c06c6d0 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 -- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f389c06c6d0 msgr2=0x7f389c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f389c06c6d0 0x7f389c06eb80 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f38ac009200 tx=0x7f38ac01a040 comp rx=0 tx=0).stop 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 -- 192.168.123.103:0/3756055340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4071980 msgr2=0x7f38b40824a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4071980 0x7f38b40824a0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f38a4009940 tx=0x7f38a4009c50 comp rx=0 tx=0).stop 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 -- 192.168.123.103:0/3756055340 shutdown_connections 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f389c06c6d0 0x7f389c06eb80 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38b4071980 0x7f38b40824a0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 --2- 192.168.123.103:0/3756055340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38b40829e0 0x7f38b4082e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 -- 192.168.123.103:0/3756055340 >> 192.168.123.103:0/3756055340 conn(0x7f38b406d1a0 msgr2=0x7f38b4076470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 -- 192.168.123.103:0/3756055340 shutdown_connections 2026-03-10T14:06:47.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.673+0000 7f3899ffb700 1 -- 192.168.123.103:0/3756055340 wait complete. 2026-03-10T14:06:47.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.825+0000 7f34b5c33700 1 -- 192.168.123.103:0/2579862759 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34b0071980 msgr2=0x7f34b0071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.825+0000 7f34b5c33700 1 --2- 192.168.123.103:0/2579862759 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34b0071980 0x7f34b0071d90 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f34a0007780 tx=0x7f34a000c050 comp rx=0 tx=0).stop 2026-03-10T14:06:47.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.825+0000 7f34b5c33700 1 -- 192.168.123.103:0/2579862759 shutdown_connections 2026-03-10T14:06:47.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.825+0000 7f34b5c33700 1 --2- 192.168.123.103:0/2579862759 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34b0072360 0x7f34b00770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.825+0000 7f34b5c33700 1 --2- 192.168.123.103:0/2579862759 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34b0071980 0x7f34b0071d90 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.825+0000 7f34b5c33700 1 -- 192.168.123.103:0/2579862759 >> 192.168.123.103:0/2579862759 conn(0x7f34b006d1a0 msgr2=0x7f34b006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.826+0000 7f34b5c33700 1 -- 192.168.123.103:0/2579862759 shutdown_connections 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.826+0000 7f34b5c33700 1 -- 192.168.123.103:0/2579862759 wait complete. 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34b5c33700 1 Processor -- start 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34b5c33700 1 -- start start 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34b5c33700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34b0072360 0x7f34b0131330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34b5c33700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34b0131870 0x7f34b007f520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34b5c33700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34b0131d70 con 0x7f34b0131870 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34b5c33700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34b0131ee0 con 0x7f34b0072360 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34aeffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34b0131870 0x7f34b007f520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34aeffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34b0131870 0x7f34b007f520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33418/0 (socket says 192.168.123.103:33418) 2026-03-10T14:06:47.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34aeffd700 1 -- 192.168.123.103:0/356264299 learned_addr learned my addr 192.168.123.103:0/356264299 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:47.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34aeffd700 1 -- 192.168.123.103:0/356264299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34b0072360 msgr2=0x7f34b0131330 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:47.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34aeffd700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34b0072360 0x7f34b0131330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:47.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.827+0000 7f34aeffd700 1 -- 192.168.123.103:0/356264299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f34a0007430 con 0x7f34b0131870 2026-03-10T14:06:47.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.828+0000 7f34aeffd700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34b0131870 0x7f34b007f520 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f34a800bf40 tx=0x7f34a800bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:47.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.828+0000 7f34acff9700 1 -- 192.168.123.103:0/356264299 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34a800cb40 con 0x7f34b0131870 2026-03-10T14:06:47.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.828+0000 7f34acff9700 1 -- 192.168.123.103:0/356264299 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f34a800cca0 con 0x7f34b0131870 2026-03-10T14:06:47.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.828+0000 7f34acff9700 1 -- 192.168.123.103:0/356264299 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34a8012740 con 0x7f34b0131870 2026-03-10T14:06:47.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.830+0000 7f34b5c33700 1 -- 192.168.123.103:0/356264299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f34b007fac0 con 0x7f34b0131870 2026-03-10T14:06:47.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.830+0000 7f34b5c33700 1 -- 192.168.123.103:0/356264299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f34b007ffe0 con 0x7f34b0131870 2026-03-10T14:06:47.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.834+0000 7f34b5c33700 1 -- 192.168.123.103:0/356264299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f34b012b500 con 0x7f34b0131870 2026-03-10T14:06:47.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.835+0000 7f34acff9700 1 -- 192.168.123.103:0/356264299 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f34a8014bb0 con 0x7f34b0131870 2026-03-10T14:06:47.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.835+0000 7f34acff9700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f349806c6d0 0x7f349806eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:47.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.835+0000 7f34acff9700 1 -- 192.168.123.103:0/356264299 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f34a808bc60 con 0x7f34b0131870 2026-03-10T14:06:47.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.835+0000 7f34af7fe700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f349806c6d0 0x7f349806eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:47.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.836+0000 7f34af7fe700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f349806c6d0 0x7f349806eb80 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f34a000c4d0 tx=0x7f34a000af60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:47.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:47.840+0000 7f34acff9700 1 -- 192.168.123.103:0/356264299 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f34a8059ef0 con 0x7f34b0131870 2026-03-10T14:06:48.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:48 vm03.local ceph-mon[49718]: from='client.14604 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:48.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:48 vm03.local ceph-mon[49718]: from='client.24353 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:48.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:48 vm03.local ceph-mon[49718]: from='client.24357 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:48.128 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:06:48.128 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.127+0000 7f34b5c33700 1 -- 192.168.123.103:0/356264299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f34b004ea50 con 0x7f34b0131870 2026-03-10T14:06:48.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.128+0000 7f34acff9700 1 -- 192.168.123.103:0/356264299 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f34a8059a80 con 0x7f34b0131870 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 -- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f349806c6d0 msgr2=0x7f349806eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f349806c6d0 0x7f349806eb80 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f34a000c4d0 tx=0x7f34a000af60 comp rx=0 tx=0).stop 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 -- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34b0131870 msgr2=0x7f34b007f520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34b0131870 0x7f34b007f520 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f34a800bf40 tx=0x7f34a800bf70 comp rx=0 tx=0).stop 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 -- 192.168.123.103:0/356264299 shutdown_connections 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f349806c6d0 0x7f349806eb80 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f34b0072360 0x7f34b0131330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 --2- 192.168.123.103:0/356264299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f34b0131870 0x7f34b007f520 unknown :-1 s=CLOSED pgs=299 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 -- 192.168.123.103:0/356264299 >> 192.168.123.103:0/356264299 conn(0x7f34b006d1a0 msgr2=0x7f34b0076490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 -- 192.168.123.103:0/356264299 shutdown_connections 2026-03-10T14:06:48.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.140+0000 7f34967fc700 1 -- 192.168.123.103:0/356264299 wait complete. 2026-03-10T14:06:48.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.230+0000 7fb3a300a700 1 -- 192.168.123.103:0/1310983324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c072360 msgr2=0x7fb39c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.230+0000 7fb3a300a700 1 --2- 192.168.123.103:0/1310983324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c072360 0x7fb39c0770e0 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7fb39400b600 tx=0x7fb39400b910 comp rx=0 tx=0).stop 2026-03-10T14:06:48.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.231+0000 7fb3a300a700 1 -- 192.168.123.103:0/1310983324 shutdown_connections 2026-03-10T14:06:48.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.231+0000 7fb3a300a700 1 --2- 192.168.123.103:0/1310983324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c072360 0x7fb39c0770e0 unknown :-1 s=CLOSED pgs=300 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.231+0000 7fb3a300a700 1 --2- 192.168.123.103:0/1310983324 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb39c071980 0x7fb39c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.231+0000 7fb3a300a700 1 -- 192.168.123.103:0/1310983324 >> 192.168.123.103:0/1310983324 conn(0x7fb39c06d1a0 msgr2=0x7fb39c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.231+0000 7fb3a300a700 1 -- 192.168.123.103:0/1310983324 shutdown_connections 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.231+0000 7fb3a300a700 1 -- 192.168.123.103:0/1310983324 wait complete. 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb3a300a700 1 Processor -- start 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb3a300a700 1 -- start start 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb3a300a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb39c071980 0x7fb39c082620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb3a300a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c082b60 0x7fb39c082fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb3a300a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb39c1b2a90 con 0x7fb39c082b60 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb3a300a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb39c1b2bd0 con 0x7fb39c071980 2026-03-10T14:06:48.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb39bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c082b60 0x7fb39c082fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:48.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb39bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c082b60 0x7fb39c082fd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33438/0 (socket says 192.168.123.103:33438) 2026-03-10T14:06:48.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb39bfff700 1 -- 192.168.123.103:0/2344047340 learned_addr learned my addr 192.168.123.103:0/2344047340 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:48.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb39bfff700 1 -- 192.168.123.103:0/2344047340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb39c071980 msgr2=0x7fb39c082620 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:06:48.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb39bfff700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb39c071980 0x7fb39c082620 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb39bfff700 1 -- 192.168.123.103:0/2344047340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb39400b050 con 0x7fb39c082b60 2026-03-10T14:06:48.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.232+0000 7fb39bfff700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c082b60 0x7fb39c082fd0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7fb394003c30 tx=0x7fb394003d10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:48.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.233+0000 7fb399ffb700 1 -- 192.168.123.103:0/2344047340 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb39400e030 con 0x7fb39c082b60 2026-03-10T14:06:48.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.233+0000 7fb3a300a700 1 -- 192.168.123.103:0/2344047340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb39c1b2dd0 con 0x7fb39c082b60 2026-03-10T14:06:48.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.234+0000 7fb3a300a700 1 -- 192.168.123.103:0/2344047340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb39c1b3320 con 0x7fb39c082b60 2026-03-10T14:06:48.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.235+0000 7fb399ffb700 1 -- 192.168.123.103:0/2344047340 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb3940048e0 con 0x7fb39c082b60 2026-03-10T14:06:48.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.235+0000 7fb399ffb700 1 -- 192.168.123.103:0/2344047340 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb39401ce20 con 0x7fb39c082b60 2026-03-10T14:06:48.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.236+0000 7fb399ffb700 1 -- 192.168.123.103:0/2344047340 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb394012430 con 0x7fb39c082b60 2026-03-10T14:06:48.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.236+0000 7fb399ffb700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb38406ea40 0x7fb384070ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.237+0000 7fb399ffb700 1 -- 192.168.123.103:0/2344047340 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb39402a030 con 0x7fb39c082b60 2026-03-10T14:06:48.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.237+0000 7fb3a0da6700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb38406ea40 0x7fb384070ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:48.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.237+0000 7fb3a0da6700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb38406ea40 0x7fb384070ef0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fb39c083d50 tx=0x7fb38c006c60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:48.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.237+0000 7fb3a300a700 1 -- 192.168.123.103:0/2344047340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb388005320 con 0x7fb39c082b60 2026-03-10T14:06:48.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.240+0000 7fb399ffb700 1 -- 192.168.123.103:0/2344047340 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb394058810 con 0x7fb39c082b60 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:e13 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:epoch 13 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:06:44.825228+0000 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:06:48.372 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.369+0000 7fb3a300a700 1 -- 192.168.123.103:0/2344047340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb388006200 con 0x7fb39c082b60 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.370+0000 7fb399ffb700 1 -- 192.168.123.103:0/2344047340 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1865 (secure 0 0 0) 0x7fb394017760 con 0x7fb39c082b60 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 -- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb38406ea40 msgr2=0x7fb384070ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb38406ea40 0x7fb384070ef0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fb39c083d50 tx=0x7fb38c006c60 comp rx=0 tx=0).stop 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 -- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c082b60 msgr2=0x7fb39c082fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c082b60 0x7fb39c082fd0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7fb394003c30 tx=0x7fb394003d10 comp rx=0 tx=0).stop 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 -- 192.168.123.103:0/2344047340 shutdown_connections 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb38406ea40 0x7fb384070ef0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb39c071980 0x7fb39c082620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 --2- 192.168.123.103:0/2344047340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb39c082b60 0x7fb39c082fd0 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.374+0000 7fb3837fe700 1 -- 192.168.123.103:0/2344047340 >> 192.168.123.103:0/2344047340 conn(0x7fb39c06d1a0 msgr2=0x7fb39c0764e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.375+0000 7fb3837fe700 1 -- 192.168.123.103:0/2344047340 shutdown_connections 2026-03-10T14:06:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.375+0000 7fb3837fe700 1 -- 192.168.123.103:0/2344047340 wait complete. 2026-03-10T14:06:48.375 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 13 2026-03-10T14:06:48.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.493+0000 7fd7ea720700 1 -- 192.168.123.103:0/1947344642 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4072360 msgr2=0x7fd7e40770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.493+0000 7fd7ea720700 1 --2- 192.168.123.103:0/1947344642 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4072360 0x7fd7e40770e0 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7fd7dc00d3f0 tx=0x7fd7dc00d700 comp rx=0 tx=0).stop 2026-03-10T14:06:48.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.493+0000 7fd7ea720700 1 -- 192.168.123.103:0/1947344642 shutdown_connections 2026-03-10T14:06:48.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.493+0000 7fd7ea720700 1 --2- 192.168.123.103:0/1947344642 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4072360 0x7fd7e40770e0 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.493+0000 7fd7ea720700 1 --2- 192.168.123.103:0/1947344642 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd7e4071980 0x7fd7e4071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 -- 192.168.123.103:0/1947344642 >> 192.168.123.103:0/1947344642 conn(0x7fd7e406d1a0 msgr2=0x7fd7e406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:48.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 -- 192.168.123.103:0/1947344642 shutdown_connections 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 -- 192.168.123.103:0/1947344642 wait complete. 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 Processor -- start 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 -- start start 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4071980 0x7fd7e4131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd7e4131890 0x7fd7e407f520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7e4131d90 con 0x7fd7e4071980 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.494+0000 7fd7ea720700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7e4131ed0 con 0x7fd7e4131890 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.495+0000 7fd7e3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4071980 0x7fd7e4131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.495+0000 7fd7e3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4071980 0x7fd7e4131350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33454/0 (socket says 192.168.123.103:33454) 2026-03-10T14:06:48.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.495+0000 7fd7e3fff700 1 -- 192.168.123.103:0/2430889203 learned_addr learned my addr 192.168.123.103:0/2430889203 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.495+0000 7fd7e3fff700 1 -- 192.168.123.103:0/2430889203 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd7e4131890 msgr2=0x7fd7e407f520 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.495+0000 7fd7e3fff700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd7e4131890 0x7fd7e407f520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.495+0000 7fd7e3fff700 1 -- 192.168.123.103:0/2430889203 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd7dc007ed0 con 0x7fd7e4071980 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.495+0000 7fd7e3fff700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4071980 0x7fd7e4131350 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fd7d400b770 tx=0x7fd7d400bb30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.496+0000 7fd7e17fa700 1 -- 192.168.123.103:0/2430889203 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd7d400f820 con 0x7fd7e4071980 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.496+0000 7fd7ea720700 1 -- 192.168.123.103:0/2430889203 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd7e407fac0 con 0x7fd7e4071980 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.496+0000 7fd7ea720700 1 -- 192.168.123.103:0/2430889203 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd7e407ffc0 con 0x7fd7e4071980 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.499+0000 7fd7e17fa700 1 -- 192.168.123.103:0/2430889203 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd7d400fe60 con 0x7fd7e4071980 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.499+0000 7fd7e17fa700 1 -- 192.168.123.103:0/2430889203 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd7d400d610 con 0x7fd7e4071980 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.499+0000 7fd7e17fa700 1 -- 192.168.123.103:0/2430889203 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd7d400d830 con 0x7fd7e4071980 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.499+0000 7fd7e17fa700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd7cc06c870 0x7fd7cc06ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.500+0000 7fd7e17fa700 1 -- 192.168.123.103:0/2430889203 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fd7d408c7c0 con 0x7fd7e4071980 2026-03-10T14:06:48.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.501+0000 7fd7e37fe700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd7cc06c870 0x7fd7cc06ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:48.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.501+0000 7fd7e37fe700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd7cc06c870 0x7fd7cc06ed20 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fd7e4072ff0 tx=0x7fd7dc00db00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:48.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.501+0000 7fd7ea720700 1 -- 192.168.123.103:0/2430889203 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd7d0005320 con 0x7fd7e4071980 2026-03-10T14:06:48.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.507+0000 7fd7e17fa700 1 -- 192.168.123.103:0/2430889203 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd7d405aa50 con 0x7fd7e4071980 2026-03-10T14:06:48.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:48 vm04.local ceph-mon[55966]: from='client.14604 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:48.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:48 vm04.local ceph-mon[55966]: from='client.24353 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:48.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:48 vm04.local ceph-mon[55966]: from='client.24357 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:48.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.700+0000 7fd7ea720700 1 -- 192.168.123.103:0/2430889203 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd7d0000bf0 con 0x7fd7cc06c870 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "0/23 daemons upgraded", 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm04", 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:06:48.709 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.704+0000 7fd7e17fa700 1 -- 192.168.123.103:0/2430889203 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fd7d0000bf0 con 0x7fd7cc06c870 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.706+0000 7fd7caffd700 1 -- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd7cc06c870 msgr2=0x7fd7cc06ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd7cc06c870 0x7fd7cc06ed20 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fd7e4072ff0 tx=0x7fd7dc00db00 comp rx=0 tx=0).stop 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 -- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4071980 msgr2=0x7fd7e4131350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4071980 0x7fd7e4131350 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7fd7d400b770 tx=0x7fd7d400bb30 comp rx=0 tx=0).stop 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 -- 192.168.123.103:0/2430889203 shutdown_connections 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd7e4071980 0x7fd7e4131350 unknown :-1 s=CLOSED pgs=303 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd7cc06c870 0x7fd7cc06ed20 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 --2- 192.168.123.103:0/2430889203 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd7e4131890 0x7fd7e407f520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 -- 192.168.123.103:0/2430889203 >> 192.168.123.103:0/2430889203 conn(0x7fd7e406d1a0 msgr2=0x7fd7e4076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 -- 192.168.123.103:0/2430889203 shutdown_connections 2026-03-10T14:06:48.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.707+0000 7fd7caffd700 1 -- 192.168.123.103:0/2430889203 wait complete. 2026-03-10T14:06:48.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.827+0000 7f7f2755c700 1 -- 192.168.123.103:0/32617367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20072360 msgr2=0x7f7f200770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.827+0000 7f7f2755c700 1 --2- 192.168.123.103:0/32617367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20072360 0x7f7f200770e0 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7f7f1800a390 tx=0x7f7f1800a6a0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.828+0000 7f7f2755c700 1 -- 192.168.123.103:0/32617367 shutdown_connections 2026-03-10T14:06:48.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.828+0000 7f7f2755c700 1 --2- 192.168.123.103:0/32617367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20072360 0x7f7f200770e0 unknown :-1 s=CLOSED pgs=304 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.828+0000 7f7f2755c700 1 --2- 192.168.123.103:0/32617367 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f20071980 0x7f7f20071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.828+0000 7f7f2755c700 1 -- 192.168.123.103:0/32617367 >> 192.168.123.103:0/32617367 conn(0x7f7f2006d1a0 msgr2=0x7f7f2006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:48.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f2755c700 1 -- 192.168.123.103:0/32617367 shutdown_connections 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f2755c700 1 -- 192.168.123.103:0/32617367 wait complete. 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f2755c700 1 Processor -- start 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f2755c700 1 -- start start 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f2755c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f20071980 0x7f7f20082560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f2755c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20082aa0 0x7f7f20082f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f2755c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f201b2a90 con 0x7f7f20082aa0 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f2755c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f201b2bd0 con 0x7f7f20071980 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f252f8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f20071980 0x7f7f20082560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f24af7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20082aa0 0x7f7f20082f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:48.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.830+0000 7f7f24af7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20082aa0 0x7f7f20082f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33482/0 (socket says 192.168.123.103:33482) 2026-03-10T14:06:48.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.831+0000 7f7f24af7700 1 -- 192.168.123.103:0/2745170142 learned_addr learned my addr 192.168.123.103:0/2745170142 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:06:48.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.832+0000 7f7f24af7700 1 -- 192.168.123.103:0/2745170142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f20071980 msgr2=0x7f7f20082560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:48.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.832+0000 7f7f24af7700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f20071980 0x7f7f20082560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:48.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.832+0000 7f7f24af7700 1 -- 192.168.123.103:0/2745170142 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f1800a040 con 0x7f7f20082aa0 2026-03-10T14:06:48.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.833+0000 7f7f24af7700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20082aa0 0x7f7f20082f10 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7f7f1800aa80 tx=0x7f7f1800aab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:48.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.833+0000 7f7f167fc700 1 -- 192.168.123.103:0/2745170142 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f1800ae90 con 0x7f7f20082aa0 2026-03-10T14:06:48.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.833+0000 7f7f2755c700 1 -- 192.168.123.103:0/2745170142 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f201b2d70 con 0x7f7f20082aa0 2026-03-10T14:06:48.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.833+0000 7f7f2755c700 1 -- 192.168.123.103:0/2745170142 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f201b3260 con 0x7f7f20082aa0 2026-03-10T14:06:48.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.834+0000 7f7f167fc700 1 -- 192.168.123.103:0/2745170142 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7f1800b050 con 0x7f7f20082aa0 2026-03-10T14:06:48.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.834+0000 7f7f167fc700 1 -- 192.168.123.103:0/2745170142 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f18004050 con 0x7f7f20082aa0 2026-03-10T14:06:48.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.838+0000 7f7f167fc700 1 -- 192.168.123.103:0/2745170142 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7f18022070 con 0x7f7f20082aa0 2026-03-10T14:06:48.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.839+0000 7f7f167fc700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f0c06c7a0 0x7f7f0c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:06:48.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.839+0000 7f7f167fc700 1 -- 192.168.123.103:0/2745170142 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7f180949e0 con 0x7f7f20082aa0 2026-03-10T14:06:48.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.839+0000 7f7f2755c700 1 -- 192.168.123.103:0/2745170142 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f04005320 con 0x7f7f20082aa0 2026-03-10T14:06:48.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.845+0000 7f7f252f8700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f0c06c7a0 0x7f7f0c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:06:48.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.845+0000 7f7f252f8700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f0c06c7a0 0x7f7f0c06ec50 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f7f1c005950 tx=0x7f7f1c00b7f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:06:48.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:48.849+0000 7f7f167fc700 1 -- 192.168.123.103:0/2745170142 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7f18062c70 con 0x7f7f20082aa0 2026-03-10T14:06:49.088 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 failed cephadm daemon(s) 2026-03-10T14:06:49.089 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s) 2026-03-10T14:06:49.089 INFO:teuthology.orchestra.run.vm03.stdout: daemon ceph-exporter.vm03 on vm03 is in error state 2026-03-10T14:06:49.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.087+0000 7f7f2755c700 1 -- 192.168.123.103:0/2745170142 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f7f04005190 con 0x7f7f20082aa0 2026-03-10T14:06:49.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.088+0000 7f7f167fc700 1 -- 192.168.123.103:0/2745170142 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+151 (secure 0 0 0) 0x7f7f18062800 con 0x7f7f20082aa0 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 -- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f0c06c7a0 msgr2=0x7f7f0c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f0c06c7a0 0x7f7f0c06ec50 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f7f1c005950 tx=0x7f7f1c00b7f0 comp rx=0 tx=0).stop 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 -- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20082aa0 msgr2=0x7f7f20082f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20082aa0 0x7f7f20082f10 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7f7f1800aa80 tx=0x7f7f1800aab0 comp rx=0 tx=0).stop 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 -- 192.168.123.103:0/2745170142 shutdown_connections 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f0c06c7a0 0x7f7f0c06ec50 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f20071980 0x7f7f20082560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 --2- 192.168.123.103:0/2745170142 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f20082aa0 0x7f7f20082f10 unknown :-1 s=CLOSED pgs=305 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:06:49.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 -- 192.168.123.103:0/2745170142 >> 192.168.123.103:0/2745170142 conn(0x7f7f2006d1a0 msgr2=0x7f7f200764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:06:49.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 -- 192.168.123.103:0/2745170142 shutdown_connections 2026-03-10T14:06:49.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:06:49.095+0000 7f7f0bfff700 1 -- 192.168.123.103:0/2745170142 wait complete. 2026-03-10T14:06:49.124 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:49 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/356264299' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:06:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:49 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/356264299' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:06:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:49 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/2344047340' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:06:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:49 vm03.local ceph-mon[49718]: pgmap v123: 65 pgs: 65 active+clean; 46 MiB data, 449 MiB used, 120 GiB / 120 GiB avail; 1.1 MiB/s rd, 3.6 MiB/s wr, 367 op/s 2026-03-10T14:06:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:49 vm03.local ceph-mon[49718]: from='client.14622 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:49 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/2745170142' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:06:49.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:49 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/2344047340' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:06:49.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:49 vm04.local ceph-mon[55966]: pgmap v123: 65 pgs: 65 active+clean; 46 MiB data, 449 MiB used, 120 GiB / 120 GiB avail; 1.1 MiB/s rd, 3.6 MiB/s wr, 367 op/s 2026-03-10T14:06:49.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:49 vm04.local ceph-mon[55966]: from='client.14622 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:06:49.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:49 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/2745170142' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:06:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:51 vm04.local ceph-mon[55966]: pgmap v124: 65 pgs: 65 active+clean; 51 MiB data, 460 MiB used, 120 GiB / 120 GiB avail; 1.2 MiB/s rd, 3.1 MiB/s wr, 335 op/s 2026-03-10T14:06:52.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:51 vm03.local ceph-mon[49718]: pgmap v124: 65 pgs: 65 active+clean; 51 MiB data, 460 MiB used, 120 GiB / 120 GiB avail; 1.2 MiB/s rd, 3.1 MiB/s wr, 335 op/s 2026-03-10T14:06:53.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:53 vm04.local ceph-mon[55966]: pgmap v125: 65 pgs: 65 active+clean; 51 MiB data, 496 MiB used, 119 GiB / 120 GiB avail; 1.2 MiB/s rd, 3.1 MiB/s wr, 343 op/s 2026-03-10T14:06:53.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:53 vm03.local ceph-mon[49718]: pgmap v125: 65 pgs: 65 active+clean; 51 MiB data, 496 MiB used, 119 GiB / 120 GiB avail; 1.2 MiB/s rd, 3.1 MiB/s wr, 343 op/s 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr: git switch -c 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr:Or undo this operation with: 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr: git switch - 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-10T14:06:53.793 INFO:tasks.workunit.client.0.vm03.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T14:06:53.794 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-10T14:06:53.794 INFO:tasks.workunit.client.0.vm03.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T14:06:53.799 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-10T14:06:53.861 INFO:tasks.workunit.client.0.vm03.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T14:06:53.863 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T14:06:53.863 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T14:06:53.951 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T14:06:54.008 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T14:06:54.053 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T14:06:54.056 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T14:06:54.056 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T14:06:54.105 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T14:06:54.110 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:06:54.110 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-10T14:06:54.175 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-10T14:06:54.176 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T14:06:54.176 DEBUG:teuthology.orchestra.run.vm03:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-10T14:06:54.259 INFO:tasks.workunit.client.0.vm03.stderr:+ mkdir -p fsstress 2026-03-10T14:06:54.267 INFO:tasks.workunit.client.0.vm03.stderr:+ pushd fsstress 2026-03-10T14:06:54.268 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T14:06:54.269 INFO:tasks.workunit.client.0.vm03.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T14:06:55.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:54 vm03.local ceph-mon[49718]: pgmap v126: 65 pgs: 65 active+clean; 57 MiB data, 540 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 3.4 MiB/s wr, 343 op/s 2026-03-10T14:06:55.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:54 vm04.local ceph-mon[55966]: pgmap v126: 65 pgs: 65 active+clean; 57 MiB data, 540 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 3.4 MiB/s wr, 343 op/s 2026-03-10T14:06:55.644 INFO:tasks.workunit.client.0.vm03.stderr:+ tar xzf ltp-full.tgz 2026-03-10T14:06:57.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:56 vm04.local ceph-mon[55966]: pgmap v127: 65 pgs: 65 active+clean; 62 MiB data, 590 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.6 MiB/s wr, 260 op/s 2026-03-10T14:06:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:56 vm03.local ceph-mon[49718]: pgmap v127: 65 pgs: 65 active+clean; 62 MiB data, 590 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 2.6 MiB/s wr, 260 op/s 2026-03-10T14:06:58.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:58 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:58.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:58 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:06:59.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:06:59 vm04.local ceph-mon[55966]: pgmap v128: 65 pgs: 65 active+clean; 76 MiB data, 682 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 3.6 MiB/s wr, 320 op/s 2026-03-10T14:06:59.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:06:59 vm03.local ceph-mon[49718]: pgmap v128: 65 pgs: 65 active+clean; 76 MiB data, 682 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 3.6 MiB/s wr, 320 op/s 2026-03-10T14:07:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:01 vm04.local ceph-mon[55966]: pgmap v129: 65 pgs: 65 active+clean; 82 MiB data, 715 MiB used, 119 GiB / 120 GiB avail; 1.3 MiB/s rd, 3.2 MiB/s wr, 273 op/s 2026-03-10T14:07:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:01 vm03.local ceph-mon[49718]: pgmap v129: 65 pgs: 65 active+clean; 82 MiB data, 715 MiB used, 119 GiB / 120 GiB avail; 1.3 MiB/s rd, 3.2 MiB/s wr, 273 op/s 2026-03-10T14:07:04.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:03 vm04.local ceph-mon[55966]: pgmap v130: 65 pgs: 65 active+clean; 87 MiB data, 767 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 3.2 MiB/s wr, 317 op/s 2026-03-10T14:07:04.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:03 vm03.local ceph-mon[49718]: pgmap v130: 65 pgs: 65 active+clean; 87 MiB data, 767 MiB used, 119 GiB / 120 GiB avail; 1.1 MiB/s rd, 3.2 MiB/s wr, 317 op/s 2026-03-10T14:07:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:05 vm03.local ceph-mon[49718]: pgmap v131: 65 pgs: 65 active+clean; 100 MiB data, 898 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 4.3 MiB/s wr, 409 op/s 2026-03-10T14:07:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:05 vm04.local ceph-mon[55966]: pgmap v131: 65 pgs: 65 active+clean; 100 MiB data, 898 MiB used, 119 GiB / 120 GiB avail; 1.7 MiB/s rd, 4.3 MiB/s wr, 409 op/s 2026-03-10T14:07:07.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:06 vm04.local ceph-mon[55966]: pgmap v132: 65 pgs: 65 active+clean; 101 MiB data, 910 MiB used, 119 GiB / 120 GiB avail; 1.2 MiB/s rd, 3.9 MiB/s wr, 379 op/s 2026-03-10T14:07:07.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:06 vm03.local ceph-mon[49718]: pgmap v132: 65 pgs: 65 active+clean; 101 MiB data, 910 MiB used, 119 GiB / 120 GiB avail; 1.2 MiB/s rd, 3.9 MiB/s wr, 379 op/s 2026-03-10T14:07:09.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:09 vm03.local ceph-mon[49718]: pgmap v133: 65 pgs: 65 active+clean; 113 MiB data, 913 MiB used, 119 GiB / 120 GiB avail; 2.0 MiB/s rd, 4.5 MiB/s wr, 429 op/s 2026-03-10T14:07:09.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:09 vm04.local ceph-mon[55966]: pgmap v133: 65 pgs: 65 active+clean; 113 MiB data, 913 MiB used, 119 GiB / 120 GiB avail; 2.0 MiB/s rd, 4.5 MiB/s wr, 429 op/s 2026-03-10T14:07:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:11 vm04.local ceph-mon[55966]: pgmap v134: 65 pgs: 65 active+clean; 115 MiB data, 966 MiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.4 MiB/s wr, 359 op/s 2026-03-10T14:07:11.859 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:11 vm03.local ceph-mon[49718]: pgmap v134: 65 pgs: 65 active+clean; 115 MiB data, 966 MiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.4 MiB/s wr, 359 op/s 2026-03-10T14:07:14.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:13 vm04.local ceph-mon[55966]: pgmap v135: 65 pgs: 65 active+clean; 118 MiB data, 1009 MiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.2 MiB/s wr, 378 op/s 2026-03-10T14:07:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:13 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:07:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:13 vm03.local ceph-mon[49718]: pgmap v135: 65 pgs: 65 active+clean; 118 MiB data, 1009 MiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 3.2 MiB/s wr, 378 op/s 2026-03-10T14:07:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:13 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:07:15.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:15 vm04.local ceph-mon[55966]: pgmap v136: 65 pgs: 65 active+clean; 135 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.1 MiB/s rd, 4.3 MiB/s wr, 425 op/s 2026-03-10T14:07:15.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:15 vm03.local ceph-mon[49718]: pgmap v136: 65 pgs: 65 active+clean; 135 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.1 MiB/s rd, 4.3 MiB/s wr, 425 op/s 2026-03-10T14:07:17.140 INFO:tasks.workunit.client.1.vm04.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T14:07:17.146 INFO:tasks.workunit.client.1.vm04.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T14:07:17.146 INFO:tasks.workunit.client.1.vm04.stderr:+ make 2026-03-10T14:07:17.729 INFO:tasks.workunit.client.1.vm04.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T14:07:17.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:17 vm03.local ceph-mon[49718]: pgmap v137: 65 pgs: 65 active+clean; 143 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 1.8 MiB/s rd, 3.8 MiB/s wr, 339 op/s 2026-03-10T14:07:18.041 INFO:tasks.workunit.client.1.vm04.stderr:++ readlink -f fsstress 2026-03-10T14:07:18.043 INFO:tasks.workunit.client.1.vm04.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T14:07:18.043 INFO:tasks.workunit.client.1.vm04.stderr:+ popd 2026-03-10T14:07:18.044 INFO:tasks.workunit.client.1.vm04.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T14:07:18.044 INFO:tasks.workunit.client.1.vm04.stderr:+ popd 2026-03-10T14:07:18.045 INFO:tasks.workunit.client.1.vm04.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-10T14:07:18.045 INFO:tasks.workunit.client.1.vm04.stderr:++ mktemp -d -p . 2026-03-10T14:07:18.050 INFO:tasks.workunit.client.1.vm04.stderr:+ T=./tmp.oa6XzvJ2NI 2026-03-10T14:07:18.050 INFO:tasks.workunit.client.1.vm04.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.oa6XzvJ2NI -l 1 -n 1000 -p 10 -v 2026-03-10T14:07:18.062 INFO:tasks.workunit.client.1.vm04.stdout:seed = 1773198286 2026-03-10T14:07:18.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:17 vm04.local ceph-mon[55966]: pgmap v137: 65 pgs: 65 active+clean; 143 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 1.8 MiB/s rd, 3.8 MiB/s wr, 339 op/s 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/0: link - no file 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/1: rename - no filename 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/2: write - no filename 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/3: chown . 0 1 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/4: fsync - no filename 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/5: write - no filename 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/6: truncate - no filename 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/7: dread - no filename 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/8: rename - no filename 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/9: chown . 0 1 2026-03-10T14:07:18.086 INFO:tasks.workunit.client.1.vm04.stdout:9/10: dread - no filename 2026-03-10T14:07:18.108 INFO:tasks.workunit.client.1.vm04.stdout:7/0: link - no file 2026-03-10T14:07:18.130 INFO:tasks.workunit.client.1.vm04.stdout:7/1: creat f0 x:0 0 0 2026-03-10T14:07:18.130 INFO:tasks.workunit.client.1.vm04.stdout:7/2: stat f0 0 2026-03-10T14:07:18.134 INFO:tasks.workunit.client.1.vm04.stdout:6/0: dread - no filename 2026-03-10T14:07:18.134 INFO:tasks.workunit.client.1.vm04.stdout:6/1: readlink - no filename 2026-03-10T14:07:18.134 INFO:tasks.workunit.client.1.vm04.stdout:6/2: dread - no filename 2026-03-10T14:07:18.134 INFO:tasks.workunit.client.1.vm04.stdout:6/3: readlink - no filename 2026-03-10T14:07:18.134 INFO:tasks.workunit.client.1.vm04.stdout:6/4: write - no filename 2026-03-10T14:07:18.134 INFO:tasks.workunit.client.1.vm04.stdout:6/5: write - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/0: link - no file 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/1: fdatasync - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/2: rename - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/3: rmdir - no directory 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/4: truncate - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/5: fdatasync - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/6: rename - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/7: read - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/8: dread - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/9: dread - no filename 2026-03-10T14:07:18.140 INFO:tasks.workunit.client.1.vm04.stdout:5/10: link - no file 2026-03-10T14:07:18.152 INFO:tasks.workunit.client.1.vm04.stdout:8/0: mkdir d0 0 2026-03-10T14:07:18.152 INFO:tasks.workunit.client.1.vm04.stdout:8/1: write - no filename 2026-03-10T14:07:18.152 INFO:tasks.workunit.client.1.vm04.stdout:8/2: dread - no filename 2026-03-10T14:07:18.152 INFO:tasks.workunit.client.1.vm04.stdout:8/3: dwrite - no filename 2026-03-10T14:07:18.152 INFO:tasks.workunit.client.1.vm04.stdout:7/3: link f0 f1 0 2026-03-10T14:07:18.152 INFO:tasks.workunit.client.1.vm04.stdout:7/4: read - f0 zero size 2026-03-10T14:07:18.152 INFO:tasks.workunit.client.1.vm04.stdout:7/5: dread - f0 zero size 2026-03-10T14:07:18.153 INFO:tasks.workunit.client.1.vm04.stdout:7/6: chown f0 827956 1 2026-03-10T14:07:18.153 INFO:tasks.workunit.client.1.vm04.stdout:4/0: write - no filename 2026-03-10T14:07:18.153 INFO:tasks.workunit.client.1.vm04.stdout:4/1: dread - no filename 2026-03-10T14:07:18.155 INFO:tasks.workunit.client.1.vm04.stdout:6/6: mknod c0 0 2026-03-10T14:07:18.162 INFO:tasks.workunit.client.1.vm04.stdout:2/0: stat - no entries 2026-03-10T14:07:18.166 INFO:tasks.workunit.client.1.vm04.stdout:2/1: write - no filename 2026-03-10T14:07:18.166 INFO:tasks.workunit.client.1.vm04.stdout:2/2: readlink - no filename 2026-03-10T14:07:18.167 INFO:tasks.workunit.client.1.vm04.stdout:7/7: mkdir d2 0 2026-03-10T14:07:18.167 INFO:tasks.workunit.client.1.vm04.stdout:7/8: read - f1 zero size 2026-03-10T14:07:18.167 INFO:tasks.workunit.client.1.vm04.stdout:5/11: creat f0 x:0 0 0 2026-03-10T14:07:18.167 INFO:tasks.workunit.client.1.vm04.stdout:5/12: readlink - no filename 2026-03-10T14:07:18.167 INFO:tasks.workunit.client.1.vm04.stdout:5/13: write f0 [934197,10519] 0 2026-03-10T14:07:18.170 INFO:tasks.workunit.client.1.vm04.stdout:8/4: mknod d0/c1 0 2026-03-10T14:07:18.170 INFO:tasks.workunit.client.1.vm04.stdout:8/5: read - no filename 2026-03-10T14:07:18.170 INFO:tasks.workunit.client.1.vm04.stdout:8/6: dread - no filename 2026-03-10T14:07:18.170 INFO:tasks.workunit.client.1.vm04.stdout:6/7: creat f1 x:0 0 0 2026-03-10T14:07:18.177 INFO:tasks.workunit.client.1.vm04.stdout:4/2: creat f0 x:0 0 0 2026-03-10T14:07:18.178 INFO:tasks.workunit.client.1.vm04.stdout:8/7: creat d0/f2 x:0 0 0 2026-03-10T14:07:18.178 INFO:tasks.workunit.client.1.vm04.stdout:4/3: dread - f0 zero size 2026-03-10T14:07:18.178 INFO:tasks.workunit.client.1.vm04.stdout:2/3: mkdir d0 0 2026-03-10T14:07:18.178 INFO:tasks.workunit.client.1.vm04.stdout:2/4: write - no filename 2026-03-10T14:07:18.178 INFO:tasks.workunit.client.1.vm04.stdout:2/5: dwrite - no filename 2026-03-10T14:07:18.179 INFO:tasks.workunit.client.1.vm04.stdout:4/4: write f0 [400660,55741] 0 2026-03-10T14:07:18.191 INFO:tasks.workunit.client.1.vm04.stdout:3/0: chown . 59660 1 2026-03-10T14:07:18.211 INFO:tasks.workunit.client.1.vm04.stdout:5/14: dwrite f0 [0,4194304] 0 2026-03-10T14:07:18.211 INFO:tasks.workunit.client.1.vm04.stdout:4/5: write f0 [849670,18202] 0 2026-03-10T14:07:18.211 INFO:tasks.workunit.client.1.vm04.stdout:3/1: symlink l0 0 2026-03-10T14:07:18.211 INFO:tasks.workunit.client.1.vm04.stdout:8/8: dwrite d0/f2 [0,4194304] 0 2026-03-10T14:07:18.211 INFO:tasks.workunit.client.1.vm04.stdout:8/9: chown d0 114474 1 2026-03-10T14:07:18.211 INFO:tasks.workunit.client.1.vm04.stdout:6/8: dwrite f1 [0,4194304] 0 2026-03-10T14:07:18.211 INFO:tasks.workunit.client.1.vm04.stdout:2/6: mknod d0/c1 0 2026-03-10T14:07:18.211 INFO:tasks.workunit.client.1.vm04.stdout:6/9: chown c0 6718 1 2026-03-10T14:07:18.214 INFO:tasks.workunit.client.1.vm04.stdout:5/15: dread f0 [0,4194304] 0 2026-03-10T14:07:18.214 INFO:tasks.workunit.client.1.vm04.stdout:5/16: chown f0 8297185 1 2026-03-10T14:07:18.214 INFO:tasks.workunit.client.1.vm04.stdout:5/17: rmdir - no directory 2026-03-10T14:07:18.215 INFO:tasks.workunit.client.1.vm04.stdout:5/18: write f0 [746044,2300] 0 2026-03-10T14:07:18.216 INFO:tasks.workunit.client.1.vm04.stdout:6/10: dread f1 [0,4194304] 0 2026-03-10T14:07:18.220 INFO:tasks.workunit.client.1.vm04.stdout:1/0: readlink - no filename 2026-03-10T14:07:18.220 INFO:tasks.workunit.client.1.vm04.stdout:1/1: dread - no filename 2026-03-10T14:07:18.220 INFO:tasks.workunit.client.1.vm04.stdout:1/2: write - no filename 2026-03-10T14:07:18.220 INFO:tasks.workunit.client.1.vm04.stdout:1/3: write - no filename 2026-03-10T14:07:18.221 INFO:tasks.workunit.client.1.vm04.stdout:1/4: chown . 648567 1 2026-03-10T14:07:18.221 INFO:tasks.workunit.client.1.vm04.stdout:1/5: dwrite - no filename 2026-03-10T14:07:18.221 INFO:tasks.workunit.client.1.vm04.stdout:1/6: unlink - no file 2026-03-10T14:07:18.221 INFO:tasks.workunit.client.1.vm04.stdout:1/7: dwrite - no filename 2026-03-10T14:07:18.221 INFO:tasks.workunit.client.1.vm04.stdout:1/8: rename - no filename 2026-03-10T14:07:18.221 INFO:tasks.workunit.client.1.vm04.stdout:1/9: write - no filename 2026-03-10T14:07:18.221 INFO:tasks.workunit.client.1.vm04.stdout:1/10: rename - no filename 2026-03-10T14:07:18.240 INFO:tasks.workunit.client.1.vm04.stdout:6/11: unlink f1 0 2026-03-10T14:07:18.255 INFO:tasks.workunit.client.1.vm04.stdout:6/12: dwrite - no filename 2026-03-10T14:07:18.255 INFO:tasks.workunit.client.1.vm04.stdout:6/13: write - no filename 2026-03-10T14:07:18.255 INFO:tasks.workunit.client.1.vm04.stdout:6/14: write - no filename 2026-03-10T14:07:18.256 INFO:tasks.workunit.client.1.vm04.stdout:2/7: link d0/c1 d0/c2 0 2026-03-10T14:07:18.256 INFO:tasks.workunit.client.1.vm04.stdout:2/8: dwrite - no filename 2026-03-10T14:07:18.256 INFO:tasks.workunit.client.1.vm04.stdout:6/15: creat f2 x:0 0 0 2026-03-10T14:07:18.256 INFO:tasks.workunit.client.1.vm04.stdout:6/16: write f2 [881689,129875] 0 2026-03-10T14:07:18.256 INFO:tasks.workunit.client.1.vm04.stdout:6/17: dwrite f2 [0,4194304] 0 2026-03-10T14:07:18.257 INFO:tasks.workunit.client.1.vm04.stdout:2/9: mkdir d0/d3 0 2026-03-10T14:07:18.257 INFO:tasks.workunit.client.1.vm04.stdout:2/10: write - no filename 2026-03-10T14:07:18.257 INFO:tasks.workunit.client.1.vm04.stdout:2/11: truncate - no filename 2026-03-10T14:07:18.257 INFO:tasks.workunit.client.1.vm04.stdout:0/0: mkdir d0 0 2026-03-10T14:07:18.257 INFO:tasks.workunit.client.1.vm04.stdout:0/1: write - no filename 2026-03-10T14:07:18.257 INFO:tasks.workunit.client.1.vm04.stdout:0/2: dwrite - no filename 2026-03-10T14:07:18.257 INFO:tasks.workunit.client.1.vm04.stdout:6/18: read f2 [3327402,90897] 0 2026-03-10T14:07:18.264 INFO:tasks.workunit.client.1.vm04.stdout:0/3: symlink d0/l1 0 2026-03-10T14:07:18.264 INFO:tasks.workunit.client.1.vm04.stdout:0/4: dread - no filename 2026-03-10T14:07:18.266 INFO:tasks.workunit.client.1.vm04.stdout:2/12: unlink d0/c2 0 2026-03-10T14:07:18.267 INFO:tasks.workunit.client.1.vm04.stdout:0/5: mkdir d0/d2 0 2026-03-10T14:07:18.270 INFO:tasks.workunit.client.1.vm04.stdout:2/13: mkdir d0/d4 0 2026-03-10T14:07:18.272 INFO:tasks.workunit.client.1.vm04.stdout:0/6: creat d0/f3 x:0 0 0 2026-03-10T14:07:18.273 INFO:tasks.workunit.client.1.vm04.stdout:0/7: write d0/f3 [385152,113155] 0 2026-03-10T14:07:18.276 INFO:tasks.workunit.client.1.vm04.stdout:0/8: creat d0/f4 x:0 0 0 2026-03-10T14:07:18.430 INFO:tasks.workunit.client.1.vm04.stdout:4/6: dread f0 [0,4194304] 0 2026-03-10T14:07:18.432 INFO:tasks.workunit.client.1.vm04.stdout:4/7: creat f1 x:0 0 0 2026-03-10T14:07:18.433 INFO:tasks.workunit.client.1.vm04.stdout:4/8: creat f2 x:0 0 0 2026-03-10T14:07:18.433 INFO:tasks.workunit.client.1.vm04.stdout:4/9: creat f3 x:0 0 0 2026-03-10T14:07:18.434 INFO:tasks.workunit.client.1.vm04.stdout:4/10: read - f3 zero size 2026-03-10T14:07:18.434 INFO:tasks.workunit.client.1.vm04.stdout:4/11: mkdir d4 0 2026-03-10T14:07:18.435 INFO:tasks.workunit.client.1.vm04.stdout:4/12: mknod d4/c5 0 2026-03-10T14:07:18.435 INFO:tasks.workunit.client.1.vm04.stdout:4/13: truncate f0 1588235 0 2026-03-10T14:07:18.436 INFO:tasks.workunit.client.1.vm04.stdout:4/14: creat d4/f6 x:0 0 0 2026-03-10T14:07:18.437 INFO:tasks.workunit.client.1.vm04.stdout:4/15: symlink d4/l7 0 2026-03-10T14:07:18.438 INFO:tasks.workunit.client.1.vm04.stdout:4/16: symlink d4/l8 0 2026-03-10T14:07:18.445 INFO:tasks.workunit.client.1.vm04.stdout:4/17: dwrite f0 [0,4194304] 0 2026-03-10T14:07:18.449 INFO:tasks.workunit.client.1.vm04.stdout:4/18: link d4/f6 d4/f9 0 2026-03-10T14:07:18.451 INFO:tasks.workunit.client.1.vm04.stdout:4/19: creat d4/fa x:0 0 0 2026-03-10T14:07:18.454 INFO:tasks.workunit.client.1.vm04.stdout:4/20: dread f0 [0,4194304] 0 2026-03-10T14:07:18.551 INFO:tasks.workunit.client.1.vm04.stdout:7/9: truncate f0 607936 0 2026-03-10T14:07:18.557 INFO:tasks.workunit.client.1.vm04.stdout:5/19: truncate f0 2384 0 2026-03-10T14:07:18.560 INFO:tasks.workunit.client.1.vm04.stdout:8/10: truncate d0/f2 4102152 0 2026-03-10T14:07:18.564 INFO:tasks.workunit.client.1.vm04.stdout:6/19: getdents . 0 2026-03-10T14:07:18.564 INFO:tasks.workunit.client.1.vm04.stdout:6/20: mkdir d3 0 2026-03-10T14:07:18.567 INFO:tasks.workunit.client.1.vm04.stdout:6/21: dread f2 [0,4194304] 0 2026-03-10T14:07:18.568 INFO:tasks.workunit.client.1.vm04.stdout:6/22: fdatasync f2 0 2026-03-10T14:07:18.569 INFO:tasks.workunit.client.1.vm04.stdout:6/23: dread f2 [0,4194304] 0 2026-03-10T14:07:18.586 INFO:tasks.workunit.client.1.vm04.stdout:2/14: getdents d0 0 2026-03-10T14:07:18.588 INFO:tasks.workunit.client.1.vm04.stdout:2/15: chown d0 109 1 2026-03-10T14:07:18.588 INFO:tasks.workunit.client.1.vm04.stdout:4/21: rmdir d4 39 2026-03-10T14:07:18.589 INFO:tasks.workunit.client.1.vm04.stdout:6/24: dwrite f2 [4194304,4194304] 0 2026-03-10T14:07:18.594 INFO:tasks.workunit.client.1.vm04.stdout:6/25: write f2 [8003828,69430] 0 2026-03-10T14:07:18.595 INFO:tasks.workunit.client.1.vm04.stdout:6/26: write f2 [6574370,3287] 0 2026-03-10T14:07:18.649 INFO:tasks.workunit.client.1.vm04.stdout:2/16: chown d0/c1 4 1 2026-03-10T14:07:18.667 INFO:tasks.workunit.client.1.vm04.stdout:2/17: dwrite - no filename 2026-03-10T14:07:18.667 INFO:tasks.workunit.client.1.vm04.stdout:4/22: symlink d4/lb 0 2026-03-10T14:07:18.667 INFO:tasks.workunit.client.1.vm04.stdout:4/23: write f1 [863017,123996] 0 2026-03-10T14:07:18.667 INFO:tasks.workunit.client.1.vm04.stdout:2/18: creat d0/d3/f5 x:0 0 0 2026-03-10T14:07:18.668 INFO:tasks.workunit.client.1.vm04.stdout:2/19: fdatasync d0/d3/f5 0 2026-03-10T14:07:18.668 INFO:tasks.workunit.client.1.vm04.stdout:2/20: unlink d0/d3/f5 0 2026-03-10T14:07:18.668 INFO:tasks.workunit.client.1.vm04.stdout:2/21: fdatasync - no filename 2026-03-10T14:07:18.668 INFO:tasks.workunit.client.1.vm04.stdout:2/22: write - no filename 2026-03-10T14:07:18.668 INFO:tasks.workunit.client.1.vm04.stdout:2/23: fsync - no filename 2026-03-10T14:07:18.668 INFO:tasks.workunit.client.1.vm04.stdout:2/24: dread - no filename 2026-03-10T14:07:18.668 INFO:tasks.workunit.client.1.vm04.stdout:2/25: dread - no filename 2026-03-10T14:07:18.668 INFO:tasks.workunit.client.1.vm04.stdout:2/26: creat d0/d4/f6 x:0 0 0 2026-03-10T14:07:18.747 INFO:tasks.workunit.client.1.vm04.stdout:7/10: rename f1 to d2/f3 0 2026-03-10T14:07:18.748 INFO:tasks.workunit.client.1.vm04.stdout:5/20: fsync f0 0 2026-03-10T14:07:18.748 INFO:tasks.workunit.client.1.vm04.stdout:5/21: read f0 [469,89189] 0 2026-03-10T14:07:18.748 INFO:tasks.workunit.client.1.vm04.stdout:5/22: rmdir - no directory 2026-03-10T14:07:18.749 INFO:tasks.workunit.client.1.vm04.stdout:5/23: write f0 [967370,56447] 0 2026-03-10T14:07:18.751 INFO:tasks.workunit.client.1.vm04.stdout:6/27: rename f2 to d3/f4 0 2026-03-10T14:07:18.755 INFO:tasks.workunit.client.1.vm04.stdout:5/24: unlink f0 0 2026-03-10T14:07:18.755 INFO:tasks.workunit.client.1.vm04.stdout:7/11: fdatasync f0 0 2026-03-10T14:07:18.755 INFO:tasks.workunit.client.1.vm04.stdout:6/28: write d3/f4 [2013351,15784] 0 2026-03-10T14:07:18.758 INFO:tasks.workunit.client.1.vm04.stdout:7/12: dread f0 [0,4194304] 0 2026-03-10T14:07:18.758 INFO:tasks.workunit.client.1.vm04.stdout:6/29: dread d3/f4 [4194304,4194304] 0 2026-03-10T14:07:18.759 INFO:tasks.workunit.client.1.vm04.stdout:5/25: symlink l1 0 2026-03-10T14:07:18.762 INFO:tasks.workunit.client.1.vm04.stdout:7/13: creat d2/f4 x:0 0 0 2026-03-10T14:07:18.763 INFO:tasks.workunit.client.1.vm04.stdout:7/14: dread f0 [0,4194304] 0 2026-03-10T14:07:18.763 INFO:tasks.workunit.client.1.vm04.stdout:7/15: write d2/f4 [315872,99072] 0 2026-03-10T14:07:18.764 INFO:tasks.workunit.client.1.vm04.stdout:7/16: truncate d2/f4 1352667 0 2026-03-10T14:07:18.764 INFO:tasks.workunit.client.1.vm04.stdout:7/17: rename d2 to d2/d5 22 2026-03-10T14:07:18.765 INFO:tasks.workunit.client.1.vm04.stdout:7/18: dread f0 [0,4194304] 0 2026-03-10T14:07:18.770 INFO:tasks.workunit.client.1.vm04.stdout:6/30: link d3/f4 d3/f5 0 2026-03-10T14:07:18.784 INFO:tasks.workunit.client.1.vm04.stdout:7/19: symlink d2/l6 0 2026-03-10T14:07:18.784 INFO:tasks.workunit.client.1.vm04.stdout:7/20: chown d2/f4 166679 1 2026-03-10T14:07:18.784 INFO:tasks.workunit.client.1.vm04.stdout:6/31: link c0 d3/c6 0 2026-03-10T14:07:18.784 INFO:tasks.workunit.client.1.vm04.stdout:6/32: write d3/f4 [1643345,44408] 0 2026-03-10T14:07:18.786 INFO:tasks.workunit.client.1.vm04.stdout:6/33: dwrite d3/f4 [8388608,4194304] 0 2026-03-10T14:07:18.794 INFO:tasks.workunit.client.1.vm04.stdout:6/34: mknod d3/c7 0 2026-03-10T14:07:18.802 INFO:tasks.workunit.client.1.vm04.stdout:6/35: mkdir d3/d8 0 2026-03-10T14:07:18.802 INFO:tasks.workunit.client.1.vm04.stdout:6/36: unlink d3/c6 0 2026-03-10T14:07:18.805 INFO:tasks.workunit.client.1.vm04.stdout:6/37: creat d3/f9 x:0 0 0 2026-03-10T14:07:18.805 INFO:tasks.workunit.client.1.vm04.stdout:6/38: chown d3/c7 187692444 1 2026-03-10T14:07:18.811 INFO:tasks.workunit.client.1.vm04.stdout:6/39: dwrite d3/f5 [0,4194304] 0 2026-03-10T14:07:18.923 INFO:tasks.workunit.client.1.vm04.stdout:8/11: dwrite d0/f2 [0,4194304] 0 2026-03-10T14:07:18.924 INFO:tasks.workunit.client.1.vm04.stdout:8/12: chown d0 14 1 2026-03-10T14:07:18.925 INFO:tasks.workunit.client.1.vm04.stdout:8/13: mkdir d0/d3 0 2026-03-10T14:07:18.931 INFO:tasks.workunit.client.1.vm04.stdout:8/14: dwrite d0/f2 [0,4194304] 0 2026-03-10T14:07:18.932 INFO:tasks.workunit.client.1.vm04.stdout:8/15: write d0/f2 [3913962,80683] 0 2026-03-10T14:07:18.933 INFO:tasks.workunit.client.1.vm04.stdout:8/16: chown d0 428 1 2026-03-10T14:07:18.933 INFO:tasks.workunit.client.1.vm04.stdout:8/17: write d0/f2 [812269,118817] 0 2026-03-10T14:07:18.934 INFO:tasks.workunit.client.1.vm04.stdout:8/18: stat d0/f2 0 2026-03-10T14:07:18.956 INFO:tasks.workunit.client.1.vm04.stdout:9/11: sync 2026-03-10T14:07:18.956 INFO:tasks.workunit.client.1.vm04.stdout:3/2: sync 2026-03-10T14:07:18.956 INFO:tasks.workunit.client.1.vm04.stdout:3/3: dread - no filename 2026-03-10T14:07:18.960 INFO:tasks.workunit.client.1.vm04.stdout:9/12: creat f0 x:0 0 0 2026-03-10T14:07:18.970 INFO:tasks.workunit.client.1.vm04.stdout:9/13: chown f0 8 1 2026-03-10T14:07:18.970 INFO:tasks.workunit.client.1.vm04.stdout:9/14: dwrite f0 [0,4194304] 0 2026-03-10T14:07:18.970 INFO:tasks.workunit.client.1.vm04.stdout:9/15: mknod c1 0 2026-03-10T14:07:18.977 INFO:tasks.workunit.client.1.vm04.stdout:9/16: rename f0 to f2 0 2026-03-10T14:07:18.984 INFO:tasks.workunit.client.1.vm04.stdout:9/17: creat f3 x:0 0 0 2026-03-10T14:07:18.990 INFO:tasks.workunit.client.1.vm04.stdout:9/18: symlink l4 0 2026-03-10T14:07:19.004 INFO:tasks.workunit.client.1.vm04.stdout:1/11: sync 2026-03-10T14:07:19.004 INFO:tasks.workunit.client.1.vm04.stdout:0/9: sync 2026-03-10T14:07:19.004 INFO:tasks.workunit.client.1.vm04.stdout:7/21: sync 2026-03-10T14:07:19.004 INFO:tasks.workunit.client.1.vm04.stdout:3/4: sync 2026-03-10T14:07:19.005 INFO:tasks.workunit.client.1.vm04.stdout:0/10: stat d0/d2 0 2026-03-10T14:07:19.005 INFO:tasks.workunit.client.1.vm04.stdout:0/11: readlink d0/l1 0 2026-03-10T14:07:19.009 INFO:tasks.workunit.client.1.vm04.stdout:3/5: rename l0 to l1 0 2026-03-10T14:07:19.009 INFO:tasks.workunit.client.1.vm04.stdout:3/6: dwrite - no filename 2026-03-10T14:07:19.010 INFO:tasks.workunit.client.1.vm04.stdout:1/12: creat f0 x:0 0 0 2026-03-10T14:07:19.012 INFO:tasks.workunit.client.1.vm04.stdout:1/13: rename f0 to f1 0 2026-03-10T14:07:19.013 INFO:tasks.workunit.client.1.vm04.stdout:1/14: chown f1 1984 1 2026-03-10T14:07:19.013 INFO:tasks.workunit.client.1.vm04.stdout:1/15: chown f1 264919793 1 2026-03-10T14:07:19.016 INFO:tasks.workunit.client.1.vm04.stdout:1/16: creat f2 x:0 0 0 2026-03-10T14:07:19.031 INFO:tasks.workunit.client.1.vm04.stdout:1/17: chown f1 571 1 2026-03-10T14:07:19.031 INFO:tasks.workunit.client.1.vm04.stdout:1/18: truncate f2 737658 0 2026-03-10T14:07:19.031 INFO:tasks.workunit.client.1.vm04.stdout:1/19: mkdir d3 0 2026-03-10T14:07:19.031 INFO:tasks.workunit.client.1.vm04.stdout:1/20: symlink d3/l4 0 2026-03-10T14:07:19.035 INFO:tasks.workunit.client.1.vm04.stdout:0/12: sync 2026-03-10T14:07:19.035 INFO:tasks.workunit.client.1.vm04.stdout:1/21: sync 2026-03-10T14:07:19.044 INFO:tasks.workunit.client.1.vm04.stdout:0/13: rename d0/l1 to d0/l5 0 2026-03-10T14:07:19.048 INFO:tasks.workunit.client.1.vm04.stdout:1/22: dread f2 [0,4194304] 0 2026-03-10T14:07:19.049 INFO:tasks.workunit.client.1.vm04.stdout:1/23: mkdir d3/d5 0 2026-03-10T14:07:19.051 INFO:tasks.workunit.client.1.vm04.stdout:1/24: write f2 [637942,22291] 0 2026-03-10T14:07:19.052 INFO:tasks.workunit.client.1.vm04.stdout:1/25: write f2 [321738,106777] 0 2026-03-10T14:07:19.053 INFO:tasks.workunit.client.1.vm04.stdout:1/26: symlink d3/d5/l6 0 2026-03-10T14:07:19.056 INFO:tasks.workunit.client.1.vm04.stdout:1/27: mknod d3/c7 0 2026-03-10T14:07:19.058 INFO:tasks.workunit.client.1.vm04.stdout:1/28: creat d3/f8 x:0 0 0 2026-03-10T14:07:19.063 INFO:tasks.workunit.client.1.vm04.stdout:1/29: symlink d3/l9 0 2026-03-10T14:07:19.063 INFO:tasks.workunit.client.1.vm04.stdout:1/30: write f2 [1124943,36602] 0 2026-03-10T14:07:19.064 INFO:tasks.workunit.client.1.vm04.stdout:1/31: chown d3/c7 786 1 2026-03-10T14:07:19.065 INFO:tasks.workunit.client.1.vm04.stdout:1/32: read f2 [764187,10821] 0 2026-03-10T14:07:19.066 INFO:tasks.workunit.client.1.vm04.stdout:1/33: dread f2 [0,4194304] 0 2026-03-10T14:07:19.068 INFO:tasks.workunit.client.1.vm04.stdout:1/34: dread f2 [0,4194304] 0 2026-03-10T14:07:19.068 INFO:tasks.workunit.client.1.vm04.stdout:1/35: readlink d3/l4 0 2026-03-10T14:07:19.092 INFO:tasks.workunit.client.1.vm04.stdout:1/36: dwrite d3/f8 [0,4194304] 0 2026-03-10T14:07:19.101 INFO:tasks.workunit.client.1.vm04.stdout:1/37: creat d3/fa x:0 0 0 2026-03-10T14:07:19.103 INFO:tasks.workunit.client.1.vm04.stdout:1/38: write f1 [771374,23012] 0 2026-03-10T14:07:19.107 INFO:tasks.workunit.client.1.vm04.stdout:1/39: creat d3/fb x:0 0 0 2026-03-10T14:07:19.179 INFO:tasks.workunit.client.1.vm04.stdout:4/24: truncate f1 258298 0 2026-03-10T14:07:19.180 INFO:tasks.workunit.client.1.vm04.stdout:4/25: symlink d4/lc 0 2026-03-10T14:07:19.182 INFO:tasks.workunit.client.1.vm04.stdout:4/26: link d4/f6 d4/fd 0 2026-03-10T14:07:19.183 INFO:tasks.workunit.client.1.vm04.stdout:4/27: dread - d4/f6 zero size 2026-03-10T14:07:19.185 INFO:tasks.workunit.client.1.vm04.stdout:4/28: creat d4/fe x:0 0 0 2026-03-10T14:07:19.185 INFO:tasks.workunit.client.1.vm04.stdout:4/29: write f2 [632632,99515] 0 2026-03-10T14:07:19.189 INFO:tasks.workunit.client.1.vm04.stdout:4/30: dwrite f3 [0,4194304] 0 2026-03-10T14:07:19.190 INFO:tasks.workunit.client.1.vm04.stdout:4/31: chown f0 4090 1 2026-03-10T14:07:19.199 INFO:tasks.workunit.client.1.vm04.stdout:4/32: mkdir d4/df 0 2026-03-10T14:07:19.199 INFO:tasks.workunit.client.1.vm04.stdout:4/33: readlink d4/lc 0 2026-03-10T14:07:19.207 INFO:tasks.workunit.client.1.vm04.stdout:4/34: mkdir d4/df/d10 0 2026-03-10T14:07:19.209 INFO:tasks.workunit.client.1.vm04.stdout:2/27: fsync d0/d4/f6 0 2026-03-10T14:07:19.209 INFO:tasks.workunit.client.1.vm04.stdout:2/28: truncate d0/d4/f6 828110 0 2026-03-10T14:07:19.210 INFO:tasks.workunit.client.1.vm04.stdout:2/29: fdatasync d0/d4/f6 0 2026-03-10T14:07:19.214 INFO:tasks.workunit.client.1.vm04.stdout:2/30: creat d0/d4/f7 x:0 0 0 2026-03-10T14:07:19.214 INFO:tasks.workunit.client.1.vm04.stdout:2/31: dread - d0/d4/f7 zero size 2026-03-10T14:07:19.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.220+0000 7f6af1182700 1 -- 192.168.123.103:0/420381249 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec072330 msgr2=0x7f6aec0770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.220+0000 7f6af1182700 1 --2- 192.168.123.103:0/420381249 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec072330 0x7f6aec0770b0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f6ae400d3f0 tx=0x7f6ae400d700 comp rx=0 tx=0).stop 2026-03-10T14:07:19.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.220+0000 7f6af1182700 1 -- 192.168.123.103:0/420381249 shutdown_connections 2026-03-10T14:07:19.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.220+0000 7f6af1182700 1 --2- 192.168.123.103:0/420381249 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec072330 0x7f6aec0770b0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.220+0000 7f6af1182700 1 --2- 192.168.123.103:0/420381249 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6aec071950 0x7f6aec071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.220+0000 7f6af1182700 1 -- 192.168.123.103:0/420381249 >> 192.168.123.103:0/420381249 conn(0x7f6aec06d1a0 msgr2=0x7f6aec06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.220+0000 7f6af1182700 1 -- 192.168.123.103:0/420381249 shutdown_connections 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.220+0000 7f6af1182700 1 -- 192.168.123.103:0/420381249 wait complete. 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6af1182700 1 Processor -- start 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6af1182700 1 -- start start 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6af1182700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6aec071950 0x7f6aec1312a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6af1182700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec1317e0 0x7f6aec07f440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6af1182700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6aec131ce0 con 0x7f6aec071950 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6af1182700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6aec131e50 con 0x7f6aec1317e0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6aebfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6aec071950 0x7f6aec1312a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6aeb7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec1317e0 0x7f6aec07f440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6aeb7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec1317e0 0x7f6aec07f440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:36238/0 (socket says 192.168.123.103:36238) 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.221+0000 7f6aeb7fe700 1 -- 192.168.123.103:0/2643264408 learned_addr learned my addr 192.168.123.103:0/2643264408 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.222+0000 7f6aeb7fe700 1 -- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6aec071950 msgr2=0x7f6aec1312a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.222+0000 7f6aeb7fe700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6aec071950 0x7f6aec1312a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.222+0000 7f6aeb7fe700 1 -- 192.168.123.103:0/2643264408 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ae4007ed0 con 0x7f6aec1317e0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.223+0000 7f6aeb7fe700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec1317e0 0x7f6aec07f440 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f6ae4003c30 tx=0x7f6ae4003c60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.223+0000 7f6ae97fa700 1 -- 192.168.123.103:0/2643264408 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ae401c070 con 0x7f6aec1317e0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.223+0000 7f6af1182700 1 -- 192.168.123.103:0/2643264408 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6aec07f980 con 0x7f6aec1317e0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.223+0000 7f6af1182700 1 -- 192.168.123.103:0/2643264408 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6aec07fe40 con 0x7f6aec1317e0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.224+0000 7f6ae97fa700 1 -- 192.168.123.103:0/2643264408 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6ae4004100 con 0x7f6aec1317e0 2026-03-10T14:07:19.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.224+0000 7f6ae97fa700 1 -- 192.168.123.103:0/2643264408 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ae40218c0 con 0x7f6aec1317e0 2026-03-10T14:07:19.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.225+0000 7f6af1182700 1 -- 192.168.123.103:0/2643264408 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ad8005320 con 0x7f6aec1317e0 2026-03-10T14:07:19.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.226+0000 7f6ae97fa700 1 -- 192.168.123.103:0/2643264408 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6ae400f810 con 0x7f6aec1317e0 2026-03-10T14:07:19.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.227+0000 7f6ae97fa700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ad406c6d0 0x7f6ad406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.227+0000 7f6ae97fa700 1 -- 192.168.123.103:0/2643264408 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f6ae4013070 con 0x7f6aec1317e0 2026-03-10T14:07:19.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.227+0000 7f6aebfff700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ad406c6d0 0x7f6ad406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.228+0000 7f6aebfff700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ad406c6d0 0x7f6ad406eb80 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f6adc009c80 tx=0x7f6adc009400 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:19.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.236+0000 7f6ae97fa700 1 -- 192.168.123.103:0/2643264408 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6ae405b980 con 0x7f6aec1317e0 2026-03-10T14:07:19.259 INFO:tasks.workunit.client.1.vm04.stdout:5/26: getdents . 0 2026-03-10T14:07:19.261 INFO:tasks.workunit.client.1.vm04.stdout:5/27: creat f2 x:0 0 0 2026-03-10T14:07:19.262 INFO:tasks.workunit.client.1.vm04.stdout:7/22: truncate f0 467379 0 2026-03-10T14:07:19.262 INFO:tasks.workunit.client.1.vm04.stdout:5/28: mknod c3 0 2026-03-10T14:07:19.265 INFO:tasks.workunit.client.1.vm04.stdout:5/29: creat f4 x:0 0 0 2026-03-10T14:07:19.266 INFO:tasks.workunit.client.1.vm04.stdout:5/30: creat f5 x:0 0 0 2026-03-10T14:07:19.267 INFO:tasks.workunit.client.1.vm04.stdout:5/31: unlink f5 0 2026-03-10T14:07:19.267 INFO:tasks.workunit.client.1.vm04.stdout:5/32: dread - f2 zero size 2026-03-10T14:07:19.268 INFO:tasks.workunit.client.1.vm04.stdout:5/33: unlink l1 0 2026-03-10T14:07:19.268 INFO:tasks.workunit.client.1.vm04.stdout:5/34: write f4 [98590,72126] 0 2026-03-10T14:07:19.269 INFO:tasks.workunit.client.1.vm04.stdout:5/35: write f4 [297259,66874] 0 2026-03-10T14:07:19.269 INFO:tasks.workunit.client.1.vm04.stdout:5/36: readlink - no filename 2026-03-10T14:07:19.269 INFO:tasks.workunit.client.1.vm04.stdout:5/37: write f4 [1123616,42678] 0 2026-03-10T14:07:19.271 INFO:tasks.workunit.client.1.vm04.stdout:5/38: symlink l6 0 2026-03-10T14:07:19.271 INFO:tasks.workunit.client.1.vm04.stdout:5/39: rmdir - no directory 2026-03-10T14:07:19.272 INFO:tasks.workunit.client.1.vm04.stdout:5/40: mkdir d7 0 2026-03-10T14:07:19.275 INFO:tasks.workunit.client.1.vm04.stdout:5/41: link f4 d7/f8 0 2026-03-10T14:07:19.276 INFO:tasks.workunit.client.1.vm04.stdout:6/40: fsync d3/f4 0 2026-03-10T14:07:19.277 INFO:tasks.workunit.client.1.vm04.stdout:5/42: rmdir d7 39 2026-03-10T14:07:19.278 INFO:tasks.workunit.client.1.vm04.stdout:5/43: readlink l6 0 2026-03-10T14:07:19.279 INFO:tasks.workunit.client.1.vm04.stdout:5/44: write f4 [1971245,85117] 0 2026-03-10T14:07:19.279 INFO:tasks.workunit.client.1.vm04.stdout:5/45: stat d7/f8 0 2026-03-10T14:07:19.280 INFO:tasks.workunit.client.1.vm04.stdout:6/41: creat d3/d8/fa x:0 0 0 2026-03-10T14:07:19.280 INFO:tasks.workunit.client.1.vm04.stdout:6/42: write d3/f9 [690535,76898] 0 2026-03-10T14:07:19.280 INFO:tasks.workunit.client.1.vm04.stdout:6/43: readlink - no filename 2026-03-10T14:07:19.282 INFO:tasks.workunit.client.1.vm04.stdout:6/44: dread d3/f4 [0,4194304] 0 2026-03-10T14:07:19.284 INFO:tasks.workunit.client.1.vm04.stdout:6/45: dread d3/f4 [4194304,4194304] 0 2026-03-10T14:07:19.287 INFO:tasks.workunit.client.1.vm04.stdout:5/46: mkdir d7/d9 0 2026-03-10T14:07:19.287 INFO:tasks.workunit.client.1.vm04.stdout:5/47: write d7/f8 [353651,110403] 0 2026-03-10T14:07:19.295 INFO:tasks.workunit.client.1.vm04.stdout:6/46: getdents d3 0 2026-03-10T14:07:19.296 INFO:tasks.workunit.client.1.vm04.stdout:6/47: fdatasync d3/f4 0 2026-03-10T14:07:19.299 INFO:tasks.workunit.client.1.vm04.stdout:6/48: dwrite d3/f5 [0,4194304] 0 2026-03-10T14:07:19.301 INFO:tasks.workunit.client.1.vm04.stdout:6/49: write d3/f5 [6642846,42918] 0 2026-03-10T14:07:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:19 vm04.local ceph-mon[55966]: pgmap v138: 65 pgs: 65 active+clean; 158 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 2.2 MiB/s rd, 4.9 MiB/s wr, 379 op/s 2026-03-10T14:07:19.343 INFO:tasks.workunit.client.1.vm04.stdout:6/50: fdatasync d3/f4 0 2026-03-10T14:07:19.348 INFO:tasks.workunit.client.1.vm04.stdout:6/51: link d3/f9 d3/fb 0 2026-03-10T14:07:19.350 INFO:tasks.workunit.client.1.vm04.stdout:6/52: creat d3/d8/fc x:0 0 0 2026-03-10T14:07:19.352 INFO:tasks.workunit.client.1.vm04.stdout:6/53: unlink d3/f5 0 2026-03-10T14:07:19.354 INFO:tasks.workunit.client.1.vm04.stdout:6/54: creat d3/d8/fd x:0 0 0 2026-03-10T14:07:19.354 INFO:tasks.workunit.client.1.vm04.stdout:6/55: chown d3/f9 1 1 2026-03-10T14:07:19.354 INFO:tasks.workunit.client.1.vm04.stdout:6/56: chown d3/d8/fd 44743 1 2026-03-10T14:07:19.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:19 vm03.local ceph-mon[49718]: pgmap v138: 65 pgs: 65 active+clean; 158 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 2.2 MiB/s rd, 4.9 MiB/s wr, 379 op/s 2026-03-10T14:07:19.359 INFO:tasks.workunit.client.1.vm04.stdout:6/57: dwrite d3/d8/fc [0,4194304] 0 2026-03-10T14:07:19.361 INFO:tasks.workunit.client.1.vm04.stdout:5/48: dread d7/f8 [0,4194304] 0 2026-03-10T14:07:19.363 INFO:tasks.workunit.client.1.vm04.stdout:5/49: creat d7/fa x:0 0 0 2026-03-10T14:07:19.364 INFO:tasks.workunit.client.1.vm04.stdout:5/50: getdents d7/d9 0 2026-03-10T14:07:19.364 INFO:tasks.workunit.client.1.vm04.stdout:5/51: truncate f4 2064705 0 2026-03-10T14:07:19.369 INFO:tasks.workunit.client.1.vm04.stdout:5/52: dwrite f4 [0,4194304] 0 2026-03-10T14:07:19.369 INFO:tasks.workunit.client.1.vm04.stdout:5/53: dread - d7/fa zero size 2026-03-10T14:07:19.372 INFO:tasks.workunit.client.1.vm04.stdout:5/54: dread d7/f8 [0,4194304] 0 2026-03-10T14:07:19.384 INFO:tasks.workunit.client.1.vm04.stdout:5/55: symlink d7/d9/lb 0 2026-03-10T14:07:19.385 INFO:tasks.workunit.client.1.vm04.stdout:5/56: creat d7/fc x:0 0 0 2026-03-10T14:07:19.387 INFO:tasks.workunit.client.1.vm04.stdout:5/57: dread f4 [0,4194304] 0 2026-03-10T14:07:19.387 INFO:tasks.workunit.client.1.vm04.stdout:5/58: chown c3 620970926 1 2026-03-10T14:07:19.392 INFO:tasks.workunit.client.1.vm04.stdout:5/59: dwrite d7/f8 [0,4194304] 0 2026-03-10T14:07:19.408 INFO:tasks.workunit.client.1.vm04.stdout:5/60: write f4 [3191244,25922] 0 2026-03-10T14:07:19.408 INFO:tasks.workunit.client.1.vm04.stdout:5/61: stat c3 0 2026-03-10T14:07:19.408 INFO:tasks.workunit.client.1.vm04.stdout:5/62: write d7/f8 [3955839,58714] 0 2026-03-10T14:07:19.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.411+0000 7f6af1182700 1 -- 192.168.123.103:0/2643264408 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6ad8000bf0 con 0x7f6ad406c6d0 2026-03-10T14:07:19.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.414+0000 7f6ae97fa700 1 -- 192.168.123.103:0/2643264408 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f6ad8000bf0 con 0x7f6ad406c6d0 2026-03-10T14:07:19.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.418+0000 7f6ad2ffd700 1 -- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ad406c6d0 msgr2=0x7f6ad406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.418+0000 7f6ad2ffd700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ad406c6d0 0x7f6ad406eb80 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f6adc009c80 tx=0x7f6adc009400 comp rx=0 tx=0).stop 2026-03-10T14:07:19.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.418+0000 7f6ad2ffd700 1 -- 192.168.123.103:0/2643264408 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec1317e0 msgr2=0x7f6aec07f440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.418+0000 7f6ad2ffd700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec1317e0 0x7f6aec07f440 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f6ae4003c30 tx=0x7f6ae4003c60 comp rx=0 tx=0).stop 2026-03-10T14:07:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.419+0000 7f6ad2ffd700 1 -- 192.168.123.103:0/2643264408 shutdown_connections 2026-03-10T14:07:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.419+0000 7f6ad2ffd700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6aec071950 0x7f6aec1312a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.419+0000 7f6ad2ffd700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ad406c6d0 0x7f6ad406eb80 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.419+0000 7f6ad2ffd700 1 --2- 192.168.123.103:0/2643264408 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6aec1317e0 0x7f6aec07f440 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.419+0000 7f6ad2ffd700 1 -- 192.168.123.103:0/2643264408 >> 192.168.123.103:0/2643264408 conn(0x7f6aec06d1a0 msgr2=0x7f6aec076410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.419+0000 7f6ad2ffd700 1 -- 192.168.123.103:0/2643264408 shutdown_connections 2026-03-10T14:07:19.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.419+0000 7f6ad2ffd700 1 -- 192.168.123.103:0/2643264408 wait complete. 2026-03-10T14:07:19.431 INFO:tasks.workunit.client.1.vm04.stdout:8/19: truncate d0/f2 3372469 0 2026-03-10T14:07:19.436 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:07:19.517 INFO:tasks.workunit.client.1.vm04.stdout:9/19: getdents . 0 2026-03-10T14:07:19.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.573+0000 7f1dde1e3700 1 -- 192.168.123.103:0/164678227 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1dd8072360 msgr2=0x7f1dd80770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.573+0000 7f1dde1e3700 1 --2- 192.168.123.103:0/164678227 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1dd8072360 0x7f1dd80770e0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f1dd000d3f0 tx=0x7f1dd000d700 comp rx=0 tx=0).stop 2026-03-10T14:07:19.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.573+0000 7f1dde1e3700 1 -- 192.168.123.103:0/164678227 shutdown_connections 2026-03-10T14:07:19.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.573+0000 7f1dde1e3700 1 --2- 192.168.123.103:0/164678227 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1dd8072360 0x7f1dd80770e0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.573+0000 7f1dde1e3700 1 --2- 192.168.123.103:0/164678227 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1dd8071980 0x7f1dd8071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.573+0000 7f1dde1e3700 1 -- 192.168.123.103:0/164678227 >> 192.168.123.103:0/164678227 conn(0x7f1dd806d1a0 msgr2=0x7f1dd806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:19.573 INFO:tasks.workunit.client.1.vm04.stdout:9/20: sync 2026-03-10T14:07:19.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.574+0000 7f1dde1e3700 1 -- 192.168.123.103:0/164678227 shutdown_connections 2026-03-10T14:07:19.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.574+0000 7f1dde1e3700 1 -- 192.168.123.103:0/164678227 wait complete. 2026-03-10T14:07:19.574 INFO:tasks.workunit.client.1.vm04.stdout:9/21: creat f5 x:0 0 0 2026-03-10T14:07:19.574 INFO:tasks.workunit.client.1.vm04.stdout:9/22: chown f3 0 1 2026-03-10T14:07:19.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.574+0000 7f1dde1e3700 1 Processor -- start 2026-03-10T14:07:19.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.574+0000 7f1dde1e3700 1 -- start start 2026-03-10T14:07:19.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.574+0000 7f1dde1e3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1dd8071980 0x7f1dd807f800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.575+0000 7f1dde1e3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1dd8072360 0x7f1dd807fd40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.575+0000 7f1dde1e3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1dd8080280 con 0x7f1dd8072360 2026-03-10T14:07:19.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.575+0000 7f1dde1e3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1dd80803f0 con 0x7f1dd8071980 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.575+0000 7f1dd6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1dd8072360 0x7f1dd807fd40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.575+0000 7f1dd6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1dd8072360 0x7f1dd807fd40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:50892/0 (socket says 192.168.123.103:50892) 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.575+0000 7f1dd6ffd700 1 -- 192.168.123.103:0/2035755739 learned_addr learned my addr 192.168.123.103:0/2035755739 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.575+0000 7f1dd77fe700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1dd8071980 0x7f1dd807f800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.576 INFO:tasks.workunit.client.1.vm04.stdout:9/23: symlink l6 0 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.576+0000 7f1dd6ffd700 1 -- 192.168.123.103:0/2035755739 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1dd8071980 msgr2=0x7f1dd807f800 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.576+0000 7f1dd6ffd700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1dd8071980 0x7f1dd807f800 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.576+0000 7f1dd6ffd700 1 -- 192.168.123.103:0/2035755739 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1dd0007ed0 con 0x7f1dd8072360 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.576+0000 7f1dd6ffd700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1dd8072360 0x7f1dd807fd40 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7f1dd00062a0 tx=0x7f1dd0004020 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:19.576 INFO:tasks.workunit.client.1.vm04.stdout:9/24: chown l6 444 1 2026-03-10T14:07:19.576 INFO:tasks.workunit.client.1.vm04.stdout:9/25: stat c1 0 2026-03-10T14:07:19.576 INFO:tasks.workunit.client.1.vm04.stdout:9/26: dread - f5 zero size 2026-03-10T14:07:19.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.578+0000 7f1dd4ff9700 1 -- 192.168.123.103:0/2035755739 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1dd001d070 con 0x7f1dd8072360 2026-03-10T14:07:19.577 INFO:tasks.workunit.client.1.vm04.stdout:9/27: mknod c7 0 2026-03-10T14:07:19.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.578+0000 7f1dde1e3700 1 -- 192.168.123.103:0/2035755739 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1dd8080670 con 0x7f1dd8072360 2026-03-10T14:07:19.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.578+0000 7f1dde1e3700 1 -- 192.168.123.103:0/2035755739 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1dd812e0e0 con 0x7f1dd8072360 2026-03-10T14:07:19.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.580+0000 7f1dd4ff9700 1 -- 192.168.123.103:0/2035755739 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1dd0004840 con 0x7f1dd8072360 2026-03-10T14:07:19.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.581+0000 7f1dd4ff9700 1 -- 192.168.123.103:0/2035755739 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1dd0017a60 con 0x7f1dd8072360 2026-03-10T14:07:19.580 INFO:tasks.workunit.client.1.vm04.stdout:9/28: rename f2 to f8 0 2026-03-10T14:07:19.581 INFO:tasks.workunit.client.1.vm04.stdout:9/29: mkdir d9 0 2026-03-10T14:07:19.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.583+0000 7f1dde1e3700 1 -- 192.168.123.103:0/2035755739 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1dc4005320 con 0x7f1dd8072360 2026-03-10T14:07:19.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.585+0000 7f1dd4ff9700 1 -- 192.168.123.103:0/2035755739 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f1dd0017bc0 con 0x7f1dd8072360 2026-03-10T14:07:19.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.586+0000 7f1dd4ff9700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1dc006c7a0 0x7f1dc006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.586+0000 7f1dd4ff9700 1 -- 192.168.123.103:0/2035755739 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f1dd0013070 con 0x7f1dd8072360 2026-03-10T14:07:19.585 INFO:tasks.workunit.client.1.vm04.stdout:9/30: mkdir d9/da 0 2026-03-10T14:07:19.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.587+0000 7f1dd77fe700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1dc006c7a0 0x7f1dc006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.587+0000 7f1dd77fe700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1dc006c7a0 0x7f1dc006ec50 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f1dc8009de0 tx=0x7f1dc8009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:19.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.591+0000 7f1dd4ff9700 1 -- 192.168.123.103:0/2035755739 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1dd005b9d0 con 0x7f1dd8072360 2026-03-10T14:07:19.623 INFO:tasks.workunit.client.1.vm04.stdout:9/31: creat d9/da/fb x:0 0 0 2026-03-10T14:07:19.623 INFO:tasks.workunit.client.1.vm04.stdout:9/32: chown f3 2 1 2026-03-10T14:07:19.626 INFO:tasks.workunit.client.1.vm04.stdout:9/33: mknod d9/da/cc 0 2026-03-10T14:07:19.628 INFO:tasks.workunit.client.1.vm04.stdout:9/34: mkdir d9/da/dd 0 2026-03-10T14:07:19.630 INFO:tasks.workunit.client.1.vm04.stdout:9/35: unlink d9/da/cc 0 2026-03-10T14:07:19.633 INFO:tasks.workunit.client.1.vm04.stdout:9/36: creat d9/da/dd/fe x:0 0 0 2026-03-10T14:07:19.641 INFO:tasks.workunit.client.1.vm04.stdout:3/7: getdents . 0 2026-03-10T14:07:19.641 INFO:tasks.workunit.client.1.vm04.stdout:3/8: fdatasync - no filename 2026-03-10T14:07:19.641 INFO:tasks.workunit.client.1.vm04.stdout:3/9: dread - no filename 2026-03-10T14:07:19.642 INFO:tasks.workunit.client.1.vm04.stdout:3/10: chown l1 349 1 2026-03-10T14:07:19.642 INFO:tasks.workunit.client.1.vm04.stdout:3/11: dwrite - no filename 2026-03-10T14:07:19.650 INFO:tasks.workunit.client.1.vm04.stdout:3/12: symlink l2 0 2026-03-10T14:07:19.650 INFO:tasks.workunit.client.1.vm04.stdout:3/13: fdatasync - no filename 2026-03-10T14:07:19.654 INFO:tasks.workunit.client.1.vm04.stdout:0/14: rename d0/l5 to d0/d2/l6 0 2026-03-10T14:07:19.657 INFO:tasks.workunit.client.1.vm04.stdout:0/15: dread d0/f3 [0,4194304] 0 2026-03-10T14:07:19.657 INFO:tasks.workunit.client.1.vm04.stdout:0/16: fdatasync d0/f4 0 2026-03-10T14:07:19.664 INFO:tasks.workunit.client.1.vm04.stdout:3/14: creat f3 x:0 0 0 2026-03-10T14:07:19.670 INFO:tasks.workunit.client.1.vm04.stdout:1/40: getdents d3 0 2026-03-10T14:07:19.670 INFO:tasks.workunit.client.1.vm04.stdout:1/41: write d3/fb [526225,125468] 0 2026-03-10T14:07:19.675 INFO:tasks.workunit.client.1.vm04.stdout:3/15: creat f4 x:0 0 0 2026-03-10T14:07:19.675 INFO:tasks.workunit.client.1.vm04.stdout:3/16: dread - f3 zero size 2026-03-10T14:07:19.677 INFO:tasks.workunit.client.1.vm04.stdout:1/42: creat d3/fc x:0 0 0 2026-03-10T14:07:19.681 INFO:tasks.workunit.client.1.vm04.stdout:1/43: creat d3/d5/fd x:0 0 0 2026-03-10T14:07:19.681 INFO:tasks.workunit.client.1.vm04.stdout:1/44: fdatasync d3/f8 0 2026-03-10T14:07:19.683 INFO:tasks.workunit.client.1.vm04.stdout:3/17: mknod c5 0 2026-03-10T14:07:19.686 INFO:tasks.workunit.client.1.vm04.stdout:1/45: mkdir d3/d5/de 0 2026-03-10T14:07:19.691 INFO:tasks.workunit.client.1.vm04.stdout:1/46: symlink d3/lf 0 2026-03-10T14:07:19.691 INFO:tasks.workunit.client.1.vm04.stdout:1/47: dread - d3/fc zero size 2026-03-10T14:07:19.692 INFO:tasks.workunit.client.1.vm04.stdout:1/48: truncate d3/fc 450645 0 2026-03-10T14:07:19.694 INFO:tasks.workunit.client.1.vm04.stdout:3/18: mknod c6 0 2026-03-10T14:07:19.696 INFO:tasks.workunit.client.1.vm04.stdout:1/49: mknod d3/c10 0 2026-03-10T14:07:19.699 INFO:tasks.workunit.client.1.vm04.stdout:1/50: symlink d3/d5/l11 0 2026-03-10T14:07:19.701 INFO:tasks.workunit.client.1.vm04.stdout:3/19: mknod c7 0 2026-03-10T14:07:19.706 INFO:tasks.workunit.client.1.vm04.stdout:3/20: dwrite f4 [0,4194304] 0 2026-03-10T14:07:19.707 INFO:tasks.workunit.client.1.vm04.stdout:4/35: dwrite d4/f6 [0,4194304] 0 2026-03-10T14:07:19.728 INFO:tasks.workunit.client.1.vm04.stdout:4/36: creat d4/df/f11 x:0 0 0 2026-03-10T14:07:19.733 INFO:tasks.workunit.client.1.vm04.stdout:3/21: creat f8 x:0 0 0 2026-03-10T14:07:19.736 INFO:tasks.workunit.client.1.vm04.stdout:4/37: dwrite f1 [0,4194304] 0 2026-03-10T14:07:19.737 INFO:tasks.workunit.client.1.vm04.stdout:4/38: write d4/fd [1175502,78301] 0 2026-03-10T14:07:19.745 INFO:tasks.workunit.client.1.vm04.stdout:3/22: creat f9 x:0 0 0 2026-03-10T14:07:19.747 INFO:tasks.workunit.client.1.vm04.stdout:4/39: mknod d4/df/d10/c12 0 2026-03-10T14:07:19.747 INFO:tasks.workunit.client.1.vm04.stdout:4/40: truncate d4/df/f11 907156 0 2026-03-10T14:07:19.747 INFO:tasks.workunit.client.1.vm04.stdout:4/41: truncate f2 1450551 0 2026-03-10T14:07:19.747 INFO:tasks.workunit.client.1.vm04.stdout:4/42: chown f1 1 1 2026-03-10T14:07:19.747 INFO:tasks.workunit.client.1.vm04.stdout:4/43: write d4/fa [382725,61473] 0 2026-03-10T14:07:19.750 INFO:tasks.workunit.client.1.vm04.stdout:3/23: mkdir da 0 2026-03-10T14:07:19.751 INFO:tasks.workunit.client.1.vm04.stdout:3/24: readlink l1 0 2026-03-10T14:07:19.754 INFO:tasks.workunit.client.1.vm04.stdout:3/25: creat da/fb x:0 0 0 2026-03-10T14:07:19.754 INFO:tasks.workunit.client.1.vm04.stdout:3/26: readlink l1 0 2026-03-10T14:07:19.755 INFO:tasks.workunit.client.1.vm04.stdout:3/27: mkdir da/dc 0 2026-03-10T14:07:19.755 INFO:tasks.workunit.client.1.vm04.stdout:3/28: truncate f3 983725 0 2026-03-10T14:07:19.756 INFO:tasks.workunit.client.1.vm04.stdout:3/29: truncate f3 1846453 0 2026-03-10T14:07:19.759 INFO:tasks.workunit.client.1.vm04.stdout:3/30: creat da/fd x:0 0 0 2026-03-10T14:07:19.764 INFO:tasks.workunit.client.1.vm04.stdout:3/31: truncate f9 498710 0 2026-03-10T14:07:19.764 INFO:tasks.workunit.client.1.vm04.stdout:3/32: chown c5 1292302 1 2026-03-10T14:07:19.764 INFO:tasks.workunit.client.1.vm04.stdout:3/33: dread - da/fd zero size 2026-03-10T14:07:19.785 INFO:tasks.workunit.client.1.vm04.stdout:2/32: rename d0/d4 to d0/d3/d8 0 2026-03-10T14:07:19.785 INFO:tasks.workunit.client.1.vm04.stdout:2/33: chown d0/d3/d8 40721244 1 2026-03-10T14:07:19.785 INFO:tasks.workunit.client.1.vm04.stdout:2/34: fdatasync d0/d3/d8/f7 0 2026-03-10T14:07:19.791 INFO:tasks.workunit.client.1.vm04.stdout:2/35: dwrite d0/d3/d8/f7 [0,4194304] 0 2026-03-10T14:07:19.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.794+0000 7f1dde1e3700 1 -- 192.168.123.103:0/2035755739 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1dc4000bf0 con 0x7f1dc006c7a0 2026-03-10T14:07:19.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.796+0000 7f1dd4ff9700 1 -- 192.168.123.103:0/2035755739 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f1dc4000bf0 con 0x7f1dc006c7a0 2026-03-10T14:07:19.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.800+0000 7f1dbe7fc700 1 -- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1dc006c7a0 msgr2=0x7f1dc006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.800+0000 7f1dbe7fc700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1dc006c7a0 0x7f1dc006ec50 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f1dc8009de0 tx=0x7f1dc8009450 comp rx=0 tx=0).stop 2026-03-10T14:07:19.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.801+0000 7f1dbe7fc700 1 -- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1dd8072360 msgr2=0x7f1dd807fd40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.801+0000 7f1dbe7fc700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1dd8072360 0x7f1dd807fd40 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7f1dd00062a0 tx=0x7f1dd0004020 comp rx=0 tx=0).stop 2026-03-10T14:07:19.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.801+0000 7f1dbe7fc700 1 -- 192.168.123.103:0/2035755739 shutdown_connections 2026-03-10T14:07:19.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.801+0000 7f1dbe7fc700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1dc006c7a0 0x7f1dc006ec50 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.801+0000 7f1dbe7fc700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1dd8071980 0x7f1dd807f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.801+0000 7f1dbe7fc700 1 --2- 192.168.123.103:0/2035755739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1dd8072360 0x7f1dd807fd40 unknown :-1 s=CLOSED pgs=306 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.801+0000 7f1dbe7fc700 1 -- 192.168.123.103:0/2035755739 >> 192.168.123.103:0/2035755739 conn(0x7f1dd806d1a0 msgr2=0x7f1dd80757e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:19.800 INFO:tasks.workunit.client.1.vm04.stdout:1/51: rename d3/d5/l6 to d3/l12 0 2026-03-10T14:07:19.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.802+0000 7f1dbe7fc700 1 -- 192.168.123.103:0/2035755739 shutdown_connections 2026-03-10T14:07:19.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.802+0000 7f1dbe7fc700 1 -- 192.168.123.103:0/2035755739 wait complete. 2026-03-10T14:07:19.804 INFO:tasks.workunit.client.1.vm04.stdout:1/52: mkdir d3/d5/d13 0 2026-03-10T14:07:19.810 INFO:tasks.workunit.client.1.vm04.stdout:4/44: fdatasync d4/f9 0 2026-03-10T14:07:19.813 INFO:tasks.workunit.client.1.vm04.stdout:1/53: dwrite f1 [0,4194304] 0 2026-03-10T14:07:19.830 INFO:tasks.workunit.client.1.vm04.stdout:4/45: rename d4/df/d10/c12 to d4/df/c13 0 2026-03-10T14:07:19.832 INFO:tasks.workunit.client.1.vm04.stdout:4/46: mkdir d4/d14 0 2026-03-10T14:07:19.832 INFO:tasks.workunit.client.1.vm04.stdout:1/54: rmdir d3/d5/de 0 2026-03-10T14:07:19.834 INFO:tasks.workunit.client.1.vm04.stdout:4/47: mknod d4/d14/c15 0 2026-03-10T14:07:19.846 INFO:tasks.workunit.client.1.vm04.stdout:1/55: dread f2 [0,4194304] 0 2026-03-10T14:07:19.847 INFO:tasks.workunit.client.1.vm04.stdout:1/56: creat d3/f14 x:0 0 0 2026-03-10T14:07:19.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.908+0000 7f17b1990700 1 -- 192.168.123.103:0/190213925 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17ac071980 msgr2=0x7f17ac071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.908+0000 7f17b1990700 1 --2- 192.168.123.103:0/190213925 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17ac071980 0x7f17ac071d90 secure :-1 s=READY pgs=307 cs=0 l=1 rev1=1 crypto rx=0x7f179c007780 tx=0x7f179c00c050 comp rx=0 tx=0).stop 2026-03-10T14:07:19.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.908+0000 7f17b1990700 1 -- 192.168.123.103:0/190213925 shutdown_connections 2026-03-10T14:07:19.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.908+0000 7f17b1990700 1 --2- 192.168.123.103:0/190213925 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17ac072360 0x7f17ac0770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.908+0000 7f17b1990700 1 --2- 192.168.123.103:0/190213925 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17ac071980 0x7f17ac071d90 unknown :-1 s=CLOSED pgs=307 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.908+0000 7f17b1990700 1 -- 192.168.123.103:0/190213925 >> 192.168.123.103:0/190213925 conn(0x7f17ac06d1a0 msgr2=0x7f17ac06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.908+0000 7f17b1990700 1 -- 192.168.123.103:0/190213925 shutdown_connections 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.908+0000 7f17b1990700 1 -- 192.168.123.103:0/190213925 wait complete. 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17b1990700 1 Processor -- start 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17b1990700 1 -- start start 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17b1990700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17ac072360 0x7f17ac082590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17b1990700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17ac082ad0 0x7f17ac082f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17b1990700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17ac1b2a90 con 0x7f17ac082ad0 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17b1990700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17ac1b2bd0 con 0x7f17ac072360 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17aa7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17ac082ad0 0x7f17ac082f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17aaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17ac072360 0x7f17ac082590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17aaffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17ac072360 0x7f17ac082590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:36262/0 (socket says 192.168.123.103:36262) 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17aaffd700 1 -- 192.168.123.103:0/2856156622 learned_addr learned my addr 192.168.123.103:0/2856156622 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17aaffd700 1 -- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17ac082ad0 msgr2=0x7f17ac082f40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.909+0000 7f17aaffd700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17ac082ad0 0x7f17ac082f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.910+0000 7f17aaffd700 1 -- 192.168.123.103:0/2856156622 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f179c007430 con 0x7f17ac072360 2026-03-10T14:07:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.910+0000 7f17aaffd700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17ac072360 0x7f17ac082590 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f179c007750 tx=0x7f179c00a3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:19.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.910+0000 7f17b098e700 1 -- 192.168.123.103:0/2856156622 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f179c00f040 con 0x7f17ac072360 2026-03-10T14:07:19.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.910+0000 7f17b1990700 1 -- 192.168.123.103:0/2856156622 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f17ac1b2d70 con 0x7f17ac072360 2026-03-10T14:07:19.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.910+0000 7f17b1990700 1 -- 192.168.123.103:0/2856156622 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17ac1b3230 con 0x7f17ac072360 2026-03-10T14:07:19.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.911+0000 7f17b098e700 1 -- 192.168.123.103:0/2856156622 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f179c0072b0 con 0x7f17ac072360 2026-03-10T14:07:19.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.911+0000 7f17b098e700 1 -- 192.168.123.103:0/2856156622 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f179c003890 con 0x7f17ac072360 2026-03-10T14:07:19.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.912+0000 7f17b098e700 1 -- 192.168.123.103:0/2856156622 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f179c007ca0 con 0x7f17ac072360 2026-03-10T14:07:19.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.914+0000 7f17b098e700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f179406e8e0 0x7f1794070d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:19.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.914+0000 7f17b098e700 1 -- 192.168.123.103:0/2856156622 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f179c08b5e0 con 0x7f17ac072360 2026-03-10T14:07:19.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.914+0000 7f17aa7fc700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f179406e8e0 0x7f1794070d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:19.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.914+0000 7f17b1990700 1 -- 192.168.123.103:0/2856156622 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1798005320 con 0x7f17ac072360 2026-03-10T14:07:19.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.915+0000 7f17aa7fc700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f179406e8e0 0x7f1794070d90 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f17a4009fd0 tx=0x7f17a400b040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:19.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:19.918+0000 7f17b098e700 1 -- 192.168.123.103:0/2856156622 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f179c0597f0 con 0x7f17ac072360 2026-03-10T14:07:19.990 INFO:tasks.workunit.client.1.vm04.stdout:6/58: truncate d3/f4 3023441 0 2026-03-10T14:07:19.999 INFO:tasks.workunit.client.1.vm04.stdout:5/63: fsync d7/fc 0 2026-03-10T14:07:20.008 INFO:tasks.workunit.client.1.vm04.stdout:8/20: dread d0/f2 [0,4194304] 0 2026-03-10T14:07:20.009 INFO:tasks.workunit.client.1.vm04.stdout:8/21: rmdir d0 39 2026-03-10T14:07:20.044 INFO:tasks.workunit.client.1.vm04.stdout:6/59: sync 2026-03-10T14:07:20.044 INFO:tasks.workunit.client.1.vm04.stdout:5/64: sync 2026-03-10T14:07:20.044 INFO:tasks.workunit.client.1.vm04.stdout:8/22: sync 2026-03-10T14:07:20.045 INFO:tasks.workunit.client.1.vm04.stdout:6/60: truncate d3/d8/fd 588194 0 2026-03-10T14:07:20.048 INFO:tasks.workunit.client.1.vm04.stdout:5/65: rmdir d7 39 2026-03-10T14:07:20.054 INFO:tasks.workunit.client.1.vm04.stdout:6/61: mkdir d3/de 0 2026-03-10T14:07:20.058 INFO:tasks.workunit.client.1.vm04.stdout:5/66: dread d7/f8 [0,4194304] 0 2026-03-10T14:07:20.060 INFO:tasks.workunit.client.1.vm04.stdout:5/67: dread d7/f8 [0,4194304] 0 2026-03-10T14:07:20.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.064+0000 7f17b1990700 1 -- 192.168.123.103:0/2856156622 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f1798000bf0 con 0x7f179406e8e0 2026-03-10T14:07:20.063 INFO:tasks.workunit.client.1.vm04.stdout:6/62: creat d3/ff x:0 0 0 2026-03-10T14:07:20.065 INFO:tasks.workunit.client.1.vm04.stdout:6/63: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:20.067 INFO:tasks.workunit.client.1.vm04.stdout:5/68: creat d7/d9/fd x:0 0 0 2026-03-10T14:07:20.070 INFO:tasks.workunit.client.1.vm04.stdout:8/23: creat d0/d3/f4 x:0 0 0 2026-03-10T14:07:20.072 INFO:tasks.workunit.client.1.vm04.stdout:6/64: chown c0 2 1 2026-03-10T14:07:20.073 INFO:tasks.workunit.client.1.vm04.stdout:6/65: chown d3/f9 806167 1 2026-03-10T14:07:20.074 INFO:tasks.workunit.client.1.vm04.stdout:5/69: rmdir d7 39 2026-03-10T14:07:20.076 INFO:tasks.workunit.client.1.vm04.stdout:6/66: dread d3/d8/fc [0,4194304] 0 2026-03-10T14:07:20.077 INFO:tasks.workunit.client.1.vm04.stdout:8/24: mkdir d0/d3/d5 0 2026-03-10T14:07:20.081 INFO:tasks.workunit.client.1.vm04.stdout:5/70: dread - d7/d9/fd zero size 2026-03-10T14:07:20.088 INFO:tasks.workunit.client.1.vm04.stdout:6/67: unlink c0 0 2026-03-10T14:07:20.088 INFO:tasks.workunit.client.1.vm04.stdout:6/68: readlink - no filename 2026-03-10T14:07:20.089 INFO:tasks.workunit.client.1.vm04.stdout:6/69: write d3/d8/fd [514991,45435] 0 2026-03-10T14:07:20.090 INFO:tasks.workunit.client.1.vm04.stdout:8/25: mknod d0/d3/c6 0 2026-03-10T14:07:20.091 INFO:tasks.workunit.client.1.vm04.stdout:8/26: chown d0/d3/c6 67 1 2026-03-10T14:07:20.098 INFO:tasks.workunit.client.1.vm04.stdout:5/71: fdatasync f4 0 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.101+0000 7f17b098e700 1 -- 192.168.123.103:0/2856156622 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3336 (secure 0 0 0) 0x7f1798000bf0 con 0x7f179406e8e0 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (3m) 109s ago 3m 21.4M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 error 109s ago 4m - - 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (3m) 110s ago 3m 8342k - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 109s ago 4m 7419k - 18.2.0 dc2bc1663786 57962aef7443 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (3m) 110s ago 3m 7407k - 18.2.0 dc2bc1663786 0918365fa827 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 109s ago 3m 82.1M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (116s) 109s ago 116s 16.4M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (114s) 109s ago 114s 16.1M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (113s) 110s ago 113s 16.8M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (115s) 110s ago 115s 18.5M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:07:20.100 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:9283,8765,8443 running (4m) 109s ago 4m 499M - 18.2.0 dc2bc1663786 378306a7bb3c 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (3m) 110s ago 3m 446M - 18.2.0 dc2bc1663786 f2d79432e040 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 109s ago 4m 54.7M 2048M 18.2.0 dc2bc1663786 f59cc7d5bdfd 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (3m) 110s ago 3m 44.2M 2048M 18.2.0 dc2bc1663786 4113774b34c7 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 109s ago 4m 14.2M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (3m) 110s ago 3m 14.1M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 109s ago 3m 48.8M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 109s ago 2m 45.7M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 109s ago 2m 45.3M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (2m) 110s ago 2m 47.2M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (2m) 110s ago 2m 45.6M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (2m) 110s ago 2m 44.0M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:07:20.101 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 109s ago 3m 36.2M - 2.43.0 a07b618ecd1d fcef697ff8c4 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 -- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f179406e8e0 msgr2=0x7f1794070d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f179406e8e0 0x7f1794070d90 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f17a4009fd0 tx=0x7f17a400b040 comp rx=0 tx=0).stop 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 -- 192.168.123.103:0/2856156622 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17ac072360 msgr2=0x7f17ac082590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17ac072360 0x7f17ac082590 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f179c007750 tx=0x7f179c00a3b0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 -- 192.168.123.103:0/2856156622 shutdown_connections 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f179406e8e0 0x7f1794070d90 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17ac072360 0x7f17ac082590 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 --2- 192.168.123.103:0/2856156622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17ac082ad0 0x7f17ac082f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.105+0000 7f17927fc700 1 -- 192.168.123.103:0/2856156622 >> 192.168.123.103:0/2856156622 conn(0x7f17ac06d1a0 msgr2=0x7f17ac075900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.106+0000 7f17927fc700 1 -- 192.168.123.103:0/2856156622 shutdown_connections 2026-03-10T14:07:20.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.106+0000 7f17927fc700 1 -- 192.168.123.103:0/2856156622 wait complete. 2026-03-10T14:07:20.105 INFO:tasks.workunit.client.1.vm04.stdout:8/27: unlink d0/d3/f4 0 2026-03-10T14:07:20.111 INFO:tasks.workunit.client.1.vm04.stdout:6/70: creat d3/de/f10 x:0 0 0 2026-03-10T14:07:20.112 INFO:tasks.workunit.client.1.vm04.stdout:6/71: read - d3/de/f10 zero size 2026-03-10T14:07:20.117 INFO:tasks.workunit.client.1.vm04.stdout:8/28: dwrite d0/f2 [0,4194304] 0 2026-03-10T14:07:20.124 INFO:tasks.workunit.client.1.vm04.stdout:9/37: rmdir d9/da/dd 39 2026-03-10T14:07:20.126 INFO:tasks.workunit.client.1.vm04.stdout:9/38: creat d9/ff x:0 0 0 2026-03-10T14:07:20.134 INFO:tasks.workunit.client.1.vm04.stdout:0/17: chown d0/d2/l6 147 1 2026-03-10T14:07:20.136 INFO:tasks.workunit.client.1.vm04.stdout:0/18: mkdir d0/d2/d7 0 2026-03-10T14:07:20.137 INFO:tasks.workunit.client.1.vm04.stdout:0/19: fsync d0/f3 0 2026-03-10T14:07:20.138 INFO:tasks.workunit.client.1.vm04.stdout:0/20: unlink d0/f4 0 2026-03-10T14:07:20.139 INFO:tasks.workunit.client.1.vm04.stdout:0/21: write d0/f3 [1099794,52950] 0 2026-03-10T14:07:20.154 INFO:tasks.workunit.client.1.vm04.stdout:9/39: sync 2026-03-10T14:07:20.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.211+0000 7fb992db0700 1 -- 192.168.123.103:0/3465294096 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c072360 msgr2=0x7fb98c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.211+0000 7fb992db0700 1 --2- 192.168.123.103:0/3465294096 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c072360 0x7fb98c0770e0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fb98400d3f0 tx=0x7fb98400d700 comp rx=0 tx=0).stop 2026-03-10T14:07:20.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.211+0000 7fb992db0700 1 -- 192.168.123.103:0/3465294096 shutdown_connections 2026-03-10T14:07:20.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.211+0000 7fb992db0700 1 --2- 192.168.123.103:0/3465294096 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c072360 0x7fb98c0770e0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.211+0000 7fb992db0700 1 --2- 192.168.123.103:0/3465294096 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb98c071980 0x7fb98c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.211+0000 7fb992db0700 1 -- 192.168.123.103:0/3465294096 >> 192.168.123.103:0/3465294096 conn(0x7fb98c06d1a0 msgr2=0x7fb98c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.212+0000 7fb992db0700 1 -- 192.168.123.103:0/3465294096 shutdown_connections 2026-03-10T14:07:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.212+0000 7fb992db0700 1 -- 192.168.123.103:0/3465294096 wait complete. 2026-03-10T14:07:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.212+0000 7fb992db0700 1 Processor -- start 2026-03-10T14:07:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.212+0000 7fb992db0700 1 -- start start 2026-03-10T14:07:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.212+0000 7fb992db0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb98c071980 0x7fb98c1313e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.212+0000 7fb992db0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c131920 0x7fb98c07f580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.212+0000 7fb992db0700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb98c131e20 con 0x7fb98c071980 2026-03-10T14:07:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.212+0000 7fb992db0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb98c131f90 con 0x7fb98c131920 2026-03-10T14:07:20.212 INFO:tasks.workunit.client.1.vm04.stdout:3/34: rmdir da 39 2026-03-10T14:07:20.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.215+0000 7fb98bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c131920 0x7fb98c07f580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:20.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.215+0000 7fb98bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c131920 0x7fb98c07f580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:36282/0 (socket says 192.168.123.103:36282) 2026-03-10T14:07:20.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.215+0000 7fb98bfff700 1 -- 192.168.123.103:0/1330128079 learned_addr learned my addr 192.168.123.103:0/1330128079 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:20.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.215+0000 7fb990b4c700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb98c071980 0x7fb98c1313e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:20.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.216+0000 7fb98bfff700 1 -- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb98c071980 msgr2=0x7fb98c1313e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.216+0000 7fb98bfff700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb98c071980 0x7fb98c1313e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.216+0000 7fb98bfff700 1 -- 192.168.123.103:0/1330128079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb984007ed0 con 0x7fb98c131920 2026-03-10T14:07:20.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.216+0000 7fb98bfff700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c131920 0x7fb98c07f580 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb9840060b0 tx=0x7fb984003c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:20.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.216+0000 7fb989ffb700 1 -- 192.168.123.103:0/1330128079 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb98401c070 con 0x7fb98c131920 2026-03-10T14:07:20.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.216+0000 7fb992db0700 1 -- 192.168.123.103:0/1330128079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb98c07fac0 con 0x7fb98c131920 2026-03-10T14:07:20.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.217+0000 7fb992db0700 1 -- 192.168.123.103:0/1330128079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb98c07ff80 con 0x7fb98c131920 2026-03-10T14:07:20.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.217+0000 7fb989ffb700 1 -- 192.168.123.103:0/1330128079 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb98400deb0 con 0x7fb98c131920 2026-03-10T14:07:20.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.217+0000 7fb992db0700 1 -- 192.168.123.103:0/1330128079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb98c12b500 con 0x7fb98c131920 2026-03-10T14:07:20.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.217+0000 7fb989ffb700 1 -- 192.168.123.103:0/1330128079 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb984004580 con 0x7fb98c131920 2026-03-10T14:07:20.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.218+0000 7fb989ffb700 1 -- 192.168.123.103:0/1330128079 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb9840178e0 con 0x7fb98c131920 2026-03-10T14:07:20.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.219+0000 7fb989ffb700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb97406c6d0 0x7fb97406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:20.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.219+0000 7fb989ffb700 1 -- 192.168.123.103:0/1330128079 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb984013070 con 0x7fb98c131920 2026-03-10T14:07:20.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.219+0000 7fb990b4c700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb97406c6d0 0x7fb97406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.221+0000 7fb990b4c700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb97406c6d0 0x7fb97406eb80 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fb98c07b4f0 tx=0x7fb97c00b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.221+0000 7fb989ffb700 1 -- 192.168.123.103:0/1330128079 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb984059200 con 0x7fb98c131920 2026-03-10T14:07:20.221 INFO:tasks.workunit.client.1.vm04.stdout:3/35: creat da/fe x:0 0 0 2026-03-10T14:07:20.221 INFO:tasks.workunit.client.1.vm04.stdout:3/36: chown da/fb 21535040 1 2026-03-10T14:07:20.222 INFO:tasks.workunit.client.1.vm04.stdout:3/37: symlink da/dc/lf 0 2026-03-10T14:07:20.227 INFO:tasks.workunit.client.1.vm04.stdout:3/38: dwrite da/fe [0,4194304] 0 2026-03-10T14:07:20.230 INFO:tasks.workunit.client.1.vm04.stdout:7/23: dwrite f0 [0,4194304] 0 2026-03-10T14:07:20.231 INFO:tasks.workunit.client.1.vm04.stdout:7/24: stat d2/f4 0 2026-03-10T14:07:20.237 INFO:tasks.workunit.client.1.vm04.stdout:7/25: dwrite d2/f4 [0,4194304] 0 2026-03-10T14:07:20.241 INFO:tasks.workunit.client.1.vm04.stdout:7/26: dread f0 [0,4194304] 0 2026-03-10T14:07:20.242 INFO:tasks.workunit.client.1.vm04.stdout:7/27: chown f0 10727 1 2026-03-10T14:07:20.248 INFO:tasks.workunit.client.1.vm04.stdout:3/39: link f4 da/f10 0 2026-03-10T14:07:20.248 INFO:tasks.workunit.client.1.vm04.stdout:7/28: creat d2/f7 x:0 0 0 2026-03-10T14:07:20.253 INFO:tasks.workunit.client.1.vm04.stdout:7/29: symlink d2/l8 0 2026-03-10T14:07:20.256 INFO:tasks.workunit.client.1.vm04.stdout:3/40: dwrite f4 [0,4194304] 0 2026-03-10T14:07:20.257 INFO:tasks.workunit.client.1.vm04.stdout:7/30: dread d2/f3 [0,4194304] 0 2026-03-10T14:07:20.264 INFO:tasks.workunit.client.1.vm04.stdout:7/31: dwrite d2/f7 [0,4194304] 0 2026-03-10T14:07:20.269 INFO:tasks.workunit.client.1.vm04.stdout:7/32: dwrite d2/f4 [0,4194304] 0 2026-03-10T14:07:20.274 INFO:tasks.workunit.client.1.vm04.stdout:7/33: unlink d2/f3 0 2026-03-10T14:07:20.274 INFO:tasks.workunit.client.1.vm04.stdout:7/34: stat d2/l6 0 2026-03-10T14:07:20.292 INFO:tasks.workunit.client.1.vm04.stdout:1/57: rename d3/l12 to d3/d5/d13/l15 0 2026-03-10T14:07:20.293 INFO:tasks.workunit.client.1.vm04.stdout:2/36: write d0/d3/d8/f7 [4549405,81203] 0 2026-03-10T14:07:20.295 INFO:tasks.workunit.client.1.vm04.stdout:5/72: rename c3 to d7/ce 0 2026-03-10T14:07:20.299 INFO:tasks.workunit.client.1.vm04.stdout:8/29: rename d0/d3 to d0/d3/d5/d7 22 2026-03-10T14:07:20.304 INFO:tasks.workunit.client.1.vm04.stdout:1/58: mkdir d3/d5/d16 0 2026-03-10T14:07:20.305 INFO:tasks.workunit.client.1.vm04.stdout:2/37: unlink d0/d3/d8/f7 0 2026-03-10T14:07:20.307 INFO:tasks.workunit.client.1.vm04.stdout:2/38: creat d0/d3/f9 x:0 0 0 2026-03-10T14:07:20.308 INFO:tasks.workunit.client.1.vm04.stdout:2/39: dread - d0/d3/f9 zero size 2026-03-10T14:07:20.308 INFO:tasks.workunit.client.1.vm04.stdout:2/40: fsync d0/d3/d8/f6 0 2026-03-10T14:07:20.309 INFO:tasks.workunit.client.1.vm04.stdout:2/41: write d0/d3/f9 [47921,18671] 0 2026-03-10T14:07:20.313 INFO:tasks.workunit.client.1.vm04.stdout:9/40: rename f3 to d9/da/f10 0 2026-03-10T14:07:20.317 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:20 vm04.local ceph-mon[55966]: from='client.24367 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:20.318 INFO:tasks.workunit.client.1.vm04.stdout:2/42: symlink d0/d3/la 0 2026-03-10T14:07:20.318 INFO:tasks.workunit.client.1.vm04.stdout:3/41: rename f9 to da/dc/f11 0 2026-03-10T14:07:20.319 INFO:tasks.workunit.client.1.vm04.stdout:8/30: rename d0 to d0/d3/d8 22 2026-03-10T14:07:20.323 INFO:tasks.workunit.client.1.vm04.stdout:3/42: dwrite f8 [0,4194304] 0 2026-03-10T14:07:20.326 INFO:tasks.workunit.client.1.vm04.stdout:8/31: unlink d0/d3/c6 0 2026-03-10T14:07:20.326 INFO:tasks.workunit.client.1.vm04.stdout:8/32: fsync d0/f2 0 2026-03-10T14:07:20.330 INFO:tasks.workunit.client.1.vm04.stdout:8/33: dwrite d0/f2 [0,4194304] 0 2026-03-10T14:07:20.330 INFO:tasks.workunit.client.1.vm04.stdout:2/43: symlink d0/d3/lb 0 2026-03-10T14:07:20.330 INFO:tasks.workunit.client.1.vm04.stdout:5/73: sync 2026-03-10T14:07:20.332 INFO:tasks.workunit.client.1.vm04.stdout:3/43: symlink da/dc/l12 0 2026-03-10T14:07:20.336 INFO:tasks.workunit.client.1.vm04.stdout:5/74: dwrite f4 [0,4194304] 0 2026-03-10T14:07:20.341 INFO:tasks.workunit.client.1.vm04.stdout:3/44: symlink da/dc/l13 0 2026-03-10T14:07:20.341 INFO:tasks.workunit.client.1.vm04.stdout:2/44: creat d0/d3/d8/fc x:0 0 0 2026-03-10T14:07:20.341 INFO:tasks.workunit.client.1.vm04.stdout:5/75: mknod d7/d9/cf 0 2026-03-10T14:07:20.342 INFO:tasks.workunit.client.1.vm04.stdout:8/34: mkdir d0/d3/d5/d9 0 2026-03-10T14:07:20.344 INFO:tasks.workunit.client.1.vm04.stdout:8/35: dread d0/f2 [0,4194304] 0 2026-03-10T14:07:20.347 INFO:tasks.workunit.client.1.vm04.stdout:8/36: stat d0/d3/d5/d9 0 2026-03-10T14:07:20.347 INFO:tasks.workunit.client.1.vm04.stdout:8/37: dread d0/f2 [0,4194304] 0 2026-03-10T14:07:20.348 INFO:tasks.workunit.client.1.vm04.stdout:5/76: mknod d7/c10 0 2026-03-10T14:07:20.349 INFO:tasks.workunit.client.1.vm04.stdout:3/45: creat da/f14 x:0 0 0 2026-03-10T14:07:20.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:20 vm03.local ceph-mon[49718]: from='client.24367 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:20.378 INFO:tasks.workunit.client.1.vm04.stdout:8/38: mkdir d0/da 0 2026-03-10T14:07:20.380 INFO:tasks.workunit.client.1.vm04.stdout:5/77: rename f2 to d7/f11 0 2026-03-10T14:07:20.381 INFO:tasks.workunit.client.1.vm04.stdout:3/46: creat da/f15 x:0 0 0 2026-03-10T14:07:20.382 INFO:tasks.workunit.client.1.vm04.stdout:3/47: truncate f8 4434895 0 2026-03-10T14:07:20.382 INFO:tasks.workunit.client.1.vm04.stdout:3/48: stat da/f14 0 2026-03-10T14:07:20.384 INFO:tasks.workunit.client.1.vm04.stdout:8/39: mkdir d0/d3/db 0 2026-03-10T14:07:20.385 INFO:tasks.workunit.client.1.vm04.stdout:5/78: mkdir d7/d12 0 2026-03-10T14:07:20.385 INFO:tasks.workunit.client.1.vm04.stdout:5/79: chown d7/d12 3313 1 2026-03-10T14:07:20.386 INFO:tasks.workunit.client.1.vm04.stdout:3/49: write da/dc/f11 [1165191,81061] 0 2026-03-10T14:07:20.389 INFO:tasks.workunit.client.1.vm04.stdout:5/80: rename d7/f8 to d7/f13 0 2026-03-10T14:07:20.392 INFO:tasks.workunit.client.1.vm04.stdout:3/50: link c7 da/c16 0 2026-03-10T14:07:20.396 INFO:tasks.workunit.client.1.vm04.stdout:3/51: rename da/c16 to da/c17 0 2026-03-10T14:07:20.400 INFO:tasks.workunit.client.1.vm04.stdout:3/52: link l2 da/dc/l18 0 2026-03-10T14:07:20.401 INFO:tasks.workunit.client.1.vm04.stdout:3/53: rename f3 to da/f19 0 2026-03-10T14:07:20.402 INFO:tasks.workunit.client.1.vm04.stdout:3/54: write f4 [4638351,42372] 0 2026-03-10T14:07:20.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.416+0000 7fb992db0700 1 -- 192.168.123.103:0/1330128079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fb98c02d0b0 con 0x7fb98c131920 2026-03-10T14:07:20.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.418+0000 7fb989ffb700 1 -- 192.168.123.103:0/1330128079 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7fb98c02d0b0 con 0x7fb98c131920 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:07:20.418 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:07:20.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.423+0000 7fb9737fe700 1 -- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb97406c6d0 msgr2=0x7fb97406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.423+0000 7fb9737fe700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb97406c6d0 0x7fb97406eb80 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fb98c07b4f0 tx=0x7fb97c00b410 comp rx=0 tx=0).stop 2026-03-10T14:07:20.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.423+0000 7fb9737fe700 1 -- 192.168.123.103:0/1330128079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c131920 msgr2=0x7fb98c07f580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.423+0000 7fb9737fe700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c131920 0x7fb98c07f580 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb9840060b0 tx=0x7fb984003c30 comp rx=0 tx=0).stop 2026-03-10T14:07:20.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.424+0000 7fb9737fe700 1 -- 192.168.123.103:0/1330128079 shutdown_connections 2026-03-10T14:07:20.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.424+0000 7fb9737fe700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb98c071980 0x7fb98c1313e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.424+0000 7fb9737fe700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb97406c6d0 0x7fb97406eb80 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.424+0000 7fb9737fe700 1 --2- 192.168.123.103:0/1330128079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb98c131920 0x7fb98c07f580 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.424+0000 7fb9737fe700 1 -- 192.168.123.103:0/1330128079 >> 192.168.123.103:0/1330128079 conn(0x7fb98c06d1a0 msgr2=0x7fb98c0764b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:20.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.425+0000 7fb9737fe700 1 -- 192.168.123.103:0/1330128079 shutdown_connections 2026-03-10T14:07:20.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.425+0000 7fb9737fe700 1 -- 192.168.123.103:0/1330128079 wait complete. 2026-03-10T14:07:20.444 INFO:tasks.workunit.client.1.vm04.stdout:2/45: dread d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:20.446 INFO:tasks.workunit.client.1.vm04.stdout:2/46: mkdir d0/d3/d8/dd 0 2026-03-10T14:07:20.447 INFO:tasks.workunit.client.1.vm04.stdout:2/47: mknod d0/d3/d8/dd/ce 0 2026-03-10T14:07:20.448 INFO:tasks.workunit.client.1.vm04.stdout:2/48: dread d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:20.452 INFO:tasks.workunit.client.1.vm04.stdout:2/49: dwrite d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:20.456 INFO:tasks.workunit.client.1.vm04.stdout:4/48: truncate f3 4024869 0 2026-03-10T14:07:20.458 INFO:tasks.workunit.client.1.vm04.stdout:2/50: mknod d0/d3/cf 0 2026-03-10T14:07:20.458 INFO:tasks.workunit.client.1.vm04.stdout:2/51: stat d0/d3/f9 0 2026-03-10T14:07:20.461 INFO:tasks.workunit.client.1.vm04.stdout:2/52: unlink d0/d3/lb 0 2026-03-10T14:07:20.461 INFO:tasks.workunit.client.1.vm04.stdout:2/53: readlink d0/d3/la 0 2026-03-10T14:07:20.463 INFO:tasks.workunit.client.1.vm04.stdout:4/49: rename d4/df/c13 to d4/df/c16 0 2026-03-10T14:07:20.465 INFO:tasks.workunit.client.1.vm04.stdout:2/54: symlink d0/l10 0 2026-03-10T14:07:20.465 INFO:tasks.workunit.client.1.vm04.stdout:4/50: dread d4/f6 [0,4194304] 0 2026-03-10T14:07:20.466 INFO:tasks.workunit.client.1.vm04.stdout:4/51: chown d4/l8 2464 1 2026-03-10T14:07:20.469 INFO:tasks.workunit.client.1.vm04.stdout:4/52: unlink f1 0 2026-03-10T14:07:20.473 INFO:tasks.workunit.client.1.vm04.stdout:4/53: symlink d4/l17 0 2026-03-10T14:07:20.474 INFO:tasks.workunit.client.1.vm04.stdout:4/54: creat d4/df/f18 x:0 0 0 2026-03-10T14:07:20.474 INFO:tasks.workunit.client.1.vm04.stdout:8/40: fdatasync d0/f2 0 2026-03-10T14:07:20.476 INFO:tasks.workunit.client.1.vm04.stdout:4/55: rename d4/l8 to d4/l19 0 2026-03-10T14:07:20.480 INFO:tasks.workunit.client.1.vm04.stdout:4/56: dwrite d4/f9 [0,4194304] 0 2026-03-10T14:07:20.481 INFO:tasks.workunit.client.1.vm04.stdout:4/57: write d4/f6 [33352,86898] 0 2026-03-10T14:07:20.486 INFO:tasks.workunit.client.1.vm04.stdout:8/41: creat d0/d3/db/fc x:0 0 0 2026-03-10T14:07:20.487 INFO:tasks.workunit.client.1.vm04.stdout:8/42: chown d0/d3/db/fc 23426 1 2026-03-10T14:07:20.492 INFO:tasks.workunit.client.1.vm04.stdout:4/58: mknod d4/df/c1a 0 2026-03-10T14:07:20.492 INFO:tasks.workunit.client.1.vm04.stdout:8/43: unlink d0/c1 0 2026-03-10T14:07:20.493 INFO:tasks.workunit.client.1.vm04.stdout:8/44: chown d0/f2 226308735 1 2026-03-10T14:07:20.495 INFO:tasks.workunit.client.1.vm04.stdout:4/59: read f2 [164999,1076] 0 2026-03-10T14:07:20.500 INFO:tasks.workunit.client.1.vm04.stdout:8/45: dwrite d0/f2 [0,4194304] 0 2026-03-10T14:07:20.500 INFO:tasks.workunit.client.1.vm04.stdout:8/46: chown d0/d3 973 1 2026-03-10T14:07:20.500 INFO:tasks.workunit.client.1.vm04.stdout:8/47: readlink - no filename 2026-03-10T14:07:20.503 INFO:tasks.workunit.client.1.vm04.stdout:4/60: mkdir d4/d14/d1b 0 2026-03-10T14:07:20.504 INFO:tasks.workunit.client.1.vm04.stdout:8/48: unlink d0/f2 0 2026-03-10T14:07:20.505 INFO:tasks.workunit.client.1.vm04.stdout:8/49: write d0/d3/db/fc [438307,39249] 0 2026-03-10T14:07:20.552 INFO:tasks.workunit.client.1.vm04.stdout:8/50: sync 2026-03-10T14:07:20.555 INFO:tasks.workunit.client.1.vm04.stdout:8/51: rmdir d0/d3/d5/d9 0 2026-03-10T14:07:20.556 INFO:tasks.workunit.client.1.vm04.stdout:8/52: truncate d0/d3/db/fc 765173 0 2026-03-10T14:07:20.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.558+0000 7f708de89700 1 -- 192.168.123.103:0/2648838394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7088071980 msgr2=0x7f7088071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.558+0000 7f708de89700 1 --2- 192.168.123.103:0/2648838394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7088071980 0x7f7088071d90 secure :-1 s=READY pgs=308 cs=0 l=1 rev1=1 crypto rx=0x7f7078005fd0 tx=0x7f707800adf0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.560+0000 7f708de89700 1 -- 192.168.123.103:0/2648838394 shutdown_connections 2026-03-10T14:07:20.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.560+0000 7f708de89700 1 --2- 192.168.123.103:0/2648838394 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7088072360 0x7f70880770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.560+0000 7f708de89700 1 --2- 192.168.123.103:0/2648838394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7088071980 0x7f7088071d90 unknown :-1 s=CLOSED pgs=308 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.560+0000 7f708de89700 1 -- 192.168.123.103:0/2648838394 >> 192.168.123.103:0/2648838394 conn(0x7f708806d1a0 msgr2=0x7f708806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:20.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.561+0000 7f708de89700 1 -- 192.168.123.103:0/2648838394 shutdown_connections 2026-03-10T14:07:20.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.561+0000 7f708de89700 1 -- 192.168.123.103:0/2648838394 wait complete. 2026-03-10T14:07:20.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.562+0000 7f708de89700 1 Processor -- start 2026-03-10T14:07:20.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.562+0000 7f708de89700 1 -- start start 2026-03-10T14:07:20.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.562+0000 7f708de89700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7088072360 0x7f70880824f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:20.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.562+0000 7f708de89700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7088082a30 0x7f7088082ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:20.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.562+0000 7f708de89700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7088083e50 con 0x7f7088072360 2026-03-10T14:07:20.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.562+0000 7f708de89700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f708812dd80 con 0x7f7088082a30 2026-03-10T14:07:20.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.563+0000 7f7086ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7088082a30 0x7f7088082ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:20.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.563+0000 7f7086ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7088082a30 0x7f7088082ea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:36310/0 (socket says 192.168.123.103:36310) 2026-03-10T14:07:20.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.563+0000 7f7086ffd700 1 -- 192.168.123.103:0/2305711322 learned_addr learned my addr 192.168.123.103:0/2305711322 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:20.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.563+0000 7f70877fe700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7088072360 0x7f70880824f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:20.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.563+0000 7f7086ffd700 1 -- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7088072360 msgr2=0x7f70880824f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.563+0000 7f7086ffd700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7088072360 0x7f70880824f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.563+0000 7f7086ffd700 1 -- 192.168.123.103:0/2305711322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f70780082d0 con 0x7f7088082a30 2026-03-10T14:07:20.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.564+0000 7f7086ffd700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7088082a30 0x7f7088082ea0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f708000e550 tx=0x7f708000e860 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:20.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.565+0000 7f7084ff9700 1 -- 192.168.123.103:0/2305711322 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7080004d60 con 0x7f7088082a30 2026-03-10T14:07:20.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.566+0000 7f708de89700 1 -- 192.168.123.103:0/2305711322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f708812e000 con 0x7f7088082a30 2026-03-10T14:07:20.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.566+0000 7f708de89700 1 -- 192.168.123.103:0/2305711322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f708812e550 con 0x7f7088082a30 2026-03-10T14:07:20.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.569+0000 7f7084ff9700 1 -- 192.168.123.103:0/2305711322 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7080011930 con 0x7f7088082a30 2026-03-10T14:07:20.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.570+0000 7f7084ff9700 1 -- 192.168.123.103:0/2305711322 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7080005600 con 0x7f7088082a30 2026-03-10T14:07:20.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.570+0000 7f7084ff9700 1 -- 192.168.123.103:0/2305711322 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7080005820 con 0x7f7088082a30 2026-03-10T14:07:20.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.571+0000 7f7084ff9700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f707006c870 0x7f707006ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:20.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.572+0000 7f7084ff9700 1 -- 192.168.123.103:0/2305711322 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f708008c8a0 con 0x7f7088082a30 2026-03-10T14:07:20.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.573+0000 7f708de89700 1 -- 192.168.123.103:0/2305711322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7074005320 con 0x7f7088082a30 2026-03-10T14:07:20.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.573+0000 7f70877fe700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f707006c870 0x7f707006ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:20.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.574+0000 7f70877fe700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f707006c870 0x7f707006ed20 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f707800b3a0 tx=0x7f7078005c40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:20.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.577+0000 7f7084ff9700 1 -- 192.168.123.103:0/2305711322 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f708005ab30 con 0x7f7088082a30 2026-03-10T14:07:20.682 INFO:tasks.workunit.client.1.vm04.stdout:6/72: truncate d3/f9 355360 0 2026-03-10T14:07:20.711 INFO:tasks.workunit.client.1.vm04.stdout:0/22: getdents d0 0 2026-03-10T14:07:20.711 INFO:tasks.workunit.client.1.vm04.stdout:0/23: write d0/f3 [88330,95025] 0 2026-03-10T14:07:20.712 INFO:tasks.workunit.client.1.vm04.stdout:0/24: write d0/f3 [1026696,96317] 0 2026-03-10T14:07:20.713 INFO:tasks.workunit.client.1.vm04.stdout:0/25: write d0/f3 [1086959,98862] 0 2026-03-10T14:07:20.725 INFO:tasks.workunit.client.1.vm04.stdout:0/26: dread d0/f3 [0,4194304] 0 2026-03-10T14:07:20.728 INFO:tasks.workunit.client.1.vm04.stdout:0/27: dwrite d0/f3 [0,4194304] 0 2026-03-10T14:07:20.777 INFO:tasks.workunit.client.1.vm04.stdout:0/28: rmdir d0/d2/d7 0 2026-03-10T14:07:20.778 INFO:tasks.workunit.client.1.vm04.stdout:0/29: rename d0/f3 to d0/d2/f8 0 2026-03-10T14:07:20.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.800+0000 7f708de89700 1 -- 192.168.123.103:0/2305711322 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f7074006200 con 0x7f7088082a30 2026-03-10T14:07:20.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.802+0000 7f7084ff9700 1 -- 192.168.123.103:0/2305711322 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1865 (secure 0 0 0) 0x7f7080015030 con 0x7f7088082a30 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:e13 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:epoch 13 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:06:44.825228+0000 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:07:20.801 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:07:20.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.804+0000 7f706e7fc700 1 -- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f707006c870 msgr2=0x7f707006ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.804 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.806+0000 7f706e7fc700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f707006c870 0x7f707006ed20 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f707800b3a0 tx=0x7f7078005c40 comp rx=0 tx=0).stop 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.806+0000 7f706e7fc700 1 -- 192.168.123.103:0/2305711322 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7088082a30 msgr2=0x7f7088082ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.806+0000 7f706e7fc700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7088082a30 0x7f7088082ea0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f708000e550 tx=0x7f708000e860 comp rx=0 tx=0).stop 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.806+0000 7f706e7fc700 1 -- 192.168.123.103:0/2305711322 shutdown_connections 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.806+0000 7f706e7fc700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7088072360 0x7f70880824f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.806+0000 7f706e7fc700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f707006c870 0x7f707006ed20 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.806+0000 7f706e7fc700 1 --2- 192.168.123.103:0/2305711322 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7088082a30 0x7f7088082ea0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.806+0000 7f706e7fc700 1 -- 192.168.123.103:0/2305711322 >> 192.168.123.103:0/2305711322 conn(0x7f708806d1a0 msgr2=0x7f708806e090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.807+0000 7f706e7fc700 1 -- 192.168.123.103:0/2305711322 shutdown_connections 2026-03-10T14:07:20.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:20.807+0000 7f706e7fc700 1 -- 192.168.123.103:0/2305711322 wait complete. 2026-03-10T14:07:20.807 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 13 2026-03-10T14:07:21.101 INFO:tasks.workunit.client.1.vm04.stdout:7/35: getdents d2 0 2026-03-10T14:07:21.101 INFO:tasks.workunit.client.1.vm04.stdout:7/36: readlink d2/l6 0 2026-03-10T14:07:21.106 INFO:tasks.workunit.client.1.vm04.stdout:7/37: dwrite d2/f4 [4194304,4194304] 0 2026-03-10T14:07:21.109 INFO:tasks.workunit.client.1.vm04.stdout:7/38: link d2/l6 d2/l9 0 2026-03-10T14:07:21.112 INFO:tasks.workunit.client.1.vm04.stdout:7/39: link d2/l9 d2/la 0 2026-03-10T14:07:21.113 INFO:tasks.workunit.client.1.vm04.stdout:7/40: write d2/f4 [7643690,81358] 0 2026-03-10T14:07:21.120 INFO:tasks.workunit.client.1.vm04.stdout:7/41: mkdir d2/db 0 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.183+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/795527805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5e81080e0 msgr2=0x7fe5e81084f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.183+0000 7fe5ee3b6700 1 --2- 192.168.123.103:0/795527805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5e81080e0 0x7fe5e81084f0 secure :-1 s=READY pgs=309 cs=0 l=1 rev1=1 crypto rx=0x7fe5d8007780 tx=0x7fe5d800c050 comp rx=0 tx=0).stop 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.183+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/795527805 shutdown_connections 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.183+0000 7fe5ee3b6700 1 --2- 192.168.123.103:0/795527805 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe5e8071960 0x7fe5e8071dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.183+0000 7fe5ee3b6700 1 --2- 192.168.123.103:0/795527805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5e81080e0 0x7fe5e81084f0 unknown :-1 s=CLOSED pgs=309 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.183+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/795527805 >> 192.168.123.103:0/795527805 conn(0x7fe5e806d3e0 msgr2=0x7fe5e806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.184+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/795527805 shutdown_connections 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.184+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/795527805 wait complete. 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.184+0000 7fe5ee3b6700 1 Processor -- start 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.184+0000 7fe5ee3b6700 1 -- start start 2026-03-10T14:07:21.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.184+0000 7fe5ee3b6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe5e8071960 0x7fe5e8132550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.184+0000 7fe5ee3b6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5e8132a90 0x7fe5e807e570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.184+0000 7fe5ee3b6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe5e8132f90 con 0x7fe5e8132a90 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.184+0000 7fe5ee3b6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe5e8133100 con 0x7fe5e8071960 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.185+0000 7fe5e7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe5e8071960 0x7fe5e8132550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.185+0000 7fe5e7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe5e8071960 0x7fe5e8132550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:36340/0 (socket says 192.168.123.103:36340) 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.185+0000 7fe5e7fff700 1 -- 192.168.123.103:0/3937750689 learned_addr learned my addr 192.168.123.103:0/3937750689 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.185+0000 7fe5e7fff700 1 -- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5e8132a90 msgr2=0x7fe5e807e570 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.185+0000 7fe5e7fff700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5e8132a90 0x7fe5e807e570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.185+0000 7fe5e7fff700 1 -- 192.168.123.103:0/3937750689 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe5d8007430 con 0x7fe5e8071960 2026-03-10T14:07:21.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.186+0000 7fe5e7fff700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe5e8071960 0x7fe5e8132550 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fe5d8007fd0 tx=0x7fe5d800ca60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:21.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.186+0000 7fe5e57fa700 1 -- 192.168.123.103:0/3937750689 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe5d800f050 con 0x7fe5e8071960 2026-03-10T14:07:21.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.187+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/3937750689 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe5e807eab0 con 0x7fe5e8071960 2026-03-10T14:07:21.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.187+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/3937750689 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe5e807efd0 con 0x7fe5e8071960 2026-03-10T14:07:21.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.187+0000 7fe5e57fa700 1 -- 192.168.123.103:0/3937750689 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe5d800a5b0 con 0x7fe5e8071960 2026-03-10T14:07:21.185 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.187+0000 7fe5e57fa700 1 -- 192.168.123.103:0/3937750689 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe5d8008790 con 0x7fe5e8071960 2026-03-10T14:07:21.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.188+0000 7fe5e57fa700 1 -- 192.168.123.103:0/3937750689 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe5d801a040 con 0x7fe5e8071960 2026-03-10T14:07:21.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.188+0000 7fe5e57fa700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe5d006c7a0 0x7fe5d006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:21.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.189+0000 7fe5e57fa700 1 -- 192.168.123.103:0/3937750689 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fe5d808cf40 con 0x7fe5e8071960 2026-03-10T14:07:21.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.189+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/3937750689 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe5d4005320 con 0x7fe5e8071960 2026-03-10T14:07:21.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.190+0000 7fe5e77fe700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe5d006c7a0 0x7fe5d006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:21.190 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.191+0000 7fe5e77fe700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe5d006c7a0 0x7fe5d006ec50 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fe5e8107e10 tx=0x7fe5e0009250 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:21.193 INFO:tasks.workunit.client.1.vm04.stdout:1/59: getdents d3 0 2026-03-10T14:07:21.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.196+0000 7fe5e57fa700 1 -- 192.168.123.103:0/3937750689 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe5d805b1d0 con 0x7fe5e8071960 2026-03-10T14:07:21.195 INFO:tasks.workunit.client.1.vm04.stdout:9/41: fsync d9/da/f10 0 2026-03-10T14:07:21.196 INFO:tasks.workunit.client.1.vm04.stdout:1/60: fsync f2 0 2026-03-10T14:07:21.196 INFO:tasks.workunit.client.1.vm04.stdout:1/61: write d3/fb [414758,32856] 0 2026-03-10T14:07:21.197 INFO:tasks.workunit.client.1.vm04.stdout:9/42: mknod d9/c11 0 2026-03-10T14:07:21.202 INFO:tasks.workunit.client.1.vm04.stdout:9/43: rmdir d9/da 39 2026-03-10T14:07:21.204 INFO:tasks.workunit.client.1.vm04.stdout:9/44: fsync d9/da/f10 0 2026-03-10T14:07:21.204 INFO:tasks.workunit.client.1.vm04.stdout:9/45: readlink l6 0 2026-03-10T14:07:21.208 INFO:tasks.workunit.client.1.vm04.stdout:9/46: dwrite d9/da/dd/fe [0,4194304] 0 2026-03-10T14:07:21.211 INFO:tasks.workunit.client.1.vm04.stdout:9/47: symlink d9/da/l12 0 2026-03-10T14:07:21.211 INFO:tasks.workunit.client.1.vm04.stdout:9/48: readlink l6 0 2026-03-10T14:07:21.212 INFO:tasks.workunit.client.1.vm04.stdout:9/49: write f5 [467602,56927] 0 2026-03-10T14:07:21.216 INFO:tasks.workunit.client.1.vm04.stdout:9/50: dwrite f5 [0,4194304] 0 2026-03-10T14:07:21.219 INFO:tasks.workunit.client.1.vm04.stdout:9/51: dread f5 [0,4194304] 0 2026-03-10T14:07:21.296 INFO:tasks.workunit.client.1.vm04.stdout:9/52: fsync d9/da/dd/fe 0 2026-03-10T14:07:21.299 INFO:tasks.workunit.client.1.vm04.stdout:9/53: creat d9/da/f13 x:0 0 0 2026-03-10T14:07:21.299 INFO:tasks.workunit.client.1.vm04.stdout:9/54: read d9/da/dd/fe [490454,98938] 0 2026-03-10T14:07:21.301 INFO:tasks.workunit.client.1.vm04.stdout:9/55: rename l4 to d9/da/l14 0 2026-03-10T14:07:21.371 INFO:tasks.workunit.client.1.vm04.stdout:9/56: sync 2026-03-10T14:07:21.378 INFO:tasks.workunit.client.1.vm04.stdout:9/57: dwrite d9/da/f10 [0,4194304] 0 2026-03-10T14:07:21.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.389+0000 7fe5ee3b6700 1 -- 192.168.123.103:0/3937750689 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe5d4000bf0 con 0x7fe5d006c7a0 2026-03-10T14:07:21.389 INFO:tasks.workunit.client.1.vm04.stdout:9/58: dwrite f8 [0,4194304] 0 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "0/23 daemons upgraded", 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm04", 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:07:21.394 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.393+0000 7fe5e57fa700 1 -- 192.168.123.103:0/3937750689 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fe5d4000bf0 con 0x7fe5d006c7a0 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 -- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe5d006c7a0 msgr2=0x7fe5d006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe5d006c7a0 0x7fe5d006ec50 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fe5e8107e10 tx=0x7fe5e0009250 comp rx=0 tx=0).stop 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 -- 192.168.123.103:0/3937750689 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe5e8071960 msgr2=0x7fe5e8132550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe5e8071960 0x7fe5e8132550 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fe5d8007fd0 tx=0x7fe5d800ca60 comp rx=0 tx=0).stop 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 -- 192.168.123.103:0/3937750689 shutdown_connections 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe5d006c7a0 0x7fe5d006ec50 secure :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fe5e8107e10 tx=0x7fe5e0009250 comp rx=0 tx=0).stop 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe5e8071960 0x7fe5e8132550 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 --2- 192.168.123.103:0/3937750689 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe5e8132a90 0x7fe5e807e570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 -- 192.168.123.103:0/3937750689 >> 192.168.123.103:0/3937750689 conn(0x7fe5e806d3e0 msgr2=0x7fe5e8070620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:21.395 INFO:tasks.workunit.client.1.vm04.stdout:2/55: fsync d0/d3/d8/fc 0 2026-03-10T14:07:21.395 INFO:tasks.workunit.client.1.vm04.stdout:5/81: getdents d7/d9 0 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 -- 192.168.123.103:0/3937750689 shutdown_connections 2026-03-10T14:07:21.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.396+0000 7fe5ceffd700 1 -- 192.168.123.103:0/3937750689 wait complete. 2026-03-10T14:07:21.396 INFO:tasks.workunit.client.1.vm04.stdout:2/56: fsync d0/d3/d8/fc 0 2026-03-10T14:07:21.396 INFO:tasks.workunit.client.1.vm04.stdout:2/57: chown d0/d3/d8/dd 3105757 1 2026-03-10T14:07:21.397 INFO:tasks.workunit.client.1.vm04.stdout:2/58: dread - d0/d3/d8/fc zero size 2026-03-10T14:07:21.398 INFO:tasks.workunit.client.1.vm04.stdout:3/55: fsync da/f14 0 2026-03-10T14:07:21.409 INFO:tasks.workunit.client.1.vm04.stdout:2/59: creat d0/d3/f11 x:0 0 0 2026-03-10T14:07:21.409 INFO:tasks.workunit.client.1.vm04.stdout:2/60: readlink d0/d3/la 0 2026-03-10T14:07:21.412 INFO:tasks.workunit.client.1.vm04.stdout:3/56: creat da/dc/f1a x:0 0 0 2026-03-10T14:07:21.417 INFO:tasks.workunit.client.1.vm04.stdout:3/57: stat da/f10 0 2026-03-10T14:07:21.417 INFO:tasks.workunit.client.1.vm04.stdout:3/58: dread - da/f14 zero size 2026-03-10T14:07:21.417 INFO:tasks.workunit.client.1.vm04.stdout:3/59: dread - da/f15 zero size 2026-03-10T14:07:21.417 INFO:tasks.workunit.client.1.vm04.stdout:3/60: fdatasync da/fd 0 2026-03-10T14:07:21.417 INFO:tasks.workunit.client.1.vm04.stdout:5/82: symlink d7/d12/l14 0 2026-03-10T14:07:21.421 INFO:tasks.workunit.client.1.vm04.stdout:9/59: creat d9/f15 x:0 0 0 2026-03-10T14:07:21.424 INFO:tasks.workunit.client.1.vm04.stdout:4/61: chown d4/l19 2 1 2026-03-10T14:07:21.424 INFO:tasks.workunit.client.1.vm04.stdout:4/62: readlink d4/l19 0 2026-03-10T14:07:21.429 INFO:tasks.workunit.client.1.vm04.stdout:8/53: rename d0/d3/db to d0/d3/dd 0 2026-03-10T14:07:21.435 INFO:tasks.workunit.client.1.vm04.stdout:4/63: symlink d4/df/l1c 0 2026-03-10T14:07:21.436 INFO:tasks.workunit.client.1.vm04.stdout:4/64: fdatasync d4/fa 0 2026-03-10T14:07:21.438 INFO:tasks.workunit.client.1.vm04.stdout:2/61: link d0/l10 d0/d3/d8/l12 0 2026-03-10T14:07:21.446 INFO:tasks.workunit.client.1.vm04.stdout:8/54: symlink d0/da/le 0 2026-03-10T14:07:21.447 INFO:tasks.workunit.client.1.vm04.stdout:2/62: symlink d0/d3/l13 0 2026-03-10T14:07:21.452 INFO:tasks.workunit.client.1.vm04.stdout:2/63: dwrite d0/d3/d8/fc [0,4194304] 0 2026-03-10T14:07:21.455 INFO:tasks.workunit.client.1.vm04.stdout:2/64: write d0/d3/d8/fc [1905970,21927] 0 2026-03-10T14:07:21.464 INFO:tasks.workunit.client.1.vm04.stdout:3/61: getdents da 0 2026-03-10T14:07:21.464 INFO:tasks.workunit.client.1.vm04.stdout:3/62: write da/dc/f1a [22728,60520] 0 2026-03-10T14:07:21.477 INFO:tasks.workunit.client.1.vm04.stdout:8/55: symlink d0/lf 0 2026-03-10T14:07:21.479 INFO:tasks.workunit.client.1.vm04.stdout:8/56: stat d0/da/le 0 2026-03-10T14:07:21.479 INFO:tasks.workunit.client.1.vm04.stdout:8/57: readlink d0/lf 0 2026-03-10T14:07:21.479 INFO:tasks.workunit.client.1.vm04.stdout:8/58: write d0/d3/dd/fc [1487506,4388] 0 2026-03-10T14:07:21.486 INFO:tasks.workunit.client.1.vm04.stdout:2/65: mkdir d0/d14 0 2026-03-10T14:07:21.487 INFO:tasks.workunit.client.1.vm04.stdout:2/66: chown d0/c1 0 1 2026-03-10T14:07:21.490 INFO:tasks.workunit.client.1.vm04.stdout:3/63: symlink da/dc/l1b 0 2026-03-10T14:07:21.490 INFO:tasks.workunit.client.1.vm04.stdout:3/64: rename da to da/dc/d1c 22 2026-03-10T14:07:21.491 INFO:tasks.workunit.client.1.vm04.stdout:3/65: chown c6 2396 1 2026-03-10T14:07:21.494 INFO:tasks.workunit.client.1.vm04.stdout:8/59: mknod d0/d3/c10 0 2026-03-10T14:07:21.498 INFO:tasks.workunit.client.1.vm04.stdout:8/60: dwrite d0/d3/dd/fc [0,4194304] 0 2026-03-10T14:07:21.505 INFO:tasks.workunit.client.1.vm04.stdout:2/67: mknod d0/c15 0 2026-03-10T14:07:21.506 INFO:tasks.workunit.client.1.vm04.stdout:2/68: write d0/d3/f11 [832424,51529] 0 2026-03-10T14:07:21.511 INFO:tasks.workunit.client.1.vm04.stdout:3/66: creat da/dc/f1d x:0 0 0 2026-03-10T14:07:21.516 INFO:tasks.workunit.client.1.vm04.stdout:3/67: dwrite f4 [4194304,4194304] 0 2026-03-10T14:07:21.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.526+0000 7fe9489ef700 1 -- 192.168.123.103:0/3054972627 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944071950 msgr2=0x7fe944071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:21.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.526+0000 7fe9489ef700 1 --2- 192.168.123.103:0/3054972627 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944071950 0x7fe944071d60 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fe934007780 tx=0x7fe93400c050 comp rx=0 tx=0).stop 2026-03-10T14:07:21.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.526+0000 7fe9489ef700 1 -- 192.168.123.103:0/3054972627 shutdown_connections 2026-03-10T14:07:21.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.526+0000 7fe9489ef700 1 --2- 192.168.123.103:0/3054972627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe944072330 0x7fe9440770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.526+0000 7fe9489ef700 1 --2- 192.168.123.103:0/3054972627 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944071950 0x7fe944071d60 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.526+0000 7fe9489ef700 1 -- 192.168.123.103:0/3054972627 >> 192.168.123.103:0/3054972627 conn(0x7fe94406d1a0 msgr2=0x7fe94406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:21.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.526+0000 7fe9489ef700 1 -- 192.168.123.103:0/3054972627 shutdown_connections 2026-03-10T14:07:21.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.526+0000 7fe9489ef700 1 -- 192.168.123.103:0/3054972627 wait complete. 2026-03-10T14:07:21.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9489ef700 1 Processor -- start 2026-03-10T14:07:21.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9489ef700 1 -- start start 2026-03-10T14:07:21.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9489ef700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944072330 0x7fe9440824d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:21.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9489ef700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe944082a10 0x7fe944082e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:21.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9489ef700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9441b2a90 con 0x7fe944082a10 2026-03-10T14:07:21.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9489ef700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9441b2bd0 con 0x7fe944072330 2026-03-10T14:07:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9437fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944072330 0x7fe9440824d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9437fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944072330 0x7fe9440824d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:36358/0 (socket says 192.168.123.103:36358) 2026-03-10T14:07:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe9437fe700 1 -- 192.168.123.103:0/3525828060 learned_addr learned my addr 192.168.123.103:0/3525828060 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe942ffd700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe944082a10 0x7fe944082e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe942ffd700 1 -- 192.168.123.103:0/3525828060 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944072330 msgr2=0x7fe9440824d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe942ffd700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944072330 0x7fe9440824d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.527+0000 7fe942ffd700 1 -- 192.168.123.103:0/3525828060 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe934007430 con 0x7fe944082a10 2026-03-10T14:07:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.528+0000 7fe942ffd700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe944082a10 0x7fe944082e80 secure :-1 s=READY pgs=310 cs=0 l=1 rev1=1 crypto rx=0x7fe93c00b6a0 tx=0x7fe93c00b9b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:21.527 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:21 vm03.local ceph-mon[49718]: from='client.14632 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:21.527 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:21 vm03.local ceph-mon[49718]: from='client.24375 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:21.527 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:21 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/1330128079' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:07:21.527 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:21 vm03.local ceph-mon[49718]: pgmap v139: 65 pgs: 65 active+clean; 164 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 4.4 MiB/s wr, 319 op/s 2026-03-10T14:07:21.527 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:21 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/2305711322' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:07:21.530 INFO:tasks.workunit.client.1.vm04.stdout:2/69: mknod d0/d3/d8/dd/c16 0 2026-03-10T14:07:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.531+0000 7fe940ff9700 1 -- 192.168.123.103:0/3525828060 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe93c00e070 con 0x7fe944082a10 2026-03-10T14:07:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.531+0000 7fe9489ef700 1 -- 192.168.123.103:0/3525828060 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe9441b2d10 con 0x7fe944082a10 2026-03-10T14:07:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.531+0000 7fe9489ef700 1 -- 192.168.123.103:0/3525828060 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe9441b3210 con 0x7fe944082a10 2026-03-10T14:07:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.532+0000 7fe940ff9700 1 -- 192.168.123.103:0/3525828060 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe93c0092c0 con 0x7fe944082a10 2026-03-10T14:07:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.532+0000 7fe940ff9700 1 -- 192.168.123.103:0/3525828060 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe93c018860 con 0x7fe944082a10 2026-03-10T14:07:21.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.533+0000 7fe9489ef700 1 -- 192.168.123.103:0/3525828060 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe930005320 con 0x7fe944082a10 2026-03-10T14:07:21.535 INFO:tasks.workunit.client.1.vm04.stdout:2/70: mkdir d0/d3/d8/d17 0 2026-03-10T14:07:21.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.535+0000 7fe940ff9700 1 -- 192.168.123.103:0/3525828060 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe93c0189c0 con 0x7fe944082a10 2026-03-10T14:07:21.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.536+0000 7fe940ff9700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe92c06c7a0 0x7fe92c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:21.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.536+0000 7fe940ff9700 1 -- 192.168.123.103:0/3525828060 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fe93c08b4b0 con 0x7fe944082a10 2026-03-10T14:07:21.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.539+0000 7fe9437fe700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe92c06c7a0 0x7fe92c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:21.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.539+0000 7fe9437fe700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe92c06c7a0 0x7fe92c06ec50 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fe93400c420 tx=0x7fe93400af60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:21.538 INFO:tasks.workunit.client.1.vm04.stdout:3/68: link da/fd da/dc/f1e 0 2026-03-10T14:07:21.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.541+0000 7fe940ff9700 1 -- 192.168.123.103:0/3525828060 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe93c059740 con 0x7fe944082a10 2026-03-10T14:07:21.542 INFO:tasks.workunit.client.1.vm04.stdout:2/71: stat d0/d3/d8/l12 0 2026-03-10T14:07:21.546 INFO:tasks.workunit.client.1.vm04.stdout:3/69: dwrite da/fb [0,4194304] 0 2026-03-10T14:07:21.550 INFO:tasks.workunit.client.1.vm04.stdout:3/70: symlink da/dc/l1f 0 2026-03-10T14:07:21.551 INFO:tasks.workunit.client.1.vm04.stdout:2/72: dwrite d0/d3/d8/f6 [0,4194304] 0 2026-03-10T14:07:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:21 vm04.local ceph-mon[55966]: from='client.14632 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:21 vm04.local ceph-mon[55966]: from='client.24375 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:21 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/1330128079' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:07:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:21 vm04.local ceph-mon[55966]: pgmap v139: 65 pgs: 65 active+clean; 164 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 1.4 MiB/s rd, 4.4 MiB/s wr, 319 op/s 2026-03-10T14:07:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:21 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/2305711322' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:07:21.569 INFO:tasks.workunit.client.1.vm04.stdout:3/71: dwrite da/fd [0,4194304] 0 2026-03-10T14:07:21.578 INFO:tasks.workunit.client.1.vm04.stdout:2/73: symlink d0/d3/d8/l18 0 2026-03-10T14:07:21.580 INFO:tasks.workunit.client.1.vm04.stdout:2/74: read d0/d3/f9 [3569995,82848] 0 2026-03-10T14:07:21.582 INFO:tasks.workunit.client.1.vm04.stdout:2/75: link d0/d3/d8/dd/c16 d0/d14/c19 0 2026-03-10T14:07:21.583 INFO:tasks.workunit.client.1.vm04.stdout:2/76: read d0/d3/f11 [769156,103712] 0 2026-03-10T14:07:21.589 INFO:tasks.workunit.client.1.vm04.stdout:2/77: link d0/d3/d8/dd/c16 d0/d3/c1a 0 2026-03-10T14:07:21.605 INFO:tasks.workunit.client.1.vm04.stdout:2/78: dread d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:21.620 INFO:tasks.workunit.client.1.vm04.stdout:3/72: sync 2026-03-10T14:07:21.725 INFO:tasks.workunit.client.1.vm04.stdout:0/30: getdents d0 0 2026-03-10T14:07:21.725 INFO:tasks.workunit.client.1.vm04.stdout:0/31: chown d0/d2/l6 0 1 2026-03-10T14:07:21.730 INFO:tasks.workunit.client.1.vm04.stdout:7/42: truncate d2/f7 1786283 0 2026-03-10T14:07:21.730 INFO:tasks.workunit.client.1.vm04.stdout:1/62: write f2 [1556683,128030] 0 2026-03-10T14:07:21.731 INFO:tasks.workunit.client.1.vm04.stdout:1/63: dread f2 [0,4194304] 0 2026-03-10T14:07:21.731 INFO:tasks.workunit.client.1.vm04.stdout:1/64: dread - d3/f14 zero size 2026-03-10T14:07:21.735 INFO:tasks.workunit.client.1.vm04.stdout:1/65: dwrite d3/fb [0,4194304] 0 2026-03-10T14:07:21.752 INFO:tasks.workunit.client.1.vm04.stdout:9/60: write f5 [4975553,99096] 0 2026-03-10T14:07:21.754 INFO:tasks.workunit.client.1.vm04.stdout:9/61: write d9/da/f13 [870727,62118] 0 2026-03-10T14:07:21.755 INFO:tasks.workunit.client.1.vm04.stdout:9/62: write d9/ff [1020653,122091] 0 2026-03-10T14:07:21.756 INFO:tasks.workunit.client.1.vm04.stdout:9/63: write f5 [5546746,125579] 0 2026-03-10T14:07:21.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.767+0000 7fe9489ef700 1 -- 192.168.123.103:0/3525828060 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fe930005190 con 0x7fe944082a10 2026-03-10T14:07:21.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.769+0000 7fe940ff9700 1 -- 192.168.123.103:0/3525828060 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+151 (secure 0 0 0) 0x7fe93c0592d0 con 0x7fe944082a10 2026-03-10T14:07:21.772 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 failed cephadm daemon(s) 2026-03-10T14:07:21.772 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s) 2026-03-10T14:07:21.772 INFO:teuthology.orchestra.run.vm03.stdout: daemon ceph-exporter.vm03 on vm03 is in error state 2026-03-10T14:07:21.772 INFO:tasks.workunit.client.1.vm04.stdout:5/83: getdents d7/d12 0 2026-03-10T14:07:21.772 INFO:tasks.workunit.client.1.vm04.stdout:5/84: read - d7/fc zero size 2026-03-10T14:07:21.772 INFO:tasks.workunit.client.1.vm04.stdout:5/85: dwrite d7/d9/fd [0,4194304] 0 2026-03-10T14:07:21.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.773+0000 7fe92a7fc700 1 -- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe92c06c7a0 msgr2=0x7fe92c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:21.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.773+0000 7fe92a7fc700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe92c06c7a0 0x7fe92c06ec50 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fe93400c420 tx=0x7fe93400af60 comp rx=0 tx=0).stop 2026-03-10T14:07:21.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.773+0000 7fe92a7fc700 1 -- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe944082a10 msgr2=0x7fe944082e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:21.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.773+0000 7fe92a7fc700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe944082a10 0x7fe944082e80 secure :-1 s=READY pgs=310 cs=0 l=1 rev1=1 crypto rx=0x7fe93c00b6a0 tx=0x7fe93c00b9b0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.774+0000 7fe92a7fc700 1 -- 192.168.123.103:0/3525828060 shutdown_connections 2026-03-10T14:07:21.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.774+0000 7fe92a7fc700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe92c06c7a0 0x7fe92c06ec50 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.774+0000 7fe92a7fc700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe944072330 0x7fe9440824d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.774+0000 7fe92a7fc700 1 --2- 192.168.123.103:0/3525828060 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe944082a10 0x7fe944082e80 unknown :-1 s=CLOSED pgs=310 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:21.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.774+0000 7fe92a7fc700 1 -- 192.168.123.103:0/3525828060 >> 192.168.123.103:0/3525828060 conn(0x7fe94406d1a0 msgr2=0x7fe944070600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:21.773 INFO:tasks.workunit.client.1.vm04.stdout:1/66: sync 2026-03-10T14:07:21.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.774+0000 7fe92a7fc700 1 -- 192.168.123.103:0/3525828060 shutdown_connections 2026-03-10T14:07:21.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:21.775+0000 7fe92a7fc700 1 -- 192.168.123.103:0/3525828060 wait complete. 2026-03-10T14:07:21.787 INFO:tasks.workunit.client.1.vm04.stdout:8/61: rename d0/da to d0/d3/dd/d11 0 2026-03-10T14:07:21.885 INFO:tasks.workunit.client.1.vm04.stdout:6/73: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:07:21.943 INFO:tasks.workunit.client.1.vm04.stdout:0/32: unlink d0/d2/l6 0 2026-03-10T14:07:21.945 INFO:tasks.workunit.client.1.vm04.stdout:0/33: dread d0/d2/f8 [0,4194304] 0 2026-03-10T14:07:21.953 INFO:tasks.workunit.client.1.vm04.stdout:5/86: dwrite f4 [0,4194304] 0 2026-03-10T14:07:21.956 INFO:tasks.workunit.client.1.vm04.stdout:5/87: dread f4 [0,4194304] 0 2026-03-10T14:07:21.956 INFO:tasks.workunit.client.1.vm04.stdout:5/88: chown d7/f13 93493956 1 2026-03-10T14:07:21.957 INFO:tasks.workunit.client.1.vm04.stdout:5/89: write f4 [338667,110972] 0 2026-03-10T14:07:21.962 INFO:tasks.workunit.client.1.vm04.stdout:4/65: truncate f3 5042311 0 2026-03-10T14:07:21.966 INFO:tasks.workunit.client.1.vm04.stdout:4/66: dwrite d4/fd [0,4194304] 0 2026-03-10T14:07:21.966 INFO:tasks.workunit.client.1.vm04.stdout:4/67: dread - d4/fe zero size 2026-03-10T14:07:21.971 INFO:tasks.workunit.client.1.vm04.stdout:4/68: dwrite d4/fd [0,4194304] 0 2026-03-10T14:07:21.975 INFO:tasks.workunit.client.1.vm04.stdout:4/69: write d4/df/f11 [1763099,17053] 0 2026-03-10T14:07:21.978 INFO:tasks.workunit.client.1.vm04.stdout:8/62: mkdir d0/d3/dd/d11/d12 0 2026-03-10T14:07:21.979 INFO:tasks.workunit.client.1.vm04.stdout:6/74: mkdir d3/de/d11 0 2026-03-10T14:07:21.979 INFO:tasks.workunit.client.1.vm04.stdout:2/79: mkdir d0/d14/d1b 0 2026-03-10T14:07:21.979 INFO:tasks.workunit.client.1.vm04.stdout:0/34: fdatasync d0/d2/f8 0 2026-03-10T14:07:21.979 INFO:tasks.workunit.client.1.vm04.stdout:0/35: readlink - no filename 2026-03-10T14:07:21.980 INFO:tasks.workunit.client.1.vm04.stdout:7/43: fsync d2/f7 0 2026-03-10T14:07:21.982 INFO:tasks.workunit.client.1.vm04.stdout:2/80: rename d0 to d0/d14/d1b/d1c 22 2026-03-10T14:07:21.993 INFO:tasks.workunit.client.1.vm04.stdout:6/75: dwrite d3/de/f10 [0,4194304] 0 2026-03-10T14:07:21.995 INFO:tasks.workunit.client.1.vm04.stdout:2/81: dread d0/d3/d8/f6 [0,4194304] 0 2026-03-10T14:07:22.017 INFO:tasks.workunit.client.1.vm04.stdout:6/76: dread - d3/d8/fa zero size 2026-03-10T14:07:22.018 INFO:tasks.workunit.client.1.vm04.stdout:2/82: write d0/d3/d8/f6 [2282902,107521] 0 2026-03-10T14:07:22.018 INFO:tasks.workunit.client.1.vm04.stdout:1/67: rename d3/d5/d13/l15 to d3/d5/d13/l17 0 2026-03-10T14:07:22.018 INFO:tasks.workunit.client.1.vm04.stdout:2/83: write d0/d3/d8/f6 [2436447,120696] 0 2026-03-10T14:07:22.018 INFO:tasks.workunit.client.1.vm04.stdout:5/90: mknod d7/d12/c15 0 2026-03-10T14:07:22.018 INFO:tasks.workunit.client.1.vm04.stdout:5/91: dread f4 [0,4194304] 0 2026-03-10T14:07:22.018 INFO:tasks.workunit.client.1.vm04.stdout:5/92: chown d7/f11 6 1 2026-03-10T14:07:22.018 INFO:tasks.workunit.client.1.vm04.stdout:5/93: write d7/d9/fd [2988000,123970] 0 2026-03-10T14:07:22.018 INFO:tasks.workunit.client.1.vm04.stdout:7/44: mkdir d2/dc 0 2026-03-10T14:07:22.019 INFO:tasks.workunit.client.1.vm04.stdout:4/70: sync 2026-03-10T14:07:22.020 INFO:tasks.workunit.client.1.vm04.stdout:4/71: chown d4/fa 77211664 1 2026-03-10T14:07:22.040 INFO:tasks.workunit.client.1.vm04.stdout:1/68: creat d3/f18 x:0 0 0 2026-03-10T14:07:22.046 INFO:tasks.workunit.client.1.vm04.stdout:3/73: link da/c17 da/c20 0 2026-03-10T14:07:22.048 INFO:tasks.workunit.client.1.vm04.stdout:3/74: dread f8 [0,4194304] 0 2026-03-10T14:07:22.052 INFO:tasks.workunit.client.1.vm04.stdout:5/94: chown d7/ce 74070 1 2026-03-10T14:07:22.054 INFO:tasks.workunit.client.1.vm04.stdout:5/95: dread f4 [0,4194304] 0 2026-03-10T14:07:22.057 INFO:tasks.workunit.client.1.vm04.stdout:0/36: link d0/d2/f8 d0/d2/f9 0 2026-03-10T14:07:22.059 INFO:tasks.workunit.client.1.vm04.stdout:0/37: dread d0/d2/f9 [0,4194304] 0 2026-03-10T14:07:22.067 INFO:tasks.workunit.client.1.vm04.stdout:8/63: mknod d0/d3/dd/d11/d12/c13 0 2026-03-10T14:07:22.074 INFO:tasks.workunit.client.1.vm04.stdout:2/84: fsync d0/d3/d8/f6 0 2026-03-10T14:07:22.077 INFO:tasks.workunit.client.1.vm04.stdout:4/72: mknod d4/c1d 0 2026-03-10T14:07:22.083 INFO:tasks.workunit.client.1.vm04.stdout:6/77: symlink d3/de/d11/l12 0 2026-03-10T14:07:22.084 INFO:tasks.workunit.client.1.vm04.stdout:6/78: dread - d3/ff zero size 2026-03-10T14:07:22.085 INFO:tasks.workunit.client.1.vm04.stdout:6/79: chown d3/ff 27 1 2026-03-10T14:07:22.085 INFO:tasks.workunit.client.1.vm04.stdout:6/80: write d3/fb [541049,46646] 0 2026-03-10T14:07:22.087 INFO:tasks.workunit.client.1.vm04.stdout:6/81: write d3/d8/fa [226060,69158] 0 2026-03-10T14:07:22.089 INFO:tasks.workunit.client.1.vm04.stdout:9/64: getdents d9/da/dd 0 2026-03-10T14:07:22.089 INFO:tasks.workunit.client.1.vm04.stdout:9/65: write d9/f15 [1017654,92910] 0 2026-03-10T14:07:22.090 INFO:tasks.workunit.client.1.vm04.stdout:9/66: stat c1 0 2026-03-10T14:07:22.099 INFO:tasks.workunit.client.1.vm04.stdout:3/75: mknod da/dc/c21 0 2026-03-10T14:07:22.100 INFO:tasks.workunit.client.1.vm04.stdout:3/76: readlink da/dc/l12 0 2026-03-10T14:07:22.111 INFO:tasks.workunit.client.1.vm04.stdout:0/38: write d0/d2/f9 [34407,87087] 0 2026-03-10T14:07:22.117 INFO:tasks.workunit.client.1.vm04.stdout:8/64: unlink d0/lf 0 2026-03-10T14:07:22.117 INFO:tasks.workunit.client.1.vm04.stdout:8/65: rename d0/d3/dd to d0/d3/dd/d14 22 2026-03-10T14:07:22.119 INFO:tasks.workunit.client.1.vm04.stdout:3/77: sync 2026-03-10T14:07:22.124 INFO:tasks.workunit.client.1.vm04.stdout:4/73: unlink d4/c5 0 2026-03-10T14:07:22.124 INFO:tasks.workunit.client.1.vm04.stdout:4/74: write d4/fe [905873,100946] 0 2026-03-10T14:07:22.129 INFO:tasks.workunit.client.1.vm04.stdout:9/67: mknod d9/da/dd/c16 0 2026-03-10T14:07:22.130 INFO:tasks.workunit.client.1.vm04.stdout:9/68: write f5 [4166015,9208] 0 2026-03-10T14:07:22.131 INFO:tasks.workunit.client.1.vm04.stdout:1/69: getdents d3/d5/d16 0 2026-03-10T14:07:22.135 INFO:tasks.workunit.client.1.vm04.stdout:9/69: dwrite d9/da/f10 [4194304,4194304] 0 2026-03-10T14:07:22.145 INFO:tasks.workunit.client.1.vm04.stdout:0/39: creat d0/d2/fa x:0 0 0 2026-03-10T14:07:22.151 INFO:tasks.workunit.client.1.vm04.stdout:7/45: rmdir d2/db 0 2026-03-10T14:07:22.154 INFO:tasks.workunit.client.1.vm04.stdout:8/66: creat d0/d3/d5/f15 x:0 0 0 2026-03-10T14:07:22.157 INFO:tasks.workunit.client.1.vm04.stdout:3/78: unlink da/dc/f1e 0 2026-03-10T14:07:22.159 INFO:tasks.workunit.client.1.vm04.stdout:3/79: dread da/fb [0,4194304] 0 2026-03-10T14:07:22.186 INFO:tasks.workunit.client.1.vm04.stdout:9/70: rename l6 to d9/da/dd/l17 0 2026-03-10T14:07:22.186 INFO:tasks.workunit.client.1.vm04.stdout:5/96: link d7/ce d7/c16 0 2026-03-10T14:07:22.186 INFO:tasks.workunit.client.1.vm04.stdout:5/97: dread - d7/f11 zero size 2026-03-10T14:07:22.186 INFO:tasks.workunit.client.1.vm04.stdout:3/80: creat da/f22 x:0 0 0 2026-03-10T14:07:22.186 INFO:tasks.workunit.client.1.vm04.stdout:3/81: write da/fe [2283856,99917] 0 2026-03-10T14:07:22.186 INFO:tasks.workunit.client.1.vm04.stdout:3/82: dwrite da/fe [0,4194304] 0 2026-03-10T14:07:22.187 INFO:tasks.workunit.client.1.vm04.stdout:4/75: getdents d4/d14/d1b 0 2026-03-10T14:07:22.189 INFO:tasks.workunit.client.1.vm04.stdout:3/83: dwrite da/f10 [0,4194304] 0 2026-03-10T14:07:22.196 INFO:tasks.workunit.client.1.vm04.stdout:5/98: mknod d7/d12/c17 0 2026-03-10T14:07:22.197 INFO:tasks.workunit.client.1.vm04.stdout:0/40: link d0/d2/f9 d0/fb 0 2026-03-10T14:07:22.200 INFO:tasks.workunit.client.1.vm04.stdout:7/46: creat d2/dc/fd x:0 0 0 2026-03-10T14:07:22.213 INFO:tasks.workunit.client.1.vm04.stdout:8/67: sync 2026-03-10T14:07:22.214 INFO:tasks.workunit.client.1.vm04.stdout:9/71: sync 2026-03-10T14:07:22.214 INFO:tasks.workunit.client.1.vm04.stdout:5/99: symlink d7/d9/l18 0 2026-03-10T14:07:22.217 INFO:tasks.workunit.client.1.vm04.stdout:9/72: dwrite f8 [0,4194304] 0 2026-03-10T14:07:22.219 INFO:tasks.workunit.client.1.vm04.stdout:9/73: read d9/da/f13 [123836,55859] 0 2026-03-10T14:07:22.220 INFO:tasks.workunit.client.1.vm04.stdout:9/74: fsync d9/da/dd/fe 0 2026-03-10T14:07:22.221 INFO:tasks.workunit.client.1.vm04.stdout:9/75: write d9/da/f10 [1533694,78139] 0 2026-03-10T14:07:22.226 INFO:tasks.workunit.client.1.vm04.stdout:9/76: dwrite d9/f15 [0,4194304] 0 2026-03-10T14:07:22.227 INFO:tasks.workunit.client.1.vm04.stdout:0/41: rename d0/d2/f8 to d0/fc 0 2026-03-10T14:07:22.227 INFO:tasks.workunit.client.1.vm04.stdout:9/77: write d9/da/fb [500841,10864] 0 2026-03-10T14:07:22.234 INFO:tasks.workunit.client.1.vm04.stdout:4/76: chown f3 135 1 2026-03-10T14:07:22.238 INFO:tasks.workunit.client.1.vm04.stdout:1/70: getdents d3/d5 0 2026-03-10T14:07:22.238 INFO:tasks.workunit.client.1.vm04.stdout:1/71: dread - d3/f18 zero size 2026-03-10T14:07:22.240 INFO:tasks.workunit.client.1.vm04.stdout:4/77: dread d4/fe [0,4194304] 0 2026-03-10T14:07:22.244 INFO:tasks.workunit.client.1.vm04.stdout:8/68: creat d0/d3/f16 x:0 0 0 2026-03-10T14:07:22.245 INFO:tasks.workunit.client.1.vm04.stdout:5/100: mknod d7/d12/c19 0 2026-03-10T14:07:22.246 INFO:tasks.workunit.client.1.vm04.stdout:5/101: chown d7/d9/cf 14 1 2026-03-10T14:07:22.269 INFO:tasks.workunit.client.1.vm04.stdout:0/42: rmdir d0 39 2026-03-10T14:07:22.289 INFO:tasks.workunit.client.1.vm04.stdout:3/84: link da/dc/lf da/l23 0 2026-03-10T14:07:22.295 INFO:tasks.workunit.client.1.vm04.stdout:1/72: dwrite f2 [0,4194304] 0 2026-03-10T14:07:22.299 INFO:tasks.workunit.client.1.vm04.stdout:8/69: creat d0/d3/f17 x:0 0 0 2026-03-10T14:07:22.299 INFO:tasks.workunit.client.1.vm04.stdout:8/70: fdatasync d0/d3/f16 0 2026-03-10T14:07:22.300 INFO:tasks.workunit.client.1.vm04.stdout:8/71: read d0/d3/dd/fc [2316473,112173] 0 2026-03-10T14:07:22.301 INFO:tasks.workunit.client.1.vm04.stdout:8/72: write d0/d3/f17 [187902,119070] 0 2026-03-10T14:07:22.311 INFO:tasks.workunit.client.1.vm04.stdout:8/73: dwrite d0/d3/f17 [0,4194304] 0 2026-03-10T14:07:22.316 INFO:tasks.workunit.client.1.vm04.stdout:8/74: dread d0/d3/f17 [0,4194304] 0 2026-03-10T14:07:22.325 INFO:tasks.workunit.client.1.vm04.stdout:5/102: rmdir d7/d9 39 2026-03-10T14:07:22.344 INFO:tasks.workunit.client.1.vm04.stdout:1/73: creat d3/f19 x:0 0 0 2026-03-10T14:07:22.344 INFO:tasks.workunit.client.1.vm04.stdout:1/74: truncate d3/f14 675498 0 2026-03-10T14:07:22.344 INFO:tasks.workunit.client.1.vm04.stdout:1/75: chown d3/fc 22 1 2026-03-10T14:07:22.348 INFO:tasks.workunit.client.1.vm04.stdout:4/78: mknod d4/d14/d1b/c1e 0 2026-03-10T14:07:22.371 INFO:tasks.workunit.client.1.vm04.stdout:5/103: dwrite d7/f13 [0,4194304] 0 2026-03-10T14:07:22.373 INFO:tasks.workunit.client.1.vm04.stdout:0/43: creat d0/d2/fd x:0 0 0 2026-03-10T14:07:22.420 INFO:tasks.workunit.client.1.vm04.stdout:4/79: rename f3 to d4/df/d10/f1f 0 2026-03-10T14:07:22.423 INFO:tasks.workunit.client.1.vm04.stdout:4/80: dread d4/fa [0,4194304] 0 2026-03-10T14:07:22.426 INFO:tasks.workunit.client.1.vm04.stdout:4/81: dread d4/f9 [0,4194304] 0 2026-03-10T14:07:22.428 INFO:tasks.workunit.client.1.vm04.stdout:0/44: rename d0/fb to d0/d2/fe 0 2026-03-10T14:07:22.431 INFO:tasks.workunit.client.1.vm04.stdout:2/85: getdents d0/d14 0 2026-03-10T14:07:22.435 INFO:tasks.workunit.client.1.vm04.stdout:0/45: creat d0/ff x:0 0 0 2026-03-10T14:07:22.437 INFO:tasks.workunit.client.1.vm04.stdout:1/76: getdents d3/d5/d13 0 2026-03-10T14:07:22.439 INFO:tasks.workunit.client.1.vm04.stdout:1/77: write d3/fc [209807,19742] 0 2026-03-10T14:07:22.447 INFO:tasks.workunit.client.1.vm04.stdout:0/46: creat d0/f10 x:0 0 0 2026-03-10T14:07:22.448 INFO:tasks.workunit.client.1.vm04.stdout:0/47: truncate d0/f10 894308 0 2026-03-10T14:07:22.449 INFO:tasks.workunit.client.1.vm04.stdout:4/82: link f2 d4/d14/d1b/f20 0 2026-03-10T14:07:22.450 INFO:tasks.workunit.client.1.vm04.stdout:4/83: chown d4/d14/c15 71030221 1 2026-03-10T14:07:22.450 INFO:tasks.workunit.client.1.vm04.stdout:4/84: write d4/df/f18 [33182,32565] 0 2026-03-10T14:07:22.451 INFO:tasks.workunit.client.1.vm04.stdout:4/85: truncate d4/df/f18 519577 0 2026-03-10T14:07:22.461 INFO:tasks.workunit.client.1.vm04.stdout:0/48: mknod d0/d2/c11 0 2026-03-10T14:07:22.471 INFO:tasks.workunit.client.1.vm04.stdout:4/86: dwrite d4/f6 [0,4194304] 0 2026-03-10T14:07:22.509 INFO:tasks.workunit.client.1.vm04.stdout:4/87: creat d4/f21 x:0 0 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:4/88: read d4/fd [262582,5541] 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:1/78: rmdir d3/d5/d16 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:4/89: mkdir d4/df/d22 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:1/79: mkdir d3/d5/d13/d1a 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:1/80: truncate d3/f19 353528 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:4/90: dwrite d4/fe [0,4194304] 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:1/81: creat d3/d5/f1b x:0 0 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:1/82: mknod d3/c1c 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:1/83: dwrite d3/fa [0,4194304] 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:1/84: write d3/d5/f1b [329460,80121] 0 2026-03-10T14:07:22.511 INFO:tasks.workunit.client.1.vm04.stdout:1/85: chown d3/d5/d13 1432 1 2026-03-10T14:07:22.512 INFO:tasks.workunit.client.1.vm04.stdout:1/86: write d3/f18 [840772,59380] 0 2026-03-10T14:07:22.513 INFO:tasks.workunit.client.1.vm04.stdout:1/87: chown d3/fc 57236616 1 2026-03-10T14:07:22.517 INFO:tasks.workunit.client.1.vm04.stdout:1/88: dwrite f1 [0,4194304] 0 2026-03-10T14:07:22.519 INFO:tasks.workunit.client.1.vm04.stdout:1/89: write d3/fc [1208900,98276] 0 2026-03-10T14:07:22.543 INFO:tasks.workunit.client.1.vm04.stdout:1/90: creat d3/f1d x:0 0 0 2026-03-10T14:07:22.543 INFO:tasks.workunit.client.1.vm04.stdout:1/91: chown f1 8 1 2026-03-10T14:07:22.545 INFO:tasks.workunit.client.1.vm04.stdout:1/92: dread d3/fb [0,4194304] 0 2026-03-10T14:07:22.545 INFO:tasks.workunit.client.1.vm04.stdout:1/93: readlink d3/l9 0 2026-03-10T14:07:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:22 vm04.local ceph-mon[55966]: from='client.24385 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:22 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/3525828060' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:07:22.570 INFO:tasks.workunit.client.1.vm04.stdout:3/85: truncate da/fb 4415692 0 2026-03-10T14:07:22.573 INFO:tasks.workunit.client.1.vm04.stdout:3/86: stat da/c17 0 2026-03-10T14:07:22.575 INFO:tasks.workunit.client.1.vm04.stdout:3/87: mknod da/dc/c24 0 2026-03-10T14:07:22.575 INFO:tasks.workunit.client.1.vm04.stdout:3/88: readlink da/dc/l1b 0 2026-03-10T14:07:22.576 INFO:tasks.workunit.client.1.vm04.stdout:5/104: getdents d7/d9 0 2026-03-10T14:07:22.577 INFO:tasks.workunit.client.1.vm04.stdout:3/89: creat da/f25 x:0 0 0 2026-03-10T14:07:22.577 INFO:tasks.workunit.client.1.vm04.stdout:3/90: dread - da/f22 zero size 2026-03-10T14:07:22.578 INFO:tasks.workunit.client.1.vm04.stdout:3/91: write da/dc/f11 [1648761,87832] 0 2026-03-10T14:07:22.583 INFO:tasks.workunit.client.1.vm04.stdout:3/92: dwrite da/f19 [0,4194304] 0 2026-03-10T14:07:22.585 INFO:tasks.workunit.client.1.vm04.stdout:3/93: dread - da/dc/f1d zero size 2026-03-10T14:07:22.585 INFO:tasks.workunit.client.1.vm04.stdout:5/105: mknod d7/c1a 0 2026-03-10T14:07:22.585 INFO:tasks.workunit.client.1.vm04.stdout:5/106: dread - d7/fc zero size 2026-03-10T14:07:22.592 INFO:tasks.workunit.client.1.vm04.stdout:7/47: truncate d2/f4 7892002 0 2026-03-10T14:07:22.592 INFO:tasks.workunit.client.1.vm04.stdout:7/48: readlink d2/l8 0 2026-03-10T14:07:22.594 INFO:tasks.workunit.client.1.vm04.stdout:5/107: dread f4 [0,4194304] 0 2026-03-10T14:07:22.594 INFO:tasks.workunit.client.1.vm04.stdout:5/108: dread - d7/f11 zero size 2026-03-10T14:07:22.595 INFO:tasks.workunit.client.1.vm04.stdout:3/94: dwrite da/fe [0,4194304] 0 2026-03-10T14:07:22.595 INFO:tasks.workunit.client.1.vm04.stdout:5/109: dread - d7/f11 zero size 2026-03-10T14:07:22.598 INFO:tasks.workunit.client.1.vm04.stdout:9/78: dwrite d9/f15 [4194304,4194304] 0 2026-03-10T14:07:22.602 INFO:tasks.workunit.client.1.vm04.stdout:8/75: rmdir d0 39 2026-03-10T14:07:22.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:22 vm03.local ceph-mon[49718]: from='client.24385 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:22.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:22 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/3525828060' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:07:22.615 INFO:tasks.workunit.client.1.vm04.stdout:7/49: chown d2/l6 2875586 1 2026-03-10T14:07:22.621 INFO:tasks.workunit.client.1.vm04.stdout:3/95: rename da/c17 to da/c26 0 2026-03-10T14:07:22.621 INFO:tasks.workunit.client.1.vm04.stdout:7/50: dwrite d2/dc/fd [0,4194304] 0 2026-03-10T14:07:22.621 INFO:tasks.workunit.client.1.vm04.stdout:5/110: creat d7/d12/f1b x:0 0 0 2026-03-10T14:07:22.623 INFO:tasks.workunit.client.1.vm04.stdout:5/111: fsync d7/fc 0 2026-03-10T14:07:22.627 INFO:tasks.workunit.client.1.vm04.stdout:9/79: rmdir d9/da/dd 39 2026-03-10T14:07:22.628 INFO:tasks.workunit.client.1.vm04.stdout:9/80: write d9/ff [1980564,105128] 0 2026-03-10T14:07:22.628 INFO:tasks.workunit.client.1.vm04.stdout:9/81: chown d9/f15 2721091 1 2026-03-10T14:07:22.629 INFO:tasks.workunit.client.1.vm04.stdout:9/82: write f5 [5471516,20715] 0 2026-03-10T14:07:22.640 INFO:tasks.workunit.client.1.vm04.stdout:6/82: dwrite d3/f4 [0,4194304] 0 2026-03-10T14:07:22.641 INFO:tasks.workunit.client.1.vm04.stdout:6/83: chown d3/de/f10 150139611 1 2026-03-10T14:07:22.660 INFO:tasks.workunit.client.1.vm04.stdout:2/86: truncate d0/d3/f9 581461 0 2026-03-10T14:07:22.667 INFO:tasks.workunit.client.1.vm04.stdout:0/49: rmdir d0 39 2026-03-10T14:07:22.684 INFO:tasks.workunit.client.1.vm04.stdout:4/91: dwrite f2 [0,4194304] 0 2026-03-10T14:07:22.733 INFO:tasks.workunit.client.1.vm04.stdout:1/94: fdatasync f1 0 2026-03-10T14:07:22.736 INFO:tasks.workunit.client.1.vm04.stdout:1/95: dread d3/f18 [0,4194304] 0 2026-03-10T14:07:22.737 INFO:tasks.workunit.client.1.vm04.stdout:1/96: write d3/fc [94347,24737] 0 2026-03-10T14:07:22.741 INFO:tasks.workunit.client.1.vm04.stdout:1/97: dwrite d3/f18 [0,4194304] 0 2026-03-10T14:07:22.749 INFO:tasks.workunit.client.1.vm04.stdout:1/98: dwrite d3/d5/f1b [0,4194304] 0 2026-03-10T14:07:22.759 INFO:tasks.workunit.client.1.vm04.stdout:6/84: sync 2026-03-10T14:07:22.759 INFO:tasks.workunit.client.1.vm04.stdout:6/85: dread - d3/ff zero size 2026-03-10T14:07:22.843 INFO:tasks.workunit.client.1.vm04.stdout:7/51: mkdir d2/dc/de 0 2026-03-10T14:07:22.843 INFO:tasks.workunit.client.1.vm04.stdout:3/96: chown da/c26 180343 1 2026-03-10T14:07:22.844 INFO:tasks.workunit.client.1.vm04.stdout:3/97: truncate da/dc/f1d 222761 0 2026-03-10T14:07:22.845 INFO:tasks.workunit.client.1.vm04.stdout:5/112: symlink d7/d9/l1c 0 2026-03-10T14:07:22.845 INFO:tasks.workunit.client.1.vm04.stdout:5/113: dread - d7/d12/f1b zero size 2026-03-10T14:07:22.855 INFO:tasks.workunit.client.1.vm04.stdout:9/83: write d9/da/dd/fe [3590360,44622] 0 2026-03-10T14:07:22.857 INFO:tasks.workunit.client.1.vm04.stdout:8/76: symlink d0/d3/dd/d11/d12/l18 0 2026-03-10T14:07:22.858 INFO:tasks.workunit.client.1.vm04.stdout:4/92: fsync f0 0 2026-03-10T14:07:22.864 INFO:tasks.workunit.client.1.vm04.stdout:1/99: unlink d3/fb 0 2026-03-10T14:07:22.864 INFO:tasks.workunit.client.1.vm04.stdout:1/100: chown d3 19 1 2026-03-10T14:07:22.864 INFO:tasks.workunit.client.1.vm04.stdout:7/52: mknod d2/dc/cf 0 2026-03-10T14:07:22.864 INFO:tasks.workunit.client.1.vm04.stdout:6/86: creat d3/de/f13 x:0 0 0 2026-03-10T14:07:22.864 INFO:tasks.workunit.client.1.vm04.stdout:3/98: symlink da/l27 0 2026-03-10T14:07:22.865 INFO:tasks.workunit.client.1.vm04.stdout:1/101: write f1 [4192938,29906] 0 2026-03-10T14:07:22.874 INFO:tasks.workunit.client.1.vm04.stdout:5/114: dwrite f4 [4194304,4194304] 0 2026-03-10T14:07:22.890 INFO:tasks.workunit.client.1.vm04.stdout:8/77: mknod d0/c19 0 2026-03-10T14:07:22.891 INFO:tasks.workunit.client.1.vm04.stdout:8/78: chown d0 1777362795 1 2026-03-10T14:07:22.897 INFO:tasks.workunit.client.1.vm04.stdout:0/50: write d0/d2/f9 [1866957,18538] 0 2026-03-10T14:07:22.903 INFO:tasks.workunit.client.1.vm04.stdout:4/93: dwrite d4/f6 [0,4194304] 0 2026-03-10T14:07:22.908 INFO:tasks.workunit.client.1.vm04.stdout:6/87: mkdir d3/d14 0 2026-03-10T14:07:22.911 INFO:tasks.workunit.client.1.vm04.stdout:7/53: creat d2/f10 x:0 0 0 2026-03-10T14:07:22.912 INFO:tasks.workunit.client.1.vm04.stdout:5/115: chown d7/ce 4 1 2026-03-10T14:07:22.912 INFO:tasks.workunit.client.1.vm04.stdout:5/116: chown d7/d12/f1b 87619 1 2026-03-10T14:07:22.922 INFO:tasks.workunit.client.1.vm04.stdout:9/84: symlink d9/da/l18 0 2026-03-10T14:07:22.925 INFO:tasks.workunit.client.1.vm04.stdout:0/51: rename d0/ff to d0/d2/f12 0 2026-03-10T14:07:22.926 INFO:tasks.workunit.client.1.vm04.stdout:8/79: dread d0/d3/dd/fc [0,4194304] 0 2026-03-10T14:07:22.926 INFO:tasks.workunit.client.1.vm04.stdout:8/80: truncate d0/d3/f16 899429 0 2026-03-10T14:07:22.936 INFO:tasks.workunit.client.1.vm04.stdout:6/88: dread d3/d8/fc [0,4194304] 0 2026-03-10T14:07:22.945 INFO:tasks.workunit.client.1.vm04.stdout:9/85: creat d9/da/dd/f19 x:0 0 0 2026-03-10T14:07:22.945 INFO:tasks.workunit.client.1.vm04.stdout:9/86: chown d9/f15 6932 1 2026-03-10T14:07:22.946 INFO:tasks.workunit.client.1.vm04.stdout:9/87: fdatasync d9/ff 0 2026-03-10T14:07:22.948 INFO:tasks.workunit.client.1.vm04.stdout:0/52: creat d0/d2/f13 x:0 0 0 2026-03-10T14:07:22.949 INFO:tasks.workunit.client.1.vm04.stdout:8/81: rename d0/d3/f17 to d0/d3/dd/d11/d12/f1a 0 2026-03-10T14:07:22.949 INFO:tasks.workunit.client.1.vm04.stdout:8/82: write d0/d3/f16 [1688149,104602] 0 2026-03-10T14:07:22.950 INFO:tasks.workunit.client.1.vm04.stdout:8/83: fsync d0/d3/d5/f15 0 2026-03-10T14:07:22.952 INFO:tasks.workunit.client.1.vm04.stdout:4/94: link d4/fa d4/df/d10/f23 0 2026-03-10T14:07:22.952 INFO:tasks.workunit.client.1.vm04.stdout:4/95: chown d4/df 962937 1 2026-03-10T14:07:22.958 INFO:tasks.workunit.client.1.vm04.stdout:7/54: mkdir d2/dc/de/d11 0 2026-03-10T14:07:22.968 INFO:tasks.workunit.client.1.vm04.stdout:3/99: link da/fd da/f28 0 2026-03-10T14:07:22.980 INFO:tasks.workunit.client.1.vm04.stdout:3/100: symlink da/dc/l29 0 2026-03-10T14:07:22.988 INFO:tasks.workunit.client.1.vm04.stdout:9/88: dread d9/da/f13 [0,4194304] 0 2026-03-10T14:07:22.997 INFO:tasks.workunit.client.1.vm04.stdout:4/96: link d4/l19 d4/df/d22/l24 0 2026-03-10T14:07:23.001 INFO:tasks.workunit.client.1.vm04.stdout:7/55: dwrite f0 [0,4194304] 0 2026-03-10T14:07:23.001 INFO:tasks.workunit.client.1.vm04.stdout:7/56: dread - d2/f10 zero size 2026-03-10T14:07:23.003 INFO:tasks.workunit.client.1.vm04.stdout:9/89: symlink d9/da/dd/l1a 0 2026-03-10T14:07:23.004 INFO:tasks.workunit.client.1.vm04.stdout:4/97: write d4/df/d10/f23 [168467,104240] 0 2026-03-10T14:07:23.005 INFO:tasks.workunit.client.1.vm04.stdout:9/90: dread d9/da/f13 [0,4194304] 0 2026-03-10T14:07:23.010 INFO:tasks.workunit.client.1.vm04.stdout:4/98: write d4/f9 [1467554,97356] 0 2026-03-10T14:07:23.010 INFO:tasks.workunit.client.1.vm04.stdout:7/57: dread f0 [0,4194304] 0 2026-03-10T14:07:23.010 INFO:tasks.workunit.client.1.vm04.stdout:7/58: truncate d2/f10 576369 0 2026-03-10T14:07:23.010 INFO:tasks.workunit.client.1.vm04.stdout:7/59: chown d2 162 1 2026-03-10T14:07:23.012 INFO:tasks.workunit.client.1.vm04.stdout:7/60: creat d2/dc/de/f12 x:0 0 0 2026-03-10T14:07:23.012 INFO:tasks.workunit.client.1.vm04.stdout:7/61: stat d2/dc/cf 0 2026-03-10T14:07:23.013 INFO:tasks.workunit.client.1.vm04.stdout:4/99: chown d4/df/d22/l24 1 1 2026-03-10T14:07:23.014 INFO:tasks.workunit.client.1.vm04.stdout:4/100: write d4/f9 [2164025,16136] 0 2026-03-10T14:07:23.016 INFO:tasks.workunit.client.1.vm04.stdout:9/91: creat d9/da/f1b x:0 0 0 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:9/92: truncate d9/da/fb 1259678 0 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:4/101: chown d4/l19 0 1 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:9/93: mkdir d9/da/dd/d1c 0 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:9/94: dwrite d9/ff [0,4194304] 0 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:9/95: chown d9/da/f13 92 1 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:9/96: readlink d9/da/l14 0 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:9/97: write d9/da/fb [1335403,117456] 0 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:9/98: write f5 [5769705,111275] 0 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:4/102: link d4/l7 d4/df/l25 0 2026-03-10T14:07:23.031 INFO:tasks.workunit.client.1.vm04.stdout:9/99: write f8 [5196410,111520] 0 2026-03-10T14:07:23.033 INFO:tasks.workunit.client.1.vm04.stdout:3/101: sync 2026-03-10T14:07:23.037 INFO:tasks.workunit.client.1.vm04.stdout:9/100: dread f5 [0,4194304] 0 2026-03-10T14:07:23.039 INFO:tasks.workunit.client.1.vm04.stdout:5/117: fdatasync d7/f13 0 2026-03-10T14:07:23.040 INFO:tasks.workunit.client.1.vm04.stdout:4/103: creat d4/d14/d1b/f26 x:0 0 0 2026-03-10T14:07:23.045 INFO:tasks.workunit.client.1.vm04.stdout:4/104: creat d4/d14/f27 x:0 0 0 2026-03-10T14:07:23.055 INFO:tasks.workunit.client.1.vm04.stdout:4/105: readlink d4/df/l1c 0 2026-03-10T14:07:23.055 INFO:tasks.workunit.client.1.vm04.stdout:4/106: dread f2 [0,4194304] 0 2026-03-10T14:07:23.055 INFO:tasks.workunit.client.1.vm04.stdout:2/87: dwrite d0/d3/f11 [0,4194304] 0 2026-03-10T14:07:23.055 INFO:tasks.workunit.client.1.vm04.stdout:2/88: chown d0 9 1 2026-03-10T14:07:23.056 INFO:tasks.workunit.client.1.vm04.stdout:2/89: dread d0/d3/f11 [0,4194304] 0 2026-03-10T14:07:23.061 INFO:tasks.workunit.client.1.vm04.stdout:2/90: dread d0/d3/f11 [0,4194304] 0 2026-03-10T14:07:23.062 INFO:tasks.workunit.client.1.vm04.stdout:7/62: dread d2/f4 [0,4194304] 0 2026-03-10T14:07:23.063 INFO:tasks.workunit.client.1.vm04.stdout:7/63: chown d2/l6 48492 1 2026-03-10T14:07:23.068 INFO:tasks.workunit.client.1.vm04.stdout:9/101: mkdir d9/d1d 0 2026-03-10T14:07:23.071 INFO:tasks.workunit.client.1.vm04.stdout:9/102: fdatasync d9/f15 0 2026-03-10T14:07:23.071 INFO:tasks.workunit.client.1.vm04.stdout:4/107: readlink d4/l19 0 2026-03-10T14:07:23.073 INFO:tasks.workunit.client.1.vm04.stdout:1/102: fdatasync f1 0 2026-03-10T14:07:23.078 INFO:tasks.workunit.client.1.vm04.stdout:7/64: symlink d2/dc/de/l13 0 2026-03-10T14:07:23.079 INFO:tasks.workunit.client.1.vm04.stdout:7/65: chown d2/dc/de 5560 1 2026-03-10T14:07:23.082 INFO:tasks.workunit.client.1.vm04.stdout:0/53: rmdir d0/d2 39 2026-03-10T14:07:23.082 INFO:tasks.workunit.client.1.vm04.stdout:5/118: creat d7/f1d x:0 0 0 2026-03-10T14:07:23.083 INFO:tasks.workunit.client.1.vm04.stdout:1/103: dwrite d3/f19 [0,4194304] 0 2026-03-10T14:07:23.086 INFO:tasks.workunit.client.1.vm04.stdout:8/84: getdents d0/d3/dd/d11/d12 0 2026-03-10T14:07:23.091 INFO:tasks.workunit.client.1.vm04.stdout:8/85: dread d0/d3/dd/d11/d12/f1a [0,4194304] 0 2026-03-10T14:07:23.092 INFO:tasks.workunit.client.1.vm04.stdout:6/89: truncate d3/f9 334237 0 2026-03-10T14:07:23.100 INFO:tasks.workunit.client.1.vm04.stdout:4/108: creat d4/d14/d1b/f28 x:0 0 0 2026-03-10T14:07:23.101 INFO:tasks.workunit.client.1.vm04.stdout:2/91: stat d0/d3/f9 0 2026-03-10T14:07:23.103 INFO:tasks.workunit.client.1.vm04.stdout:7/66: unlink d2/f7 0 2026-03-10T14:07:23.107 INFO:tasks.workunit.client.1.vm04.stdout:1/104: mkdir d3/d5/d1e 0 2026-03-10T14:07:23.118 INFO:tasks.workunit.client.1.vm04.stdout:0/54: dwrite d0/d2/fd [0,4194304] 0 2026-03-10T14:07:23.118 INFO:tasks.workunit.client.1.vm04.stdout:4/109: symlink d4/d14/d1b/l29 0 2026-03-10T14:07:23.118 INFO:tasks.workunit.client.1.vm04.stdout:4/110: chown d4/df/d10 971921 1 2026-03-10T14:07:23.118 INFO:tasks.workunit.client.1.vm04.stdout:4/111: stat d4/lb 0 2026-03-10T14:07:23.118 INFO:tasks.workunit.client.1.vm04.stdout:1/105: dwrite d3/f14 [0,4194304] 0 2026-03-10T14:07:23.118 INFO:tasks.workunit.client.1.vm04.stdout:2/92: creat d0/d3/f1d x:0 0 0 2026-03-10T14:07:23.122 INFO:tasks.workunit.client.1.vm04.stdout:7/67: mkdir d2/d14 0 2026-03-10T14:07:23.127 INFO:tasks.workunit.client.1.vm04.stdout:9/103: creat d9/da/f1e x:0 0 0 2026-03-10T14:07:23.127 INFO:tasks.workunit.client.1.vm04.stdout:4/112: dwrite d4/d14/d1b/f26 [0,4194304] 0 2026-03-10T14:07:23.129 INFO:tasks.workunit.client.1.vm04.stdout:2/93: symlink d0/d3/d8/l1e 0 2026-03-10T14:07:23.129 INFO:tasks.workunit.client.1.vm04.stdout:5/119: rename d7/d9/l1c to d7/l1e 0 2026-03-10T14:07:23.130 INFO:tasks.workunit.client.1.vm04.stdout:0/55: write d0/d2/f12 [646150,108075] 0 2026-03-10T14:07:23.131 INFO:tasks.workunit.client.1.vm04.stdout:5/120: write d7/fa [931848,33007] 0 2026-03-10T14:07:23.133 INFO:tasks.workunit.client.1.vm04.stdout:0/56: write d0/d2/f9 [2826668,129766] 0 2026-03-10T14:07:23.141 INFO:tasks.workunit.client.1.vm04.stdout:0/57: write d0/d2/f13 [737445,33247] 0 2026-03-10T14:07:23.142 INFO:tasks.workunit.client.1.vm04.stdout:3/102: truncate da/fe 931901 0 2026-03-10T14:07:23.144 INFO:tasks.workunit.client.1.vm04.stdout:6/90: rmdir d3 39 2026-03-10T14:07:23.146 INFO:tasks.workunit.client.1.vm04.stdout:0/58: truncate d0/d2/fa 355956 0 2026-03-10T14:07:23.146 INFO:tasks.workunit.client.1.vm04.stdout:3/103: write da/f25 [297699,1970] 0 2026-03-10T14:07:23.147 INFO:tasks.workunit.client.1.vm04.stdout:5/121: dwrite d7/fc [0,4194304] 0 2026-03-10T14:07:23.147 INFO:tasks.workunit.client.1.vm04.stdout:0/59: stat d0/fc 0 2026-03-10T14:07:23.154 INFO:tasks.workunit.client.1.vm04.stdout:2/94: dwrite d0/d3/d8/fc [0,4194304] 0 2026-03-10T14:07:23.155 INFO:tasks.workunit.client.1.vm04.stdout:9/104: dwrite d9/da/dd/f19 [0,4194304] 0 2026-03-10T14:07:23.160 INFO:tasks.workunit.client.1.vm04.stdout:9/105: dread d9/da/dd/fe [0,4194304] 0 2026-03-10T14:07:23.163 INFO:tasks.workunit.client.1.vm04.stdout:9/106: write d9/da/dd/f19 [3493074,129999] 0 2026-03-10T14:07:23.174 INFO:tasks.workunit.client.1.vm04.stdout:8/86: rename d0/d3/f16 to d0/f1b 0 2026-03-10T14:07:23.181 INFO:tasks.workunit.client.1.vm04.stdout:4/113: mkdir d4/d2a 0 2026-03-10T14:07:23.182 INFO:tasks.workunit.client.1.vm04.stdout:6/91: fdatasync d3/d8/fa 0 2026-03-10T14:07:23.183 INFO:tasks.workunit.client.1.vm04.stdout:6/92: write d3/de/f10 [878979,132] 0 2026-03-10T14:07:23.184 INFO:tasks.workunit.client.1.vm04.stdout:3/104: creat da/dc/f2a x:0 0 0 2026-03-10T14:07:23.190 INFO:tasks.workunit.client.1.vm04.stdout:7/68: creat d2/d14/f15 x:0 0 0 2026-03-10T14:07:23.191 INFO:tasks.workunit.client.1.vm04.stdout:7/69: stat d2/dc/fd 0 2026-03-10T14:07:23.192 INFO:tasks.workunit.client.1.vm04.stdout:7/70: write d2/d14/f15 [340213,80038] 0 2026-03-10T14:07:23.197 INFO:tasks.workunit.client.1.vm04.stdout:1/106: rename d3/f1d to d3/d5/d1e/f1f 0 2026-03-10T14:07:23.202 INFO:tasks.workunit.client.1.vm04.stdout:8/87: fdatasync d0/f1b 0 2026-03-10T14:07:23.202 INFO:tasks.workunit.client.1.vm04.stdout:8/88: chown d0/d3/d5 2 1 2026-03-10T14:07:23.203 INFO:tasks.workunit.client.1.vm04.stdout:8/89: write d0/d3/d5/f15 [367048,13072] 0 2026-03-10T14:07:23.210 INFO:tasks.workunit.client.1.vm04.stdout:5/122: mknod d7/c1f 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:3/105: mkdir da/dc/d2b 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:2/95: creat d0/d3/d8/d17/f1f x:0 0 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:3/106: stat da/l27 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:9/107: creat d9/d1d/f1f x:0 0 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:8/90: fdatasync d0/d3/dd/fc 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:9/108: chown d9/d1d 812 1 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:8/91: truncate d0/d3/dd/d11/d12/f1a 5022779 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:8/92: chown d0/d3/dd 254 1 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:4/114: link d4/f21 d4/df/d22/f2b 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:3/107: dwrite da/f19 [0,4194304] 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:5/123: rename d7/d12/f1b to d7/d9/f20 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:0/60: creat d0/f14 x:0 0 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:5/124: write d7/f13 [7530994,113498] 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:3/108: write da/dc/f1d [678747,50569] 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:9/109: unlink d9/da/f10 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:4/115: rmdir d4/df/d10 39 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:0/61: mkdir d0/d2/d15 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:4/116: dwrite d4/d14/d1b/f26 [0,4194304] 0 2026-03-10T14:07:23.242 INFO:tasks.workunit.client.1.vm04.stdout:0/62: rename d0/fc to d0/d2/d15/f16 0 2026-03-10T14:07:23.244 INFO:tasks.workunit.client.1.vm04.stdout:8/93: link d0/d3/dd/d11/le d0/l1c 0 2026-03-10T14:07:23.246 INFO:tasks.workunit.client.1.vm04.stdout:4/117: write d4/df/d10/f23 [34478,17423] 0 2026-03-10T14:07:23.247 INFO:tasks.workunit.client.1.vm04.stdout:4/118: chown d4/df/l1c 309513 1 2026-03-10T14:07:23.249 INFO:tasks.workunit.client.1.vm04.stdout:3/109: rmdir da/dc/d2b 0 2026-03-10T14:07:23.250 INFO:tasks.workunit.client.1.vm04.stdout:4/119: dread f0 [0,4194304] 0 2026-03-10T14:07:23.252 INFO:tasks.workunit.client.1.vm04.stdout:0/63: rename d0/d2/fe to d0/d2/f17 0 2026-03-10T14:07:23.253 INFO:tasks.workunit.client.1.vm04.stdout:1/107: sync 2026-03-10T14:07:23.253 INFO:tasks.workunit.client.1.vm04.stdout:9/110: creat d9/f20 x:0 0 0 2026-03-10T14:07:23.253 INFO:tasks.workunit.client.1.vm04.stdout:8/94: fdatasync d0/d3/dd/d11/d12/f1a 0 2026-03-10T14:07:23.255 INFO:tasks.workunit.client.1.vm04.stdout:8/95: write d0/d3/dd/d11/d12/f1a [3149510,94701] 0 2026-03-10T14:07:23.261 INFO:tasks.workunit.client.1.vm04.stdout:0/64: mknod d0/d2/d15/c18 0 2026-03-10T14:07:23.262 INFO:tasks.workunit.client.1.vm04.stdout:9/111: write f5 [201371,47587] 0 2026-03-10T14:07:23.262 INFO:tasks.workunit.client.1.vm04.stdout:9/112: symlink d9/da/dd/l21 0 2026-03-10T14:07:23.262 INFO:tasks.workunit.client.1.vm04.stdout:9/113: chown d9/da/dd/f19 62121886 1 2026-03-10T14:07:23.263 INFO:tasks.workunit.client.1.vm04.stdout:8/96: chown d0/d3/dd/d11/le 44721 1 2026-03-10T14:07:23.264 INFO:tasks.workunit.client.1.vm04.stdout:8/97: chown d0/d3/dd/d11/d12 2818177 1 2026-03-10T14:07:23.266 INFO:tasks.workunit.client.1.vm04.stdout:0/65: creat d0/f19 x:0 0 0 2026-03-10T14:07:23.272 INFO:tasks.workunit.client.1.vm04.stdout:8/98: creat d0/d3/dd/d11/d12/f1d x:0 0 0 2026-03-10T14:07:23.272 INFO:tasks.workunit.client.1.vm04.stdout:9/114: creat d9/da/dd/d1c/f22 x:0 0 0 2026-03-10T14:07:23.273 INFO:tasks.workunit.client.1.vm04.stdout:8/99: write d0/d3/dd/d11/d12/f1a [1959475,91541] 0 2026-03-10T14:07:23.273 INFO:tasks.workunit.client.1.vm04.stdout:9/115: write d9/da/f1e [165962,49076] 0 2026-03-10T14:07:23.274 INFO:tasks.workunit.client.1.vm04.stdout:8/100: chown d0/d3/c10 835149071 1 2026-03-10T14:07:23.275 INFO:tasks.workunit.client.1.vm04.stdout:9/116: truncate d9/d1d/f1f 513528 0 2026-03-10T14:07:23.279 INFO:tasks.workunit.client.1.vm04.stdout:9/117: write d9/da/dd/f19 [1357980,54839] 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/118: creat d9/d1d/f23 x:0 0 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/119: chown d9/da/dd/d1c 187826265 1 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/120: stat d9/d1d/f1f 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/121: creat d9/da/dd/f24 x:0 0 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/122: write d9/da/fb [799455,56974] 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/123: chown d9/d1d/f23 0 1 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/124: symlink d9/l25 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/125: dwrite f5 [4194304,4194304] 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/126: mknod d9/d1d/c26 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/127: creat d9/da/dd/d1c/f27 x:0 0 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/128: stat d9/da/f1e 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/129: creat d9/da/dd/f28 x:0 0 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/130: stat d9/da/dd/fe 0 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/131: dread - d9/da/dd/f28 zero size 2026-03-10T14:07:23.309 INFO:tasks.workunit.client.1.vm04.stdout:9/132: mknod d9/da/dd/d1c/c29 0 2026-03-10T14:07:23.339 INFO:tasks.workunit.client.1.vm04.stdout:8/101: sync 2026-03-10T14:07:23.341 INFO:tasks.workunit.client.1.vm04.stdout:8/102: mknod d0/d3/d5/c1e 0 2026-03-10T14:07:23.341 INFO:tasks.workunit.client.1.vm04.stdout:8/103: fsync d0/f1b 0 2026-03-10T14:07:23.342 INFO:tasks.workunit.client.1.vm04.stdout:8/104: write d0/d3/dd/d11/d12/f1a [5060406,90605] 0 2026-03-10T14:07:23.343 INFO:tasks.workunit.client.1.vm04.stdout:8/105: chown d0/d3/dd/d11/d12/l18 7990 1 2026-03-10T14:07:23.345 INFO:tasks.workunit.client.1.vm04.stdout:8/106: unlink d0/l1c 0 2026-03-10T14:07:23.349 INFO:tasks.workunit.client.1.vm04.stdout:8/107: link d0/f1b d0/d3/dd/d11/d12/f1f 0 2026-03-10T14:07:23.349 INFO:tasks.workunit.client.1.vm04.stdout:8/108: readlink d0/d3/dd/d11/d12/l18 0 2026-03-10T14:07:23.384 INFO:tasks.workunit.client.1.vm04.stdout:1/108: getdents d3 0 2026-03-10T14:07:23.386 INFO:tasks.workunit.client.1.vm04.stdout:1/109: getdents d3 0 2026-03-10T14:07:23.387 INFO:tasks.workunit.client.1.vm04.stdout:2/96: rmdir d0/d3/d8/d17 39 2026-03-10T14:07:23.388 INFO:tasks.workunit.client.1.vm04.stdout:1/110: getdents d3 0 2026-03-10T14:07:23.389 INFO:tasks.workunit.client.1.vm04.stdout:7/71: write d2/f4 [3279123,46582] 0 2026-03-10T14:07:23.393 INFO:tasks.workunit.client.1.vm04.stdout:7/72: dwrite d2/f4 [0,4194304] 0 2026-03-10T14:07:23.404 INFO:tasks.workunit.client.1.vm04.stdout:1/111: mkdir d3/d20 0 2026-03-10T14:07:23.414 INFO:tasks.workunit.client.1.vm04.stdout:5/125: dread d7/fa [0,4194304] 0 2026-03-10T14:07:23.414 INFO:tasks.workunit.client.1.vm04.stdout:1/112: rename d3/d5/d1e/f1f to d3/d5/d1e/f21 0 2026-03-10T14:07:23.416 INFO:tasks.workunit.client.1.vm04.stdout:7/73: creat d2/dc/de/d11/f16 x:0 0 0 2026-03-10T14:07:23.419 INFO:tasks.workunit.client.1.vm04.stdout:1/113: write d3/d5/d1e/f21 [789240,110399] 0 2026-03-10T14:07:23.423 INFO:tasks.workunit.client.1.vm04.stdout:1/114: dread d3/fc [0,4194304] 0 2026-03-10T14:07:23.427 INFO:tasks.workunit.client.1.vm04.stdout:7/74: creat d2/d14/f17 x:0 0 0 2026-03-10T14:07:23.429 INFO:tasks.workunit.client.1.vm04.stdout:8/109: dread d0/d3/dd/d11/d12/f1a [4194304,4194304] 0 2026-03-10T14:07:23.429 INFO:tasks.workunit.client.1.vm04.stdout:8/110: fsync d0/d3/d5/f15 0 2026-03-10T14:07:23.434 INFO:tasks.workunit.client.1.vm04.stdout:7/75: symlink d2/dc/l18 0 2026-03-10T14:07:23.435 INFO:tasks.workunit.client.1.vm04.stdout:7/76: read - d2/dc/de/d11/f16 zero size 2026-03-10T14:07:23.435 INFO:tasks.workunit.client.1.vm04.stdout:7/77: stat d2/l6 0 2026-03-10T14:07:23.436 INFO:tasks.workunit.client.1.vm04.stdout:7/78: truncate d2/dc/de/f12 610112 0 2026-03-10T14:07:23.439 INFO:tasks.workunit.client.1.vm04.stdout:3/110: dread da/dc/f11 [0,4194304] 0 2026-03-10T14:07:23.442 INFO:tasks.workunit.client.1.vm04.stdout:7/79: dwrite d2/dc/de/d11/f16 [0,4194304] 0 2026-03-10T14:07:23.454 INFO:tasks.workunit.client.1.vm04.stdout:7/80: dwrite d2/f4 [0,4194304] 0 2026-03-10T14:07:23.462 INFO:tasks.workunit.client.1.vm04.stdout:7/81: dread d2/dc/de/d11/f16 [0,4194304] 0 2026-03-10T14:07:23.466 INFO:tasks.workunit.client.1.vm04.stdout:7/82: dwrite d2/dc/de/f12 [0,4194304] 0 2026-03-10T14:07:23.466 INFO:tasks.workunit.client.1.vm04.stdout:0/66: fsync d0/d2/d15/f16 0 2026-03-10T14:07:23.468 INFO:tasks.workunit.client.1.vm04.stdout:3/111: dread da/dc/f1d [0,4194304] 0 2026-03-10T14:07:23.472 INFO:tasks.workunit.client.1.vm04.stdout:0/67: dwrite d0/d2/f17 [0,4194304] 0 2026-03-10T14:07:23.474 INFO:tasks.workunit.client.1.vm04.stdout:8/111: creat d0/d3/dd/d11/d12/f20 x:0 0 0 2026-03-10T14:07:23.474 INFO:tasks.workunit.client.1.vm04.stdout:8/112: chown d0/d3/dd 229 1 2026-03-10T14:07:23.482 INFO:tasks.workunit.client.1.vm04.stdout:4/120: write d4/df/d10/f1f [4410132,127047] 0 2026-03-10T14:07:23.486 INFO:tasks.workunit.client.1.vm04.stdout:4/121: dwrite d4/df/d10/f1f [0,4194304] 0 2026-03-10T14:07:23.493 INFO:tasks.workunit.client.1.vm04.stdout:4/122: dwrite d4/fd [0,4194304] 0 2026-03-10T14:07:23.497 INFO:tasks.workunit.client.1.vm04.stdout:4/123: read f0 [3976292,115421] 0 2026-03-10T14:07:23.498 INFO:tasks.workunit.client.1.vm04.stdout:5/126: getdents d7 0 2026-03-10T14:07:23.498 INFO:tasks.workunit.client.1.vm04.stdout:5/127: write f4 [671637,58440] 0 2026-03-10T14:07:23.510 INFO:tasks.workunit.client.1.vm04.stdout:9/133: truncate d9/da/dd/f19 1501593 0 2026-03-10T14:07:23.511 INFO:tasks.workunit.client.1.vm04.stdout:7/83: creat d2/dc/de/d11/f19 x:0 0 0 2026-03-10T14:07:23.511 INFO:tasks.workunit.client.1.vm04.stdout:3/112: creat da/dc/f2c x:0 0 0 2026-03-10T14:07:23.513 INFO:tasks.workunit.client.1.vm04.stdout:8/113: unlink d0/d3/dd/d11/d12/f1a 0 2026-03-10T14:07:23.514 INFO:tasks.workunit.client.1.vm04.stdout:3/113: dwrite da/f14 [0,4194304] 0 2026-03-10T14:07:23.531 INFO:tasks.workunit.client.1.vm04.stdout:5/128: rename d7/fc to d7/f21 0 2026-03-10T14:07:23.535 INFO:tasks.workunit.client.1.vm04.stdout:7/84: creat d2/dc/de/d11/f1a x:0 0 0 2026-03-10T14:07:23.536 INFO:tasks.workunit.client.1.vm04.stdout:7/85: chown d2/dc/de/d11 60 1 2026-03-10T14:07:23.536 INFO:tasks.workunit.client.1.vm04.stdout:7/86: chown d2/dc/de/d11 9 1 2026-03-10T14:07:23.540 INFO:tasks.workunit.client.1.vm04.stdout:6/93: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:07:23.540 INFO:tasks.workunit.client.1.vm04.stdout:2/97: write d0/d3/f9 [419124,75187] 0 2026-03-10T14:07:23.549 INFO:tasks.workunit.client.1.vm04.stdout:1/115: write d3/d5/f1b [5142189,129521] 0 2026-03-10T14:07:23.549 INFO:tasks.workunit.client.1.vm04.stdout:8/114: rmdir d0/d3/dd/d11 39 2026-03-10T14:07:23.553 INFO:tasks.workunit.client.1.vm04.stdout:4/124: link d4/fe d4/f2c 0 2026-03-10T14:07:23.555 INFO:tasks.workunit.client.1.vm04.stdout:4/125: truncate d4/d14/f27 694710 0 2026-03-10T14:07:23.565 INFO:tasks.workunit.client.1.vm04.stdout:4/126: stat d4/df/c1a 0 2026-03-10T14:07:23.565 INFO:tasks.workunit.client.1.vm04.stdout:9/134: mknod d9/da/c2a 0 2026-03-10T14:07:23.565 INFO:tasks.workunit.client.1.vm04.stdout:7/87: symlink d2/dc/de/l1b 0 2026-03-10T14:07:23.565 INFO:tasks.workunit.client.1.vm04.stdout:6/94: symlink d3/de/d11/l15 0 2026-03-10T14:07:23.565 INFO:tasks.workunit.client.1.vm04.stdout:6/95: chown d3/de 22 1 2026-03-10T14:07:23.565 INFO:tasks.workunit.client.1.vm04.stdout:1/116: mkdir d3/d22 0 2026-03-10T14:07:23.565 INFO:tasks.workunit.client.1.vm04.stdout:8/115: chown d0/d3/dd/d11/d12/f1f 1369568 1 2026-03-10T14:07:23.565 INFO:tasks.workunit.client.1.vm04.stdout:5/129: unlink d7/d9/lb 0 2026-03-10T14:07:23.567 INFO:tasks.workunit.client.1.vm04.stdout:7/88: dread f0 [0,4194304] 0 2026-03-10T14:07:23.567 INFO:tasks.workunit.client.1.vm04.stdout:1/117: dread d3/f14 [0,4194304] 0 2026-03-10T14:07:23.568 INFO:tasks.workunit.client.1.vm04.stdout:9/135: dwrite d9/d1d/f23 [0,4194304] 0 2026-03-10T14:07:23.573 INFO:tasks.workunit.client.1.vm04.stdout:9/136: write d9/da/f1e [512182,104130] 0 2026-03-10T14:07:23.573 INFO:tasks.workunit.client.1.vm04.stdout:9/137: fsync d9/da/dd/d1c/f22 0 2026-03-10T14:07:23.577 INFO:tasks.workunit.client.1.vm04.stdout:9/138: dread - d9/da/dd/d1c/f27 zero size 2026-03-10T14:07:23.580 INFO:tasks.workunit.client.1.vm04.stdout:8/116: dread d0/d3/dd/d11/d12/f1f [0,4194304] 0 2026-03-10T14:07:23.581 INFO:tasks.workunit.client.1.vm04.stdout:7/89: dwrite d2/f4 [4194304,4194304] 0 2026-03-10T14:07:23.584 INFO:tasks.workunit.client.1.vm04.stdout:7/90: stat d2/f10 0 2026-03-10T14:07:23.597 INFO:tasks.workunit.client.1.vm04.stdout:4/127: rename f0 to d4/df/d22/f2d 0 2026-03-10T14:07:23.597 INFO:tasks.workunit.client.1.vm04.stdout:0/68: getdents d0/d2 0 2026-03-10T14:07:23.599 INFO:tasks.workunit.client.1.vm04.stdout:3/114: link da/dc/l1b da/l2d 0 2026-03-10T14:07:23.604 INFO:tasks.workunit.client.1.vm04.stdout:5/130: chown d7/d12 38460 1 2026-03-10T14:07:23.604 INFO:tasks.workunit.client.1.vm04.stdout:1/118: creat d3/d5/d13/f23 x:0 0 0 2026-03-10T14:07:23.607 INFO:tasks.workunit.client.1.vm04.stdout:8/117: mknod d0/d3/dd/d11/d12/c21 0 2026-03-10T14:07:23.609 INFO:tasks.workunit.client.1.vm04.stdout:2/98: link d0/c15 d0/d14/d1b/c20 0 2026-03-10T14:07:23.609 INFO:tasks.workunit.client.1.vm04.stdout:7/91: write f0 [1014649,87246] 0 2026-03-10T14:07:23.609 INFO:tasks.workunit.client.1.vm04.stdout:2/99: dread - d0/d3/f1d zero size 2026-03-10T14:07:23.616 INFO:tasks.workunit.client.1.vm04.stdout:2/100: fsync d0/d3/d8/f6 0 2026-03-10T14:07:23.616 INFO:tasks.workunit.client.1.vm04.stdout:3/115: symlink da/l2e 0 2026-03-10T14:07:23.617 INFO:tasks.workunit.client.1.vm04.stdout:3/116: fsync da/f19 0 2026-03-10T14:07:23.622 INFO:tasks.workunit.client.1.vm04.stdout:9/139: rename d9/da/c2a to d9/d1d/c2b 0 2026-03-10T14:07:23.622 INFO:tasks.workunit.client.1.vm04.stdout:8/118: dwrite d0/d3/dd/d11/d12/f20 [0,4194304] 0 2026-03-10T14:07:23.624 INFO:tasks.workunit.client.1.vm04.stdout:1/119: unlink d3/d5/f1b 0 2026-03-10T14:07:23.624 INFO:tasks.workunit.client.1.vm04.stdout:8/119: write d0/d3/dd/d11/d12/f20 [370162,41500] 0 2026-03-10T14:07:23.625 INFO:tasks.workunit.client.1.vm04.stdout:6/96: link d3/de/d11/l12 d3/d14/l16 0 2026-03-10T14:07:23.626 INFO:tasks.workunit.client.1.vm04.stdout:6/97: truncate d3/de/f13 1236 0 2026-03-10T14:07:23.629 INFO:tasks.workunit.client.1.vm04.stdout:6/98: truncate d3/ff 793136 0 2026-03-10T14:07:23.630 INFO:tasks.workunit.client.1.vm04.stdout:7/92: rename d2/dc/l18 to d2/dc/l1c 0 2026-03-10T14:07:23.636 INFO:tasks.workunit.client.1.vm04.stdout:2/101: chown d0/d3/d8/d17/f1f 72997835 1 2026-03-10T14:07:23.637 INFO:tasks.workunit.client.1.vm04.stdout:6/99: dread d3/d8/fd [0,4194304] 0 2026-03-10T14:07:23.639 INFO:tasks.workunit.client.1.vm04.stdout:9/140: symlink d9/d1d/l2c 0 2026-03-10T14:07:23.640 INFO:tasks.workunit.client.1.vm04.stdout:9/141: write d9/f20 [31626,75750] 0 2026-03-10T14:07:23.641 INFO:tasks.workunit.client.1.vm04.stdout:3/117: dwrite da/dc/f1d [0,4194304] 0 2026-03-10T14:07:23.643 INFO:tasks.workunit.client.1.vm04.stdout:2/102: dread d0/d3/d8/f6 [0,4194304] 0 2026-03-10T14:07:23.648 INFO:tasks.workunit.client.1.vm04.stdout:0/69: sync 2026-03-10T14:07:23.657 INFO:tasks.workunit.client.1.vm04.stdout:5/131: unlink d7/c10 0 2026-03-10T14:07:23.660 INFO:tasks.workunit.client.1.vm04.stdout:9/142: readlink d9/da/dd/l17 0 2026-03-10T14:07:23.665 INFO:tasks.workunit.client.1.vm04.stdout:0/70: symlink d0/d2/d15/l1a 0 2026-03-10T14:07:23.669 INFO:tasks.workunit.client.1.vm04.stdout:0/71: dwrite d0/d2/f9 [0,4194304] 0 2026-03-10T14:07:23.672 INFO:tasks.workunit.client.1.vm04.stdout:5/132: symlink d7/d12/l22 0 2026-03-10T14:07:23.673 INFO:tasks.workunit.client.1.vm04.stdout:1/120: creat d3/d5/d13/d1a/f24 x:0 0 0 2026-03-10T14:07:23.673 INFO:tasks.workunit.client.1.vm04.stdout:8/120: mknod d0/d3/dd/c22 0 2026-03-10T14:07:23.675 INFO:tasks.workunit.client.1.vm04.stdout:4/128: getdents d4/d14 0 2026-03-10T14:07:23.676 INFO:tasks.workunit.client.1.vm04.stdout:4/129: stat d4/df/d22/f2b 0 2026-03-10T14:07:23.678 INFO:tasks.workunit.client.1.vm04.stdout:0/72: dwrite d0/d2/fd [0,4194304] 0 2026-03-10T14:07:23.680 INFO:tasks.workunit.client.1.vm04.stdout:2/103: rename d0/d14/d1b/c20 to d0/d14/d1b/c21 0 2026-03-10T14:07:23.684 INFO:tasks.workunit.client.1.vm04.stdout:2/104: dread d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:23.684 INFO:tasks.workunit.client.1.vm04.stdout:5/133: fdatasync d7/f1d 0 2026-03-10T14:07:23.685 INFO:tasks.workunit.client.1.vm04.stdout:2/105: truncate d0/d3/f9 982495 0 2026-03-10T14:07:23.695 INFO:tasks.workunit.client.1.vm04.stdout:8/121: creat d0/f23 x:0 0 0 2026-03-10T14:07:23.696 INFO:tasks.workunit.client.1.vm04.stdout:8/122: write d0/d3/d5/f15 [1271910,95438] 0 2026-03-10T14:07:23.697 INFO:tasks.workunit.client.1.vm04.stdout:8/123: truncate d0/d3/dd/d11/d12/f20 5077948 0 2026-03-10T14:07:23.698 INFO:tasks.workunit.client.1.vm04.stdout:8/124: dread - d0/d3/dd/d11/d12/f1d zero size 2026-03-10T14:07:23.700 INFO:tasks.workunit.client.1.vm04.stdout:4/130: readlink d4/df/d22/l24 0 2026-03-10T14:07:23.702 INFO:tasks.workunit.client.1.vm04.stdout:5/134: mknod d7/d9/c23 0 2026-03-10T14:07:23.703 INFO:tasks.workunit.client.1.vm04.stdout:2/106: symlink d0/d3/d8/dd/l22 0 2026-03-10T14:07:23.703 INFO:tasks.workunit.client.1.vm04.stdout:2/107: write d0/d3/d8/f6 [2537861,103053] 0 2026-03-10T14:07:23.706 INFO:tasks.workunit.client.1.vm04.stdout:8/125: unlink d0/d3/d5/c1e 0 2026-03-10T14:07:23.709 INFO:tasks.workunit.client.1.vm04.stdout:1/121: unlink d3/d5/d13/l17 0 2026-03-10T14:07:23.713 INFO:tasks.workunit.client.1.vm04.stdout:4/131: dread f2 [0,4194304] 0 2026-03-10T14:07:23.716 INFO:tasks.workunit.client.1.vm04.stdout:2/108: mknod d0/d14/d1b/c23 0 2026-03-10T14:07:23.718 INFO:tasks.workunit.client.1.vm04.stdout:8/126: mkdir d0/d3/d24 0 2026-03-10T14:07:23.720 INFO:tasks.workunit.client.1.vm04.stdout:0/73: creat d0/f1b x:0 0 0 2026-03-10T14:07:23.724 INFO:tasks.workunit.client.1.vm04.stdout:0/74: dwrite d0/d2/f17 [0,4194304] 0 2026-03-10T14:07:23.734 INFO:tasks.workunit.client.1.vm04.stdout:0/75: write d0/d2/fa [364688,80579] 0 2026-03-10T14:07:23.734 INFO:tasks.workunit.client.1.vm04.stdout:5/135: dwrite d7/fa [0,4194304] 0 2026-03-10T14:07:23.734 INFO:tasks.workunit.client.1.vm04.stdout:0/76: dread - d0/f19 zero size 2026-03-10T14:07:23.734 INFO:tasks.workunit.client.1.vm04.stdout:0/77: dread - d0/f19 zero size 2026-03-10T14:07:23.734 INFO:tasks.workunit.client.1.vm04.stdout:0/78: dread - d0/f19 zero size 2026-03-10T14:07:23.734 INFO:tasks.workunit.client.1.vm04.stdout:5/136: chown f4 207204561 1 2026-03-10T14:07:23.739 INFO:tasks.workunit.client.1.vm04.stdout:8/127: mknod d0/d3/dd/d11/c25 0 2026-03-10T14:07:23.739 INFO:tasks.workunit.client.1.vm04.stdout:1/122: symlink d3/d22/l25 0 2026-03-10T14:07:23.739 INFO:tasks.workunit.client.1.vm04.stdout:8/128: chown d0/d3/d24 124437 1 2026-03-10T14:07:23.740 INFO:tasks.workunit.client.1.vm04.stdout:2/109: write d0/d3/f9 [1474401,90944] 0 2026-03-10T14:07:23.740 INFO:tasks.workunit.client.1.vm04.stdout:8/129: chown d0/f23 34129 1 2026-03-10T14:07:23.744 INFO:tasks.workunit.client.1.vm04.stdout:0/79: unlink d0/f10 0 2026-03-10T14:07:23.744 INFO:tasks.workunit.client.1.vm04.stdout:0/80: dread - d0/f19 zero size 2026-03-10T14:07:23.744 INFO:tasks.workunit.client.1.vm04.stdout:0/81: chown d0/d2 178954735 1 2026-03-10T14:07:23.745 INFO:tasks.workunit.client.1.vm04.stdout:0/82: chown d0/d2/f17 335 1 2026-03-10T14:07:23.749 INFO:tasks.workunit.client.1.vm04.stdout:1/123: rmdir d3/d5/d1e 39 2026-03-10T14:07:23.749 INFO:tasks.workunit.client.1.vm04.stdout:1/124: fdatasync d3/fa 0 2026-03-10T14:07:23.753 INFO:tasks.workunit.client.1.vm04.stdout:2/110: creat d0/d3/f24 x:0 0 0 2026-03-10T14:07:23.755 INFO:tasks.workunit.client.1.vm04.stdout:4/132: creat d4/df/f2e x:0 0 0 2026-03-10T14:07:23.755 INFO:tasks.workunit.client.1.vm04.stdout:8/130: rmdir d0/d3 39 2026-03-10T14:07:23.757 INFO:tasks.workunit.client.1.vm04.stdout:2/111: dwrite d0/d3/f1d [0,4194304] 0 2026-03-10T14:07:23.761 INFO:tasks.workunit.client.1.vm04.stdout:0/83: dwrite d0/d2/f13 [0,4194304] 0 2026-03-10T14:07:23.768 INFO:tasks.workunit.client.1.vm04.stdout:0/84: write d0/d2/fd [4553128,64613] 0 2026-03-10T14:07:23.775 INFO:tasks.workunit.client.1.vm04.stdout:4/133: dwrite d4/df/f2e [0,4194304] 0 2026-03-10T14:07:23.777 INFO:tasks.workunit.client.1.vm04.stdout:2/112: dwrite d0/d3/f24 [0,4194304] 0 2026-03-10T14:07:23.789 INFO:tasks.workunit.client.1.vm04.stdout:5/137: creat d7/f24 x:0 0 0 2026-03-10T14:07:23.800 INFO:tasks.workunit.client.1.vm04.stdout:5/138: dwrite d7/d9/fd [4194304,4194304] 0 2026-03-10T14:07:23.800 INFO:tasks.workunit.client.1.vm04.stdout:4/134: dwrite d4/f6 [0,4194304] 0 2026-03-10T14:07:23.805 INFO:tasks.workunit.client.1.vm04.stdout:8/131: sync 2026-03-10T14:07:23.813 INFO:tasks.workunit.client.1.vm04.stdout:2/113: mkdir d0/d14/d1b/d25 0 2026-03-10T14:07:23.813 INFO:tasks.workunit.client.1.vm04.stdout:2/114: write d0/d3/d8/f6 [4605417,48037] 0 2026-03-10T14:07:23.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:23 vm04.local ceph-mon[55966]: pgmap v140: 65 pgs: 65 active+clean; 206 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.9 MiB/s rd, 9.6 MiB/s wr, 346 op/s 2026-03-10T14:07:23.814 INFO:tasks.workunit.client.1.vm04.stdout:1/125: link d3/d22/l25 d3/l26 0 2026-03-10T14:07:23.814 INFO:tasks.workunit.client.1.vm04.stdout:0/85: mkdir d0/d1c 0 2026-03-10T14:07:23.818 INFO:tasks.workunit.client.1.vm04.stdout:0/86: dread d0/d2/f9 [0,4194304] 0 2026-03-10T14:07:23.819 INFO:tasks.workunit.client.1.vm04.stdout:0/87: write d0/d2/fd [976703,38566] 0 2026-03-10T14:07:23.825 INFO:tasks.workunit.client.1.vm04.stdout:5/139: dwrite d7/f13 [0,4194304] 0 2026-03-10T14:07:23.836 INFO:tasks.workunit.client.1.vm04.stdout:0/88: dwrite d0/d2/f12 [0,4194304] 0 2026-03-10T14:07:23.836 INFO:tasks.workunit.client.1.vm04.stdout:8/132: dread d0/d3/dd/fc [0,4194304] 0 2026-03-10T14:07:23.840 INFO:tasks.workunit.client.1.vm04.stdout:8/133: dread d0/d3/dd/d11/d12/f1f [0,4194304] 0 2026-03-10T14:07:23.844 INFO:tasks.workunit.client.1.vm04.stdout:2/115: dwrite d0/d3/f11 [0,4194304] 0 2026-03-10T14:07:23.855 INFO:tasks.workunit.client.1.vm04.stdout:8/134: dwrite d0/d3/dd/d11/d12/f20 [0,4194304] 0 2026-03-10T14:07:23.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:23 vm03.local ceph-mon[49718]: pgmap v140: 65 pgs: 65 active+clean; 206 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.9 MiB/s rd, 9.6 MiB/s wr, 346 op/s 2026-03-10T14:07:23.859 INFO:tasks.workunit.client.1.vm04.stdout:4/135: dread d4/df/f11 [0,4194304] 0 2026-03-10T14:07:23.862 INFO:tasks.workunit.client.1.vm04.stdout:5/140: dread - d7/f11 zero size 2026-03-10T14:07:23.863 INFO:tasks.workunit.client.1.vm04.stdout:5/141: write d7/f24 [565210,88633] 0 2026-03-10T14:07:23.864 INFO:tasks.workunit.client.1.vm04.stdout:4/136: dread d4/df/d10/f1f [0,4194304] 0 2026-03-10T14:07:23.866 INFO:tasks.workunit.client.1.vm04.stdout:7/93: rmdir d2/dc/de/d11 39 2026-03-10T14:07:23.866 INFO:tasks.workunit.client.1.vm04.stdout:1/126: creat d3/d20/f27 x:0 0 0 2026-03-10T14:07:23.868 INFO:tasks.workunit.client.1.vm04.stdout:2/116: mkdir d0/d3/d8/dd/d26 0 2026-03-10T14:07:23.877 INFO:tasks.workunit.client.1.vm04.stdout:7/94: symlink d2/l1d 0 2026-03-10T14:07:23.878 INFO:tasks.workunit.client.1.vm04.stdout:7/95: truncate d2/d14/f17 347590 0 2026-03-10T14:07:23.878 INFO:tasks.workunit.client.1.vm04.stdout:7/96: truncate d2/f10 875154 0 2026-03-10T14:07:23.880 INFO:tasks.workunit.client.1.vm04.stdout:5/142: sync 2026-03-10T14:07:23.881 INFO:tasks.workunit.client.1.vm04.stdout:4/137: rename d4/df/f11 to d4/d14/d1b/f2f 0 2026-03-10T14:07:23.886 INFO:tasks.workunit.client.1.vm04.stdout:2/117: symlink d0/d3/l27 0 2026-03-10T14:07:23.888 INFO:tasks.workunit.client.1.vm04.stdout:5/143: dwrite d7/f24 [0,4194304] 0 2026-03-10T14:07:23.890 INFO:tasks.workunit.client.1.vm04.stdout:4/138: dwrite d4/fa [0,4194304] 0 2026-03-10T14:07:23.891 INFO:tasks.workunit.client.1.vm04.stdout:2/118: dread d0/d3/f11 [0,4194304] 0 2026-03-10T14:07:23.900 INFO:tasks.workunit.client.1.vm04.stdout:1/127: rename d3/d5/d1e/f21 to d3/d5/d13/d1a/f28 0 2026-03-10T14:07:23.903 INFO:tasks.workunit.client.1.vm04.stdout:5/144: dwrite d7/fa [0,4194304] 0 2026-03-10T14:07:23.903 INFO:tasks.workunit.client.1.vm04.stdout:1/128: dread d3/fa [0,4194304] 0 2026-03-10T14:07:23.907 INFO:tasks.workunit.client.1.vm04.stdout:5/145: stat d7/d9/f20 0 2026-03-10T14:07:23.907 INFO:tasks.workunit.client.1.vm04.stdout:5/146: read d7/fa [1692171,22136] 0 2026-03-10T14:07:23.914 INFO:tasks.workunit.client.1.vm04.stdout:4/139: creat d4/d14/d1b/f30 x:0 0 0 2026-03-10T14:07:23.917 INFO:tasks.workunit.client.1.vm04.stdout:1/129: mkdir d3/d5/d13/d1a/d29 0 2026-03-10T14:07:23.917 INFO:tasks.workunit.client.1.vm04.stdout:1/130: truncate f2 5104612 0 2026-03-10T14:07:23.924 INFO:tasks.workunit.client.1.vm04.stdout:5/147: unlink d7/d12/l14 0 2026-03-10T14:07:23.925 INFO:tasks.workunit.client.1.vm04.stdout:5/148: write d7/f24 [3205793,11935] 0 2026-03-10T14:07:23.929 INFO:tasks.workunit.client.1.vm04.stdout:7/97: link d2/dc/de/f12 d2/dc/de/f1e 0 2026-03-10T14:07:23.930 INFO:tasks.workunit.client.1.vm04.stdout:1/131: read d3/fc [1003597,52728] 0 2026-03-10T14:07:23.934 INFO:tasks.workunit.client.1.vm04.stdout:4/140: mkdir d4/df/d31 0 2026-03-10T14:07:23.935 INFO:tasks.workunit.client.1.vm04.stdout:5/149: write d7/f11 [104490,11889] 0 2026-03-10T14:07:23.937 INFO:tasks.workunit.client.1.vm04.stdout:1/132: unlink d3/c7 0 2026-03-10T14:07:23.938 INFO:tasks.workunit.client.1.vm04.stdout:4/141: mknod d4/d14/d1b/c32 0 2026-03-10T14:07:23.938 INFO:tasks.workunit.client.1.vm04.stdout:7/98: sync 2026-03-10T14:07:23.939 INFO:tasks.workunit.client.1.vm04.stdout:4/142: write d4/d14/f27 [846688,101093] 0 2026-03-10T14:07:23.945 INFO:tasks.workunit.client.1.vm04.stdout:1/133: mknod d3/c2a 0 2026-03-10T14:07:23.949 INFO:tasks.workunit.client.1.vm04.stdout:7/99: creat d2/dc/f1f x:0 0 0 2026-03-10T14:07:23.955 INFO:tasks.workunit.client.1.vm04.stdout:7/100: dwrite d2/f10 [0,4194304] 0 2026-03-10T14:07:23.963 INFO:tasks.workunit.client.1.vm04.stdout:5/150: rename d7/d9/cf to d7/c25 0 2026-03-10T14:07:23.970 INFO:tasks.workunit.client.1.vm04.stdout:7/101: creat d2/f20 x:0 0 0 2026-03-10T14:07:23.970 INFO:tasks.workunit.client.1.vm04.stdout:7/102: dwrite d2/dc/fd [0,4194304] 0 2026-03-10T14:07:23.976 INFO:tasks.workunit.client.1.vm04.stdout:5/151: mkdir d7/d26 0 2026-03-10T14:07:23.976 INFO:tasks.workunit.client.1.vm04.stdout:7/103: fsync d2/dc/de/d11/f1a 0 2026-03-10T14:07:23.979 INFO:tasks.workunit.client.1.vm04.stdout:5/152: creat d7/d12/f27 x:0 0 0 2026-03-10T14:07:23.979 INFO:tasks.workunit.client.1.vm04.stdout:5/153: fdatasync d7/d12/f27 0 2026-03-10T14:07:23.980 INFO:tasks.workunit.client.1.vm04.stdout:7/104: mkdir d2/dc/de/d21 0 2026-03-10T14:07:23.984 INFO:tasks.workunit.client.1.vm04.stdout:7/105: creat d2/dc/de/f22 x:0 0 0 2026-03-10T14:07:23.985 INFO:tasks.workunit.client.1.vm04.stdout:7/106: mknod d2/d14/c23 0 2026-03-10T14:07:23.986 INFO:tasks.workunit.client.1.vm04.stdout:7/107: unlink d2/d14/f17 0 2026-03-10T14:07:24.000 INFO:tasks.workunit.client.1.vm04.stdout:9/143: getdents d9/da 0 2026-03-10T14:07:24.007 INFO:tasks.workunit.client.1.vm04.stdout:9/144: dwrite d9/da/dd/d1c/f27 [0,4194304] 0 2026-03-10T14:07:24.012 INFO:tasks.workunit.client.1.vm04.stdout:5/154: sync 2026-03-10T14:07:24.020 INFO:tasks.workunit.client.1.vm04.stdout:6/100: truncate d3/f9 2103766 0 2026-03-10T14:07:24.020 INFO:tasks.workunit.client.1.vm04.stdout:3/118: write da/fe [123825,54677] 0 2026-03-10T14:07:24.022 INFO:tasks.workunit.client.1.vm04.stdout:1/134: getdents d3/d22 0 2026-03-10T14:07:24.022 INFO:tasks.workunit.client.1.vm04.stdout:3/119: write da/fe [1945539,117528] 0 2026-03-10T14:07:24.023 INFO:tasks.workunit.client.1.vm04.stdout:3/120: fdatasync da/dc/f1d 0 2026-03-10T14:07:24.023 INFO:tasks.workunit.client.1.vm04.stdout:1/135: write d3/d5/d13/d1a/f24 [837309,55036] 0 2026-03-10T14:07:24.025 INFO:tasks.workunit.client.1.vm04.stdout:9/145: mknod d9/da/dd/d1c/c2d 0 2026-03-10T14:07:24.026 INFO:tasks.workunit.client.1.vm04.stdout:5/155: creat d7/d9/f28 x:0 0 0 2026-03-10T14:07:24.027 INFO:tasks.workunit.client.1.vm04.stdout:9/146: dread d9/da/f13 [0,4194304] 0 2026-03-10T14:07:24.028 INFO:tasks.workunit.client.1.vm04.stdout:9/147: chown d9/da/dd/d1c/c2d 518510393 1 2026-03-10T14:07:24.041 INFO:tasks.workunit.client.1.vm04.stdout:3/121: unlink da/dc/l1f 0 2026-03-10T14:07:24.047 INFO:tasks.workunit.client.1.vm04.stdout:3/122: chown da/l27 5 1 2026-03-10T14:07:24.047 INFO:tasks.workunit.client.1.vm04.stdout:3/123: chown f8 7352 1 2026-03-10T14:07:24.049 INFO:tasks.workunit.client.1.vm04.stdout:1/136: creat d3/d22/f2b x:0 0 0 2026-03-10T14:07:24.051 INFO:tasks.workunit.client.1.vm04.stdout:9/148: truncate d9/da/dd/fe 2024372 0 2026-03-10T14:07:24.053 INFO:tasks.workunit.client.1.vm04.stdout:5/156: symlink d7/l29 0 2026-03-10T14:07:24.057 INFO:tasks.workunit.client.1.vm04.stdout:5/157: dwrite d7/d12/f27 [0,4194304] 0 2026-03-10T14:07:24.062 INFO:tasks.workunit.client.1.vm04.stdout:1/137: rename d3/f19 to d3/f2c 0 2026-03-10T14:07:24.063 INFO:tasks.workunit.client.1.vm04.stdout:9/149: creat d9/da/dd/d1c/f2e x:0 0 0 2026-03-10T14:07:24.066 INFO:tasks.workunit.client.1.vm04.stdout:5/158: dwrite d7/f1d [0,4194304] 0 2026-03-10T14:07:24.069 INFO:tasks.workunit.client.1.vm04.stdout:5/159: write d7/d9/fd [5456576,126688] 0 2026-03-10T14:07:24.075 INFO:tasks.workunit.client.1.vm04.stdout:9/150: dwrite d9/da/dd/f19 [0,4194304] 0 2026-03-10T14:07:24.076 INFO:tasks.workunit.client.1.vm04.stdout:9/151: chown d9/da/f13 201 1 2026-03-10T14:07:24.077 INFO:tasks.workunit.client.1.vm04.stdout:5/160: dwrite d7/d12/f27 [0,4194304] 0 2026-03-10T14:07:24.080 INFO:tasks.workunit.client.1.vm04.stdout:5/161: write d7/f13 [3229138,78274] 0 2026-03-10T14:07:24.083 INFO:tasks.workunit.client.1.vm04.stdout:2/119: dread d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:24.094 INFO:tasks.workunit.client.1.vm04.stdout:9/152: mknod d9/da/c2f 0 2026-03-10T14:07:24.094 INFO:tasks.workunit.client.1.vm04.stdout:9/153: creat d9/da/dd/d1c/f30 x:0 0 0 2026-03-10T14:07:24.094 INFO:tasks.workunit.client.1.vm04.stdout:9/154: write d9/d1d/f1f [306131,73000] 0 2026-03-10T14:07:24.104 INFO:tasks.workunit.client.1.vm04.stdout:3/124: sync 2026-03-10T14:07:24.104 INFO:tasks.workunit.client.1.vm04.stdout:1/138: sync 2026-03-10T14:07:24.104 INFO:tasks.workunit.client.1.vm04.stdout:2/120: sync 2026-03-10T14:07:24.105 INFO:tasks.workunit.client.1.vm04.stdout:3/125: write da/f22 [822571,122227] 0 2026-03-10T14:07:24.105 INFO:tasks.workunit.client.1.vm04.stdout:3/126: dread - da/dc/f2c zero size 2026-03-10T14:07:24.106 INFO:tasks.workunit.client.1.vm04.stdout:3/127: chown da/c20 159592136 1 2026-03-10T14:07:24.112 INFO:tasks.workunit.client.1.vm04.stdout:1/139: creat d3/d5/d1e/f2d x:0 0 0 2026-03-10T14:07:24.115 INFO:tasks.workunit.client.1.vm04.stdout:3/128: dwrite da/f25 [0,4194304] 0 2026-03-10T14:07:24.128 INFO:tasks.workunit.client.1.vm04.stdout:3/129: write da/f10 [6701797,78280] 0 2026-03-10T14:07:24.128 INFO:tasks.workunit.client.1.vm04.stdout:2/121: rename d0/d3/d8/fc to d0/f28 0 2026-03-10T14:07:24.128 INFO:tasks.workunit.client.1.vm04.stdout:0/89: fsync d0/d2/fd 0 2026-03-10T14:07:24.128 INFO:tasks.workunit.client.1.vm04.stdout:0/90: chown d0/d2/fa 6 1 2026-03-10T14:07:24.128 INFO:tasks.workunit.client.1.vm04.stdout:0/91: dread - d0/f14 zero size 2026-03-10T14:07:24.128 INFO:tasks.workunit.client.1.vm04.stdout:2/122: dread d0/d3/f11 [0,4194304] 0 2026-03-10T14:07:24.135 INFO:tasks.workunit.client.1.vm04.stdout:3/130: creat da/dc/f2f x:0 0 0 2026-03-10T14:07:24.136 INFO:tasks.workunit.client.1.vm04.stdout:3/131: truncate da/f15 540293 0 2026-03-10T14:07:24.140 INFO:tasks.workunit.client.1.vm04.stdout:1/140: getdents d3/d5/d13/d1a/d29 0 2026-03-10T14:07:24.150 INFO:tasks.workunit.client.1.vm04.stdout:2/123: rmdir d0/d3/d8/dd 39 2026-03-10T14:07:24.150 INFO:tasks.workunit.client.1.vm04.stdout:2/124: read d0/d3/f9 [107706,52166] 0 2026-03-10T14:07:24.150 INFO:tasks.workunit.client.1.vm04.stdout:2/125: readlink d0/d3/l27 0 2026-03-10T14:07:24.150 INFO:tasks.workunit.client.1.vm04.stdout:3/132: mkdir da/d30 0 2026-03-10T14:07:24.155 INFO:tasks.workunit.client.1.vm04.stdout:1/141: mknod d3/c2e 0 2026-03-10T14:07:24.155 INFO:tasks.workunit.client.1.vm04.stdout:2/126: creat d0/d14/d1b/f29 x:0 0 0 2026-03-10T14:07:24.159 INFO:tasks.workunit.client.1.vm04.stdout:1/142: dwrite d3/f8 [0,4194304] 0 2026-03-10T14:07:24.161 INFO:tasks.workunit.client.1.vm04.stdout:1/143: readlink d3/d5/l11 0 2026-03-10T14:07:24.164 INFO:tasks.workunit.client.1.vm04.stdout:1/144: write d3/fa [353498,123892] 0 2026-03-10T14:07:24.166 INFO:tasks.workunit.client.1.vm04.stdout:3/133: rename da/c26 to da/d30/c31 0 2026-03-10T14:07:24.167 INFO:tasks.workunit.client.1.vm04.stdout:8/135: dread d0/d3/d5/f15 [0,4194304] 0 2026-03-10T14:07:24.169 INFO:tasks.workunit.client.1.vm04.stdout:8/136: chown d0/d3/dd/d11/d12/f1d 510 1 2026-03-10T14:07:24.170 INFO:tasks.workunit.client.1.vm04.stdout:0/92: getdents d0/d2 0 2026-03-10T14:07:24.170 INFO:tasks.workunit.client.1.vm04.stdout:8/137: read d0/f1b [345286,75453] 0 2026-03-10T14:07:24.171 INFO:tasks.workunit.client.1.vm04.stdout:1/145: dread f2 [0,4194304] 0 2026-03-10T14:07:24.177 INFO:tasks.workunit.client.1.vm04.stdout:3/134: creat da/d30/f32 x:0 0 0 2026-03-10T14:07:24.178 INFO:tasks.workunit.client.1.vm04.stdout:0/93: write d0/d2/d15/f16 [2397610,5126] 0 2026-03-10T14:07:24.180 INFO:tasks.workunit.client.1.vm04.stdout:1/146: mkdir d3/d22/d2f 0 2026-03-10T14:07:24.184 INFO:tasks.workunit.client.1.vm04.stdout:7/108: truncate d2/dc/de/f12 4185419 0 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:0/94: creat d0/d2/f1d x:0 0 0 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:0/95: rename d0 to d0/d2/d15/d1e 22 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:0/96: write d0/d2/d15/f16 [349271,94647] 0 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:0/97: fdatasync d0/d2/fa 0 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:2/127: creat d0/d3/d8/dd/d26/f2a x:0 0 0 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:2/128: truncate d0/d14/d1b/f29 351268 0 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:7/109: creat d2/dc/f24 x:0 0 0 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:7/110: dread - d2/dc/f1f zero size 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:4/143: write d4/df/d22/f2b [576710,101440] 0 2026-03-10T14:07:24.200 INFO:tasks.workunit.client.1.vm04.stdout:2/129: symlink d0/l2b 0 2026-03-10T14:07:24.205 INFO:tasks.workunit.client.1.vm04.stdout:9/155: read d9/d1d/f1f [462409,39752] 0 2026-03-10T14:07:24.216 INFO:tasks.workunit.client.1.vm04.stdout:3/135: truncate da/f19 4150839 0 2026-03-10T14:07:24.225 INFO:tasks.workunit.client.1.vm04.stdout:3/136: dwrite da/f22 [0,4194304] 0 2026-03-10T14:07:24.225 INFO:tasks.workunit.client.1.vm04.stdout:6/101: unlink d3/fb 0 2026-03-10T14:07:24.227 INFO:tasks.workunit.client.1.vm04.stdout:6/102: dread d3/d8/fc [0,4194304] 0 2026-03-10T14:07:24.233 INFO:tasks.workunit.client.1.vm04.stdout:8/138: getdents d0/d3/d5 0 2026-03-10T14:07:24.235 INFO:tasks.workunit.client.1.vm04.stdout:4/144: rmdir d4/df/d10 39 2026-03-10T14:07:24.235 INFO:tasks.workunit.client.1.vm04.stdout:4/145: write d4/f21 [316808,59293] 0 2026-03-10T14:07:24.236 INFO:tasks.workunit.client.1.vm04.stdout:4/146: write d4/df/d22/f2b [343720,29646] 0 2026-03-10T14:07:24.239 INFO:tasks.workunit.client.1.vm04.stdout:2/130: symlink d0/d3/l2c 0 2026-03-10T14:07:24.240 INFO:tasks.workunit.client.1.vm04.stdout:2/131: fdatasync d0/d14/d1b/f29 0 2026-03-10T14:07:24.244 INFO:tasks.workunit.client.1.vm04.stdout:5/162: stat d7/c25 0 2026-03-10T14:07:24.250 INFO:tasks.workunit.client.1.vm04.stdout:9/156: creat d9/da/dd/f31 x:0 0 0 2026-03-10T14:07:24.254 INFO:tasks.workunit.client.1.vm04.stdout:4/147: dread d4/f2c [0,4194304] 0 2026-03-10T14:07:24.257 INFO:tasks.workunit.client.1.vm04.stdout:1/147: fsync d3/d5/d13/d1a/f24 0 2026-03-10T14:07:24.267 INFO:tasks.workunit.client.1.vm04.stdout:6/103: readlink d3/d14/l16 0 2026-03-10T14:07:24.277 INFO:tasks.workunit.client.1.vm04.stdout:5/163: symlink d7/d12/l2a 0 2026-03-10T14:07:24.279 INFO:tasks.workunit.client.1.vm04.stdout:2/132: dwrite d0/d3/f11 [4194304,4194304] 0 2026-03-10T14:07:24.280 INFO:tasks.workunit.client.1.vm04.stdout:2/133: stat d0/l2b 0 2026-03-10T14:07:24.285 INFO:tasks.workunit.client.1.vm04.stdout:0/98: fsync d0/d2/f9 0 2026-03-10T14:07:24.285 INFO:tasks.workunit.client.1.vm04.stdout:0/99: chown d0/f19 60179794 1 2026-03-10T14:07:24.307 INFO:tasks.workunit.client.1.vm04.stdout:3/137: rename da/dc/l18 to da/d30/l33 0 2026-03-10T14:07:24.307 INFO:tasks.workunit.client.1.vm04.stdout:3/138: write f4 [6969864,97122] 0 2026-03-10T14:07:24.314 INFO:tasks.workunit.client.1.vm04.stdout:5/164: chown d7/c16 129220885 1 2026-03-10T14:07:24.322 INFO:tasks.workunit.client.1.vm04.stdout:4/148: symlink d4/df/d10/l33 0 2026-03-10T14:07:24.324 INFO:tasks.workunit.client.1.vm04.stdout:3/139: symlink da/dc/l34 0 2026-03-10T14:07:24.327 INFO:tasks.workunit.client.1.vm04.stdout:5/165: mkdir d7/d12/d2b 0 2026-03-10T14:07:24.330 INFO:tasks.workunit.client.1.vm04.stdout:3/140: dwrite da/dc/f1d [0,4194304] 0 2026-03-10T14:07:24.342 INFO:tasks.workunit.client.1.vm04.stdout:5/166: dread d7/fa [0,4194304] 0 2026-03-10T14:07:24.342 INFO:tasks.workunit.client.1.vm04.stdout:3/141: dread da/f14 [0,4194304] 0 2026-03-10T14:07:24.342 INFO:tasks.workunit.client.1.vm04.stdout:2/134: link d0/d3/f1d d0/d14/d1b/d25/f2d 0 2026-03-10T14:07:24.349 INFO:tasks.workunit.client.1.vm04.stdout:7/111: fsync d2/dc/de/f12 0 2026-03-10T14:07:24.360 INFO:tasks.workunit.client.1.vm04.stdout:4/149: sync 2026-03-10T14:07:24.368 INFO:tasks.workunit.client.1.vm04.stdout:9/157: link d9/da/c2f d9/da/dd/d1c/c32 0 2026-03-10T14:07:24.369 INFO:tasks.workunit.client.1.vm04.stdout:9/158: truncate d9/da/dd/f28 720212 0 2026-03-10T14:07:24.379 INFO:tasks.workunit.client.1.vm04.stdout:2/135: rename d0/d3/d8/dd/ce to d0/d14/d1b/c2e 0 2026-03-10T14:07:24.379 INFO:tasks.workunit.client.1.vm04.stdout:7/112: creat d2/dc/f25 x:0 0 0 2026-03-10T14:07:24.380 INFO:tasks.workunit.client.1.vm04.stdout:2/136: write d0/d3/d8/dd/d26/f2a [425835,4633] 0 2026-03-10T14:07:24.383 INFO:tasks.workunit.client.1.vm04.stdout:5/167: mknod d7/d26/c2c 0 2026-03-10T14:07:24.384 INFO:tasks.workunit.client.1.vm04.stdout:5/168: write d7/d9/fd [5656786,94764] 0 2026-03-10T14:07:24.394 INFO:tasks.workunit.client.1.vm04.stdout:9/159: mkdir d9/d33 0 2026-03-10T14:07:24.398 INFO:tasks.workunit.client.1.vm04.stdout:0/100: getdents d0 0 2026-03-10T14:07:24.398 INFO:tasks.workunit.client.1.vm04.stdout:0/101: chown d0/f14 1 1 2026-03-10T14:07:24.420 INFO:tasks.workunit.client.1.vm04.stdout:0/102: mkdir d0/d2/d15/d1f 0 2026-03-10T14:07:24.420 INFO:tasks.workunit.client.1.vm04.stdout:0/103: write d0/f1b [465363,107853] 0 2026-03-10T14:07:24.426 INFO:tasks.workunit.client.1.vm04.stdout:4/150: getdents d4/d14/d1b 0 2026-03-10T14:07:24.439 INFO:tasks.workunit.client.1.vm04.stdout:9/160: creat d9/d33/f34 x:0 0 0 2026-03-10T14:07:24.448 INFO:tasks.workunit.client.1.vm04.stdout:0/104: symlink d0/d2/d15/l20 0 2026-03-10T14:07:24.455 INFO:tasks.workunit.client.1.vm04.stdout:2/137: getdents d0/d3/d8 0 2026-03-10T14:07:24.471 INFO:tasks.workunit.client.1.vm04.stdout:9/161: rename d9/da/dd/d1c/c32 to d9/da/dd/c35 0 2026-03-10T14:07:24.472 INFO:tasks.workunit.client.1.vm04.stdout:8/139: write d0/f1b [793209,97636] 0 2026-03-10T14:07:24.479 INFO:tasks.workunit.client.1.vm04.stdout:0/105: rename d0/d2/c11 to d0/d2/d15/d1f/c21 0 2026-03-10T14:07:24.482 INFO:tasks.workunit.client.1.vm04.stdout:9/162: symlink d9/d1d/l36 0 2026-03-10T14:07:24.486 INFO:tasks.workunit.client.1.vm04.stdout:0/106: mkdir d0/d2/d15/d22 0 2026-03-10T14:07:24.489 INFO:tasks.workunit.client.1.vm04.stdout:9/163: read - d9/da/f1b zero size 2026-03-10T14:07:24.490 INFO:tasks.workunit.client.1.vm04.stdout:9/164: truncate d9/da/f1b 904607 0 2026-03-10T14:07:24.493 INFO:tasks.workunit.client.1.vm04.stdout:0/107: mknod d0/c23 0 2026-03-10T14:07:24.495 INFO:tasks.workunit.client.1.vm04.stdout:8/140: creat d0/f26 x:0 0 0 2026-03-10T14:07:24.497 INFO:tasks.workunit.client.1.vm04.stdout:3/142: rmdir da/d30 39 2026-03-10T14:07:24.503 INFO:tasks.workunit.client.1.vm04.stdout:9/165: symlink d9/da/l37 0 2026-03-10T14:07:24.504 INFO:tasks.workunit.client.1.vm04.stdout:9/166: readlink d9/da/dd/l17 0 2026-03-10T14:07:24.505 INFO:tasks.workunit.client.1.vm04.stdout:1/148: truncate d3/f8 87589 0 2026-03-10T14:07:24.509 INFO:tasks.workunit.client.1.vm04.stdout:9/167: dwrite d9/d33/f34 [0,4194304] 0 2026-03-10T14:07:24.510 INFO:tasks.workunit.client.1.vm04.stdout:1/149: dwrite d3/d20/f27 [0,4194304] 0 2026-03-10T14:07:24.514 INFO:tasks.workunit.client.1.vm04.stdout:8/141: mknod d0/d3/dd/c27 0 2026-03-10T14:07:24.516 INFO:tasks.workunit.client.1.vm04.stdout:8/142: dread - d0/f23 zero size 2026-03-10T14:07:24.517 INFO:tasks.workunit.client.1.vm04.stdout:8/143: write d0/f26 [236112,128631] 0 2026-03-10T14:07:24.522 INFO:tasks.workunit.client.1.vm04.stdout:1/150: symlink d3/d5/d13/d1a/l30 0 2026-03-10T14:07:24.523 INFO:tasks.workunit.client.1.vm04.stdout:1/151: readlink d3/d5/d13/d1a/l30 0 2026-03-10T14:07:24.527 INFO:tasks.workunit.client.1.vm04.stdout:4/151: rename d4/df/d10 to d4/df/d34 0 2026-03-10T14:07:24.539 INFO:tasks.workunit.client.1.vm04.stdout:6/104: truncate d3/de/f10 549145 0 2026-03-10T14:07:24.539 INFO:tasks.workunit.client.1.vm04.stdout:6/105: readlink d3/d14/l16 0 2026-03-10T14:07:24.554 INFO:tasks.workunit.client.1.vm04.stdout:8/144: rename d0/d3/c10 to d0/d3/c28 0 2026-03-10T14:07:24.554 INFO:tasks.workunit.client.1.vm04.stdout:4/152: mknod d4/d14/d1b/c35 0 2026-03-10T14:07:24.558 INFO:tasks.workunit.client.1.vm04.stdout:8/145: dwrite d0/f26 [0,4194304] 0 2026-03-10T14:07:24.561 INFO:tasks.workunit.client.1.vm04.stdout:8/146: dread d0/d3/dd/d11/d12/f1f [0,4194304] 0 2026-03-10T14:07:24.565 INFO:tasks.workunit.client.1.vm04.stdout:8/147: dwrite d0/d3/dd/d11/d12/f1f [0,4194304] 0 2026-03-10T14:07:24.572 INFO:tasks.workunit.client.1.vm04.stdout:6/106: creat d3/de/d11/f17 x:0 0 0 2026-03-10T14:07:24.577 INFO:tasks.workunit.client.1.vm04.stdout:6/107: dwrite d3/d8/fa [0,4194304] 0 2026-03-10T14:07:24.577 INFO:tasks.workunit.client.1.vm04.stdout:6/108: read - d3/de/d11/f17 zero size 2026-03-10T14:07:24.598 INFO:tasks.workunit.client.1.vm04.stdout:8/148: truncate d0/d3/dd/fc 4392108 0 2026-03-10T14:07:24.598 INFO:tasks.workunit.client.1.vm04.stdout:8/149: truncate d0/f23 961393 0 2026-03-10T14:07:24.605 INFO:tasks.workunit.client.1.vm04.stdout:1/152: link d3/c1c d3/d5/d13/d1a/c31 0 2026-03-10T14:07:24.618 INFO:tasks.workunit.client.1.vm04.stdout:6/109: creat d3/d14/f18 x:0 0 0 2026-03-10T14:07:24.624 INFO:tasks.workunit.client.1.vm04.stdout:1/153: getdents d3/d5 0 2026-03-10T14:07:24.624 INFO:tasks.workunit.client.1.vm04.stdout:1/154: fdatasync f1 0 2026-03-10T14:07:24.625 INFO:tasks.workunit.client.1.vm04.stdout:1/155: stat d3/d5/d13/d1a 0 2026-03-10T14:07:24.636 INFO:tasks.workunit.client.1.vm04.stdout:1/156: creat d3/d20/f32 x:0 0 0 2026-03-10T14:07:24.645 INFO:tasks.workunit.client.1.vm04.stdout:1/157: rename d3/d5/d13/d1a/d29 to d3/d5/d13/d1a/d33 0 2026-03-10T14:07:24.654 INFO:tasks.workunit.client.1.vm04.stdout:1/158: creat d3/d22/d2f/f34 x:0 0 0 2026-03-10T14:07:24.656 INFO:tasks.workunit.client.1.vm04.stdout:8/150: fdatasync d0/f1b 0 2026-03-10T14:07:24.662 INFO:tasks.workunit.client.1.vm04.stdout:8/151: mkdir d0/d3/dd/d11/d29 0 2026-03-10T14:07:24.674 INFO:tasks.workunit.client.1.vm04.stdout:8/152: mknod d0/d3/dd/d11/d29/c2a 0 2026-03-10T14:07:24.681 INFO:tasks.workunit.client.1.vm04.stdout:8/153: creat d0/d3/dd/d11/d29/f2b x:0 0 0 2026-03-10T14:07:24.684 INFO:tasks.workunit.client.1.vm04.stdout:8/154: dread d0/f26 [0,4194304] 0 2026-03-10T14:07:24.684 INFO:tasks.workunit.client.1.vm04.stdout:8/155: write d0/f26 [675600,105866] 0 2026-03-10T14:07:24.689 INFO:tasks.workunit.client.1.vm04.stdout:8/156: dwrite d0/d3/dd/d11/d12/f20 [4194304,4194304] 0 2026-03-10T14:07:24.703 INFO:tasks.workunit.client.1.vm04.stdout:8/157: creat d0/d3/dd/d11/d12/f2c x:0 0 0 2026-03-10T14:07:24.712 INFO:tasks.workunit.client.1.vm04.stdout:8/158: symlink d0/d3/d5/l2d 0 2026-03-10T14:07:24.716 INFO:tasks.workunit.client.1.vm04.stdout:8/159: getdents d0/d3/d24 0 2026-03-10T14:07:24.722 INFO:tasks.workunit.client.1.vm04.stdout:8/160: creat d0/d3/dd/d11/d12/f2e x:0 0 0 2026-03-10T14:07:24.732 INFO:tasks.workunit.client.1.vm04.stdout:8/161: symlink d0/d3/dd/d11/d29/l2f 0 2026-03-10T14:07:24.733 INFO:tasks.workunit.client.1.vm04.stdout:8/162: chown d0/c19 8192544 1 2026-03-10T14:07:24.736 INFO:tasks.workunit.client.1.vm04.stdout:8/163: creat d0/d3/d5/f30 x:0 0 0 2026-03-10T14:07:24.742 INFO:tasks.workunit.client.1.vm04.stdout:8/164: stat d0/d3/dd/fc 0 2026-03-10T14:07:24.743 INFO:tasks.workunit.client.1.vm04.stdout:8/165: chown d0/d3/d5/l2d 132848 1 2026-03-10T14:07:24.743 INFO:tasks.workunit.client.1.vm04.stdout:8/166: read d0/d3/d5/f15 [664035,102511] 0 2026-03-10T14:07:24.876 INFO:tasks.workunit.client.1.vm04.stdout:7/113: getdents d2/dc 0 2026-03-10T14:07:24.877 INFO:tasks.workunit.client.1.vm04.stdout:7/114: rename d2/dc/f24 to d2/dc/f26 0 2026-03-10T14:07:24.881 INFO:tasks.workunit.client.1.vm04.stdout:7/115: getdents d2/dc/de/d11 0 2026-03-10T14:07:24.900 INFO:tasks.workunit.client.1.vm04.stdout:5/169: truncate d7/f13 142640 0 2026-03-10T14:07:24.901 INFO:tasks.workunit.client.1.vm04.stdout:5/170: mkdir d7/d2d 0 2026-03-10T14:07:24.902 INFO:tasks.workunit.client.1.vm04.stdout:5/171: creat d7/d9/f2e x:0 0 0 2026-03-10T14:07:24.902 INFO:tasks.workunit.client.1.vm04.stdout:5/172: write d7/d9/fd [1445429,97122] 0 2026-03-10T14:07:24.909 INFO:tasks.workunit.client.1.vm04.stdout:2/138: truncate d0/d3/f24 2676822 0 2026-03-10T14:07:24.909 INFO:tasks.workunit.client.1.vm04.stdout:2/139: dread - d0/d3/d8/d17/f1f zero size 2026-03-10T14:07:24.925 INFO:tasks.workunit.client.1.vm04.stdout:0/108: dwrite d0/d2/f17 [0,4194304] 0 2026-03-10T14:07:24.936 INFO:tasks.workunit.client.1.vm04.stdout:3/143: write da/f22 [4860080,16995] 0 2026-03-10T14:07:24.938 INFO:tasks.workunit.client.1.vm04.stdout:0/109: chown d0/f1b 21259 1 2026-03-10T14:07:24.942 INFO:tasks.workunit.client.1.vm04.stdout:3/144: mkdir da/dc/d35 0 2026-03-10T14:07:24.947 INFO:tasks.workunit.client.1.vm04.stdout:3/145: rename da/dc/l34 to da/dc/l36 0 2026-03-10T14:07:24.956 INFO:tasks.workunit.client.1.vm04.stdout:3/146: read - da/dc/f2c zero size 2026-03-10T14:07:24.956 INFO:tasks.workunit.client.1.vm04.stdout:9/168: truncate d9/da/dd/f28 708165 0 2026-03-10T14:07:24.956 INFO:tasks.workunit.client.1.vm04.stdout:9/169: dread d9/da/dd/f19 [0,4194304] 0 2026-03-10T14:07:24.956 INFO:tasks.workunit.client.1.vm04.stdout:9/170: dwrite d9/d1d/f23 [0,4194304] 0 2026-03-10T14:07:24.959 INFO:tasks.workunit.client.1.vm04.stdout:0/110: creat d0/d1c/f24 x:0 0 0 2026-03-10T14:07:24.962 INFO:tasks.workunit.client.1.vm04.stdout:3/147: mkdir da/dc/d35/d37 0 2026-03-10T14:07:24.962 INFO:tasks.workunit.client.1.vm04.stdout:0/111: dread d0/d2/fa [0,4194304] 0 2026-03-10T14:07:24.971 INFO:tasks.workunit.client.1.vm04.stdout:9/171: fdatasync d9/da/f1e 0 2026-03-10T14:07:24.972 INFO:tasks.workunit.client.1.vm04.stdout:9/172: fdatasync d9/d33/f34 0 2026-03-10T14:07:24.980 INFO:tasks.workunit.client.1.vm04.stdout:9/173: fsync d9/d1d/f1f 0 2026-03-10T14:07:24.980 INFO:tasks.workunit.client.1.vm04.stdout:9/174: write d9/da/dd/d1c/f22 [241592,4796] 0 2026-03-10T14:07:24.987 INFO:tasks.workunit.client.1.vm04.stdout:9/175: mknod d9/da/dd/c38 0 2026-03-10T14:07:24.989 INFO:tasks.workunit.client.1.vm04.stdout:3/148: dread da/f19 [0,4194304] 0 2026-03-10T14:07:24.992 INFO:tasks.workunit.client.1.vm04.stdout:9/176: dwrite d9/da/f1e [0,4194304] 0 2026-03-10T14:07:24.996 INFO:tasks.workunit.client.1.vm04.stdout:3/149: unlink da/f15 0 2026-03-10T14:07:25.002 INFO:tasks.workunit.client.1.vm04.stdout:4/153: truncate d4/df/d34/f1f 1851227 0 2026-03-10T14:07:25.003 INFO:tasks.workunit.client.1.vm04.stdout:4/154: write d4/df/f18 [238684,95176] 0 2026-03-10T14:07:25.005 INFO:tasks.workunit.client.1.vm04.stdout:4/155: dread d4/d14/d1b/f2f [0,4194304] 0 2026-03-10T14:07:25.009 INFO:tasks.workunit.client.1.vm04.stdout:8/167: write d0/d3/dd/d11/d12/f1f [4469309,75764] 0 2026-03-10T14:07:25.025 INFO:tasks.workunit.client.1.vm04.stdout:1/159: rename d3/d5/d13/d1a/d33 to d3/d5/d1e/d35 0 2026-03-10T14:07:25.029 INFO:tasks.workunit.client.1.vm04.stdout:8/168: symlink d0/d3/dd/d11/d12/l31 0 2026-03-10T14:07:25.030 INFO:tasks.workunit.client.1.vm04.stdout:8/169: dread - d0/d3/d5/f30 zero size 2026-03-10T14:07:25.031 INFO:tasks.workunit.client.1.vm04.stdout:3/150: truncate da/f28 1507139 0 2026-03-10T14:07:25.033 INFO:tasks.workunit.client.1.vm04.stdout:7/116: rename d2/dc/de/d11/f16 to d2/d14/f27 0 2026-03-10T14:07:25.035 INFO:tasks.workunit.client.1.vm04.stdout:1/160: creat d3/d5/d13/f36 x:0 0 0 2026-03-10T14:07:25.037 INFO:tasks.workunit.client.1.vm04.stdout:8/170: symlink d0/d3/dd/d11/d29/l32 0 2026-03-10T14:07:25.037 INFO:tasks.workunit.client.1.vm04.stdout:8/171: readlink d0/d3/dd/d11/d29/l32 0 2026-03-10T14:07:25.039 INFO:tasks.workunit.client.1.vm04.stdout:0/112: rename d0/d2/d15/d1f to d0/d2/d25 0 2026-03-10T14:07:25.040 INFO:tasks.workunit.client.1.vm04.stdout:0/113: fdatasync d0/d2/fa 0 2026-03-10T14:07:25.040 INFO:tasks.workunit.client.1.vm04.stdout:0/114: chown d0/d2/f9 394449 1 2026-03-10T14:07:25.041 INFO:tasks.workunit.client.1.vm04.stdout:7/117: write d2/d14/f27 [66812,361] 0 2026-03-10T14:07:25.042 INFO:tasks.workunit.client.1.vm04.stdout:7/118: dread - d2/dc/de/d11/f19 zero size 2026-03-10T14:07:25.044 INFO:tasks.workunit.client.1.vm04.stdout:1/161: symlink d3/d22/l37 0 2026-03-10T14:07:25.044 INFO:tasks.workunit.client.1.vm04.stdout:1/162: chown d3/d20 7951031 1 2026-03-10T14:07:25.045 INFO:tasks.workunit.client.1.vm04.stdout:1/163: write d3/d20/f32 [414573,81546] 0 2026-03-10T14:07:25.046 INFO:tasks.workunit.client.1.vm04.stdout:3/151: mknod da/dc/d35/d37/c38 0 2026-03-10T14:07:25.047 INFO:tasks.workunit.client.1.vm04.stdout:0/115: mknod d0/d2/c26 0 2026-03-10T14:07:25.048 INFO:tasks.workunit.client.1.vm04.stdout:0/116: write d0/d2/f9 [3592171,15776] 0 2026-03-10T14:07:25.051 INFO:tasks.workunit.client.1.vm04.stdout:0/117: dread d0/d2/fd [0,4194304] 0 2026-03-10T14:07:25.052 INFO:tasks.workunit.client.1.vm04.stdout:7/119: mkdir d2/d28 0 2026-03-10T14:07:25.053 INFO:tasks.workunit.client.1.vm04.stdout:1/164: mkdir d3/d5/d13/d38 0 2026-03-10T14:07:25.054 INFO:tasks.workunit.client.1.vm04.stdout:3/152: creat da/dc/f39 x:0 0 0 2026-03-10T14:07:25.057 INFO:tasks.workunit.client.1.vm04.stdout:8/172: rename d0/d3/d24 to d0/d3/dd/d33 0 2026-03-10T14:07:25.058 INFO:tasks.workunit.client.1.vm04.stdout:5/173: dread f4 [0,4194304] 0 2026-03-10T14:07:25.064 INFO:tasks.workunit.client.1.vm04.stdout:0/118: getdents d0/d2/d15/d22 0 2026-03-10T14:07:25.064 INFO:tasks.workunit.client.1.vm04.stdout:0/119: write d0/d2/f9 [1988012,41757] 0 2026-03-10T14:07:25.071 INFO:tasks.workunit.client.1.vm04.stdout:2/140: dwrite d0/d14/d1b/d25/f2d [0,4194304] 0 2026-03-10T14:07:25.081 INFO:tasks.workunit.client.1.vm04.stdout:1/165: link f1 d3/d22/d2f/f39 0 2026-03-10T14:07:25.082 INFO:tasks.workunit.client.1.vm04.stdout:1/166: dread - d3/d5/d13/f23 zero size 2026-03-10T14:07:25.082 INFO:tasks.workunit.client.1.vm04.stdout:1/167: readlink d3/l4 0 2026-03-10T14:07:25.083 INFO:tasks.workunit.client.1.vm04.stdout:1/168: truncate d3/d5/d13/f23 46911 0 2026-03-10T14:07:25.085 INFO:tasks.workunit.client.1.vm04.stdout:8/173: symlink d0/d3/dd/d11/d12/l34 0 2026-03-10T14:07:25.087 INFO:tasks.workunit.client.1.vm04.stdout:0/120: rename d0/d2/d15/l1a to d0/d2/d25/l27 0 2026-03-10T14:07:25.096 INFO:tasks.workunit.client.1.vm04.stdout:2/141: fsync d0/d3/f24 0 2026-03-10T14:07:25.096 INFO:tasks.workunit.client.1.vm04.stdout:1/169: creat d3/d22/d2f/f3a x:0 0 0 2026-03-10T14:07:25.096 INFO:tasks.workunit.client.1.vm04.stdout:8/174: read d0/d3/dd/fc [4323638,5487] 0 2026-03-10T14:07:25.096 INFO:tasks.workunit.client.1.vm04.stdout:5/174: symlink d7/d2d/l2f 0 2026-03-10T14:07:25.100 INFO:tasks.workunit.client.1.vm04.stdout:0/121: dwrite d0/f14 [0,4194304] 0 2026-03-10T14:07:25.101 INFO:tasks.workunit.client.1.vm04.stdout:0/122: write d0/f14 [2502013,126569] 0 2026-03-10T14:07:25.101 INFO:tasks.workunit.client.1.vm04.stdout:0/123: stat d0/d2/d15/f16 0 2026-03-10T14:07:25.104 INFO:tasks.workunit.client.1.vm04.stdout:7/120: getdents d2/dc 0 2026-03-10T14:07:25.104 INFO:tasks.workunit.client.1.vm04.stdout:4/156: rmdir d4/d2a 0 2026-03-10T14:07:25.104 INFO:tasks.workunit.client.1.vm04.stdout:6/110: creat d3/f19 x:0 0 0 2026-03-10T14:07:25.105 INFO:tasks.workunit.client.1.vm04.stdout:4/157: fsync d4/f6 0 2026-03-10T14:07:25.106 INFO:tasks.workunit.client.1.vm04.stdout:6/111: fdatasync d3/d14/f18 0 2026-03-10T14:07:25.106 INFO:tasks.workunit.client.1.vm04.stdout:9/177: write d9/da/dd/fe [2057713,50095] 0 2026-03-10T14:07:25.108 INFO:tasks.workunit.client.1.vm04.stdout:6/112: write d3/de/d11/f17 [1014007,82396] 0 2026-03-10T14:07:25.119 INFO:tasks.workunit.client.1.vm04.stdout:8/175: rename d0/d3/dd/d11/d12/f20 to d0/d3/f35 0 2026-03-10T14:07:25.125 INFO:tasks.workunit.client.1.vm04.stdout:0/124: dwrite d0/d2/f1d [0,4194304] 0 2026-03-10T14:07:25.125 INFO:tasks.workunit.client.1.vm04.stdout:5/175: creat d7/d26/f30 x:0 0 0 2026-03-10T14:07:25.125 INFO:tasks.workunit.client.1.vm04.stdout:8/176: fdatasync d0/d3/dd/d11/d12/f2e 0 2026-03-10T14:07:25.127 INFO:tasks.workunit.client.1.vm04.stdout:4/158: symlink d4/d14/d1b/l36 0 2026-03-10T14:07:25.127 INFO:tasks.workunit.client.1.vm04.stdout:1/170: dwrite d3/d5/d13/d1a/f28 [0,4194304] 0 2026-03-10T14:07:25.128 INFO:tasks.workunit.client.1.vm04.stdout:6/113: fdatasync d3/d8/fc 0 2026-03-10T14:07:25.139 INFO:tasks.workunit.client.1.vm04.stdout:7/121: rename d2/dc/de/d11/f1a to d2/d28/f29 0 2026-03-10T14:07:25.143 INFO:tasks.workunit.client.1.vm04.stdout:0/125: dread d0/f14 [0,4194304] 0 2026-03-10T14:07:25.146 INFO:tasks.workunit.client.1.vm04.stdout:5/176: symlink d7/d26/l31 0 2026-03-10T14:07:25.147 INFO:tasks.workunit.client.1.vm04.stdout:5/177: stat d7/d9/c23 0 2026-03-10T14:07:25.147 INFO:tasks.workunit.client.1.vm04.stdout:0/126: read d0/d2/d15/f16 [3855425,12085] 0 2026-03-10T14:07:25.148 INFO:tasks.workunit.client.1.vm04.stdout:0/127: dread d0/d2/fa [0,4194304] 0 2026-03-10T14:07:25.151 INFO:tasks.workunit.client.1.vm04.stdout:0/128: stat d0/d2/c26 0 2026-03-10T14:07:25.152 INFO:tasks.workunit.client.1.vm04.stdout:0/129: fdatasync d0/d1c/f24 0 2026-03-10T14:07:25.159 INFO:tasks.workunit.client.1.vm04.stdout:8/177: rename d0/d3/dd/d11/d29/c2a to d0/d3/dd/d11/d29/c36 0 2026-03-10T14:07:25.160 INFO:tasks.workunit.client.1.vm04.stdout:5/178: dwrite d7/d9/fd [4194304,4194304] 0 2026-03-10T14:07:25.161 INFO:tasks.workunit.client.1.vm04.stdout:8/178: dread - d0/d3/dd/d11/d29/f2b zero size 2026-03-10T14:07:25.166 INFO:tasks.workunit.client.1.vm04.stdout:5/179: chown d7/l29 447020 1 2026-03-10T14:07:25.168 INFO:tasks.workunit.client.1.vm04.stdout:6/114: dread d3/f4 [0,4194304] 0 2026-03-10T14:07:25.181 INFO:tasks.workunit.client.1.vm04.stdout:6/115: truncate d3/de/f13 267145 0 2026-03-10T14:07:25.181 INFO:tasks.workunit.client.1.vm04.stdout:6/116: chown d3/de/d11/l12 20232247 1 2026-03-10T14:07:25.181 INFO:tasks.workunit.client.1.vm04.stdout:8/179: dwrite d0/f1b [0,4194304] 0 2026-03-10T14:07:25.181 INFO:tasks.workunit.client.1.vm04.stdout:3/153: truncate da/f22 1017172 0 2026-03-10T14:07:25.185 INFO:tasks.workunit.client.1.vm04.stdout:3/154: write da/dc/f39 [550725,110996] 0 2026-03-10T14:07:25.186 INFO:tasks.workunit.client.1.vm04.stdout:3/155: dread - da/dc/f2a zero size 2026-03-10T14:07:25.187 INFO:tasks.workunit.client.1.vm04.stdout:9/178: rename d9/f15 to d9/da/dd/d1c/f39 0 2026-03-10T14:07:25.187 INFO:tasks.workunit.client.1.vm04.stdout:9/179: stat d9/da/l37 0 2026-03-10T14:07:25.189 INFO:tasks.workunit.client.1.vm04.stdout:7/122: mkdir d2/d2a 0 2026-03-10T14:07:25.190 INFO:tasks.workunit.client.1.vm04.stdout:2/142: getdents d0/d3/d8/dd 0 2026-03-10T14:07:25.193 INFO:tasks.workunit.client.1.vm04.stdout:2/143: dwrite d0/d3/f24 [0,4194304] 0 2026-03-10T14:07:25.193 INFO:tasks.workunit.client.1.vm04.stdout:5/180: mkdir d7/d2d/d32 0 2026-03-10T14:07:25.193 INFO:tasks.workunit.client.1.vm04.stdout:8/180: mknod d0/d3/d5/c37 0 2026-03-10T14:07:25.205 INFO:tasks.workunit.client.1.vm04.stdout:1/171: rename d3/d5/d13/f23 to d3/d22/f3b 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:1/172: dread d3/d22/f3b [0,4194304] 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:0/130: creat d0/d2/d15/d22/f28 x:0 0 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:1/173: write d3/d22/d2f/f3a [956866,39298] 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:2/144: mkdir d0/d3/d8/d2f 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:5/181: mknod d7/d26/c33 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:7/123: rename d2/l6 to d2/d28/l2b 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:0/131: fdatasync d0/f19 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:2/145: dwrite d0/d14/d1b/d25/f2d [0,4194304] 0 2026-03-10T14:07:25.228 INFO:tasks.workunit.client.1.vm04.stdout:1/174: creat d3/d22/d2f/f3c x:0 0 0 2026-03-10T14:07:25.238 INFO:tasks.workunit.client.1.vm04.stdout:4/159: sync 2026-03-10T14:07:25.240 INFO:tasks.workunit.client.1.vm04.stdout:5/182: write d7/f13 [992256,12037] 0 2026-03-10T14:07:25.244 INFO:tasks.workunit.client.1.vm04.stdout:5/183: write d7/d12/f27 [1142276,58839] 0 2026-03-10T14:07:25.245 INFO:tasks.workunit.client.1.vm04.stdout:3/156: rename da/dc/l13 to da/d30/l3a 0 2026-03-10T14:07:25.246 INFO:tasks.workunit.client.1.vm04.stdout:3/157: write da/dc/f39 [1480512,77385] 0 2026-03-10T14:07:25.252 INFO:tasks.workunit.client.1.vm04.stdout:7/124: mknod d2/dc/c2c 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:5/184: dwrite d7/f13 [0,4194304] 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:7/125: truncate f0 4302284 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:5/185: readlink d7/l29 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:0/132: creat d0/d1c/f29 x:0 0 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:0/133: readlink d0/d2/d15/l20 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:0/134: stat d0/d2/d15 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:2/146: creat d0/d3/d8/f30 x:0 0 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:7/126: dwrite d2/d14/f27 [0,4194304] 0 2026-03-10T14:07:25.270 INFO:tasks.workunit.client.1.vm04.stdout:6/117: getdents d3/de/d11 0 2026-03-10T14:07:25.273 INFO:tasks.workunit.client.1.vm04.stdout:4/160: sync 2026-03-10T14:07:25.285 INFO:tasks.workunit.client.1.vm04.stdout:2/147: symlink d0/d14/d1b/l31 0 2026-03-10T14:07:25.287 INFO:tasks.workunit.client.1.vm04.stdout:1/175: getdents d3/d5/d13/d38 0 2026-03-10T14:07:25.293 INFO:tasks.workunit.client.1.vm04.stdout:3/158: rename da/d30/c31 to da/dc/d35/d37/c3b 0 2026-03-10T14:07:25.314 INFO:tasks.workunit.client.1.vm04.stdout:2/148: creat d0/d14/d1b/f32 x:0 0 0 2026-03-10T14:07:25.314 INFO:tasks.workunit.client.1.vm04.stdout:1/176: dread d3/f18 [0,4194304] 0 2026-03-10T14:07:25.314 INFO:tasks.workunit.client.1.vm04.stdout:3/159: fdatasync da/f14 0 2026-03-10T14:07:25.314 INFO:tasks.workunit.client.1.vm04.stdout:1/177: dwrite d3/fa [4194304,4194304] 0 2026-03-10T14:07:25.314 INFO:tasks.workunit.client.1.vm04.stdout:4/161: sync 2026-03-10T14:07:25.314 INFO:tasks.workunit.client.1.vm04.stdout:2/149: creat d0/d3/d8/dd/d26/f33 x:0 0 0 2026-03-10T14:07:25.314 INFO:tasks.workunit.client.1.vm04.stdout:1/178: fsync f1 0 2026-03-10T14:07:25.314 INFO:tasks.workunit.client.1.vm04.stdout:1/179: stat d3/d5/d1e/d35 0 2026-03-10T14:07:25.322 INFO:tasks.workunit.client.1.vm04.stdout:4/162: creat d4/df/d22/f37 x:0 0 0 2026-03-10T14:07:25.322 INFO:tasks.workunit.client.1.vm04.stdout:4/163: stat d4/df/l1c 0 2026-03-10T14:07:25.325 INFO:tasks.workunit.client.1.vm04.stdout:0/135: dwrite d0/d2/f1d [4194304,4194304] 0 2026-03-10T14:07:25.325 INFO:tasks.workunit.client.1.vm04.stdout:0/136: readlink d0/d2/d15/l20 0 2026-03-10T14:07:25.333 INFO:tasks.workunit.client.1.vm04.stdout:2/150: mknod d0/d3/c34 0 2026-03-10T14:07:25.334 INFO:tasks.workunit.client.1.vm04.stdout:6/118: getdents d3/d14 0 2026-03-10T14:07:25.338 INFO:tasks.workunit.client.1.vm04.stdout:3/160: mknod da/c3c 0 2026-03-10T14:07:25.343 INFO:tasks.workunit.client.1.vm04.stdout:0/137: rmdir d0/d2/d25 39 2026-03-10T14:07:25.345 INFO:tasks.workunit.client.1.vm04.stdout:6/119: sync 2026-03-10T14:07:25.352 INFO:tasks.workunit.client.1.vm04.stdout:2/151: mkdir d0/d3/d8/d17/d35 0 2026-03-10T14:07:25.356 INFO:tasks.workunit.client.1.vm04.stdout:2/152: dwrite d0/d14/d1b/f29 [0,4194304] 0 2026-03-10T14:07:25.358 INFO:tasks.workunit.client.1.vm04.stdout:1/180: creat d3/d5/d13/d38/f3d x:0 0 0 2026-03-10T14:07:25.375 INFO:tasks.workunit.client.1.vm04.stdout:6/120: creat d3/f1a x:0 0 0 2026-03-10T14:07:25.375 INFO:tasks.workunit.client.1.vm04.stdout:1/181: symlink d3/d5/d13/d38/l3e 0 2026-03-10T14:07:25.375 INFO:tasks.workunit.client.1.vm04.stdout:6/121: chown d3/f19 4104 1 2026-03-10T14:07:25.375 INFO:tasks.workunit.client.1.vm04.stdout:2/153: creat d0/d3/d8/dd/d26/f36 x:0 0 0 2026-03-10T14:07:25.377 INFO:tasks.workunit.client.1.vm04.stdout:2/154: dread - d0/d3/d8/dd/d26/f33 zero size 2026-03-10T14:07:25.390 INFO:tasks.workunit.client.1.vm04.stdout:0/138: rename d0/d2/d15/f16 to d0/d2/d25/f2a 0 2026-03-10T14:07:25.395 INFO:tasks.workunit.client.1.vm04.stdout:6/122: mkdir d3/d1b 0 2026-03-10T14:07:25.399 INFO:tasks.workunit.client.1.vm04.stdout:1/182: symlink d3/d5/d13/l3f 0 2026-03-10T14:07:25.406 INFO:tasks.workunit.client.1.vm04.stdout:0/139: unlink d0/d2/d15/c18 0 2026-03-10T14:07:25.406 INFO:tasks.workunit.client.1.vm04.stdout:0/140: chown d0/f14 179 1 2026-03-10T14:07:25.406 INFO:tasks.workunit.client.1.vm04.stdout:0/141: write d0/d2/f12 [1806764,124749] 0 2026-03-10T14:07:25.411 INFO:tasks.workunit.client.1.vm04.stdout:1/183: rename d3/c10 to d3/d5/c40 0 2026-03-10T14:07:25.411 INFO:tasks.workunit.client.1.vm04.stdout:6/123: write d3/f4 [4113002,94177] 0 2026-03-10T14:07:25.412 INFO:tasks.workunit.client.1.vm04.stdout:6/124: chown d3/d14/f18 8707 1 2026-03-10T14:07:25.419 INFO:tasks.workunit.client.1.vm04.stdout:0/142: creat d0/d1c/f2b x:0 0 0 2026-03-10T14:07:25.428 INFO:tasks.workunit.client.1.vm04.stdout:0/143: chown d0/d2/d25/c21 330152 1 2026-03-10T14:07:25.428 INFO:tasks.workunit.client.1.vm04.stdout:0/144: fdatasync d0/d2/f13 0 2026-03-10T14:07:25.435 INFO:tasks.workunit.client.1.vm04.stdout:5/186: rmdir d7 39 2026-03-10T14:07:25.436 INFO:tasks.workunit.client.1.vm04.stdout:9/180: write d9/da/f13 [904367,51938] 0 2026-03-10T14:07:25.444 INFO:tasks.workunit.client.1.vm04.stdout:8/181: truncate d0/d3/dd/d11/d12/f1f 1255307 0 2026-03-10T14:07:25.447 INFO:tasks.workunit.client.1.vm04.stdout:2/155: getdents d0/d3/d8 0 2026-03-10T14:07:25.448 INFO:tasks.workunit.client.1.vm04.stdout:8/182: dread d0/f23 [0,4194304] 0 2026-03-10T14:07:25.448 INFO:tasks.workunit.client.1.vm04.stdout:8/183: read d0/d3/dd/fc [3426709,70008] 0 2026-03-10T14:07:25.453 INFO:tasks.workunit.client.1.vm04.stdout:3/161: write da/f14 [995655,53556] 0 2026-03-10T14:07:25.457 INFO:tasks.workunit.client.1.vm04.stdout:4/164: getdents d4/df/d22 0 2026-03-10T14:07:25.459 INFO:tasks.workunit.client.1.vm04.stdout:5/187: dread - d7/d9/f20 zero size 2026-03-10T14:07:25.465 INFO:tasks.workunit.client.1.vm04.stdout:0/145: link d0/d2/f13 d0/d1c/f2c 0 2026-03-10T14:07:25.470 INFO:tasks.workunit.client.1.vm04.stdout:7/127: dread - d2/dc/f26 zero size 2026-03-10T14:07:25.472 INFO:tasks.workunit.client.1.vm04.stdout:2/156: write d0/d3/f9 [2240765,92675] 0 2026-03-10T14:07:25.473 INFO:tasks.workunit.client.1.vm04.stdout:2/157: dread - d0/d3/d8/f30 zero size 2026-03-10T14:07:25.476 INFO:tasks.workunit.client.1.vm04.stdout:2/158: dwrite d0/d3/d8/dd/d26/f2a [0,4194304] 0 2026-03-10T14:07:25.481 INFO:tasks.workunit.client.1.vm04.stdout:8/184: write d0/d3/d5/f15 [735278,8452] 0 2026-03-10T14:07:25.483 INFO:tasks.workunit.client.1.vm04.stdout:3/162: mknod da/dc/d35/c3d 0 2026-03-10T14:07:25.483 INFO:tasks.workunit.client.1.vm04.stdout:3/163: stat f8 0 2026-03-10T14:07:25.496 INFO:tasks.workunit.client.1.vm04.stdout:9/181: rename d9/da/dd/c35 to d9/d33/c3a 0 2026-03-10T14:07:25.504 INFO:tasks.workunit.client.1.vm04.stdout:6/125: getdents d3 0 2026-03-10T14:07:25.505 INFO:tasks.workunit.client.1.vm04.stdout:6/126: chown d3/de/d11/l15 130936 1 2026-03-10T14:07:25.514 INFO:tasks.workunit.client.1.vm04.stdout:2/159: mkdir d0/d3/d8/dd/d37 0 2026-03-10T14:07:25.517 INFO:tasks.workunit.client.1.vm04.stdout:2/160: dread d0/d3/f24 [0,4194304] 0 2026-03-10T14:07:25.520 INFO:tasks.workunit.client.1.vm04.stdout:2/161: dwrite d0/d14/d1b/f32 [0,4194304] 0 2026-03-10T14:07:25.523 INFO:tasks.workunit.client.1.vm04.stdout:2/162: dwrite d0/d3/f1d [0,4194304] 0 2026-03-10T14:07:25.541 INFO:tasks.workunit.client.1.vm04.stdout:4/165: link d4/f6 d4/d14/f38 0 2026-03-10T14:07:25.547 INFO:tasks.workunit.client.1.vm04.stdout:5/188: mkdir d7/d2d/d32/d34 0 2026-03-10T14:07:25.547 INFO:tasks.workunit.client.1.vm04.stdout:1/184: truncate d3/d22/d2f/f3a 5921 0 2026-03-10T14:07:25.556 INFO:tasks.workunit.client.1.vm04.stdout:9/182: creat d9/da/dd/d1c/f3b x:0 0 0 2026-03-10T14:07:25.558 INFO:tasks.workunit.client.1.vm04.stdout:6/127: rmdir d3/d8 39 2026-03-10T14:07:25.568 INFO:tasks.workunit.client.1.vm04.stdout:6/128: dread d3/de/d11/f17 [0,4194304] 0 2026-03-10T14:07:25.597 INFO:tasks.workunit.client.1.vm04.stdout:4/166: fsync d4/d14/f38 0 2026-03-10T14:07:25.601 INFO:tasks.workunit.client.1.vm04.stdout:5/189: unlink d7/d9/f2e 0 2026-03-10T14:07:25.601 INFO:tasks.workunit.client.1.vm04.stdout:5/190: truncate d7/f24 4501662 0 2026-03-10T14:07:25.601 INFO:tasks.workunit.client.1.vm04.stdout:5/191: stat d7/f21 0 2026-03-10T14:07:25.602 INFO:tasks.workunit.client.1.vm04.stdout:5/192: write f4 [3102327,66810] 0 2026-03-10T14:07:25.611 INFO:tasks.workunit.client.1.vm04.stdout:7/128: mkdir d2/dc/de/d2d 0 2026-03-10T14:07:25.624 INFO:tasks.workunit.client.1.vm04.stdout:4/167: mknod d4/d14/c39 0 2026-03-10T14:07:25.632 INFO:tasks.workunit.client.1.vm04.stdout:1/185: write d3/d5/d13/d1a/f28 [4372806,87451] 0 2026-03-10T14:07:25.637 INFO:tasks.workunit.client.1.vm04.stdout:7/129: sync 2026-03-10T14:07:25.638 INFO:tasks.workunit.client.1.vm04.stdout:7/130: write d2/dc/de/d11/f19 [226322,87106] 0 2026-03-10T14:07:25.641 INFO:tasks.workunit.client.1.vm04.stdout:0/146: rename d0/d2/d15/d22/f28 to d0/d2/f2d 0 2026-03-10T14:07:25.643 INFO:tasks.workunit.client.1.vm04.stdout:9/183: symlink d9/l3c 0 2026-03-10T14:07:25.647 INFO:tasks.workunit.client.1.vm04.stdout:9/184: dwrite d9/da/f1e [0,4194304] 0 2026-03-10T14:07:25.651 INFO:tasks.workunit.client.1.vm04.stdout:2/163: truncate d0/d14/d1b/f29 2170974 0 2026-03-10T14:07:25.652 INFO:tasks.workunit.client.1.vm04.stdout:2/164: write d0/d3/f1d [3858850,14655] 0 2026-03-10T14:07:25.653 INFO:tasks.workunit.client.1.vm04.stdout:2/165: chown d0/d3/la 630 1 2026-03-10T14:07:25.654 INFO:tasks.workunit.client.1.vm04.stdout:3/164: dwrite da/f22 [0,4194304] 0 2026-03-10T14:07:25.655 INFO:tasks.workunit.client.1.vm04.stdout:3/165: stat f4 0 2026-03-10T14:07:25.658 INFO:tasks.workunit.client.1.vm04.stdout:8/185: getdents d0/d3/dd 0 2026-03-10T14:07:25.661 INFO:tasks.workunit.client.1.vm04.stdout:1/186: mknod d3/d22/c41 0 2026-03-10T14:07:25.663 INFO:tasks.workunit.client.1.vm04.stdout:3/166: dwrite da/dc/f2f [0,4194304] 0 2026-03-10T14:07:25.677 INFO:tasks.workunit.client.1.vm04.stdout:7/131: creat d2/dc/de/d11/f2e x:0 0 0 2026-03-10T14:07:25.687 INFO:tasks.workunit.client.1.vm04.stdout:6/129: rename d3/f1a to d3/d14/f1c 0 2026-03-10T14:07:25.687 INFO:tasks.workunit.client.1.vm04.stdout:4/168: rename d4/df to d4/df/d31/d3a 22 2026-03-10T14:07:25.693 INFO:tasks.workunit.client.1.vm04.stdout:0/147: creat d0/d1c/f2e x:0 0 0 2026-03-10T14:07:25.696 INFO:tasks.workunit.client.1.vm04.stdout:9/185: creat d9/d1d/f3d x:0 0 0 2026-03-10T14:07:25.713 INFO:tasks.workunit.client.1.vm04.stdout:3/167: chown da/dc/lf 27624163 1 2026-03-10T14:07:25.716 INFO:tasks.workunit.client.1.vm04.stdout:6/130: stat d3/f9 0 2026-03-10T14:07:25.716 INFO:tasks.workunit.client.1.vm04.stdout:6/131: chown d3/f4 3343905 1 2026-03-10T14:07:25.722 INFO:tasks.workunit.client.1.vm04.stdout:4/169: rename f2 to d4/d14/f3b 0 2026-03-10T14:07:25.723 INFO:tasks.workunit.client.1.vm04.stdout:0/148: creat d0/d2/d15/f2f x:0 0 0 2026-03-10T14:07:25.728 INFO:tasks.workunit.client.1.vm04.stdout:2/166: unlink d0/d3/c1a 0 2026-03-10T14:07:25.728 INFO:tasks.workunit.client.1.vm04.stdout:2/167: fdatasync d0/d3/f1d 0 2026-03-10T14:07:25.733 INFO:tasks.workunit.client.1.vm04.stdout:5/193: link d7/l1e d7/d9/l35 0 2026-03-10T14:07:25.739 INFO:tasks.workunit.client.1.vm04.stdout:8/186: dwrite d0/f23 [0,4194304] 0 2026-03-10T14:07:25.740 INFO:tasks.workunit.client.1.vm04.stdout:2/168: sync 2026-03-10T14:07:25.754 INFO:tasks.workunit.client.1.vm04.stdout:7/132: mknod d2/dc/de/d21/c2f 0 2026-03-10T14:07:25.755 INFO:tasks.workunit.client.1.vm04.stdout:0/149: creat d0/d2/d15/d22/f30 x:0 0 0 2026-03-10T14:07:25.758 INFO:tasks.workunit.client.1.vm04.stdout:0/150: dwrite d0/d2/d15/d22/f30 [0,4194304] 0 2026-03-10T14:07:25.764 INFO:tasks.workunit.client.1.vm04.stdout:0/151: dwrite d0/d1c/f2b [0,4194304] 0 2026-03-10T14:07:25.765 INFO:tasks.workunit.client.1.vm04.stdout:0/152: chown d0/d2/fd 7 1 2026-03-10T14:07:25.777 INFO:tasks.workunit.client.1.vm04.stdout:8/187: unlink d0/d3/dd/d11/d12/l34 0 2026-03-10T14:07:25.781 INFO:tasks.workunit.client.1.vm04.stdout:8/188: dwrite d0/d3/dd/d11/d12/f2c [0,4194304] 0 2026-03-10T14:07:25.784 INFO:tasks.workunit.client.1.vm04.stdout:2/169: unlink d0/d14/d1b/l31 0 2026-03-10T14:07:25.786 INFO:tasks.workunit.client.1.vm04.stdout:3/168: truncate da/fd 2148736 0 2026-03-10T14:07:25.787 INFO:tasks.workunit.client.1.vm04.stdout:3/169: stat da/d30/f32 0 2026-03-10T14:07:25.788 INFO:tasks.workunit.client.1.vm04.stdout:6/132: getdents d3/de/d11 0 2026-03-10T14:07:25.788 INFO:tasks.workunit.client.1.vm04.stdout:7/133: mknod d2/dc/de/d11/c30 0 2026-03-10T14:07:25.788 INFO:tasks.workunit.client.1.vm04.stdout:7/134: readlink d2/l1d 0 2026-03-10T14:07:25.788 INFO:tasks.workunit.client.1.vm04.stdout:7/135: fsync f0 0 2026-03-10T14:07:25.789 INFO:tasks.workunit.client.1.vm04.stdout:7/136: write d2/dc/de/d11/f2e [527657,56796] 0 2026-03-10T14:07:25.798 INFO:tasks.workunit.client.1.vm04.stdout:1/187: getdents d3/d5/d13/d1a 0 2026-03-10T14:07:25.806 INFO:tasks.workunit.client.1.vm04.stdout:8/189: mknod d0/d3/c38 0 2026-03-10T14:07:25.807 INFO:tasks.workunit.client.1.vm04.stdout:8/190: dread - d0/d3/d5/f30 zero size 2026-03-10T14:07:25.810 INFO:tasks.workunit.client.1.vm04.stdout:8/191: dwrite d0/d3/d5/f30 [0,4194304] 0 2026-03-10T14:07:25.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:25 vm04.local ceph-mon[55966]: pgmap v141: 65 pgs: 65 active+clean; 357 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 3.0 MiB/s rd, 30 MiB/s wr, 415 op/s 2026-03-10T14:07:25.832 INFO:tasks.workunit.client.1.vm04.stdout:0/153: creat d0/f31 x:0 0 0 2026-03-10T14:07:25.834 INFO:tasks.workunit.client.1.vm04.stdout:1/188: truncate d3/d5/fd 258988 0 2026-03-10T14:07:25.835 INFO:tasks.workunit.client.1.vm04.stdout:2/170: rmdir d0/d3/d8/d2f 0 2026-03-10T14:07:25.839 INFO:tasks.workunit.client.1.vm04.stdout:7/137: creat d2/dc/de/f31 x:0 0 0 2026-03-10T14:07:25.845 INFO:tasks.workunit.client.1.vm04.stdout:1/189: truncate d3/fc 1065557 0 2026-03-10T14:07:25.850 INFO:tasks.workunit.client.1.vm04.stdout:6/133: getdents d3/d14 0 2026-03-10T14:07:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:25 vm03.local ceph-mon[49718]: pgmap v141: 65 pgs: 65 active+clean; 357 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 3.0 MiB/s rd, 30 MiB/s wr, 415 op/s 2026-03-10T14:07:25.860 INFO:tasks.workunit.client.1.vm04.stdout:6/134: chown d3/d8/fd 43544 1 2026-03-10T14:07:25.861 INFO:tasks.workunit.client.1.vm04.stdout:7/138: mkdir d2/dc/d32 0 2026-03-10T14:07:25.868 INFO:tasks.workunit.client.1.vm04.stdout:6/135: mkdir d3/d1d 0 2026-03-10T14:07:25.875 INFO:tasks.workunit.client.1.vm04.stdout:7/139: symlink d2/dc/l33 0 2026-03-10T14:07:25.891 INFO:tasks.workunit.client.1.vm04.stdout:7/140: dread d2/dc/de/f12 [0,4194304] 0 2026-03-10T14:07:25.893 INFO:tasks.workunit.client.1.vm04.stdout:7/141: dread d2/dc/de/f1e [0,4194304] 0 2026-03-10T14:07:25.896 INFO:tasks.workunit.client.1.vm04.stdout:1/190: getdents d3/d5/d1e 0 2026-03-10T14:07:25.899 INFO:tasks.workunit.client.1.vm04.stdout:6/136: getdents d3/de/d11 0 2026-03-10T14:07:25.902 INFO:tasks.workunit.client.1.vm04.stdout:6/137: symlink d3/de/d11/l1e 0 2026-03-10T14:07:25.902 INFO:tasks.workunit.client.1.vm04.stdout:4/170: dwrite d4/f9 [4194304,4194304] 0 2026-03-10T14:07:25.903 INFO:tasks.workunit.client.1.vm04.stdout:7/142: mknod d2/dc/de/d2d/c34 0 2026-03-10T14:07:25.905 INFO:tasks.workunit.client.1.vm04.stdout:4/171: truncate d4/d14/f27 1950243 0 2026-03-10T14:07:25.905 INFO:tasks.workunit.client.1.vm04.stdout:7/143: stat d2/dc/de/d11 0 2026-03-10T14:07:25.907 INFO:tasks.workunit.client.1.vm04.stdout:6/138: dread d3/d8/fa [0,4194304] 0 2026-03-10T14:07:25.910 INFO:tasks.workunit.client.1.vm04.stdout:6/139: chown d3/de/d11/l15 971189 1 2026-03-10T14:07:25.910 INFO:tasks.workunit.client.1.vm04.stdout:6/140: chown d3/de 58793446 1 2026-03-10T14:07:25.911 INFO:tasks.workunit.client.1.vm04.stdout:6/141: chown d3/de/d11/f17 36972626 1 2026-03-10T14:07:25.920 INFO:tasks.workunit.client.1.vm04.stdout:9/186: truncate d9/da/dd/fe 1203767 0 2026-03-10T14:07:25.920 INFO:tasks.workunit.client.1.vm04.stdout:9/187: readlink d9/da/dd/l21 0 2026-03-10T14:07:25.921 INFO:tasks.workunit.client.1.vm04.stdout:9/188: write d9/da/dd/d1c/f2e [312400,120] 0 2026-03-10T14:07:25.923 INFO:tasks.workunit.client.1.vm04.stdout:2/171: truncate d0/d14/d1b/d25/f2d 2349044 0 2026-03-10T14:07:25.925 INFO:tasks.workunit.client.1.vm04.stdout:9/189: unlink d9/da/dd/l17 0 2026-03-10T14:07:25.930 INFO:tasks.workunit.client.1.vm04.stdout:2/172: rename d0/f28 to d0/d14/d1b/d25/f38 0 2026-03-10T14:07:25.931 INFO:tasks.workunit.client.1.vm04.stdout:4/172: sync 2026-03-10T14:07:25.931 INFO:tasks.workunit.client.1.vm04.stdout:7/144: sync 2026-03-10T14:07:25.931 INFO:tasks.workunit.client.1.vm04.stdout:6/142: sync 2026-03-10T14:07:25.931 INFO:tasks.workunit.client.1.vm04.stdout:6/143: write d3/f19 [912635,33812] 0 2026-03-10T14:07:25.940 INFO:tasks.workunit.client.1.vm04.stdout:2/173: mkdir d0/d14/d39 0 2026-03-10T14:07:25.945 INFO:tasks.workunit.client.1.vm04.stdout:2/174: mkdir d0/d3/d3a 0 2026-03-10T14:07:25.950 INFO:tasks.workunit.client.1.vm04.stdout:7/145: rename d2/dc/fd to d2/d2a/f35 0 2026-03-10T14:07:25.954 INFO:tasks.workunit.client.1.vm04.stdout:2/175: mknod d0/d3/d8/d17/c3b 0 2026-03-10T14:07:25.954 INFO:tasks.workunit.client.1.vm04.stdout:9/190: creat d9/f3e x:0 0 0 2026-03-10T14:07:25.959 INFO:tasks.workunit.client.1.vm04.stdout:7/146: dwrite d2/d2a/f35 [0,4194304] 0 2026-03-10T14:07:25.963 INFO:tasks.workunit.client.1.vm04.stdout:2/176: symlink d0/d3/d8/dd/d26/l3c 0 2026-03-10T14:07:25.966 INFO:tasks.workunit.client.1.vm04.stdout:9/191: mknod d9/da/c3f 0 2026-03-10T14:07:25.966 INFO:tasks.workunit.client.1.vm04.stdout:9/192: write d9/da/dd/d1c/f30 [112761,6225] 0 2026-03-10T14:07:25.972 INFO:tasks.workunit.client.1.vm04.stdout:2/177: truncate d0/d14/d1b/f29 2216755 0 2026-03-10T14:07:25.972 INFO:tasks.workunit.client.1.vm04.stdout:9/193: mknod d9/c40 0 2026-03-10T14:07:25.973 INFO:tasks.workunit.client.1.vm04.stdout:7/147: getdents d2/dc/de/d11 0 2026-03-10T14:07:25.973 INFO:tasks.workunit.client.1.vm04.stdout:7/148: stat d2/dc/de/d21/c2f 0 2026-03-10T14:07:25.978 INFO:tasks.workunit.client.1.vm04.stdout:7/149: mkdir d2/d14/d36 0 2026-03-10T14:07:25.978 INFO:tasks.workunit.client.1.vm04.stdout:7/150: read - d2/f20 zero size 2026-03-10T14:07:25.988 INFO:tasks.workunit.client.1.vm04.stdout:5/194: truncate d7/fa 3968194 0 2026-03-10T14:07:25.988 INFO:tasks.workunit.client.1.vm04.stdout:5/195: dread - d7/d9/f28 zero size 2026-03-10T14:07:25.991 INFO:tasks.workunit.client.1.vm04.stdout:9/194: creat d9/da/f41 x:0 0 0 2026-03-10T14:07:25.997 INFO:tasks.workunit.client.1.vm04.stdout:5/196: mknod d7/d26/c36 0 2026-03-10T14:07:26.000 INFO:tasks.workunit.client.1.vm04.stdout:7/151: creat d2/dc/d32/f37 x:0 0 0 2026-03-10T14:07:26.001 INFO:tasks.workunit.client.1.vm04.stdout:7/152: read - d2/f20 zero size 2026-03-10T14:07:26.003 INFO:tasks.workunit.client.1.vm04.stdout:9/195: rename d9/d33/c3a to d9/da/dd/d1c/c42 0 2026-03-10T14:07:26.022 INFO:tasks.workunit.client.1.vm04.stdout:8/192: truncate d0/d3/d5/f30 242494 0 2026-03-10T14:07:26.022 INFO:tasks.workunit.client.1.vm04.stdout:8/193: stat d0/d3/d5 0 2026-03-10T14:07:26.024 INFO:tasks.workunit.client.1.vm04.stdout:8/194: dread d0/d3/f35 [4194304,4194304] 0 2026-03-10T14:07:26.026 INFO:tasks.workunit.client.1.vm04.stdout:8/195: symlink d0/d3/dd/l39 0 2026-03-10T14:07:26.031 INFO:tasks.workunit.client.1.vm04.stdout:0/154: write d0/d1c/f2c [953782,70312] 0 2026-03-10T14:07:26.050 INFO:tasks.workunit.client.1.vm04.stdout:1/191: write d3/d22/d2f/f3a [700441,23570] 0 2026-03-10T14:07:26.052 INFO:tasks.workunit.client.1.vm04.stdout:1/192: link f1 d3/d22/f42 0 2026-03-10T14:07:26.056 INFO:tasks.workunit.client.1.vm04.stdout:1/193: dwrite d3/d20/f27 [0,4194304] 0 2026-03-10T14:07:26.061 INFO:tasks.workunit.client.1.vm04.stdout:1/194: symlink d3/d22/d2f/l43 0 2026-03-10T14:07:26.062 INFO:tasks.workunit.client.1.vm04.stdout:1/195: mknod d3/d22/d2f/c44 0 2026-03-10T14:07:26.063 INFO:tasks.workunit.client.1.vm04.stdout:1/196: symlink d3/d22/l45 0 2026-03-10T14:07:26.065 INFO:tasks.workunit.client.1.vm04.stdout:1/197: mknod d3/d5/c46 0 2026-03-10T14:07:26.068 INFO:tasks.workunit.client.1.vm04.stdout:1/198: dwrite d3/d22/d2f/f3a [0,4194304] 0 2026-03-10T14:07:26.069 INFO:tasks.workunit.client.1.vm04.stdout:1/199: read - d3/d5/d1e/f2d zero size 2026-03-10T14:07:26.075 INFO:tasks.workunit.client.1.vm04.stdout:2/178: dread d0/d3/f1d [0,4194304] 0 2026-03-10T14:07:26.075 INFO:tasks.workunit.client.1.vm04.stdout:2/179: chown d0/d3/f9 329 1 2026-03-10T14:07:26.077 INFO:tasks.workunit.client.1.vm04.stdout:2/180: fdatasync d0/d3/d8/dd/d26/f36 0 2026-03-10T14:07:26.080 INFO:tasks.workunit.client.1.vm04.stdout:3/170: dwrite da/f28 [0,4194304] 0 2026-03-10T14:07:26.082 INFO:tasks.workunit.client.1.vm04.stdout:2/181: dread d0/d3/d8/dd/d26/f2a [0,4194304] 0 2026-03-10T14:07:26.093 INFO:tasks.workunit.client.1.vm04.stdout:3/171: mkdir da/d3e 0 2026-03-10T14:07:26.093 INFO:tasks.workunit.client.1.vm04.stdout:2/182: dwrite d0/d3/d8/dd/d26/f33 [0,4194304] 0 2026-03-10T14:07:26.095 INFO:tasks.workunit.client.1.vm04.stdout:2/183: write d0/d3/d8/dd/d26/f36 [825385,114771] 0 2026-03-10T14:07:26.099 INFO:tasks.workunit.client.1.vm04.stdout:3/172: truncate da/fb 3459122 0 2026-03-10T14:07:26.100 INFO:tasks.workunit.client.1.vm04.stdout:2/184: symlink d0/d3/d8/d17/d35/l3d 0 2026-03-10T14:07:26.150 INFO:tasks.workunit.client.1.vm04.stdout:3/173: sync 2026-03-10T14:07:26.150 INFO:tasks.workunit.client.1.vm04.stdout:3/174: stat da/dc/f2a 0 2026-03-10T14:07:26.152 INFO:tasks.workunit.client.1.vm04.stdout:3/175: mkdir da/dc/d3f 0 2026-03-10T14:07:26.153 INFO:tasks.workunit.client.1.vm04.stdout:3/176: mknod da/d3e/c40 0 2026-03-10T14:07:26.170 INFO:tasks.workunit.client.1.vm04.stdout:6/144: dwrite d3/d8/fc [0,4194304] 0 2026-03-10T14:07:26.172 INFO:tasks.workunit.client.1.vm04.stdout:4/173: truncate d4/f21 649340 0 2026-03-10T14:07:26.175 INFO:tasks.workunit.client.1.vm04.stdout:4/174: mkdir d4/d14/d3c 0 2026-03-10T14:07:26.175 INFO:tasks.workunit.client.1.vm04.stdout:6/145: creat d3/d1b/f1f x:0 0 0 2026-03-10T14:07:26.175 INFO:tasks.workunit.client.1.vm04.stdout:4/175: chown d4/df/c1a 30368 1 2026-03-10T14:07:26.177 INFO:tasks.workunit.client.1.vm04.stdout:4/176: creat d4/df/d31/f3d x:0 0 0 2026-03-10T14:07:26.222 INFO:tasks.workunit.client.1.vm04.stdout:5/197: dread d7/fa [0,4194304] 0 2026-03-10T14:07:26.222 INFO:tasks.workunit.client.1.vm04.stdout:5/198: stat d7/d26/c2c 0 2026-03-10T14:07:26.224 INFO:tasks.workunit.client.1.vm04.stdout:7/153: rename d2/dc/d32 to d2/dc/de/d2d/d38 0 2026-03-10T14:07:26.225 INFO:tasks.workunit.client.1.vm04.stdout:1/200: rename d3/d5/d1e to d3/d5/d1e/d47 22 2026-03-10T14:07:26.225 INFO:tasks.workunit.client.1.vm04.stdout:7/154: write f0 [2718354,63746] 0 2026-03-10T14:07:26.229 INFO:tasks.workunit.client.1.vm04.stdout:5/199: rmdir d7/d26 39 2026-03-10T14:07:26.232 INFO:tasks.workunit.client.1.vm04.stdout:2/185: rename d0/d14/d1b/d25 to d0/d3/d3a/d3e 0 2026-03-10T14:07:26.234 INFO:tasks.workunit.client.1.vm04.stdout:7/155: sync 2026-03-10T14:07:26.236 INFO:tasks.workunit.client.1.vm04.stdout:5/200: symlink d7/d2d/d32/l37 0 2026-03-10T14:07:26.237 INFO:tasks.workunit.client.1.vm04.stdout:9/196: truncate d9/ff 1529451 0 2026-03-10T14:07:26.239 INFO:tasks.workunit.client.1.vm04.stdout:9/197: dwrite f8 [0,4194304] 0 2026-03-10T14:07:26.255 INFO:tasks.workunit.client.1.vm04.stdout:7/156: truncate d2/dc/f26 920340 0 2026-03-10T14:07:26.259 INFO:tasks.workunit.client.1.vm04.stdout:0/155: dwrite d0/d2/d25/f2a [4194304,4194304] 0 2026-03-10T14:07:26.263 INFO:tasks.workunit.client.1.vm04.stdout:0/156: truncate d0/d1c/f2e 743219 0 2026-03-10T14:07:26.263 INFO:tasks.workunit.client.1.vm04.stdout:9/198: mknod d9/da/dd/d1c/c43 0 2026-03-10T14:07:26.265 INFO:tasks.workunit.client.1.vm04.stdout:3/177: rename l2 to da/dc/d35/d37/l41 0 2026-03-10T14:07:26.265 INFO:tasks.workunit.client.1.vm04.stdout:6/146: rename d3/d14 to d3/d14/d20 22 2026-03-10T14:07:26.275 INFO:tasks.workunit.client.1.vm04.stdout:8/196: fsync d0/d3/dd/d11/d12/f2e 0 2026-03-10T14:07:26.281 INFO:tasks.workunit.client.1.vm04.stdout:0/157: symlink d0/d1c/l32 0 2026-03-10T14:07:26.286 INFO:tasks.workunit.client.1.vm04.stdout:0/158: read d0/d1c/f2c [1647592,76359] 0 2026-03-10T14:07:26.288 INFO:tasks.workunit.client.1.vm04.stdout:9/199: truncate d9/d1d/f1f 1059236 0 2026-03-10T14:07:26.290 INFO:tasks.workunit.client.1.vm04.stdout:4/177: rename d4/d14/f38 to d4/d14/d3c/f3e 0 2026-03-10T14:07:26.292 INFO:tasks.workunit.client.1.vm04.stdout:4/178: dread d4/f2c [0,4194304] 0 2026-03-10T14:07:26.296 INFO:tasks.workunit.client.1.vm04.stdout:6/147: creat d3/d14/f21 x:0 0 0 2026-03-10T14:07:26.296 INFO:tasks.workunit.client.1.vm04.stdout:6/148: truncate d3/d14/f18 72388 0 2026-03-10T14:07:26.297 INFO:tasks.workunit.client.1.vm04.stdout:2/186: link d0/d3/la d0/d3/l3f 0 2026-03-10T14:07:26.297 INFO:tasks.workunit.client.1.vm04.stdout:1/201: getdents d3/d5/d13/d38 0 2026-03-10T14:07:26.298 INFO:tasks.workunit.client.1.vm04.stdout:7/157: link d2/f4 d2/dc/de/d2d/f39 0 2026-03-10T14:07:26.298 INFO:tasks.workunit.client.1.vm04.stdout:1/202: fsync d3/d5/d13/d1a/f24 0 2026-03-10T14:07:26.299 INFO:tasks.workunit.client.1.vm04.stdout:7/158: chown d2/dc/de/d11/f19 358 1 2026-03-10T14:07:26.300 INFO:tasks.workunit.client.1.vm04.stdout:5/201: creat d7/f38 x:0 0 0 2026-03-10T14:07:26.301 INFO:tasks.workunit.client.1.vm04.stdout:5/202: fdatasync d7/f11 0 2026-03-10T14:07:26.306 INFO:tasks.workunit.client.1.vm04.stdout:0/159: readlink d0/d2/d25/l27 0 2026-03-10T14:07:26.312 INFO:tasks.workunit.client.1.vm04.stdout:3/178: link da/dc/f1d da/d30/f42 0 2026-03-10T14:07:26.312 INFO:tasks.workunit.client.1.vm04.stdout:3/179: chown da/dc/f1d 13545 1 2026-03-10T14:07:26.316 INFO:tasks.workunit.client.1.vm04.stdout:4/179: mknod d4/df/d34/c3f 0 2026-03-10T14:07:26.326 INFO:tasks.workunit.client.1.vm04.stdout:1/203: creat d3/d22/f48 x:0 0 0 2026-03-10T14:07:26.326 INFO:tasks.workunit.client.1.vm04.stdout:2/187: symlink d0/d3/d8/d17/d35/l40 0 2026-03-10T14:07:26.326 INFO:tasks.workunit.client.1.vm04.stdout:1/204: dread - d3/d5/d13/d38/f3d zero size 2026-03-10T14:07:26.327 INFO:tasks.workunit.client.1.vm04.stdout:2/188: write d0/d3/d8/d17/f1f [262746,84506] 0 2026-03-10T14:07:26.336 INFO:tasks.workunit.client.1.vm04.stdout:0/160: rmdir d0/d2 39 2026-03-10T14:07:26.339 INFO:tasks.workunit.client.1.vm04.stdout:9/200: mkdir d9/d44 0 2026-03-10T14:07:26.340 INFO:tasks.workunit.client.1.vm04.stdout:9/201: write d9/d1d/f3d [346892,33106] 0 2026-03-10T14:07:26.342 INFO:tasks.workunit.client.1.vm04.stdout:8/197: dwrite d0/d3/dd/fc [0,4194304] 0 2026-03-10T14:07:26.348 INFO:tasks.workunit.client.1.vm04.stdout:3/180: symlink da/dc/l43 0 2026-03-10T14:07:26.348 INFO:tasks.workunit.client.1.vm04.stdout:3/181: chown da/dc/c21 109 1 2026-03-10T14:07:26.350 INFO:tasks.workunit.client.1.vm04.stdout:4/180: creat d4/df/d31/f40 x:0 0 0 2026-03-10T14:07:26.350 INFO:tasks.workunit.client.1.vm04.stdout:4/181: readlink d4/df/l1c 0 2026-03-10T14:07:26.353 INFO:tasks.workunit.client.1.vm04.stdout:4/182: dread d4/f2c [0,4194304] 0 2026-03-10T14:07:26.354 INFO:tasks.workunit.client.1.vm04.stdout:6/149: link d3/d14/f18 d3/de/d11/f22 0 2026-03-10T14:07:26.355 INFO:tasks.workunit.client.1.vm04.stdout:1/205: write d3/f2c [837994,20740] 0 2026-03-10T14:07:26.358 INFO:tasks.workunit.client.1.vm04.stdout:2/189: symlink d0/d3/d8/l41 0 2026-03-10T14:07:26.363 INFO:tasks.workunit.client.1.vm04.stdout:5/203: symlink d7/d26/l39 0 2026-03-10T14:07:26.363 INFO:tasks.workunit.client.1.vm04.stdout:5/204: read d7/fa [3937340,11594] 0 2026-03-10T14:07:26.364 INFO:tasks.workunit.client.1.vm04.stdout:5/205: stat d7/f1d 0 2026-03-10T14:07:26.367 INFO:tasks.workunit.client.1.vm04.stdout:9/202: unlink d9/da/dd/d1c/f39 0 2026-03-10T14:07:26.370 INFO:tasks.workunit.client.1.vm04.stdout:9/203: write d9/da/dd/d1c/f2e [855293,38407] 0 2026-03-10T14:07:26.370 INFO:tasks.workunit.client.1.vm04.stdout:9/204: truncate d9/da/f13 1424846 0 2026-03-10T14:07:26.370 INFO:tasks.workunit.client.1.vm04.stdout:3/182: chown c7 217956 1 2026-03-10T14:07:26.372 INFO:tasks.workunit.client.1.vm04.stdout:3/183: write da/dc/f39 [1290332,55633] 0 2026-03-10T14:07:26.373 INFO:tasks.workunit.client.1.vm04.stdout:4/183: creat d4/d14/d3c/f41 x:0 0 0 2026-03-10T14:07:26.377 INFO:tasks.workunit.client.1.vm04.stdout:9/205: dwrite d9/da/dd/d1c/f27 [0,4194304] 0 2026-03-10T14:07:26.379 INFO:tasks.workunit.client.1.vm04.stdout:9/206: read d9/da/f1e [244810,105868] 0 2026-03-10T14:07:26.385 INFO:tasks.workunit.client.1.vm04.stdout:6/150: mkdir d3/d1b/d23 0 2026-03-10T14:07:26.385 INFO:tasks.workunit.client.1.vm04.stdout:6/151: stat d3 0 2026-03-10T14:07:26.391 INFO:tasks.workunit.client.1.vm04.stdout:1/206: symlink d3/d22/l49 0 2026-03-10T14:07:26.396 INFO:tasks.workunit.client.1.vm04.stdout:2/190: creat d0/d3/d8/f42 x:0 0 0 2026-03-10T14:07:26.396 INFO:tasks.workunit.client.1.vm04.stdout:2/191: dread d0/d3/f1d [0,4194304] 0 2026-03-10T14:07:26.396 INFO:tasks.workunit.client.1.vm04.stdout:2/192: chown d0/d3/d3a 2786 1 2026-03-10T14:07:26.397 INFO:tasks.workunit.client.1.vm04.stdout:7/159: link d2/dc/de/d2d/c34 d2/c3a 0 2026-03-10T14:07:26.399 INFO:tasks.workunit.client.1.vm04.stdout:0/161: creat d0/d2/d25/f33 x:0 0 0 2026-03-10T14:07:26.400 INFO:tasks.workunit.client.1.vm04.stdout:3/184: sync 2026-03-10T14:07:26.405 INFO:tasks.workunit.client.1.vm04.stdout:5/206: rename d7/f38 to d7/d9/f3a 0 2026-03-10T14:07:26.406 INFO:tasks.workunit.client.1.vm04.stdout:5/207: write d7/d12/f27 [3595018,68273] 0 2026-03-10T14:07:26.406 INFO:tasks.workunit.client.1.vm04.stdout:5/208: write d7/f24 [4698518,107393] 0 2026-03-10T14:07:26.412 INFO:tasks.workunit.client.1.vm04.stdout:8/198: symlink d0/l3a 0 2026-03-10T14:07:26.413 INFO:tasks.workunit.client.1.vm04.stdout:8/199: stat d0/d3/dd/l39 0 2026-03-10T14:07:26.418 INFO:tasks.workunit.client.1.vm04.stdout:8/200: sync 2026-03-10T14:07:26.429 INFO:tasks.workunit.client.1.vm04.stdout:9/207: creat d9/d33/f45 x:0 0 0 2026-03-10T14:07:26.440 INFO:tasks.workunit.client.1.vm04.stdout:6/152: read d3/d8/fd [68262,111338] 0 2026-03-10T14:07:26.441 INFO:tasks.workunit.client.1.vm04.stdout:1/207: mknod d3/d5/d1e/c4a 0 2026-03-10T14:07:26.442 INFO:tasks.workunit.client.1.vm04.stdout:1/208: write d3/d22/d2f/f34 [406006,121898] 0 2026-03-10T14:07:26.448 INFO:tasks.workunit.client.1.vm04.stdout:7/160: fdatasync d2/dc/f25 0 2026-03-10T14:07:26.452 INFO:tasks.workunit.client.1.vm04.stdout:3/185: fsync da/dc/f1d 0 2026-03-10T14:07:26.453 INFO:tasks.workunit.client.1.vm04.stdout:3/186: write da/dc/f2f [3851104,36703] 0 2026-03-10T14:07:26.456 INFO:tasks.workunit.client.1.vm04.stdout:3/187: dread da/dc/f1d [0,4194304] 0 2026-03-10T14:07:26.460 INFO:tasks.workunit.client.1.vm04.stdout:2/193: rename d0/c1 to d0/d3/d8/dd/d37/c43 0 2026-03-10T14:07:26.460 INFO:tasks.workunit.client.1.vm04.stdout:2/194: chown d0 11698 1 2026-03-10T14:07:26.461 INFO:tasks.workunit.client.1.vm04.stdout:2/195: truncate d0/d3/d3a/d3e/f2d 2893980 0 2026-03-10T14:07:26.466 INFO:tasks.workunit.client.1.vm04.stdout:5/209: truncate d7/fa 4774339 0 2026-03-10T14:07:26.466 INFO:tasks.workunit.client.1.vm04.stdout:8/201: mkdir d0/d3/dd/d11/d3b 0 2026-03-10T14:07:26.467 INFO:tasks.workunit.client.1.vm04.stdout:1/209: fsync d3/f18 0 2026-03-10T14:07:26.468 INFO:tasks.workunit.client.1.vm04.stdout:5/210: dread d7/f11 [0,4194304] 0 2026-03-10T14:07:26.477 INFO:tasks.workunit.client.1.vm04.stdout:7/161: mkdir d2/d14/d3b 0 2026-03-10T14:07:26.481 INFO:tasks.workunit.client.1.vm04.stdout:8/202: write d0/d3/dd/d11/d12/f1f [2197725,17504] 0 2026-03-10T14:07:26.485 INFO:tasks.workunit.client.1.vm04.stdout:1/210: chown d3/d22/l25 23774 1 2026-03-10T14:07:26.485 INFO:tasks.workunit.client.1.vm04.stdout:1/211: chown f2 207 1 2026-03-10T14:07:26.486 INFO:tasks.workunit.client.1.vm04.stdout:5/211: write d7/d9/f3a [320782,83496] 0 2026-03-10T14:07:26.487 INFO:tasks.workunit.client.1.vm04.stdout:5/212: write d7/d26/f30 [391669,117860] 0 2026-03-10T14:07:26.489 INFO:tasks.workunit.client.1.vm04.stdout:7/162: readlink d2/l9 0 2026-03-10T14:07:26.492 INFO:tasks.workunit.client.1.vm04.stdout:7/163: dwrite d2/d14/f27 [0,4194304] 0 2026-03-10T14:07:26.494 INFO:tasks.workunit.client.1.vm04.stdout:2/196: rmdir d0/d3 39 2026-03-10T14:07:26.495 INFO:tasks.workunit.client.1.vm04.stdout:4/184: write d4/d14/d1b/f20 [4981439,101536] 0 2026-03-10T14:07:26.497 INFO:tasks.workunit.client.1.vm04.stdout:0/162: creat d0/d2/f34 x:0 0 0 2026-03-10T14:07:26.497 INFO:tasks.workunit.client.1.vm04.stdout:0/163: readlink d0/d1c/l32 0 2026-03-10T14:07:26.498 INFO:tasks.workunit.client.1.vm04.stdout:8/203: creat d0/d3/dd/d11/f3c x:0 0 0 2026-03-10T14:07:26.500 INFO:tasks.workunit.client.1.vm04.stdout:0/164: dread d0/d2/d15/d22/f30 [0,4194304] 0 2026-03-10T14:07:26.501 INFO:tasks.workunit.client.1.vm04.stdout:0/165: fsync d0/d2/fd 0 2026-03-10T14:07:26.512 INFO:tasks.workunit.client.1.vm04.stdout:5/213: creat d7/d2d/d32/f3b x:0 0 0 2026-03-10T14:07:26.514 INFO:tasks.workunit.client.1.vm04.stdout:9/208: rename c1 to d9/c46 0 2026-03-10T14:07:26.514 INFO:tasks.workunit.client.1.vm04.stdout:8/204: mkdir d0/d3/dd/d11/d12/d3d 0 2026-03-10T14:07:26.515 INFO:tasks.workunit.client.1.vm04.stdout:6/153: getdents d3/de/d11 0 2026-03-10T14:07:26.515 INFO:tasks.workunit.client.1.vm04.stdout:8/205: chown d0/d3/dd/l39 960148 1 2026-03-10T14:07:26.519 INFO:tasks.workunit.client.1.vm04.stdout:7/164: unlink d2/dc/f1f 0 2026-03-10T14:07:26.530 INFO:tasks.workunit.client.1.vm04.stdout:7/165: chown d2/dc/de/l13 9141380 1 2026-03-10T14:07:26.530 INFO:tasks.workunit.client.1.vm04.stdout:5/214: readlink d7/l1e 0 2026-03-10T14:07:26.530 INFO:tasks.workunit.client.1.vm04.stdout:2/197: unlink d0/d14/d1b/c2e 0 2026-03-10T14:07:26.530 INFO:tasks.workunit.client.1.vm04.stdout:3/188: rename da/dc/f2f to da/d3e/f44 0 2026-03-10T14:07:26.530 INFO:tasks.workunit.client.1.vm04.stdout:8/206: mknod d0/d3/dd/d11/d12/c3e 0 2026-03-10T14:07:26.530 INFO:tasks.workunit.client.1.vm04.stdout:8/207: write d0/d3/dd/fc [3763746,26570] 0 2026-03-10T14:07:26.531 INFO:tasks.workunit.client.1.vm04.stdout:0/166: sync 2026-03-10T14:07:26.532 INFO:tasks.workunit.client.1.vm04.stdout:0/167: chown d0/d2/d15/d22/f30 169530 1 2026-03-10T14:07:26.544 INFO:tasks.workunit.client.1.vm04.stdout:4/185: dwrite d4/f2c [0,4194304] 0 2026-03-10T14:07:26.548 INFO:tasks.workunit.client.1.vm04.stdout:1/212: write d3/fc [1219038,110166] 0 2026-03-10T14:07:26.549 INFO:tasks.workunit.client.1.vm04.stdout:7/166: write d2/f4 [8655471,51340] 0 2026-03-10T14:07:26.551 INFO:tasks.workunit.client.1.vm04.stdout:7/167: fdatasync d2/dc/de/d2d/d38/f37 0 2026-03-10T14:07:26.552 INFO:tasks.workunit.client.1.vm04.stdout:4/186: dwrite d4/f2c [0,4194304] 0 2026-03-10T14:07:26.561 INFO:tasks.workunit.client.1.vm04.stdout:9/209: rename d9/d33/f34 to d9/da/dd/f47 0 2026-03-10T14:07:26.561 INFO:tasks.workunit.client.1.vm04.stdout:8/208: rename d0 to d0/d3/dd/d11/d12/d3d/d3f 22 2026-03-10T14:07:26.562 INFO:tasks.workunit.client.1.vm04.stdout:8/209: write d0/d3/dd/d11/d29/f2b [670287,99404] 0 2026-03-10T14:07:26.563 INFO:tasks.workunit.client.1.vm04.stdout:8/210: write d0/d3/dd/d11/d12/f1f [979501,51220] 0 2026-03-10T14:07:26.574 INFO:tasks.workunit.client.1.vm04.stdout:7/168: dread d2/dc/de/d2d/f39 [0,4194304] 0 2026-03-10T14:07:26.581 INFO:tasks.workunit.client.1.vm04.stdout:2/198: dwrite d0/d3/d3a/d3e/f2d [0,4194304] 0 2026-03-10T14:07:26.581 INFO:tasks.workunit.client.1.vm04.stdout:2/199: fsync d0/d3/d8/dd/d26/f36 0 2026-03-10T14:07:26.590 INFO:tasks.workunit.client.1.vm04.stdout:1/213: creat d3/d5/d13/d1a/f4b x:0 0 0 2026-03-10T14:07:26.592 INFO:tasks.workunit.client.1.vm04.stdout:2/200: dread d0/d3/d8/dd/d26/f36 [0,4194304] 0 2026-03-10T14:07:26.593 INFO:tasks.workunit.client.1.vm04.stdout:2/201: write d0/d3/f11 [7733697,126561] 0 2026-03-10T14:07:26.594 INFO:tasks.workunit.client.1.vm04.stdout:2/202: stat d0/d3/d8/dd 0 2026-03-10T14:07:26.594 INFO:tasks.workunit.client.1.vm04.stdout:2/203: fsync d0/d14/d1b/f32 0 2026-03-10T14:07:26.599 INFO:tasks.workunit.client.1.vm04.stdout:4/187: mknod d4/d14/d3c/c42 0 2026-03-10T14:07:26.600 INFO:tasks.workunit.client.1.vm04.stdout:4/188: write d4/df/d31/f40 [181569,17392] 0 2026-03-10T14:07:26.616 INFO:tasks.workunit.client.1.vm04.stdout:7/169: creat d2/dc/de/d21/f3c x:0 0 0 2026-03-10T14:07:26.617 INFO:tasks.workunit.client.1.vm04.stdout:7/170: chown d2/dc/de/d21/f3c 103247091 1 2026-03-10T14:07:26.621 INFO:tasks.workunit.client.1.vm04.stdout:7/171: dwrite d2/d14/f15 [0,4194304] 0 2026-03-10T14:07:26.624 INFO:tasks.workunit.client.1.vm04.stdout:7/172: dread d2/dc/de/f1e [0,4194304] 0 2026-03-10T14:07:26.626 INFO:tasks.workunit.client.1.vm04.stdout:7/173: write d2/d14/f15 [2233922,102737] 0 2026-03-10T14:07:26.628 INFO:tasks.workunit.client.1.vm04.stdout:5/215: creat d7/f3c x:0 0 0 2026-03-10T14:07:26.629 INFO:tasks.workunit.client.1.vm04.stdout:5/216: dread - d7/f3c zero size 2026-03-10T14:07:26.629 INFO:tasks.workunit.client.1.vm04.stdout:5/217: dread - d7/d2d/d32/f3b zero size 2026-03-10T14:07:26.637 INFO:tasks.workunit.client.1.vm04.stdout:3/189: dwrite da/f19 [0,4194304] 0 2026-03-10T14:07:26.645 INFO:tasks.workunit.client.1.vm04.stdout:3/190: dwrite da/dc/f2c [0,4194304] 0 2026-03-10T14:07:26.645 INFO:tasks.workunit.client.1.vm04.stdout:6/154: write d3/de/f10 [462332,16332] 0 2026-03-10T14:07:26.648 INFO:tasks.workunit.client.1.vm04.stdout:1/214: rmdir d3/d5 39 2026-03-10T14:07:26.659 INFO:tasks.workunit.client.1.vm04.stdout:4/189: read d4/df/d22/f2b [129384,125267] 0 2026-03-10T14:07:26.663 INFO:tasks.workunit.client.1.vm04.stdout:0/168: creat d0/f35 x:0 0 0 2026-03-10T14:07:26.664 INFO:tasks.workunit.client.1.vm04.stdout:0/169: chown d0/d2/d15/d22/f30 233646 1 2026-03-10T14:07:26.676 INFO:tasks.workunit.client.1.vm04.stdout:3/191: symlink da/d30/l45 0 2026-03-10T14:07:26.681 INFO:tasks.workunit.client.1.vm04.stdout:1/215: write d3/d5/d1e/f2d [162637,79817] 0 2026-03-10T14:07:26.683 INFO:tasks.workunit.client.1.vm04.stdout:2/204: creat d0/d14/d39/f44 x:0 0 0 2026-03-10T14:07:26.683 INFO:tasks.workunit.client.1.vm04.stdout:2/205: read d0/d3/d8/d17/f1f [281923,40379] 0 2026-03-10T14:07:26.684 INFO:tasks.workunit.client.1.vm04.stdout:2/206: chown d0/d3/d8/dd/l22 16 1 2026-03-10T14:07:26.698 INFO:tasks.workunit.client.1.vm04.stdout:4/190: mkdir d4/d43 0 2026-03-10T14:07:26.702 INFO:tasks.workunit.client.1.vm04.stdout:4/191: dread d4/d14/f27 [0,4194304] 0 2026-03-10T14:07:26.702 INFO:tasks.workunit.client.1.vm04.stdout:4/192: chown d4/df/f2e 20197 1 2026-03-10T14:07:26.710 INFO:tasks.workunit.client.1.vm04.stdout:8/211: rename d0/d3/dd/d11/d12/l18 to d0/l40 0 2026-03-10T14:07:26.712 INFO:tasks.workunit.client.1.vm04.stdout:0/170: fsync d0/d2/f13 0 2026-03-10T14:07:26.724 INFO:tasks.workunit.client.1.vm04.stdout:6/155: mkdir d3/d1b/d23/d24 0 2026-03-10T14:07:26.726 INFO:tasks.workunit.client.1.vm04.stdout:1/216: mknod d3/d5/d13/c4c 0 2026-03-10T14:07:26.727 INFO:tasks.workunit.client.1.vm04.stdout:6/156: read d3/de/f10 [81219,50274] 0 2026-03-10T14:07:26.734 INFO:tasks.workunit.client.1.vm04.stdout:2/207: read d0/d14/d1b/f29 [1974384,124765] 0 2026-03-10T14:07:26.735 INFO:tasks.workunit.client.1.vm04.stdout:2/208: fdatasync d0/d14/d1b/f32 0 2026-03-10T14:07:26.739 INFO:tasks.workunit.client.1.vm04.stdout:9/210: getdents d9/da 0 2026-03-10T14:07:26.742 INFO:tasks.workunit.client.1.vm04.stdout:4/193: dread d4/d14/d3c/f3e [0,4194304] 0 2026-03-10T14:07:26.747 INFO:tasks.workunit.client.1.vm04.stdout:0/171: write d0/d2/f2d [727192,122838] 0 2026-03-10T14:07:26.757 INFO:tasks.workunit.client.1.vm04.stdout:5/218: link d7/d26/l31 d7/d26/l3d 0 2026-03-10T14:07:26.757 INFO:tasks.workunit.client.1.vm04.stdout:5/219: readlink d7/l29 0 2026-03-10T14:07:26.758 INFO:tasks.workunit.client.1.vm04.stdout:5/220: write d7/f1d [3493047,115281] 0 2026-03-10T14:07:26.773 INFO:tasks.workunit.client.1.vm04.stdout:7/174: truncate d2/f10 2417396 0 2026-03-10T14:07:26.776 INFO:tasks.workunit.client.1.vm04.stdout:4/194: mknod d4/df/d22/c44 0 2026-03-10T14:07:26.776 INFO:tasks.workunit.client.1.vm04.stdout:8/212: mkdir d0/d41 0 2026-03-10T14:07:26.777 INFO:tasks.workunit.client.1.vm04.stdout:8/213: write d0/d3/dd/d11/d29/f2b [1506896,62961] 0 2026-03-10T14:07:26.783 INFO:tasks.workunit.client.1.vm04.stdout:5/221: read d7/fa [4308511,33184] 0 2026-03-10T14:07:26.784 INFO:tasks.workunit.client.1.vm04.stdout:5/222: write d7/d2d/d32/f3b [705077,21020] 0 2026-03-10T14:07:26.784 INFO:tasks.workunit.client.1.vm04.stdout:5/223: dread - d7/d9/f20 zero size 2026-03-10T14:07:26.795 INFO:tasks.workunit.client.1.vm04.stdout:9/211: rename d9/da/dd/fe to d9/da/dd/f48 0 2026-03-10T14:07:26.799 INFO:tasks.workunit.client.1.vm04.stdout:9/212: dwrite d9/da/dd/f24 [0,4194304] 0 2026-03-10T14:07:26.801 INFO:tasks.workunit.client.1.vm04.stdout:4/195: rmdir d4/df/d22 39 2026-03-10T14:07:26.802 INFO:tasks.workunit.client.1.vm04.stdout:4/196: write d4/d14/d3c/f41 [123751,62528] 0 2026-03-10T14:07:26.808 INFO:tasks.workunit.client.1.vm04.stdout:0/172: dwrite d0/f14 [0,4194304] 0 2026-03-10T14:07:26.811 INFO:tasks.workunit.client.1.vm04.stdout:3/192: getdents da/dc/d35/d37 0 2026-03-10T14:07:26.813 INFO:tasks.workunit.client.1.vm04.stdout:5/224: rmdir d7 39 2026-03-10T14:07:26.820 INFO:tasks.workunit.client.1.vm04.stdout:6/157: write d3/d8/fa [3985196,121737] 0 2026-03-10T14:07:26.821 INFO:tasks.workunit.client.1.vm04.stdout:6/158: readlink d3/de/d11/l1e 0 2026-03-10T14:07:26.830 INFO:tasks.workunit.client.1.vm04.stdout:7/175: rename d2/f10 to d2/d28/f3d 0 2026-03-10T14:07:26.830 INFO:tasks.workunit.client.1.vm04.stdout:7/176: truncate d2/f20 504819 0 2026-03-10T14:07:26.831 INFO:tasks.workunit.client.1.vm04.stdout:4/197: readlink d4/l7 0 2026-03-10T14:07:26.833 INFO:tasks.workunit.client.1.vm04.stdout:0/173: symlink d0/d2/d15/d22/l36 0 2026-03-10T14:07:26.844 INFO:tasks.workunit.client.1.vm04.stdout:1/217: dwrite d3/d22/d2f/f39 [0,4194304] 0 2026-03-10T14:07:26.846 INFO:tasks.workunit.client.1.vm04.stdout:9/213: creat d9/d44/f49 x:0 0 0 2026-03-10T14:07:26.854 INFO:tasks.workunit.client.1.vm04.stdout:7/177: dwrite d2/dc/f26 [0,4194304] 0 2026-03-10T14:07:26.855 INFO:tasks.workunit.client.1.vm04.stdout:7/178: write d2/dc/de/d11/f2e [30043,30257] 0 2026-03-10T14:07:26.859 INFO:tasks.workunit.client.1.vm04.stdout:4/198: fdatasync d4/d14/d1b/f2f 0 2026-03-10T14:07:26.861 INFO:tasks.workunit.client.1.vm04.stdout:8/214: creat d0/f42 x:0 0 0 2026-03-10T14:07:26.863 INFO:tasks.workunit.client.1.vm04.stdout:4/199: write d4/d14/d1b/f20 [2093324,68503] 0 2026-03-10T14:07:26.868 INFO:tasks.workunit.client.1.vm04.stdout:1/218: chown d3/d5/c40 142256 1 2026-03-10T14:07:26.869 INFO:tasks.workunit.client.1.vm04.stdout:1/219: read d3/f2c [4014862,10416] 0 2026-03-10T14:07:26.872 INFO:tasks.workunit.client.1.vm04.stdout:7/179: rmdir d2/d14 39 2026-03-10T14:07:26.882 INFO:tasks.workunit.client.1.vm04.stdout:4/200: mknod d4/c45 0 2026-03-10T14:07:26.882 INFO:tasks.workunit.client.1.vm04.stdout:2/209: truncate d0/d3/d8/dd/d26/f33 3629613 0 2026-03-10T14:07:26.882 INFO:tasks.workunit.client.1.vm04.stdout:5/225: mkdir d7/d12/d2b/d3e 0 2026-03-10T14:07:26.882 INFO:tasks.workunit.client.1.vm04.stdout:0/174: mknod d0/d2/c37 0 2026-03-10T14:07:26.886 INFO:tasks.workunit.client.1.vm04.stdout:7/180: mknod d2/dc/de/d21/c3e 0 2026-03-10T14:07:26.886 INFO:tasks.workunit.client.1.vm04.stdout:7/181: fsync d2/dc/f25 0 2026-03-10T14:07:26.890 INFO:tasks.workunit.client.1.vm04.stdout:3/193: getdents da 0 2026-03-10T14:07:26.890 INFO:tasks.workunit.client.1.vm04.stdout:2/210: rmdir d0/d3/d8 39 2026-03-10T14:07:26.890 INFO:tasks.workunit.client.1.vm04.stdout:3/194: dread - da/dc/f2a zero size 2026-03-10T14:07:26.894 INFO:tasks.workunit.client.1.vm04.stdout:1/220: getdents d3/d5/d13/d38 0 2026-03-10T14:07:26.895 INFO:tasks.workunit.client.1.vm04.stdout:7/182: symlink d2/dc/de/d2d/l3f 0 2026-03-10T14:07:26.902 INFO:tasks.workunit.client.1.vm04.stdout:5/226: mkdir d7/d12/d2b/d3e/d3f 0 2026-03-10T14:07:26.902 INFO:tasks.workunit.client.1.vm04.stdout:5/227: fdatasync f4 0 2026-03-10T14:07:26.902 INFO:tasks.workunit.client.1.vm04.stdout:3/195: rename da/dc/f11 to da/dc/d35/f46 0 2026-03-10T14:07:26.902 INFO:tasks.workunit.client.1.vm04.stdout:2/211: chown d0/d3/d8/d17/c3b 41566180 1 2026-03-10T14:07:26.902 INFO:tasks.workunit.client.1.vm04.stdout:3/196: write da/f22 [631900,20136] 0 2026-03-10T14:07:26.902 INFO:tasks.workunit.client.1.vm04.stdout:1/221: symlink d3/d5/d13/d1a/l4d 0 2026-03-10T14:07:26.902 INFO:tasks.workunit.client.1.vm04.stdout:9/214: getdents d9 0 2026-03-10T14:07:26.903 INFO:tasks.workunit.client.1.vm04.stdout:7/183: mknod d2/dc/de/d21/c40 0 2026-03-10T14:07:26.905 INFO:tasks.workunit.client.1.vm04.stdout:4/201: rmdir d4/d43 0 2026-03-10T14:07:26.906 INFO:tasks.workunit.client.1.vm04.stdout:4/202: truncate d4/d14/d3c/f3e 9067415 0 2026-03-10T14:07:26.909 INFO:tasks.workunit.client.1.vm04.stdout:5/228: rename d7/l29 to d7/d2d/d32/l40 0 2026-03-10T14:07:26.909 INFO:tasks.workunit.client.1.vm04.stdout:5/229: stat d7/d2d/d32 0 2026-03-10T14:07:26.910 INFO:tasks.workunit.client.1.vm04.stdout:8/215: sync 2026-03-10T14:07:26.910 INFO:tasks.workunit.client.1.vm04.stdout:8/216: write d0/d3/dd/fc [2465380,26991] 0 2026-03-10T14:07:26.925 INFO:tasks.workunit.client.1.vm04.stdout:3/197: unlink c5 0 2026-03-10T14:07:26.926 INFO:tasks.workunit.client.1.vm04.stdout:0/175: dread d0/d2/f2d [0,4194304] 0 2026-03-10T14:07:26.929 INFO:tasks.workunit.client.1.vm04.stdout:9/215: rmdir d9/da 39 2026-03-10T14:07:26.940 INFO:tasks.workunit.client.1.vm04.stdout:4/203: creat d4/d14/d3c/f46 x:0 0 0 2026-03-10T14:07:26.943 INFO:tasks.workunit.client.1.vm04.stdout:4/204: dwrite d4/df/f18 [0,4194304] 0 2026-03-10T14:07:26.961 INFO:tasks.workunit.client.1.vm04.stdout:3/198: readlink da/d30/l3a 0 2026-03-10T14:07:26.962 INFO:tasks.workunit.client.1.vm04.stdout:3/199: write da/fd [4150271,93654] 0 2026-03-10T14:07:26.974 INFO:tasks.workunit.client.1.vm04.stdout:7/184: rename d2/d14/f27 to d2/dc/de/d2d/d38/f41 0 2026-03-10T14:07:26.975 INFO:tasks.workunit.client.1.vm04.stdout:6/159: truncate d3/d8/fa 2388196 0 2026-03-10T14:07:26.984 INFO:tasks.workunit.client.1.vm04.stdout:3/200: mkdir da/dc/d47 0 2026-03-10T14:07:26.985 INFO:tasks.workunit.client.1.vm04.stdout:3/201: fdatasync da/dc/f2c 0 2026-03-10T14:07:26.994 INFO:tasks.workunit.client.1.vm04.stdout:4/205: mkdir d4/df/d22/d47 0 2026-03-10T14:07:27.000 INFO:tasks.workunit.client.1.vm04.stdout:0/176: truncate d0/d2/fd 2578316 0 2026-03-10T14:07:27.005 INFO:tasks.workunit.client.1.vm04.stdout:2/212: dwrite d0/d3/d8/dd/d26/f2a [0,4194304] 0 2026-03-10T14:07:27.009 INFO:tasks.workunit.client.1.vm04.stdout:3/202: dread f8 [0,4194304] 0 2026-03-10T14:07:27.017 INFO:tasks.workunit.client.1.vm04.stdout:1/222: getdents d3 0 2026-03-10T14:07:27.021 INFO:tasks.workunit.client.1.vm04.stdout:0/177: mkdir d0/d2/d15/d22/d38 0 2026-03-10T14:07:27.022 INFO:tasks.workunit.client.1.vm04.stdout:9/216: creat d9/f4a x:0 0 0 2026-03-10T14:07:27.022 INFO:tasks.workunit.client.1.vm04.stdout:2/213: mkdir d0/d14/d1b/d45 0 2026-03-10T14:07:27.023 INFO:tasks.workunit.client.1.vm04.stdout:3/203: chown da/dc/d35/d37/l41 386 1 2026-03-10T14:07:27.026 INFO:tasks.workunit.client.1.vm04.stdout:4/206: dwrite d4/f6 [0,4194304] 0 2026-03-10T14:07:27.029 INFO:tasks.workunit.client.1.vm04.stdout:4/207: chown d4/d14/d3c/f3e 221530 1 2026-03-10T14:07:27.029 INFO:tasks.workunit.client.1.vm04.stdout:4/208: read d4/df/f18 [3137920,7924] 0 2026-03-10T14:07:27.032 INFO:tasks.workunit.client.1.vm04.stdout:3/204: dwrite f4 [4194304,4194304] 0 2026-03-10T14:07:27.043 INFO:tasks.workunit.client.1.vm04.stdout:2/214: mkdir d0/d3/d8/dd/d26/d46 0 2026-03-10T14:07:27.046 INFO:tasks.workunit.client.1.vm04.stdout:3/205: write da/dc/d35/f46 [817468,17191] 0 2026-03-10T14:07:27.047 INFO:tasks.workunit.client.1.vm04.stdout:3/206: write da/f10 [3483092,81865] 0 2026-03-10T14:07:27.059 INFO:tasks.workunit.client.1.vm04.stdout:1/223: dread d3/f2c [0,4194304] 0 2026-03-10T14:07:27.062 INFO:tasks.workunit.client.1.vm04.stdout:2/215: rename d0/d3/d8/dd/d37 to d0/d14/d39/d47 0 2026-03-10T14:07:27.065 INFO:tasks.workunit.client.1.vm04.stdout:1/224: dwrite d3/d22/d2f/f39 [4194304,4194304] 0 2026-03-10T14:07:27.065 INFO:tasks.workunit.client.1.vm04.stdout:2/216: dread d0/d3/d8/dd/d26/f36 [0,4194304] 0 2026-03-10T14:07:27.066 INFO:tasks.workunit.client.1.vm04.stdout:2/217: chown d0/d3/cf 25753417 1 2026-03-10T14:07:27.066 INFO:tasks.workunit.client.1.vm04.stdout:2/218: stat d0/d3/c34 0 2026-03-10T14:07:27.067 INFO:tasks.workunit.client.1.vm04.stdout:4/209: rmdir d4/d14/d3c 39 2026-03-10T14:07:27.075 INFO:tasks.workunit.client.1.vm04.stdout:0/178: link d0/d2/d25/l27 d0/d2/l39 0 2026-03-10T14:07:27.078 INFO:tasks.workunit.client.1.vm04.stdout:8/217: dwrite d0/f26 [0,4194304] 0 2026-03-10T14:07:27.079 INFO:tasks.workunit.client.1.vm04.stdout:8/218: fsync d0/d3/dd/d11/d12/f2c 0 2026-03-10T14:07:27.083 INFO:tasks.workunit.client.1.vm04.stdout:1/225: fdatasync d3/f8 0 2026-03-10T14:07:27.094 INFO:tasks.workunit.client.1.vm04.stdout:4/210: mknod d4/df/d31/c48 0 2026-03-10T14:07:27.094 INFO:tasks.workunit.client.1.vm04.stdout:5/230: truncate d7/d2d/d32/f3b 407080 0 2026-03-10T14:07:27.094 INFO:tasks.workunit.client.1.vm04.stdout:5/231: dread d7/d9/f3a [0,4194304] 0 2026-03-10T14:07:27.094 INFO:tasks.workunit.client.1.vm04.stdout:0/179: unlink d0/d2/f1d 0 2026-03-10T14:07:27.094 INFO:tasks.workunit.client.1.vm04.stdout:5/232: write d7/f11 [868317,12012] 0 2026-03-10T14:07:27.102 INFO:tasks.workunit.client.1.vm04.stdout:8/219: dwrite d0/d3/f35 [4194304,4194304] 0 2026-03-10T14:07:27.105 INFO:tasks.workunit.client.1.vm04.stdout:1/226: symlink d3/d20/l4e 0 2026-03-10T14:07:27.108 INFO:tasks.workunit.client.1.vm04.stdout:6/160: dwrite d3/d14/f18 [0,4194304] 0 2026-03-10T14:07:27.109 INFO:tasks.workunit.client.1.vm04.stdout:6/161: readlink d3/d14/l16 0 2026-03-10T14:07:27.118 INFO:tasks.workunit.client.1.vm04.stdout:8/220: dwrite d0/d3/dd/d11/d12/f2c [0,4194304] 0 2026-03-10T14:07:27.127 INFO:tasks.workunit.client.1.vm04.stdout:4/211: rename d4/l17 to d4/df/d22/d47/l49 0 2026-03-10T14:07:27.133 INFO:tasks.workunit.client.1.vm04.stdout:3/207: dread da/dc/d35/f46 [0,4194304] 0 2026-03-10T14:07:27.135 INFO:tasks.workunit.client.1.vm04.stdout:0/180: symlink d0/d1c/l3a 0 2026-03-10T14:07:27.136 INFO:tasks.workunit.client.1.vm04.stdout:0/181: chown d0/d2/c26 1725558 1 2026-03-10T14:07:27.137 INFO:tasks.workunit.client.1.vm04.stdout:1/227: symlink d3/d5/d13/d1a/l4f 0 2026-03-10T14:07:27.140 INFO:tasks.workunit.client.1.vm04.stdout:7/185: dwrite d2/d28/f3d [0,4194304] 0 2026-03-10T14:07:27.147 INFO:tasks.workunit.client.1.vm04.stdout:1/228: dread d3/d22/d2f/f3a [0,4194304] 0 2026-03-10T14:07:27.160 INFO:tasks.workunit.client.1.vm04.stdout:2/219: fdatasync d0/d3/d8/dd/d26/f33 0 2026-03-10T14:07:27.160 INFO:tasks.workunit.client.1.vm04.stdout:2/220: stat d0/d3/f9 0 2026-03-10T14:07:27.165 INFO:tasks.workunit.client.1.vm04.stdout:2/221: dread d0/d3/d8/dd/d26/f36 [0,4194304] 0 2026-03-10T14:07:27.172 INFO:tasks.workunit.client.1.vm04.stdout:7/186: dwrite d2/dc/de/d2d/d38/f41 [0,4194304] 0 2026-03-10T14:07:27.189 INFO:tasks.workunit.client.1.vm04.stdout:4/212: symlink d4/d14/l4a 0 2026-03-10T14:07:27.189 INFO:tasks.workunit.client.1.vm04.stdout:5/233: rename d7/c1f to d7/d12/d2b/d3e/d3f/c41 0 2026-03-10T14:07:27.189 INFO:tasks.workunit.client.1.vm04.stdout:0/182: symlink d0/d2/d15/d22/d38/l3b 0 2026-03-10T14:07:27.189 INFO:tasks.workunit.client.1.vm04.stdout:7/187: mkdir d2/d2a/d42 0 2026-03-10T14:07:27.189 INFO:tasks.workunit.client.1.vm04.stdout:9/217: creat d9/d33/f4b x:0 0 0 2026-03-10T14:07:27.189 INFO:tasks.workunit.client.1.vm04.stdout:5/234: rmdir d7/d2d/d32 39 2026-03-10T14:07:27.197 INFO:tasks.workunit.client.1.vm04.stdout:6/162: rename d3/d8/fd to d3/d1b/d23/d24/f25 0 2026-03-10T14:07:27.207 INFO:tasks.workunit.client.1.vm04.stdout:8/221: rename d0 to d0/d3/dd/d11/d3b/d43 22 2026-03-10T14:07:27.207 INFO:tasks.workunit.client.1.vm04.stdout:6/163: stat d3/d14/l16 0 2026-03-10T14:07:27.208 INFO:tasks.workunit.client.1.vm04.stdout:8/222: stat d0/d3/dd/d11/d12/c21 0 2026-03-10T14:07:27.208 INFO:tasks.workunit.client.1.vm04.stdout:4/213: creat d4/df/d22/f4b x:0 0 0 2026-03-10T14:07:27.208 INFO:tasks.workunit.client.1.vm04.stdout:0/183: mknod d0/d2/d25/c3c 0 2026-03-10T14:07:27.208 INFO:tasks.workunit.client.1.vm04.stdout:0/184: chown d0/d2/f34 371 1 2026-03-10T14:07:27.208 INFO:tasks.workunit.client.1.vm04.stdout:2/222: link d0/d3/l27 d0/d14/d1b/l48 0 2026-03-10T14:07:27.212 INFO:tasks.workunit.client.1.vm04.stdout:2/223: dwrite d0/d3/d8/f30 [0,4194304] 0 2026-03-10T14:07:27.212 INFO:tasks.workunit.client.1.vm04.stdout:2/224: read d0/d3/d8/dd/d26/f33 [3527776,82756] 0 2026-03-10T14:07:27.221 INFO:tasks.workunit.client.1.vm04.stdout:5/235: creat d7/d12/f42 x:0 0 0 2026-03-10T14:07:27.227 INFO:tasks.workunit.client.1.vm04.stdout:6/164: symlink d3/d14/l26 0 2026-03-10T14:07:27.231 INFO:tasks.workunit.client.1.vm04.stdout:1/229: sync 2026-03-10T14:07:27.231 INFO:tasks.workunit.client.1.vm04.stdout:6/165: dwrite d3/d14/f18 [0,4194304] 0 2026-03-10T14:07:27.236 INFO:tasks.workunit.client.1.vm04.stdout:0/185: dread d0/d2/fa [0,4194304] 0 2026-03-10T14:07:27.250 INFO:tasks.workunit.client.1.vm04.stdout:4/214: creat d4/df/d22/f4c x:0 0 0 2026-03-10T14:07:27.252 INFO:tasks.workunit.client.1.vm04.stdout:7/188: creat d2/d14/d3b/f43 x:0 0 0 2026-03-10T14:07:27.254 INFO:tasks.workunit.client.1.vm04.stdout:2/225: creat d0/d3/d8/d17/d35/f49 x:0 0 0 2026-03-10T14:07:27.255 INFO:tasks.workunit.client.1.vm04.stdout:9/218: truncate d9/ff 2049956 0 2026-03-10T14:07:27.255 INFO:tasks.workunit.client.1.vm04.stdout:2/226: read - d0/d3/d8/d17/d35/f49 zero size 2026-03-10T14:07:27.256 INFO:tasks.workunit.client.1.vm04.stdout:8/223: rename d0/d3/dd/d11/d29/c36 to d0/d3/dd/d11/c44 0 2026-03-10T14:07:27.261 INFO:tasks.workunit.client.1.vm04.stdout:2/227: dwrite d0/d3/d8/dd/d26/f36 [0,4194304] 0 2026-03-10T14:07:27.263 INFO:tasks.workunit.client.1.vm04.stdout:2/228: write d0/d3/d8/dd/d26/f36 [1320780,112439] 0 2026-03-10T14:07:27.266 INFO:tasks.workunit.client.1.vm04.stdout:6/166: creat d3/d14/f27 x:0 0 0 2026-03-10T14:07:27.266 INFO:tasks.workunit.client.1.vm04.stdout:2/229: dread - d0/d3/d8/f42 zero size 2026-03-10T14:07:27.267 INFO:tasks.workunit.client.1.vm04.stdout:1/230: fdatasync d3/f18 0 2026-03-10T14:07:27.269 INFO:tasks.workunit.client.1.vm04.stdout:7/189: mkdir d2/d14/d44 0 2026-03-10T14:07:27.269 INFO:tasks.workunit.client.1.vm04.stdout:7/190: stat d2/dc/de/d2d 0 2026-03-10T14:07:27.270 INFO:tasks.workunit.client.1.vm04.stdout:5/236: rename d7/d26/l31 to d7/d26/l43 0 2026-03-10T14:07:27.276 INFO:tasks.workunit.client.1.vm04.stdout:6/167: symlink d3/d1b/l28 0 2026-03-10T14:07:27.284 INFO:tasks.workunit.client.1.vm04.stdout:6/168: dwrite d3/d14/f27 [0,4194304] 0 2026-03-10T14:07:27.284 INFO:tasks.workunit.client.1.vm04.stdout:4/215: sync 2026-03-10T14:07:27.286 INFO:tasks.workunit.client.1.vm04.stdout:2/230: dread d0/d3/f24 [0,4194304] 0 2026-03-10T14:07:27.292 INFO:tasks.workunit.client.1.vm04.stdout:2/231: dread - d0/d14/d39/f44 zero size 2026-03-10T14:07:27.294 INFO:tasks.workunit.client.1.vm04.stdout:7/191: creat d2/dc/de/d21/f45 x:0 0 0 2026-03-10T14:07:27.294 INFO:tasks.workunit.client.1.vm04.stdout:1/231: symlink d3/d5/d1e/l50 0 2026-03-10T14:07:27.295 INFO:tasks.workunit.client.1.vm04.stdout:2/232: dread d0/d3/f24 [0,4194304] 0 2026-03-10T14:07:27.301 INFO:tasks.workunit.client.1.vm04.stdout:5/237: symlink d7/d12/d2b/d3e/d3f/l44 0 2026-03-10T14:07:27.306 INFO:tasks.workunit.client.1.vm04.stdout:0/186: rename d0/d2/c26 to d0/c3d 0 2026-03-10T14:07:27.311 INFO:tasks.workunit.client.1.vm04.stdout:7/192: mknod d2/dc/de/d21/c46 0 2026-03-10T14:07:27.316 INFO:tasks.workunit.client.1.vm04.stdout:5/238: dwrite d7/f21 [0,4194304] 0 2026-03-10T14:07:27.328 INFO:tasks.workunit.client.1.vm04.stdout:6/169: link d3/f19 d3/de/d11/f29 0 2026-03-10T14:07:27.331 INFO:tasks.workunit.client.1.vm04.stdout:6/170: dwrite d3/f4 [0,4194304] 0 2026-03-10T14:07:27.337 INFO:tasks.workunit.client.1.vm04.stdout:0/187: creat d0/d2/d15/d22/d38/f3e x:0 0 0 2026-03-10T14:07:27.337 INFO:tasks.workunit.client.1.vm04.stdout:0/188: chown d0/d2/d25 4 1 2026-03-10T14:07:27.339 INFO:tasks.workunit.client.1.vm04.stdout:8/224: getdents d0/d3/dd/d11/d12 0 2026-03-10T14:07:27.341 INFO:tasks.workunit.client.1.vm04.stdout:7/193: mknod d2/dc/de/d11/c47 0 2026-03-10T14:07:27.341 INFO:tasks.workunit.client.1.vm04.stdout:7/194: stat d2/d28/f3d 0 2026-03-10T14:07:27.342 INFO:tasks.workunit.client.1.vm04.stdout:7/195: read d2/d2a/f35 [674825,57710] 0 2026-03-10T14:07:27.343 INFO:tasks.workunit.client.1.vm04.stdout:2/233: mkdir d0/d3/d4a 0 2026-03-10T14:07:27.350 INFO:tasks.workunit.client.1.vm04.stdout:5/239: mkdir d7/d12/d45 0 2026-03-10T14:07:27.357 INFO:tasks.workunit.client.1.vm04.stdout:5/240: read d7/f1d [1581151,92459] 0 2026-03-10T14:07:27.358 INFO:tasks.workunit.client.1.vm04.stdout:9/219: link d9/da/dd/l1a d9/d1d/l4c 0 2026-03-10T14:07:27.358 INFO:tasks.workunit.client.1.vm04.stdout:8/225: creat d0/d3/dd/d11/d29/f45 x:0 0 0 2026-03-10T14:07:27.359 INFO:tasks.workunit.client.1.vm04.stdout:8/226: chown d0/d3/dd/d11/d29/f45 29105 1 2026-03-10T14:07:27.362 INFO:tasks.workunit.client.1.vm04.stdout:3/208: truncate da/f19 3738158 0 2026-03-10T14:07:27.378 INFO:tasks.workunit.client.1.vm04.stdout:7/196: creat d2/d14/d3b/f48 x:0 0 0 2026-03-10T14:07:27.378 INFO:tasks.workunit.client.1.vm04.stdout:9/220: chown d9/d44 4 1 2026-03-10T14:07:27.378 INFO:tasks.workunit.client.1.vm04.stdout:9/221: write d9/d33/f4b [881759,117838] 0 2026-03-10T14:07:27.378 INFO:tasks.workunit.client.1.vm04.stdout:4/216: fsync d4/df/d22/f4b 0 2026-03-10T14:07:27.378 INFO:tasks.workunit.client.1.vm04.stdout:2/234: mkdir d0/d3/d8/dd/d26/d46/d4b 0 2026-03-10T14:07:27.378 INFO:tasks.workunit.client.1.vm04.stdout:6/171: rename d3/de/f10 to d3/f2a 0 2026-03-10T14:07:27.378 INFO:tasks.workunit.client.1.vm04.stdout:5/241: link d7/d9/fd d7/d12/d2b/f46 0 2026-03-10T14:07:27.381 INFO:tasks.workunit.client.1.vm04.stdout:2/235: mknod d0/d3/d3a/c4c 0 2026-03-10T14:07:27.382 INFO:tasks.workunit.client.1.vm04.stdout:7/197: rename d2/dc/de/f22 to d2/d14/d3b/f49 0 2026-03-10T14:07:27.383 INFO:tasks.workunit.client.1.vm04.stdout:7/198: stat d2/dc 0 2026-03-10T14:07:27.386 INFO:tasks.workunit.client.1.vm04.stdout:7/199: dwrite d2/dc/de/d21/f45 [0,4194304] 0 2026-03-10T14:07:27.393 INFO:tasks.workunit.client.1.vm04.stdout:6/172: creat d3/d8/f2b x:0 0 0 2026-03-10T14:07:27.400 INFO:tasks.workunit.client.1.vm04.stdout:8/227: getdents d0/d3/dd/d33 0 2026-03-10T14:07:27.401 INFO:tasks.workunit.client.1.vm04.stdout:6/173: symlink d3/de/l2c 0 2026-03-10T14:07:27.402 INFO:tasks.workunit.client.1.vm04.stdout:6/174: truncate d3/d14/f27 4969560 0 2026-03-10T14:07:27.402 INFO:tasks.workunit.client.1.vm04.stdout:8/228: read d0/d3/d5/f15 [973950,117924] 0 2026-03-10T14:07:27.404 INFO:tasks.workunit.client.1.vm04.stdout:6/175: dread d3/d8/fc [0,4194304] 0 2026-03-10T14:07:27.404 INFO:tasks.workunit.client.1.vm04.stdout:6/176: stat d3/de/d11/l15 0 2026-03-10T14:07:27.406 INFO:tasks.workunit.client.1.vm04.stdout:4/217: getdents d4/df 0 2026-03-10T14:07:27.407 INFO:tasks.workunit.client.1.vm04.stdout:9/222: sync 2026-03-10T14:07:27.410 INFO:tasks.workunit.client.1.vm04.stdout:9/223: fdatasync d9/da/dd/d1c/f22 0 2026-03-10T14:07:27.410 INFO:tasks.workunit.client.1.vm04.stdout:4/218: dread d4/f9 [0,4194304] 0 2026-03-10T14:07:27.410 INFO:tasks.workunit.client.1.vm04.stdout:7/200: creat d2/dc/f4a x:0 0 0 2026-03-10T14:07:27.414 INFO:tasks.workunit.client.1.vm04.stdout:6/177: rename d3/d14 to d3/de/d11/d2d 0 2026-03-10T14:07:27.423 INFO:tasks.workunit.client.1.vm04.stdout:2/236: getdents d0/d3/d8/dd/d26 0 2026-03-10T14:07:27.423 INFO:tasks.workunit.client.1.vm04.stdout:2/237: stat d0/d3/d8/d17/d35/l40 0 2026-03-10T14:07:27.424 INFO:tasks.workunit.client.1.vm04.stdout:6/178: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:27.430 INFO:tasks.workunit.client.1.vm04.stdout:9/224: read d9/da/dd/f19 [745439,58807] 0 2026-03-10T14:07:27.434 INFO:tasks.workunit.client.1.vm04.stdout:0/189: unlink d0/c3d 0 2026-03-10T14:07:27.450 INFO:tasks.workunit.client.1.vm04.stdout:6/179: creat d3/de/d11/d2d/f2e x:0 0 0 2026-03-10T14:07:27.451 INFO:tasks.workunit.client.1.vm04.stdout:1/232: truncate d3/fa 638865 0 2026-03-10T14:07:27.462 INFO:tasks.workunit.client.1.vm04.stdout:8/229: rmdir d0/d3/dd/d11/d12/d3d 0 2026-03-10T14:07:27.463 INFO:tasks.workunit.client.1.vm04.stdout:8/230: chown d0/d3/dd/d11/d29/f45 1014 1 2026-03-10T14:07:27.464 INFO:tasks.workunit.client.1.vm04.stdout:7/201: symlink d2/l4b 0 2026-03-10T14:07:27.475 INFO:tasks.workunit.client.1.vm04.stdout:3/209: truncate da/dc/f39 509362 0 2026-03-10T14:07:27.477 INFO:tasks.workunit.client.1.vm04.stdout:3/210: dwrite da/f28 [0,4194304] 0 2026-03-10T14:07:27.479 INFO:tasks.workunit.client.1.vm04.stdout:3/211: chown da/d3e 5 1 2026-03-10T14:07:27.479 INFO:tasks.workunit.client.1.vm04.stdout:3/212: chown da/dc/lf 9539132 1 2026-03-10T14:07:27.479 INFO:tasks.workunit.client.1.vm04.stdout:3/213: chown da/fd 5396 1 2026-03-10T14:07:27.480 INFO:tasks.workunit.client.1.vm04.stdout:7/202: fdatasync d2/dc/de/f12 0 2026-03-10T14:07:27.483 INFO:tasks.workunit.client.1.vm04.stdout:6/180: link d3/de/d11/d2d/f2e d3/d1b/d23/f2f 0 2026-03-10T14:07:27.487 INFO:tasks.workunit.client.1.vm04.stdout:6/181: dwrite d3/de/d11/d2d/f27 [0,4194304] 0 2026-03-10T14:07:27.490 INFO:tasks.workunit.client.1.vm04.stdout:1/233: getdents d3/d5/d13/d38 0 2026-03-10T14:07:27.492 INFO:tasks.workunit.client.1.vm04.stdout:8/231: mknod d0/d3/dd/d11/d3b/c46 0 2026-03-10T14:07:27.493 INFO:tasks.workunit.client.1.vm04.stdout:8/232: chown d0/c19 9 1 2026-03-10T14:07:27.498 INFO:tasks.workunit.client.1.vm04.stdout:1/234: dwrite d3/d22/d2f/f39 [0,4194304] 0 2026-03-10T14:07:27.500 INFO:tasks.workunit.client.1.vm04.stdout:7/203: read d2/f4 [6024776,15273] 0 2026-03-10T14:07:27.503 INFO:tasks.workunit.client.1.vm04.stdout:7/204: truncate d2/d28/f29 928898 0 2026-03-10T14:07:27.514 INFO:tasks.workunit.client.1.vm04.stdout:2/238: dwrite d0/d3/d8/dd/d26/f33 [0,4194304] 0 2026-03-10T14:07:27.521 INFO:tasks.workunit.client.1.vm04.stdout:2/239: chown d0/d14/d1b/f29 21 1 2026-03-10T14:07:27.521 INFO:tasks.workunit.client.1.vm04.stdout:2/240: chown d0/d3/d8 5 1 2026-03-10T14:07:27.521 INFO:tasks.workunit.client.1.vm04.stdout:2/241: write d0/d3/d8/d17/d35/f49 [626355,37294] 0 2026-03-10T14:07:27.521 INFO:tasks.workunit.client.1.vm04.stdout:2/242: fdatasync d0/d3/d8/dd/d26/f2a 0 2026-03-10T14:07:27.521 INFO:tasks.workunit.client.1.vm04.stdout:3/214: getdents da/dc/d3f 0 2026-03-10T14:07:27.523 INFO:tasks.workunit.client.1.vm04.stdout:5/242: write d7/d2d/d32/f3b [1416481,122173] 0 2026-03-10T14:07:27.531 INFO:tasks.workunit.client.1.vm04.stdout:4/219: truncate d4/d14/d1b/f26 3849862 0 2026-03-10T14:07:27.542 INFO:tasks.workunit.client.1.vm04.stdout:0/190: write d0/d2/d15/d22/f30 [4198079,11625] 0 2026-03-10T14:07:27.542 INFO:tasks.workunit.client.1.vm04.stdout:4/220: write d4/fe [5006405,8475] 0 2026-03-10T14:07:27.542 INFO:tasks.workunit.client.1.vm04.stdout:4/221: unlink d4/df/d31/f40 0 2026-03-10T14:07:27.542 INFO:tasks.workunit.client.1.vm04.stdout:3/215: dread da/dc/f1a [0,4194304] 0 2026-03-10T14:07:27.543 INFO:tasks.workunit.client.1.vm04.stdout:5/243: getdents d7/d12/d45 0 2026-03-10T14:07:27.544 INFO:tasks.workunit.client.1.vm04.stdout:1/235: link d3/l26 d3/d5/d1e/l51 0 2026-03-10T14:07:27.546 INFO:tasks.workunit.client.1.vm04.stdout:4/222: chown d4/df/d34/f1f 1456923 1 2026-03-10T14:07:27.547 INFO:tasks.workunit.client.1.vm04.stdout:4/223: write d4/f2c [2862671,104395] 0 2026-03-10T14:07:27.559 INFO:tasks.workunit.client.1.vm04.stdout:1/236: truncate d3/f2c 2030916 0 2026-03-10T14:07:27.560 INFO:tasks.workunit.client.1.vm04.stdout:7/205: link d2/dc/cf d2/dc/de/d21/c4c 0 2026-03-10T14:07:27.561 INFO:tasks.workunit.client.1.vm04.stdout:7/206: read - d2/dc/de/f31 zero size 2026-03-10T14:07:27.567 INFO:tasks.workunit.client.1.vm04.stdout:1/237: mknod d3/d22/c52 0 2026-03-10T14:07:27.571 INFO:tasks.workunit.client.1.vm04.stdout:1/238: dread - d3/d5/d13/d1a/f4b zero size 2026-03-10T14:07:27.574 INFO:tasks.workunit.client.1.vm04.stdout:1/239: unlink d3/d5/c40 0 2026-03-10T14:07:27.576 INFO:tasks.workunit.client.1.vm04.stdout:7/207: mkdir d2/dc/d4d 0 2026-03-10T14:07:27.577 INFO:tasks.workunit.client.1.vm04.stdout:2/243: sync 2026-03-10T14:07:27.577 INFO:tasks.workunit.client.1.vm04.stdout:0/191: sync 2026-03-10T14:07:27.577 INFO:tasks.workunit.client.1.vm04.stdout:7/208: chown d2/dc/de/d11 5510 1 2026-03-10T14:07:27.581 INFO:tasks.workunit.client.1.vm04.stdout:1/240: unlink d3/d22/d2f/c44 0 2026-03-10T14:07:27.582 INFO:tasks.workunit.client.1.vm04.stdout:1/241: chown d3/d5/d13/d1a/f28 182194 1 2026-03-10T14:07:27.582 INFO:tasks.workunit.client.1.vm04.stdout:1/242: fdatasync d3/d22/f48 0 2026-03-10T14:07:27.589 INFO:tasks.workunit.client.1.vm04.stdout:2/244: dwrite d0/d3/d3a/d3e/f38 [0,4194304] 0 2026-03-10T14:07:27.592 INFO:tasks.workunit.client.1.vm04.stdout:0/192: sync 2026-03-10T14:07:27.595 INFO:tasks.workunit.client.1.vm04.stdout:7/209: link d2/dc/de/d2d/d38/f37 d2/d2a/d42/f4e 0 2026-03-10T14:07:27.598 INFO:tasks.workunit.client.1.vm04.stdout:7/210: symlink d2/dc/de/d2d/l4f 0 2026-03-10T14:07:27.599 INFO:tasks.workunit.client.1.vm04.stdout:2/245: creat d0/d3/d4a/f4d x:0 0 0 2026-03-10T14:07:27.600 INFO:tasks.workunit.client.1.vm04.stdout:7/211: mkdir d2/dc/de/d2d/d38/d50 0 2026-03-10T14:07:27.601 INFO:tasks.workunit.client.1.vm04.stdout:7/212: chown d2/dc/de/f31 13 1 2026-03-10T14:07:27.604 INFO:tasks.workunit.client.1.vm04.stdout:7/213: dwrite d2/dc/f4a [0,4194304] 0 2026-03-10T14:07:27.607 INFO:tasks.workunit.client.1.vm04.stdout:7/214: truncate d2/d28/f29 1392993 0 2026-03-10T14:07:27.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:27 vm03.local ceph-mon[49718]: pgmap v142: 65 pgs: 65 active+clean; 403 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 3.6 MiB/s rd, 40 MiB/s wr, 339 op/s 2026-03-10T14:07:27.609 INFO:tasks.workunit.client.1.vm04.stdout:2/246: unlink d0/d3/l27 0 2026-03-10T14:07:27.609 INFO:tasks.workunit.client.1.vm04.stdout:2/247: fdatasync d0/d3/d8/dd/d26/f33 0 2026-03-10T14:07:27.618 INFO:tasks.workunit.client.1.vm04.stdout:7/215: dwrite d2/d28/f29 [0,4194304] 0 2026-03-10T14:07:27.621 INFO:tasks.workunit.client.1.vm04.stdout:2/248: dwrite d0/d3/f1d [0,4194304] 0 2026-03-10T14:07:27.627 INFO:tasks.workunit.client.1.vm04.stdout:7/216: dread d2/f4 [4194304,4194304] 0 2026-03-10T14:07:27.632 INFO:tasks.workunit.client.1.vm04.stdout:2/249: write d0/d3/d4a/f4d [207049,52611] 0 2026-03-10T14:07:27.632 INFO:tasks.workunit.client.1.vm04.stdout:2/250: mkdir d0/d3/d8/d17/d4e 0 2026-03-10T14:07:27.633 INFO:tasks.workunit.client.1.vm04.stdout:2/251: rename d0 to d0/d14/d1b/d45/d4f 22 2026-03-10T14:07:27.639 INFO:tasks.workunit.client.1.vm04.stdout:2/252: dwrite d0/d3/d8/dd/d26/f2a [0,4194304] 0 2026-03-10T14:07:27.643 INFO:tasks.workunit.client.1.vm04.stdout:3/216: write da/dc/f39 [344880,90077] 0 2026-03-10T14:07:27.643 INFO:tasks.workunit.client.1.vm04.stdout:3/217: write f4 [7049728,40401] 0 2026-03-10T14:07:27.654 INFO:tasks.workunit.client.1.vm04.stdout:9/225: truncate d9/da/f13 450844 0 2026-03-10T14:07:27.657 INFO:tasks.workunit.client.1.vm04.stdout:2/253: mknod d0/d3/d4a/c50 0 2026-03-10T14:07:27.659 INFO:tasks.workunit.client.1.vm04.stdout:2/254: chown d0/d14 205877 1 2026-03-10T14:07:27.659 INFO:tasks.workunit.client.1.vm04.stdout:8/233: write d0/d3/d5/f15 [2104716,57453] 0 2026-03-10T14:07:27.659 INFO:tasks.workunit.client.1.vm04.stdout:6/182: write d3/de/d11/f29 [1417598,65552] 0 2026-03-10T14:07:27.660 INFO:tasks.workunit.client.1.vm04.stdout:8/234: fsync d0/f26 0 2026-03-10T14:07:27.663 INFO:tasks.workunit.client.1.vm04.stdout:4/224: dwrite d4/d14/d1b/f26 [0,4194304] 0 2026-03-10T14:07:27.663 INFO:tasks.workunit.client.1.vm04.stdout:4/225: fdatasync d4/df/d31/f3d 0 2026-03-10T14:07:27.667 INFO:tasks.workunit.client.1.vm04.stdout:7/217: dread d2/dc/de/d11/f19 [0,4194304] 0 2026-03-10T14:07:27.674 INFO:tasks.workunit.client.1.vm04.stdout:5/244: dwrite d7/f1d [0,4194304] 0 2026-03-10T14:07:27.680 INFO:tasks.workunit.client.1.vm04.stdout:3/218: sync 2026-03-10T14:07:27.680 INFO:tasks.workunit.client.1.vm04.stdout:6/183: dread d3/d8/fc [0,4194304] 0 2026-03-10T14:07:27.681 INFO:tasks.workunit.client.1.vm04.stdout:6/184: dread - d3/de/d11/d2d/f1c zero size 2026-03-10T14:07:27.682 INFO:tasks.workunit.client.1.vm04.stdout:8/235: write d0/d3/d5/f30 [327974,110340] 0 2026-03-10T14:07:27.682 INFO:tasks.workunit.client.1.vm04.stdout:4/226: mknod d4/df/c4d 0 2026-03-10T14:07:27.690 INFO:tasks.workunit.client.1.vm04.stdout:6/185: rmdir d3/d1b 39 2026-03-10T14:07:27.693 INFO:tasks.workunit.client.1.vm04.stdout:6/186: fsync d3/de/f13 0 2026-03-10T14:07:27.702 INFO:tasks.workunit.client.1.vm04.stdout:8/236: rename d0/d3/dd/d11/d12/f1f to d0/d3/dd/d11/d12/f47 0 2026-03-10T14:07:27.704 INFO:tasks.workunit.client.1.vm04.stdout:8/237: write d0/d3/dd/d11/d29/f45 [336873,69474] 0 2026-03-10T14:07:27.707 INFO:tasks.workunit.client.1.vm04.stdout:8/238: dwrite d0/d3/dd/d11/d29/f45 [0,4194304] 0 2026-03-10T14:07:27.707 INFO:tasks.workunit.client.1.vm04.stdout:5/245: symlink d7/d2d/l47 0 2026-03-10T14:07:27.709 INFO:tasks.workunit.client.1.vm04.stdout:6/187: sync 2026-03-10T14:07:27.711 INFO:tasks.workunit.client.1.vm04.stdout:5/246: dread d7/f21 [0,4194304] 0 2026-03-10T14:07:27.714 INFO:tasks.workunit.client.1.vm04.stdout:3/219: link da/dc/c21 da/dc/d35/d37/c48 0 2026-03-10T14:07:27.724 INFO:tasks.workunit.client.1.vm04.stdout:9/226: getdents d9/d1d 0 2026-03-10T14:07:27.725 INFO:tasks.workunit.client.1.vm04.stdout:3/220: dread - da/d30/f32 zero size 2026-03-10T14:07:27.725 INFO:tasks.workunit.client.1.vm04.stdout:3/221: dwrite da/f25 [0,4194304] 0 2026-03-10T14:07:27.725 INFO:tasks.workunit.client.1.vm04.stdout:6/188: mknod d3/d1d/c30 0 2026-03-10T14:07:27.736 INFO:tasks.workunit.client.1.vm04.stdout:7/218: dread f0 [0,4194304] 0 2026-03-10T14:07:27.739 INFO:tasks.workunit.client.1.vm04.stdout:5/247: dread d7/f11 [0,4194304] 0 2026-03-10T14:07:27.740 INFO:tasks.workunit.client.1.vm04.stdout:5/248: chown d7/d2d/l47 1 1 2026-03-10T14:07:27.741 INFO:tasks.workunit.client.1.vm04.stdout:6/189: sync 2026-03-10T14:07:27.746 INFO:tasks.workunit.client.1.vm04.stdout:9/227: mkdir d9/d44/d4d 0 2026-03-10T14:07:27.749 INFO:tasks.workunit.client.1.vm04.stdout:3/222: symlink da/dc/l49 0 2026-03-10T14:07:27.749 INFO:tasks.workunit.client.1.vm04.stdout:3/223: read da/f25 [1280561,47008] 0 2026-03-10T14:07:27.755 INFO:tasks.workunit.client.1.vm04.stdout:3/224: rmdir da 39 2026-03-10T14:07:27.757 INFO:tasks.workunit.client.1.vm04.stdout:9/228: dread d9/da/dd/f48 [0,4194304] 0 2026-03-10T14:07:27.759 INFO:tasks.workunit.client.1.vm04.stdout:6/190: creat d3/d1b/d23/f31 x:0 0 0 2026-03-10T14:07:27.761 INFO:tasks.workunit.client.1.vm04.stdout:6/191: rmdir d3/de 39 2026-03-10T14:07:27.762 INFO:tasks.workunit.client.1.vm04.stdout:7/219: getdents d2/d14 0 2026-03-10T14:07:27.763 INFO:tasks.workunit.client.1.vm04.stdout:3/225: mknod da/c4a 0 2026-03-10T14:07:27.768 INFO:tasks.workunit.client.1.vm04.stdout:9/229: creat d9/d44/d4d/f4e x:0 0 0 2026-03-10T14:07:27.768 INFO:tasks.workunit.client.1.vm04.stdout:3/226: chown f4 52777086 1 2026-03-10T14:07:27.770 INFO:tasks.workunit.client.1.vm04.stdout:6/192: dwrite d3/de/d11/d2d/f1c [0,4194304] 0 2026-03-10T14:07:27.777 INFO:tasks.workunit.client.1.vm04.stdout:7/220: creat d2/d14/d44/f51 x:0 0 0 2026-03-10T14:07:27.778 INFO:tasks.workunit.client.1.vm04.stdout:7/221: write d2/dc/f4a [1306523,45263] 0 2026-03-10T14:07:27.785 INFO:tasks.workunit.client.1.vm04.stdout:6/193: dwrite d3/d1b/d23/f31 [0,4194304] 0 2026-03-10T14:07:27.785 INFO:tasks.workunit.client.1.vm04.stdout:2/255: dread d0/d14/d1b/f32 [0,4194304] 0 2026-03-10T14:07:27.786 INFO:tasks.workunit.client.1.vm04.stdout:2/256: write d0/d3/d3a/d3e/f38 [2597261,113695] 0 2026-03-10T14:07:27.786 INFO:tasks.workunit.client.1.vm04.stdout:6/194: read d3/f9 [811970,67249] 0 2026-03-10T14:07:27.787 INFO:tasks.workunit.client.1.vm04.stdout:6/195: chown d3/f4 34302736 1 2026-03-10T14:07:27.787 INFO:tasks.workunit.client.1.vm04.stdout:2/257: truncate d0/d3/d8/dd/d26/f33 5016107 0 2026-03-10T14:07:27.792 INFO:tasks.workunit.client.1.vm04.stdout:7/222: rename d2/dc/de/d11/f2e to d2/d14/d36/f52 0 2026-03-10T14:07:27.795 INFO:tasks.workunit.client.1.vm04.stdout:2/258: dwrite d0/d3/d8/f42 [0,4194304] 0 2026-03-10T14:07:27.797 INFO:tasks.workunit.client.1.vm04.stdout:9/230: creat d9/f4f x:0 0 0 2026-03-10T14:07:27.797 INFO:tasks.workunit.client.1.vm04.stdout:2/259: chown d0/d3/d8/dd/d26/f36 1153896241 1 2026-03-10T14:07:27.798 INFO:tasks.workunit.client.1.vm04.stdout:2/260: chown d0/d3/d8/d17 0 1 2026-03-10T14:07:27.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:27 vm04.local ceph-mon[55966]: pgmap v142: 65 pgs: 65 active+clean; 403 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 3.6 MiB/s rd, 40 MiB/s wr, 339 op/s 2026-03-10T14:07:27.814 INFO:tasks.workunit.client.1.vm04.stdout:6/196: dwrite d3/d8/fa [0,4194304] 0 2026-03-10T14:07:27.817 INFO:tasks.workunit.client.1.vm04.stdout:1/243: truncate d3/d22/f42 7820056 0 2026-03-10T14:07:27.829 INFO:tasks.workunit.client.1.vm04.stdout:7/223: write d2/dc/de/d11/f19 [130443,79750] 0 2026-03-10T14:07:27.829 INFO:tasks.workunit.client.1.vm04.stdout:7/224: read - d2/d14/d3b/f48 zero size 2026-03-10T14:07:27.829 INFO:tasks.workunit.client.1.vm04.stdout:7/225: write d2/dc/de/d21/f3c [8790,120033] 0 2026-03-10T14:07:27.829 INFO:tasks.workunit.client.1.vm04.stdout:0/193: truncate d0/d2/d25/f2a 4414630 0 2026-03-10T14:07:27.829 INFO:tasks.workunit.client.1.vm04.stdout:0/194: write d0/d1c/f2e [986573,107270] 0 2026-03-10T14:07:27.829 INFO:tasks.workunit.client.1.vm04.stdout:0/195: write d0/d1c/f29 [832684,130000] 0 2026-03-10T14:07:27.830 INFO:tasks.workunit.client.1.vm04.stdout:9/231: readlink d9/da/l18 0 2026-03-10T14:07:27.833 INFO:tasks.workunit.client.1.vm04.stdout:2/261: chown d0/d14/d1b/c21 128694 1 2026-03-10T14:07:27.834 INFO:tasks.workunit.client.1.vm04.stdout:2/262: write d0/d14/d1b/f32 [993909,31376] 0 2026-03-10T14:07:27.857 INFO:tasks.workunit.client.1.vm04.stdout:1/244: link d3/d22/l45 d3/d5/d1e/d35/l53 0 2026-03-10T14:07:27.859 INFO:tasks.workunit.client.1.vm04.stdout:7/226: link d2/la d2/dc/d4d/l53 0 2026-03-10T14:07:27.867 INFO:tasks.workunit.client.1.vm04.stdout:9/232: sync 2026-03-10T14:07:27.867 INFO:tasks.workunit.client.1.vm04.stdout:4/227: chown d4/df/d31/c48 2239 1 2026-03-10T14:07:27.868 INFO:tasks.workunit.client.1.vm04.stdout:4/228: stat d4/fe 0 2026-03-10T14:07:27.869 INFO:tasks.workunit.client.1.vm04.stdout:1/245: mknod d3/d20/c54 0 2026-03-10T14:07:27.874 INFO:tasks.workunit.client.1.vm04.stdout:7/227: dwrite d2/d14/d3b/f49 [0,4194304] 0 2026-03-10T14:07:27.888 INFO:tasks.workunit.client.1.vm04.stdout:9/233: link d9/da/c2f d9/da/dd/c50 0 2026-03-10T14:07:27.888 INFO:tasks.workunit.client.1.vm04.stdout:9/234: read d9/da/dd/f48 [175104,34554] 0 2026-03-10T14:07:27.897 INFO:tasks.workunit.client.1.vm04.stdout:8/239: truncate d0/d3/dd/fc 2736443 0 2026-03-10T14:07:27.901 INFO:tasks.workunit.client.1.vm04.stdout:5/249: dwrite d7/d9/f3a [0,4194304] 0 2026-03-10T14:07:27.904 INFO:tasks.workunit.client.1.vm04.stdout:6/197: rename d3/d1b to d3/de/d11/d2d/d32 0 2026-03-10T14:07:27.918 INFO:tasks.workunit.client.1.vm04.stdout:9/235: rename d9/da/dd/f28 to d9/d44/f51 0 2026-03-10T14:07:27.919 INFO:tasks.workunit.client.1.vm04.stdout:9/236: write d9/da/f41 [1014111,115326] 0 2026-03-10T14:07:27.928 INFO:tasks.workunit.client.1.vm04.stdout:3/227: chown da/dc/d35/d37/c48 10 1 2026-03-10T14:07:27.931 INFO:tasks.workunit.client.1.vm04.stdout:6/198: sync 2026-03-10T14:07:27.937 INFO:tasks.workunit.client.1.vm04.stdout:8/240: rename d0/d41 to d0/d3/dd/d33/d48 0 2026-03-10T14:07:27.942 INFO:tasks.workunit.client.1.vm04.stdout:9/237: symlink d9/d33/l52 0 2026-03-10T14:07:27.943 INFO:tasks.workunit.client.1.vm04.stdout:9/238: rename d9/da/dd/d1c to d9/da/dd/d1c/d53 22 2026-03-10T14:07:27.950 INFO:tasks.workunit.client.1.vm04.stdout:1/246: dread d3/d22/d2f/f39 [0,4194304] 0 2026-03-10T14:07:27.950 INFO:tasks.workunit.client.1.vm04.stdout:9/239: symlink d9/d44/l54 0 2026-03-10T14:07:27.953 INFO:tasks.workunit.client.1.vm04.stdout:1/247: dwrite d3/d20/f32 [0,4194304] 0 2026-03-10T14:07:27.955 INFO:tasks.workunit.client.1.vm04.stdout:3/228: link da/dc/l29 da/dc/d3f/l4b 0 2026-03-10T14:07:27.956 INFO:tasks.workunit.client.1.vm04.stdout:3/229: fdatasync da/dc/f2a 0 2026-03-10T14:07:27.957 INFO:tasks.workunit.client.1.vm04.stdout:8/241: sync 2026-03-10T14:07:27.962 INFO:tasks.workunit.client.1.vm04.stdout:0/196: dwrite d0/d2/d25/f2a [0,4194304] 0 2026-03-10T14:07:27.969 INFO:tasks.workunit.client.1.vm04.stdout:0/197: dwrite d0/f31 [0,4194304] 0 2026-03-10T14:07:27.972 INFO:tasks.workunit.client.1.vm04.stdout:9/240: readlink d9/d1d/l36 0 2026-03-10T14:07:27.973 INFO:tasks.workunit.client.1.vm04.stdout:2/263: rmdir d0/d14/d1b 39 2026-03-10T14:07:27.984 INFO:tasks.workunit.client.1.vm04.stdout:6/199: getdents d3/de/d11/d2d/d32/d23 0 2026-03-10T14:07:27.991 INFO:tasks.workunit.client.1.vm04.stdout:0/198: creat d0/d2/d25/f3f x:0 0 0 2026-03-10T14:07:27.994 INFO:tasks.workunit.client.1.vm04.stdout:9/241: symlink d9/d33/l55 0 2026-03-10T14:07:27.995 INFO:tasks.workunit.client.1.vm04.stdout:0/199: dwrite d0/d2/d25/f33 [0,4194304] 0 2026-03-10T14:07:27.996 INFO:tasks.workunit.client.1.vm04.stdout:9/242: truncate d9/f4f 659265 0 2026-03-10T14:07:28.001 INFO:tasks.workunit.client.1.vm04.stdout:0/200: dwrite d0/d2/d25/f33 [0,4194304] 0 2026-03-10T14:07:28.012 INFO:tasks.workunit.client.1.vm04.stdout:3/230: creat da/d3e/f4c x:0 0 0 2026-03-10T14:07:28.016 INFO:tasks.workunit.client.1.vm04.stdout:6/200: rmdir d3/de/d11/d2d/d32/d23/d24 39 2026-03-10T14:07:28.016 INFO:tasks.workunit.client.1.vm04.stdout:6/201: fsync d3/de/d11/d2d/d32/f1f 0 2026-03-10T14:07:28.020 INFO:tasks.workunit.client.1.vm04.stdout:9/243: creat d9/d44/d4d/f56 x:0 0 0 2026-03-10T14:07:28.027 INFO:tasks.workunit.client.1.vm04.stdout:0/201: unlink d0/f31 0 2026-03-10T14:07:28.027 INFO:tasks.workunit.client.1.vm04.stdout:1/248: link d3/d5/d13/c4c d3/d5/d1e/c55 0 2026-03-10T14:07:28.030 INFO:tasks.workunit.client.1.vm04.stdout:0/202: dwrite d0/d2/d25/f2a [0,4194304] 0 2026-03-10T14:07:28.031 INFO:tasks.workunit.client.1.vm04.stdout:0/203: chown d0/d2/f34 318716 1 2026-03-10T14:07:28.038 INFO:tasks.workunit.client.1.vm04.stdout:9/244: sync 2026-03-10T14:07:28.040 INFO:tasks.workunit.client.1.vm04.stdout:2/264: rename d0/d3/d8/l1e to d0/d14/l51 0 2026-03-10T14:07:28.042 INFO:tasks.workunit.client.1.vm04.stdout:3/231: creat da/dc/d3f/f4d x:0 0 0 2026-03-10T14:07:28.044 INFO:tasks.workunit.client.1.vm04.stdout:3/232: fsync f4 0 2026-03-10T14:07:28.049 INFO:tasks.workunit.client.1.vm04.stdout:9/245: rename d9/da/f1b to d9/da/f57 0 2026-03-10T14:07:28.053 INFO:tasks.workunit.client.1.vm04.stdout:5/250: dread d7/fa [0,4194304] 0 2026-03-10T14:07:28.057 INFO:tasks.workunit.client.1.vm04.stdout:4/229: write d4/fd [4984431,5561] 0 2026-03-10T14:07:28.057 INFO:tasks.workunit.client.1.vm04.stdout:6/202: creat d3/de/d11/d2d/d32/d23/f33 x:0 0 0 2026-03-10T14:07:28.058 INFO:tasks.workunit.client.1.vm04.stdout:9/246: dread d9/da/dd/d1c/f27 [0,4194304] 0 2026-03-10T14:07:28.061 INFO:tasks.workunit.client.1.vm04.stdout:9/247: dwrite d9/da/dd/d1c/f22 [0,4194304] 0 2026-03-10T14:07:28.062 INFO:tasks.workunit.client.1.vm04.stdout:9/248: write d9/f4a [930596,106772] 0 2026-03-10T14:07:28.062 INFO:tasks.workunit.client.1.vm04.stdout:9/249: stat d9/d33/f4b 0 2026-03-10T14:07:28.070 INFO:tasks.workunit.client.1.vm04.stdout:7/228: dwrite d2/dc/de/f1e [0,4194304] 0 2026-03-10T14:07:28.078 INFO:tasks.workunit.client.1.vm04.stdout:3/233: creat da/d30/f4e x:0 0 0 2026-03-10T14:07:28.096 INFO:tasks.workunit.client.1.vm04.stdout:1/249: creat d3/d5/f56 x:0 0 0 2026-03-10T14:07:28.098 INFO:tasks.workunit.client.1.vm04.stdout:0/204: rename d0/d1c/f24 to d0/d2/f40 0 2026-03-10T14:07:28.102 INFO:tasks.workunit.client.1.vm04.stdout:3/234: mknod da/d3e/c4f 0 2026-03-10T14:07:28.108 INFO:tasks.workunit.client.1.vm04.stdout:9/250: mkdir d9/d58 0 2026-03-10T14:07:28.115 INFO:tasks.workunit.client.1.vm04.stdout:1/250: mkdir d3/d22/d2f/d57 0 2026-03-10T14:07:28.116 INFO:tasks.workunit.client.1.vm04.stdout:0/205: symlink d0/d2/d15/d22/d38/l41 0 2026-03-10T14:07:28.116 INFO:tasks.workunit.client.1.vm04.stdout:0/206: fsync d0/d2/f12 0 2026-03-10T14:07:28.116 INFO:tasks.workunit.client.1.vm04.stdout:4/230: creat d4/df/f4e x:0 0 0 2026-03-10T14:07:28.116 INFO:tasks.workunit.client.1.vm04.stdout:7/229: link d2/dc/l33 d2/d2a/d42/l54 0 2026-03-10T14:07:28.116 INFO:tasks.workunit.client.1.vm04.stdout:7/230: truncate d2/dc/de/d21/f3c 841479 0 2026-03-10T14:07:28.116 INFO:tasks.workunit.client.1.vm04.stdout:7/231: write d2/dc/f25 [115488,129947] 0 2026-03-10T14:07:28.125 INFO:tasks.workunit.client.1.vm04.stdout:1/251: mkdir d3/d5/d13/d38/d58 0 2026-03-10T14:07:28.125 INFO:tasks.workunit.client.1.vm04.stdout:0/207: symlink d0/d1c/l42 0 2026-03-10T14:07:28.130 INFO:tasks.workunit.client.1.vm04.stdout:2/265: link d0/d14/d1b/c23 d0/d3/d8/dd/d26/d46/c52 0 2026-03-10T14:07:28.138 INFO:tasks.workunit.client.1.vm04.stdout:2/266: truncate d0/d3/d8/f30 4286361 0 2026-03-10T14:07:28.138 INFO:tasks.workunit.client.1.vm04.stdout:7/232: symlink d2/d28/l55 0 2026-03-10T14:07:28.138 INFO:tasks.workunit.client.1.vm04.stdout:3/235: fdatasync da/f25 0 2026-03-10T14:07:28.138 INFO:tasks.workunit.client.1.vm04.stdout:7/233: chown d2/dc/de/d2d/l4f 13 1 2026-03-10T14:07:28.138 INFO:tasks.workunit.client.1.vm04.stdout:3/236: write da/dc/d3f/f4d [247151,23657] 0 2026-03-10T14:07:28.147 INFO:tasks.workunit.client.1.vm04.stdout:2/267: mknod d0/d3/d8/dd/d26/c53 0 2026-03-10T14:07:28.155 INFO:tasks.workunit.client.1.vm04.stdout:4/231: unlink d4/df/d22/f4c 0 2026-03-10T14:07:28.162 INFO:tasks.workunit.client.1.vm04.stdout:3/237: creat da/dc/d35/f50 x:0 0 0 2026-03-10T14:07:28.165 INFO:tasks.workunit.client.1.vm04.stdout:6/203: getdents d3/de/d11/d2d 0 2026-03-10T14:07:28.169 INFO:tasks.workunit.client.1.vm04.stdout:6/204: dwrite d3/ff [0,4194304] 0 2026-03-10T14:07:28.171 INFO:tasks.workunit.client.1.vm04.stdout:6/205: chown d3/d1d/c30 37066021 1 2026-03-10T14:07:28.173 INFO:tasks.workunit.client.1.vm04.stdout:1/252: link d3/d22/l37 d3/d22/d2f/d57/l59 0 2026-03-10T14:07:28.180 INFO:tasks.workunit.client.1.vm04.stdout:1/253: dwrite d3/d20/f32 [0,4194304] 0 2026-03-10T14:07:28.187 INFO:tasks.workunit.client.1.vm04.stdout:4/232: dread d4/df/d22/f2d [0,4194304] 0 2026-03-10T14:07:28.196 INFO:tasks.workunit.client.1.vm04.stdout:5/251: dread d7/f24 [0,4194304] 0 2026-03-10T14:07:28.203 INFO:tasks.workunit.client.1.vm04.stdout:3/238: dread da/fb [0,4194304] 0 2026-03-10T14:07:28.205 INFO:tasks.workunit.client.1.vm04.stdout:6/206: creat d3/de/d11/d2d/f34 x:0 0 0 2026-03-10T14:07:28.209 INFO:tasks.workunit.client.1.vm04.stdout:6/207: dwrite d3/de/d11/d2d/d32/f1f [0,4194304] 0 2026-03-10T14:07:28.217 INFO:tasks.workunit.client.1.vm04.stdout:6/208: dread d3/de/d11/f17 [0,4194304] 0 2026-03-10T14:07:28.221 INFO:tasks.workunit.client.1.vm04.stdout:0/208: getdents d0/d2/d25 0 2026-03-10T14:07:28.221 INFO:tasks.workunit.client.1.vm04.stdout:1/254: creat d3/d5/d13/f5a x:0 0 0 2026-03-10T14:07:28.226 INFO:tasks.workunit.client.1.vm04.stdout:0/209: dread d0/d2/d25/f2a [0,4194304] 0 2026-03-10T14:07:28.228 INFO:tasks.workunit.client.1.vm04.stdout:2/268: mkdir d0/d14/d54 0 2026-03-10T14:07:28.235 INFO:tasks.workunit.client.1.vm04.stdout:5/252: fsync d7/d12/d2b/f46 0 2026-03-10T14:07:28.236 INFO:tasks.workunit.client.1.vm04.stdout:5/253: fsync f4 0 2026-03-10T14:07:28.236 INFO:tasks.workunit.client.1.vm04.stdout:8/242: truncate d0/d3/d5/f30 18466 0 2026-03-10T14:07:28.240 INFO:tasks.workunit.client.1.vm04.stdout:8/243: dwrite d0/d3/dd/d11/d12/f2e [0,4194304] 0 2026-03-10T14:07:28.247 INFO:tasks.workunit.client.1.vm04.stdout:3/239: dwrite da/f22 [0,4194304] 0 2026-03-10T14:07:28.262 INFO:tasks.workunit.client.1.vm04.stdout:6/209: mkdir d3/de/d35 0 2026-03-10T14:07:28.265 INFO:tasks.workunit.client.1.vm04.stdout:6/210: dwrite d3/d8/fa [0,4194304] 0 2026-03-10T14:07:28.281 INFO:tasks.workunit.client.1.vm04.stdout:5/254: mknod d7/d2d/d32/c48 0 2026-03-10T14:07:28.285 INFO:tasks.workunit.client.1.vm04.stdout:8/244: symlink d0/d3/dd/d33/d48/l49 0 2026-03-10T14:07:28.289 INFO:tasks.workunit.client.1.vm04.stdout:3/240: mknod da/d3e/c51 0 2026-03-10T14:07:28.289 INFO:tasks.workunit.client.1.vm04.stdout:3/241: write da/d30/f4e [447282,117392] 0 2026-03-10T14:07:28.290 INFO:tasks.workunit.client.1.vm04.stdout:6/211: rmdir d3/de 39 2026-03-10T14:07:28.292 INFO:tasks.workunit.client.1.vm04.stdout:1/255: mkdir d3/d5/d13/d38/d58/d5b 0 2026-03-10T14:07:28.292 INFO:tasks.workunit.client.1.vm04.stdout:1/256: truncate d3/d5/d13/f36 299702 0 2026-03-10T14:07:28.293 INFO:tasks.workunit.client.1.vm04.stdout:1/257: write d3/d20/f27 [1139775,125116] 0 2026-03-10T14:07:28.295 INFO:tasks.workunit.client.1.vm04.stdout:0/210: symlink d0/l43 0 2026-03-10T14:07:28.298 INFO:tasks.workunit.client.1.vm04.stdout:1/258: dwrite d3/d22/d2f/f3c [0,4194304] 0 2026-03-10T14:07:28.310 INFO:tasks.workunit.client.1.vm04.stdout:8/245: rmdir d0/d3 39 2026-03-10T14:07:28.310 INFO:tasks.workunit.client.1.vm04.stdout:5/255: dwrite d7/f24 [4194304,4194304] 0 2026-03-10T14:07:28.317 INFO:tasks.workunit.client.1.vm04.stdout:9/251: dread d9/da/dd/f19 [0,4194304] 0 2026-03-10T14:07:28.321 INFO:tasks.workunit.client.1.vm04.stdout:9/252: dwrite d9/f4f [0,4194304] 0 2026-03-10T14:07:28.332 INFO:tasks.workunit.client.1.vm04.stdout:6/212: dwrite d3/de/d11/d2d/f18 [0,4194304] 0 2026-03-10T14:07:28.333 INFO:tasks.workunit.client.1.vm04.stdout:6/213: write d3/de/d11/f29 [732995,77009] 0 2026-03-10T14:07:28.337 INFO:tasks.workunit.client.1.vm04.stdout:0/211: fdatasync d0/d1c/f2c 0 2026-03-10T14:07:28.341 INFO:tasks.workunit.client.1.vm04.stdout:1/259: mkdir d3/d5c 0 2026-03-10T14:07:28.342 INFO:tasks.workunit.client.1.vm04.stdout:2/269: creat d0/d14/d1b/f55 x:0 0 0 2026-03-10T14:07:28.347 INFO:tasks.workunit.client.1.vm04.stdout:2/270: dwrite d0/d3/d8/dd/d26/f36 [0,4194304] 0 2026-03-10T14:07:28.347 INFO:tasks.workunit.client.1.vm04.stdout:8/246: chown d0/d3/dd/d11/d12/f47 480891614 1 2026-03-10T14:07:28.359 INFO:tasks.workunit.client.1.vm04.stdout:7/234: write d2/dc/de/d2d/f39 [8404239,80536] 0 2026-03-10T14:07:28.364 INFO:tasks.workunit.client.1.vm04.stdout:9/253: write d9/f20 [694820,69892] 0 2026-03-10T14:07:28.365 INFO:tasks.workunit.client.1.vm04.stdout:7/235: dwrite d2/dc/de/f12 [0,4194304] 0 2026-03-10T14:07:28.376 INFO:tasks.workunit.client.1.vm04.stdout:0/212: creat d0/d2/d15/f44 x:0 0 0 2026-03-10T14:07:28.376 INFO:tasks.workunit.client.1.vm04.stdout:0/213: write d0/d2/fa [1471701,12320] 0 2026-03-10T14:07:28.377 INFO:tasks.workunit.client.1.vm04.stdout:0/214: write d0/d2/d15/d22/d38/f3e [972649,117575] 0 2026-03-10T14:07:28.396 INFO:tasks.workunit.client.1.vm04.stdout:6/214: creat d3/de/d11/d2d/d32/d23/d24/f36 x:0 0 0 2026-03-10T14:07:28.396 INFO:tasks.workunit.client.1.vm04.stdout:0/215: creat d0/d2/d25/f45 x:0 0 0 2026-03-10T14:07:28.397 INFO:tasks.workunit.client.1.vm04.stdout:2/271: mkdir d0/d3/d8/dd/d26/d46/d4b/d56 0 2026-03-10T14:07:28.398 INFO:tasks.workunit.client.1.vm04.stdout:5/256: link d7/c16 d7/d12/d2b/d3e/c49 0 2026-03-10T14:07:28.399 INFO:tasks.workunit.client.1.vm04.stdout:5/257: readlink d7/d12/d2b/d3e/d3f/l44 0 2026-03-10T14:07:28.401 INFO:tasks.workunit.client.1.vm04.stdout:5/258: dread d7/f1d [0,4194304] 0 2026-03-10T14:07:28.401 INFO:tasks.workunit.client.1.vm04.stdout:5/259: fsync d7/d2d/d32/f3b 0 2026-03-10T14:07:28.403 INFO:tasks.workunit.client.1.vm04.stdout:7/236: mkdir d2/d2a/d42/d56 0 2026-03-10T14:07:28.413 INFO:tasks.workunit.client.1.vm04.stdout:5/260: creat d7/d12/d2b/d3e/f4a x:0 0 0 2026-03-10T14:07:28.413 INFO:tasks.workunit.client.1.vm04.stdout:7/237: readlink d2/l9 0 2026-03-10T14:07:28.416 INFO:tasks.workunit.client.1.vm04.stdout:8/247: getdents d0/d3/dd/d11/d3b 0 2026-03-10T14:07:28.417 INFO:tasks.workunit.client.1.vm04.stdout:7/238: creat d2/dc/de/d2d/d38/f57 x:0 0 0 2026-03-10T14:07:28.419 INFO:tasks.workunit.client.1.vm04.stdout:6/215: getdents d3 0 2026-03-10T14:07:28.427 INFO:tasks.workunit.client.1.vm04.stdout:7/239: creat d2/d14/f58 x:0 0 0 2026-03-10T14:07:28.427 INFO:tasks.workunit.client.1.vm04.stdout:5/261: rename d7/f13 to d7/f4b 0 2026-03-10T14:07:28.428 INFO:tasks.workunit.client.1.vm04.stdout:7/240: write d2/dc/de/f12 [653757,3259] 0 2026-03-10T14:07:28.429 INFO:tasks.workunit.client.1.vm04.stdout:8/248: symlink d0/l4a 0 2026-03-10T14:07:28.429 INFO:tasks.workunit.client.1.vm04.stdout:8/249: fdatasync d0/d3/dd/d11/d12/f1d 0 2026-03-10T14:07:28.430 INFO:tasks.workunit.client.1.vm04.stdout:8/250: readlink d0/l4a 0 2026-03-10T14:07:28.430 INFO:tasks.workunit.client.1.vm04.stdout:8/251: read - d0/d3/dd/d11/f3c zero size 2026-03-10T14:07:28.436 INFO:tasks.workunit.client.1.vm04.stdout:6/216: mknod d3/de/d11/d2d/c37 0 2026-03-10T14:07:28.436 INFO:tasks.workunit.client.1.vm04.stdout:7/241: mknod d2/dc/de/d11/c59 0 2026-03-10T14:07:28.437 INFO:tasks.workunit.client.1.vm04.stdout:8/252: mknod d0/d3/dd/d11/c4b 0 2026-03-10T14:07:28.444 INFO:tasks.workunit.client.1.vm04.stdout:6/217: mkdir d3/de/d11/d2d/d38 0 2026-03-10T14:07:28.453 INFO:tasks.workunit.client.1.vm04.stdout:6/218: dwrite d3/de/d11/f22 [0,4194304] 0 2026-03-10T14:07:28.454 INFO:tasks.workunit.client.1.vm04.stdout:6/219: truncate d3/de/d11/d2d/f21 56098 0 2026-03-10T14:07:28.463 INFO:tasks.workunit.client.1.vm04.stdout:7/242: dread d2/d14/f15 [0,4194304] 0 2026-03-10T14:07:28.469 INFO:tasks.workunit.client.1.vm04.stdout:7/243: dwrite d2/d28/f3d [4194304,4194304] 0 2026-03-10T14:07:28.470 INFO:tasks.workunit.client.1.vm04.stdout:7/244: chown d2/dc/f25 124476340 1 2026-03-10T14:07:28.471 INFO:tasks.workunit.client.1.vm04.stdout:7/245: write d2/d14/d3b/f49 [3789656,56878] 0 2026-03-10T14:07:28.479 INFO:tasks.workunit.client.1.vm04.stdout:8/253: rename d0/d3/dd/d11/c44 to d0/d3/dd/d11/c4c 0 2026-03-10T14:07:28.487 INFO:tasks.workunit.client.1.vm04.stdout:7/246: link d2/d14/f58 d2/dc/de/d2d/d38/f5a 0 2026-03-10T14:07:28.488 INFO:tasks.workunit.client.1.vm04.stdout:7/247: mknod d2/dc/de/c5b 0 2026-03-10T14:07:28.490 INFO:tasks.workunit.client.1.vm04.stdout:7/248: write d2/dc/de/f1e [4612447,54093] 0 2026-03-10T14:07:28.490 INFO:tasks.workunit.client.1.vm04.stdout:7/249: chown d2/d28 0 1 2026-03-10T14:07:28.496 INFO:tasks.workunit.client.1.vm04.stdout:7/250: dwrite d2/f20 [0,4194304] 0 2026-03-10T14:07:28.498 INFO:tasks.workunit.client.1.vm04.stdout:7/251: readlink d2/dc/de/d2d/l3f 0 2026-03-10T14:07:28.574 INFO:tasks.workunit.client.1.vm04.stdout:8/254: sync 2026-03-10T14:07:28.588 INFO:tasks.workunit.client.1.vm04.stdout:7/252: sync 2026-03-10T14:07:28.589 INFO:tasks.workunit.client.1.vm04.stdout:7/253: write d2/dc/de/d2d/f39 [8902192,5420] 0 2026-03-10T14:07:28.594 INFO:tasks.workunit.client.1.vm04.stdout:7/254: mkdir d2/dc/de/d2d/d5c 0 2026-03-10T14:07:28.595 INFO:tasks.workunit.client.1.vm04.stdout:7/255: read d2/d14/f15 [3457749,72022] 0 2026-03-10T14:07:28.599 INFO:tasks.workunit.client.1.vm04.stdout:7/256: dwrite d2/f20 [0,4194304] 0 2026-03-10T14:07:28.608 INFO:tasks.workunit.client.1.vm04.stdout:7/257: fdatasync d2/dc/f4a 0 2026-03-10T14:07:28.608 INFO:tasks.workunit.client.1.vm04.stdout:7/258: symlink d2/d14/d36/l5d 0 2026-03-10T14:07:28.608 INFO:tasks.workunit.client.1.vm04.stdout:7/259: chown d2/dc/de/d2d/d38/d50 94 1 2026-03-10T14:07:28.608 INFO:tasks.workunit.client.1.vm04.stdout:7/260: creat d2/d14/d3b/f5e x:0 0 0 2026-03-10T14:07:28.608 INFO:tasks.workunit.client.1.vm04.stdout:7/261: fdatasync d2/d14/d3b/f43 0 2026-03-10T14:07:28.611 INFO:tasks.workunit.client.1.vm04.stdout:7/262: dread d2/dc/de/d21/f45 [0,4194304] 0 2026-03-10T14:07:28.614 INFO:tasks.workunit.client.1.vm04.stdout:7/263: truncate d2/d14/d44/f51 811221 0 2026-03-10T14:07:28.615 INFO:tasks.workunit.client.1.vm04.stdout:7/264: read - d2/d14/d3b/f48 zero size 2026-03-10T14:07:28.620 INFO:tasks.workunit.client.1.vm04.stdout:0/216: dread d0/d2/fa [0,4194304] 0 2026-03-10T14:07:28.620 INFO:tasks.workunit.client.1.vm04.stdout:0/217: readlink d0/d1c/l32 0 2026-03-10T14:07:28.625 INFO:tasks.workunit.client.1.vm04.stdout:7/265: symlink d2/dc/de/d2d/d38/d50/l5f 0 2026-03-10T14:07:28.627 INFO:tasks.workunit.client.1.vm04.stdout:7/266: chown d2/d2a/d42/l54 45375442 1 2026-03-10T14:07:28.628 INFO:tasks.workunit.client.1.vm04.stdout:7/267: write d2/f4 [1930832,78818] 0 2026-03-10T14:07:28.632 INFO:tasks.workunit.client.1.vm04.stdout:7/268: dread d2/dc/de/d2d/d38/f41 [0,4194304] 0 2026-03-10T14:07:28.635 INFO:tasks.workunit.client.1.vm04.stdout:7/269: mkdir d2/dc/de/d2d/d60 0 2026-03-10T14:07:28.645 INFO:tasks.workunit.client.1.vm04.stdout:7/270: dwrite d2/f4 [0,4194304] 0 2026-03-10T14:07:28.645 INFO:tasks.workunit.client.1.vm04.stdout:8/255: dread d0/d3/dd/d11/d12/f2c [0,4194304] 0 2026-03-10T14:07:28.645 INFO:tasks.workunit.client.1.vm04.stdout:7/271: fdatasync d2/dc/de/d2d/d38/f41 0 2026-03-10T14:07:28.645 INFO:tasks.workunit.client.1.vm04.stdout:7/272: chown d2/d2a/d42 238469 1 2026-03-10T14:07:28.645 INFO:tasks.workunit.client.1.vm04.stdout:7/273: chown d2/dc/f4a 54689 1 2026-03-10T14:07:28.647 INFO:tasks.workunit.client.1.vm04.stdout:7/274: rename d2/d28/l2b to d2/d2a/d42/d56/l61 0 2026-03-10T14:07:28.648 INFO:tasks.workunit.client.1.vm04.stdout:7/275: creat d2/dc/de/f62 x:0 0 0 2026-03-10T14:07:28.656 INFO:tasks.workunit.client.1.vm04.stdout:8/256: dread d0/f26 [0,4194304] 0 2026-03-10T14:07:28.672 INFO:tasks.workunit.client.1.vm04.stdout:9/254: dread d9/d1d/f3d [0,4194304] 0 2026-03-10T14:07:28.672 INFO:tasks.workunit.client.1.vm04.stdout:9/255: fsync f8 0 2026-03-10T14:07:28.685 INFO:tasks.workunit.client.1.vm04.stdout:8/257: write d0/d3/d5/f30 [327633,90451] 0 2026-03-10T14:07:28.685 INFO:tasks.workunit.client.1.vm04.stdout:4/233: write d4/df/d22/f2b [1675158,18132] 0 2026-03-10T14:07:28.688 INFO:tasks.workunit.client.1.vm04.stdout:8/258: dwrite d0/f23 [0,4194304] 0 2026-03-10T14:07:28.694 INFO:tasks.workunit.client.1.vm04.stdout:8/259: dread d0/d3/dd/d11/d12/f2c [0,4194304] 0 2026-03-10T14:07:28.694 INFO:tasks.workunit.client.1.vm04.stdout:8/260: write d0/f23 [5044079,118537] 0 2026-03-10T14:07:28.696 INFO:tasks.workunit.client.1.vm04.stdout:4/234: mkdir d4/df/d22/d47/d4f 0 2026-03-10T14:07:28.703 INFO:tasks.workunit.client.1.vm04.stdout:8/261: fsync d0/d3/dd/fc 0 2026-03-10T14:07:28.706 INFO:tasks.workunit.client.1.vm04.stdout:8/262: creat d0/d3/dd/d11/d29/f4d x:0 0 0 2026-03-10T14:07:28.712 INFO:tasks.workunit.client.1.vm04.stdout:4/235: getdents d4/df 0 2026-03-10T14:07:28.719 INFO:tasks.workunit.client.1.vm04.stdout:4/236: dread - d4/df/f4e zero size 2026-03-10T14:07:28.719 INFO:tasks.workunit.client.1.vm04.stdout:8/263: symlink d0/l4e 0 2026-03-10T14:07:28.719 INFO:tasks.workunit.client.1.vm04.stdout:4/237: rename d4/lb to d4/df/d22/l50 0 2026-03-10T14:07:28.722 INFO:tasks.workunit.client.1.vm04.stdout:1/260: fdatasync d3/f8 0 2026-03-10T14:07:28.726 INFO:tasks.workunit.client.1.vm04.stdout:8/264: link d0/d3/d5/f15 d0/d3/dd/d33/d48/f4f 0 2026-03-10T14:07:28.745 INFO:tasks.workunit.client.1.vm04.stdout:8/265: write d0/d3/dd/d11/d12/f2c [1479011,121465] 0 2026-03-10T14:07:28.747 INFO:tasks.workunit.client.1.vm04.stdout:4/238: rename d4/df/d31/c48 to d4/d14/c51 0 2026-03-10T14:07:28.748 INFO:tasks.workunit.client.1.vm04.stdout:1/261: creat d3/d22/d2f/f5d x:0 0 0 2026-03-10T14:07:28.751 INFO:tasks.workunit.client.1.vm04.stdout:4/239: dwrite d4/df/f4e [0,4194304] 0 2026-03-10T14:07:28.756 INFO:tasks.workunit.client.1.vm04.stdout:8/266: creat d0/d3/dd/d11/d12/f50 x:0 0 0 2026-03-10T14:07:28.763 INFO:tasks.workunit.client.1.vm04.stdout:4/240: mknod d4/df/c52 0 2026-03-10T14:07:28.766 INFO:tasks.workunit.client.1.vm04.stdout:8/267: dread d0/d3/d5/f15 [0,4194304] 0 2026-03-10T14:07:28.767 INFO:tasks.workunit.client.1.vm04.stdout:0/218: dread d0/d2/f12 [0,4194304] 0 2026-03-10T14:07:28.768 INFO:tasks.workunit.client.1.vm04.stdout:0/219: stat d0/d1c/l3a 0 2026-03-10T14:07:28.781 INFO:tasks.workunit.client.1.vm04.stdout:0/220: getdents d0/d2 0 2026-03-10T14:07:28.782 INFO:tasks.workunit.client.1.vm04.stdout:0/221: chown d0/d2/d25/f45 15839 1 2026-03-10T14:07:28.783 INFO:tasks.workunit.client.1.vm04.stdout:3/242: dread da/f19 [0,4194304] 0 2026-03-10T14:07:28.785 INFO:tasks.workunit.client.1.vm04.stdout:0/222: write d0/d2/f2d [229525,79903] 0 2026-03-10T14:07:28.789 INFO:tasks.workunit.client.1.vm04.stdout:3/243: mkdir da/dc/d35/d52 0 2026-03-10T14:07:28.792 INFO:tasks.workunit.client.1.vm04.stdout:0/223: mknod d0/d2/d25/c46 0 2026-03-10T14:07:28.794 INFO:tasks.workunit.client.1.vm04.stdout:3/244: mkdir da/dc/d35/d52/d53 0 2026-03-10T14:07:28.794 INFO:tasks.workunit.client.1.vm04.stdout:0/224: dread - d0/f19 zero size 2026-03-10T14:07:28.797 INFO:tasks.workunit.client.1.vm04.stdout:3/245: mkdir da/dc/d3f/d54 0 2026-03-10T14:07:28.798 INFO:tasks.workunit.client.1.vm04.stdout:3/246: truncate da/dc/f2a 111928 0 2026-03-10T14:07:28.798 INFO:tasks.workunit.client.1.vm04.stdout:3/247: chown da/dc/d35/d37/l41 398754416 1 2026-03-10T14:07:28.802 INFO:tasks.workunit.client.1.vm04.stdout:3/248: link da/dc/f2c da/d30/f55 0 2026-03-10T14:07:28.806 INFO:tasks.workunit.client.1.vm04.stdout:2/272: dread d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:28.807 INFO:tasks.workunit.client.1.vm04.stdout:2/273: creat d0/d3/d4a/f57 x:0 0 0 2026-03-10T14:07:28.811 INFO:tasks.workunit.client.1.vm04.stdout:2/274: dwrite d0/d3/d8/d17/d35/f49 [0,4194304] 0 2026-03-10T14:07:28.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:28 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:07:28.821 INFO:tasks.workunit.client.1.vm04.stdout:2/275: dwrite d0/d3/d4a/f4d [0,4194304] 0 2026-03-10T14:07:28.822 INFO:tasks.workunit.client.1.vm04.stdout:2/276: dread - d0/d14/d39/f44 zero size 2026-03-10T14:07:28.823 INFO:tasks.workunit.client.1.vm04.stdout:2/277: chown d0/d3/d8/dd/d26/f2a 315 1 2026-03-10T14:07:28.836 INFO:tasks.workunit.client.1.vm04.stdout:5/262: dwrite d7/f21 [0,4194304] 0 2026-03-10T14:07:28.843 INFO:tasks.workunit.client.1.vm04.stdout:5/263: getdents d7/d2d/d32 0 2026-03-10T14:07:28.846 INFO:tasks.workunit.client.1.vm04.stdout:5/264: dwrite d7/f21 [0,4194304] 0 2026-03-10T14:07:28.848 INFO:tasks.workunit.client.1.vm04.stdout:5/265: write d7/d9/f28 [694270,117640] 0 2026-03-10T14:07:28.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:28 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:07:28.972 INFO:tasks.workunit.client.1.vm04.stdout:5/266: dread d7/d12/d2b/f46 [4194304,4194304] 0 2026-03-10T14:07:28.973 INFO:tasks.workunit.client.1.vm04.stdout:5/267: write d7/f4b [2388770,64888] 0 2026-03-10T14:07:28.973 INFO:tasks.workunit.client.1.vm04.stdout:5/268: chown d7/d12/f27 34413 1 2026-03-10T14:07:28.975 INFO:tasks.workunit.client.1.vm04.stdout:5/269: getdents d7/d2d/d32 0 2026-03-10T14:07:29.000 INFO:tasks.workunit.client.1.vm04.stdout:6/220: write d3/d8/fc [3001389,11556] 0 2026-03-10T14:07:29.003 INFO:tasks.workunit.client.1.vm04.stdout:6/221: dread d3/de/d11/d2d/f18 [0,4194304] 0 2026-03-10T14:07:29.006 INFO:tasks.workunit.client.1.vm04.stdout:6/222: symlink d3/l39 0 2026-03-10T14:07:29.033 INFO:tasks.workunit.client.1.vm04.stdout:7/276: fsync d2/d14/f58 0 2026-03-10T14:07:29.034 INFO:tasks.workunit.client.1.vm04.stdout:7/277: chown d2/dc/de/d11/c47 4120163 1 2026-03-10T14:07:29.034 INFO:tasks.workunit.client.1.vm04.stdout:7/278: chown d2 8135 1 2026-03-10T14:07:29.036 INFO:tasks.workunit.client.1.vm04.stdout:7/279: dread d2/dc/f25 [0,4194304] 0 2026-03-10T14:07:29.038 INFO:tasks.workunit.client.1.vm04.stdout:7/280: dread d2/f20 [0,4194304] 0 2026-03-10T14:07:29.043 INFO:tasks.workunit.client.1.vm04.stdout:7/281: creat d2/dc/de/d2d/d5c/f63 x:0 0 0 2026-03-10T14:07:29.044 INFO:tasks.workunit.client.1.vm04.stdout:7/282: mkdir d2/d14/d64 0 2026-03-10T14:07:29.045 INFO:tasks.workunit.client.1.vm04.stdout:7/283: creat d2/d14/d64/f65 x:0 0 0 2026-03-10T14:07:29.048 INFO:tasks.workunit.client.1.vm04.stdout:7/284: creat d2/d14/d44/f66 x:0 0 0 2026-03-10T14:07:29.050 INFO:tasks.workunit.client.1.vm04.stdout:7/285: symlink d2/dc/d4d/l67 0 2026-03-10T14:07:29.051 INFO:tasks.workunit.client.1.vm04.stdout:7/286: write d2/dc/de/f1e [401394,54288] 0 2026-03-10T14:07:29.066 INFO:tasks.workunit.client.1.vm04.stdout:9/256: truncate d9/d33/f4b 788432 0 2026-03-10T14:07:29.066 INFO:tasks.workunit.client.1.vm04.stdout:9/257: stat d9/da/dd/f48 0 2026-03-10T14:07:29.068 INFO:tasks.workunit.client.1.vm04.stdout:9/258: mkdir d9/d44/d59 0 2026-03-10T14:07:29.069 INFO:tasks.workunit.client.1.vm04.stdout:9/259: write d9/da/dd/d1c/f22 [4550623,108929] 0 2026-03-10T14:07:29.070 INFO:tasks.workunit.client.1.vm04.stdout:9/260: write d9/f4f [116655,65246] 0 2026-03-10T14:07:29.087 INFO:tasks.workunit.client.1.vm04.stdout:8/268: rename d0/d3/dd/d33/d48 to d0/d3/dd/d11/d12/d51 0 2026-03-10T14:07:29.090 INFO:tasks.workunit.client.1.vm04.stdout:8/269: creat d0/d3/dd/d11/d29/f52 x:0 0 0 2026-03-10T14:07:29.092 INFO:tasks.workunit.client.1.vm04.stdout:2/278: rename d0/d3/d8/dd/l22 to d0/d14/l58 0 2026-03-10T14:07:29.093 INFO:tasks.workunit.client.1.vm04.stdout:2/279: fsync d0/d3/d8/dd/d26/f36 0 2026-03-10T14:07:29.096 INFO:tasks.workunit.client.1.vm04.stdout:2/280: dwrite d0/d3/f1d [0,4194304] 0 2026-03-10T14:07:29.099 INFO:tasks.workunit.client.1.vm04.stdout:5/270: rename d7/d12/l2a to d7/d12/d2b/d3e/d3f/l4c 0 2026-03-10T14:07:29.103 INFO:tasks.workunit.client.1.vm04.stdout:2/281: mkdir d0/d3/d8/d59 0 2026-03-10T14:07:29.105 INFO:tasks.workunit.client.1.vm04.stdout:7/287: rename d2/dc/de/d11/c47 to d2/dc/de/d2d/d38/d50/c68 0 2026-03-10T14:07:29.109 INFO:tasks.workunit.client.1.vm04.stdout:5/271: creat d7/d12/d2b/f4d x:0 0 0 2026-03-10T14:07:29.109 INFO:tasks.workunit.client.1.vm04.stdout:5/272: truncate d7/d9/f20 838729 0 2026-03-10T14:07:29.112 INFO:tasks.workunit.client.1.vm04.stdout:4/241: symlink d4/df/d34/l53 0 2026-03-10T14:07:29.115 INFO:tasks.workunit.client.1.vm04.stdout:2/282: mknod d0/d3/d8/dd/d26/d46/c5a 0 2026-03-10T14:07:29.116 INFO:tasks.workunit.client.1.vm04.stdout:2/283: chown d0/d3/d8/f30 7364 1 2026-03-10T14:07:29.116 INFO:tasks.workunit.client.1.vm04.stdout:4/242: dwrite d4/f9 [4194304,4194304] 0 2026-03-10T14:07:29.120 INFO:tasks.workunit.client.1.vm04.stdout:2/284: dwrite d0/d3/f1d [0,4194304] 0 2026-03-10T14:07:29.131 INFO:tasks.workunit.client.1.vm04.stdout:9/261: rename d9/d44/d4d/f56 to d9/d44/d59/f5a 0 2026-03-10T14:07:29.135 INFO:tasks.workunit.client.1.vm04.stdout:7/288: symlink d2/dc/de/d2d/d38/d50/l69 0 2026-03-10T14:07:29.141 INFO:tasks.workunit.client.1.vm04.stdout:1/262: dwrite d3/d5/fd [0,4194304] 0 2026-03-10T14:07:29.141 INFO:tasks.workunit.client.1.vm04.stdout:1/263: readlink d3/d5/d13/d38/l3e 0 2026-03-10T14:07:29.141 INFO:tasks.workunit.client.1.vm04.stdout:4/243: fdatasync d4/df/d22/f4b 0 2026-03-10T14:07:29.142 INFO:tasks.workunit.client.1.vm04.stdout:2/285: symlink d0/d14/d39/d47/l5b 0 2026-03-10T14:07:29.142 INFO:tasks.workunit.client.1.vm04.stdout:8/270: rename d0/l4a to d0/d3/d5/l53 0 2026-03-10T14:07:29.142 INFO:tasks.workunit.client.1.vm04.stdout:4/244: rename d4/df/d22 to d4/df/d22/d54 22 2026-03-10T14:07:29.146 INFO:tasks.workunit.client.1.vm04.stdout:2/286: read d0/d3/d8/dd/d26/f33 [2452719,102038] 0 2026-03-10T14:07:29.146 INFO:tasks.workunit.client.1.vm04.stdout:2/287: write d0/d3/f11 [5171479,44750] 0 2026-03-10T14:07:29.152 INFO:tasks.workunit.client.1.vm04.stdout:7/289: creat d2/d14/d64/f6a x:0 0 0 2026-03-10T14:07:29.159 INFO:tasks.workunit.client.1.vm04.stdout:1/264: rmdir d3/d5/d1e/d35 39 2026-03-10T14:07:29.159 INFO:tasks.workunit.client.1.vm04.stdout:8/271: unlink d0/d3/dd/d11/d12/f2e 0 2026-03-10T14:07:29.160 INFO:tasks.workunit.client.1.vm04.stdout:5/273: creat d7/d2d/d32/d34/f4e x:0 0 0 2026-03-10T14:07:29.161 INFO:tasks.workunit.client.1.vm04.stdout:4/245: readlink d4/df/d22/l50 0 2026-03-10T14:07:29.162 INFO:tasks.workunit.client.1.vm04.stdout:4/246: chown d4/df/d34/f23 6 1 2026-03-10T14:07:29.164 INFO:tasks.workunit.client.1.vm04.stdout:1/265: dwrite d3/d5/d13/f5a [0,4194304] 0 2026-03-10T14:07:29.172 INFO:tasks.workunit.client.1.vm04.stdout:9/262: fsync d9/d44/f51 0 2026-03-10T14:07:29.177 INFO:tasks.workunit.client.1.vm04.stdout:8/272: chown d0/d3/dd/d11/c4c 13 1 2026-03-10T14:07:29.182 INFO:tasks.workunit.client.1.vm04.stdout:5/274: write d7/f11 [559470,13998] 0 2026-03-10T14:07:29.195 INFO:tasks.workunit.client.1.vm04.stdout:9/263: write d9/f3e [914531,75060] 0 2026-03-10T14:07:29.198 INFO:tasks.workunit.client.1.vm04.stdout:7/290: mkdir d2/d6b 0 2026-03-10T14:07:29.201 INFO:tasks.workunit.client.1.vm04.stdout:5/275: mknod d7/d12/d2b/d3e/c4f 0 2026-03-10T14:07:29.203 INFO:tasks.workunit.client.1.vm04.stdout:4/247: creat d4/df/d31/f55 x:0 0 0 2026-03-10T14:07:29.208 INFO:tasks.workunit.client.1.vm04.stdout:4/248: dwrite d4/df/d31/f3d [0,4194304] 0 2026-03-10T14:07:29.225 INFO:tasks.workunit.client.1.vm04.stdout:5/276: symlink d7/d2d/l50 0 2026-03-10T14:07:29.225 INFO:tasks.workunit.client.1.vm04.stdout:5/277: chown d7/d2d 8094 1 2026-03-10T14:07:29.230 INFO:tasks.workunit.client.1.vm04.stdout:2/288: sync 2026-03-10T14:07:29.231 INFO:tasks.workunit.client.1.vm04.stdout:2/289: dread - d0/d3/d4a/f57 zero size 2026-03-10T14:07:29.231 INFO:tasks.workunit.client.1.vm04.stdout:5/278: dread d7/d2d/d32/f3b [0,4194304] 0 2026-03-10T14:07:29.235 INFO:tasks.workunit.client.1.vm04.stdout:1/266: rename d3/l4 to d3/l5e 0 2026-03-10T14:07:29.235 INFO:tasks.workunit.client.1.vm04.stdout:4/249: dread d4/df/f2e [0,4194304] 0 2026-03-10T14:07:29.242 INFO:tasks.workunit.client.1.vm04.stdout:8/273: getdents d0/d3/dd/d11/d12/d51 0 2026-03-10T14:07:29.244 INFO:tasks.workunit.client.1.vm04.stdout:8/274: write d0/f42 [617919,114909] 0 2026-03-10T14:07:29.248 INFO:tasks.workunit.client.1.vm04.stdout:9/264: link d9/da/dd/d1c/c42 d9/d44/d59/c5b 0 2026-03-10T14:07:29.249 INFO:tasks.workunit.client.1.vm04.stdout:9/265: chown d9/d44/d59 1507 1 2026-03-10T14:07:29.250 INFO:tasks.workunit.client.1.vm04.stdout:8/275: mknod d0/d3/d5/c54 0 2026-03-10T14:07:29.251 INFO:tasks.workunit.client.1.vm04.stdout:8/276: stat d0/d3/dd/d11/d29/l2f 0 2026-03-10T14:07:29.251 INFO:tasks.workunit.client.1.vm04.stdout:8/277: fdatasync d0/d3/dd/d11/d12/f2c 0 2026-03-10T14:07:29.254 INFO:tasks.workunit.client.1.vm04.stdout:2/290: mknod d0/c5c 0 2026-03-10T14:07:29.255 INFO:tasks.workunit.client.1.vm04.stdout:4/250: symlink d4/df/d22/d47/d4f/l56 0 2026-03-10T14:07:29.259 INFO:tasks.workunit.client.1.vm04.stdout:8/278: rename d0/c19 to d0/d3/dd/d11/c55 0 2026-03-10T14:07:29.260 INFO:tasks.workunit.client.1.vm04.stdout:8/279: truncate d0/d3/dd/d11/d12/f50 274273 0 2026-03-10T14:07:29.260 INFO:tasks.workunit.client.1.vm04.stdout:2/291: rmdir d0/d3/d8/d17/d35 39 2026-03-10T14:07:29.262 INFO:tasks.workunit.client.1.vm04.stdout:4/251: creat d4/f57 x:0 0 0 2026-03-10T14:07:29.265 INFO:tasks.workunit.client.1.vm04.stdout:2/292: creat d0/d14/d39/d47/f5d x:0 0 0 2026-03-10T14:07:29.269 INFO:tasks.workunit.client.1.vm04.stdout:8/280: mknod d0/c56 0 2026-03-10T14:07:29.269 INFO:tasks.workunit.client.1.vm04.stdout:4/252: rename d4/df/d22/d47/l49 to d4/df/d22/d47/l58 0 2026-03-10T14:07:29.270 INFO:tasks.workunit.client.1.vm04.stdout:4/253: chown d4/df/d34/f1f 463 1 2026-03-10T14:07:29.271 INFO:tasks.workunit.client.1.vm04.stdout:2/293: symlink d0/d3/d8/dd/d26/d46/d4b/d56/l5e 0 2026-03-10T14:07:29.282 INFO:tasks.workunit.client.1.vm04.stdout:4/254: dwrite d4/df/d22/f2d [0,4194304] 0 2026-03-10T14:07:29.287 INFO:tasks.workunit.client.1.vm04.stdout:4/255: fdatasync d4/d14/d1b/f20 0 2026-03-10T14:07:29.298 INFO:tasks.workunit.client.1.vm04.stdout:2/294: creat d0/d3/d8/d17/d35/f5f x:0 0 0 2026-03-10T14:07:29.307 INFO:tasks.workunit.client.1.vm04.stdout:2/295: dwrite d0/d3/f1d [4194304,4194304] 0 2026-03-10T14:07:29.307 INFO:tasks.workunit.client.1.vm04.stdout:8/281: link d0/d3/dd/fc d0/d3/dd/d11/f57 0 2026-03-10T14:07:29.309 INFO:tasks.workunit.client.1.vm04.stdout:8/282: dwrite d0/f23 [0,4194304] 0 2026-03-10T14:07:29.321 INFO:tasks.workunit.client.1.vm04.stdout:0/225: write d0/d2/f17 [2652974,58194] 0 2026-03-10T14:07:29.321 INFO:tasks.workunit.client.1.vm04.stdout:0/226: chown d0/d2/f40 265306440 1 2026-03-10T14:07:29.322 INFO:tasks.workunit.client.1.vm04.stdout:0/227: truncate d0/f35 242376 0 2026-03-10T14:07:29.332 INFO:tasks.workunit.client.1.vm04.stdout:4/256: creat d4/df/f59 x:0 0 0 2026-03-10T14:07:29.336 INFO:tasks.workunit.client.1.vm04.stdout:0/228: mknod d0/c47 0 2026-03-10T14:07:29.337 INFO:tasks.workunit.client.1.vm04.stdout:4/257: chown d4/df/d34/c3f 675848 1 2026-03-10T14:07:29.338 INFO:tasks.workunit.client.1.vm04.stdout:2/296: dread d0/d14/d1b/f29 [0,4194304] 0 2026-03-10T14:07:29.339 INFO:tasks.workunit.client.1.vm04.stdout:2/297: chown d0/d3/d3a/d3e 61 1 2026-03-10T14:07:29.339 INFO:tasks.workunit.client.1.vm04.stdout:2/298: dread - d0/d14/d39/d47/f5d zero size 2026-03-10T14:07:29.344 INFO:tasks.workunit.client.1.vm04.stdout:0/229: dwrite d0/d2/fa [0,4194304] 0 2026-03-10T14:07:29.345 INFO:tasks.workunit.client.1.vm04.stdout:2/299: dwrite d0/d3/f1d [4194304,4194304] 0 2026-03-10T14:07:29.350 INFO:tasks.workunit.client.1.vm04.stdout:2/300: dwrite d0/d3/d8/d17/d35/f49 [0,4194304] 0 2026-03-10T14:07:29.375 INFO:tasks.workunit.client.1.vm04.stdout:2/301: symlink d0/d3/d8/dd/d26/d46/d4b/d56/l60 0 2026-03-10T14:07:29.375 INFO:tasks.workunit.client.1.vm04.stdout:2/302: read d0/d14/d1b/f29 [1954947,251] 0 2026-03-10T14:07:29.378 INFO:tasks.workunit.client.1.vm04.stdout:2/303: creat d0/d3/d3a/d3e/f61 x:0 0 0 2026-03-10T14:07:29.395 INFO:tasks.workunit.client.1.vm04.stdout:3/249: write da/f19 [988641,109867] 0 2026-03-10T14:07:29.450 INFO:tasks.workunit.client.1.vm04.stdout:8/283: dread d0/d3/dd/d11/d12/f50 [0,4194304] 0 2026-03-10T14:07:29.454 INFO:tasks.workunit.client.1.vm04.stdout:5/279: dread d7/f11 [0,4194304] 0 2026-03-10T14:07:29.466 INFO:tasks.workunit.client.1.vm04.stdout:5/280: readlink d7/d2d/d32/l37 0 2026-03-10T14:07:29.467 INFO:tasks.workunit.client.1.vm04.stdout:5/281: creat d7/d12/f51 x:0 0 0 2026-03-10T14:07:29.467 INFO:tasks.workunit.client.1.vm04.stdout:5/282: write d7/d12/d2b/f4d [91130,19269] 0 2026-03-10T14:07:29.467 INFO:tasks.workunit.client.1.vm04.stdout:5/283: dwrite d7/f11 [0,4194304] 0 2026-03-10T14:07:29.467 INFO:tasks.workunit.client.1.vm04.stdout:5/284: creat d7/d12/d2b/d3e/d3f/f52 x:0 0 0 2026-03-10T14:07:29.467 INFO:tasks.workunit.client.1.vm04.stdout:5/285: creat d7/d12/d2b/f53 x:0 0 0 2026-03-10T14:07:29.467 INFO:tasks.workunit.client.1.vm04.stdout:5/286: creat d7/d26/f54 x:0 0 0 2026-03-10T14:07:29.468 INFO:tasks.workunit.client.1.vm04.stdout:5/287: dwrite d7/d12/f42 [0,4194304] 0 2026-03-10T14:07:29.468 INFO:tasks.workunit.client.1.vm04.stdout:5/288: fdatasync d7/d12/f27 0 2026-03-10T14:07:29.468 INFO:tasks.workunit.client.1.vm04.stdout:5/289: chown f4 193 1 2026-03-10T14:07:29.473 INFO:tasks.workunit.client.1.vm04.stdout:5/290: write d7/d9/f20 [1873671,25698] 0 2026-03-10T14:07:29.508 INFO:tasks.workunit.client.1.vm04.stdout:5/291: dread d7/d12/f27 [0,4194304] 0 2026-03-10T14:07:29.630 INFO:tasks.workunit.client.1.vm04.stdout:7/291: truncate d2/dc/de/f1e 2166869 0 2026-03-10T14:07:29.631 INFO:tasks.workunit.client.1.vm04.stdout:7/292: mknod d2/d2a/d42/c6c 0 2026-03-10T14:07:29.659 INFO:tasks.workunit.client.1.vm04.stdout:1/267: dread d3/d22/d2f/f39 [4194304,4194304] 0 2026-03-10T14:07:29.663 INFO:tasks.workunit.client.1.vm04.stdout:1/268: symlink d3/d22/d2f/d57/l5f 0 2026-03-10T14:07:29.667 INFO:tasks.workunit.client.1.vm04.stdout:1/269: dwrite f1 [4194304,4194304] 0 2026-03-10T14:07:29.673 INFO:tasks.workunit.client.1.vm04.stdout:1/270: mkdir d3/d20/d60 0 2026-03-10T14:07:29.678 INFO:tasks.workunit.client.1.vm04.stdout:1/271: link d3/d5/d1e/l51 d3/d5/d1e/d35/l61 0 2026-03-10T14:07:29.682 INFO:tasks.workunit.client.1.vm04.stdout:1/272: dwrite d3/d5/d1e/f2d [0,4194304] 0 2026-03-10T14:07:29.699 INFO:tasks.workunit.client.1.vm04.stdout:9/266: truncate d9/f20 517967 0 2026-03-10T14:07:29.775 INFO:tasks.workunit.client.1.vm04.stdout:9/267: sync 2026-03-10T14:07:29.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:29 vm04.local ceph-mon[55966]: pgmap v143: 65 pgs: 65 active+clean; 552 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 6.7 MiB/s rd, 64 MiB/s wr, 383 op/s 2026-03-10T14:07:29.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:29 vm03.local ceph-mon[49718]: pgmap v143: 65 pgs: 65 active+clean; 552 MiB data, 2.6 GiB used, 117 GiB / 120 GiB avail; 6.7 MiB/s rd, 64 MiB/s wr, 383 op/s 2026-03-10T14:07:29.866 INFO:tasks.workunit.client.1.vm04.stdout:5/292: write d7/f1d [2690484,120373] 0 2026-03-10T14:07:29.872 INFO:tasks.workunit.client.1.vm04.stdout:5/293: dread d7/d12/d2b/f4d [0,4194304] 0 2026-03-10T14:07:29.875 INFO:tasks.workunit.client.1.vm04.stdout:5/294: dwrite d7/d12/f51 [0,4194304] 0 2026-03-10T14:07:29.878 INFO:tasks.workunit.client.1.vm04.stdout:6/223: truncate d3/de/d11/d2d/d32/f1f 3348601 0 2026-03-10T14:07:29.881 INFO:tasks.workunit.client.1.vm04.stdout:6/224: mkdir d3/de/d35/d3a 0 2026-03-10T14:07:29.881 INFO:tasks.workunit.client.1.vm04.stdout:6/225: read - d3/de/d11/d2d/d32/d23/d24/f36 zero size 2026-03-10T14:07:29.882 INFO:tasks.workunit.client.1.vm04.stdout:6/226: dread - d3/de/d11/d2d/f34 zero size 2026-03-10T14:07:29.882 INFO:tasks.workunit.client.1.vm04.stdout:6/227: chown d3/c7 222 1 2026-03-10T14:07:29.883 INFO:tasks.workunit.client.1.vm04.stdout:6/228: creat d3/de/d11/f3b x:0 0 0 2026-03-10T14:07:29.885 INFO:tasks.workunit.client.1.vm04.stdout:6/229: truncate d3/f9 1039751 0 2026-03-10T14:07:29.886 INFO:tasks.workunit.client.1.vm04.stdout:6/230: creat d3/d8/f3c x:0 0 0 2026-03-10T14:07:29.913 INFO:tasks.workunit.client.1.vm04.stdout:6/231: read d3/de/d11/d2d/f21 [51544,46637] 0 2026-03-10T14:07:29.914 INFO:tasks.workunit.client.1.vm04.stdout:6/232: write d3/d8/f3c [486175,129088] 0 2026-03-10T14:07:29.982 INFO:tasks.workunit.client.1.vm04.stdout:1/273: dwrite d3/d22/d2f/f3a [0,4194304] 0 2026-03-10T14:07:29.985 INFO:tasks.workunit.client.1.vm04.stdout:1/274: link f2 d3/d5/d13/d1a/f62 0 2026-03-10T14:07:30.055 INFO:tasks.workunit.client.1.vm04.stdout:3/250: link da/dc/l29 da/l56 0 2026-03-10T14:07:30.056 INFO:tasks.workunit.client.1.vm04.stdout:3/251: mknod da/dc/d35/d52/c57 0 2026-03-10T14:07:30.057 INFO:tasks.workunit.client.1.vm04.stdout:3/252: chown l1 188656094 1 2026-03-10T14:07:30.064 INFO:tasks.workunit.client.1.vm04.stdout:3/253: dread da/f14 [0,4194304] 0 2026-03-10T14:07:30.065 INFO:tasks.workunit.client.1.vm04.stdout:3/254: dread da/f14 [0,4194304] 0 2026-03-10T14:07:30.067 INFO:tasks.workunit.client.1.vm04.stdout:3/255: symlink da/dc/d35/d52/d53/l58 0 2026-03-10T14:07:30.068 INFO:tasks.workunit.client.1.vm04.stdout:3/256: mkdir da/dc/d3f/d59 0 2026-03-10T14:07:30.069 INFO:tasks.workunit.client.1.vm04.stdout:3/257: write da/f10 [5265730,98437] 0 2026-03-10T14:07:30.072 INFO:tasks.workunit.client.1.vm04.stdout:3/258: creat da/dc/d35/d52/f5a x:0 0 0 2026-03-10T14:07:30.076 INFO:tasks.workunit.client.1.vm04.stdout:3/259: unlink da/dc/l36 0 2026-03-10T14:07:30.085 INFO:tasks.workunit.client.1.vm04.stdout:3/260: dread da/d30/f42 [0,4194304] 0 2026-03-10T14:07:30.086 INFO:tasks.workunit.client.1.vm04.stdout:5/295: truncate d7/f21 213634 0 2026-03-10T14:07:30.092 INFO:tasks.workunit.client.1.vm04.stdout:3/261: getdents da/dc/d35/d37 0 2026-03-10T14:07:30.096 INFO:tasks.workunit.client.1.vm04.stdout:3/262: dwrite da/d30/f32 [0,4194304] 0 2026-03-10T14:07:30.101 INFO:tasks.workunit.client.1.vm04.stdout:3/263: dread da/dc/d35/f46 [0,4194304] 0 2026-03-10T14:07:30.102 INFO:tasks.workunit.client.1.vm04.stdout:3/264: write da/f25 [1898340,89004] 0 2026-03-10T14:07:30.104 INFO:tasks.workunit.client.1.vm04.stdout:3/265: creat da/dc/d35/f5b x:0 0 0 2026-03-10T14:07:30.108 INFO:tasks.workunit.client.1.vm04.stdout:3/266: dread da/fb [0,4194304] 0 2026-03-10T14:07:30.110 INFO:tasks.workunit.client.1.vm04.stdout:3/267: mknod da/dc/d47/c5c 0 2026-03-10T14:07:30.111 INFO:tasks.workunit.client.1.vm04.stdout:3/268: write da/dc/d35/d52/f5a [425074,10633] 0 2026-03-10T14:07:30.114 INFO:tasks.workunit.client.1.vm04.stdout:3/269: creat da/dc/d3f/d54/f5d x:0 0 0 2026-03-10T14:07:30.122 INFO:tasks.workunit.client.1.vm04.stdout:6/233: write d3/de/d11/f17 [867601,51086] 0 2026-03-10T14:07:30.130 INFO:tasks.workunit.client.1.vm04.stdout:6/234: unlink d3/de/d11/d2d/l16 0 2026-03-10T14:07:30.130 INFO:tasks.workunit.client.1.vm04.stdout:6/235: getdents d3/d8 0 2026-03-10T14:07:30.134 INFO:tasks.workunit.client.1.vm04.stdout:5/296: sync 2026-03-10T14:07:30.136 INFO:tasks.workunit.client.1.vm04.stdout:5/297: link d7/d12/f42 d7/d12/d2b/f55 0 2026-03-10T14:07:30.137 INFO:tasks.workunit.client.1.vm04.stdout:5/298: chown d7/d12/f42 3 1 2026-03-10T14:07:30.139 INFO:tasks.workunit.client.1.vm04.stdout:5/299: creat d7/d26/f56 x:0 0 0 2026-03-10T14:07:30.141 INFO:tasks.workunit.client.1.vm04.stdout:5/300: mkdir d7/d12/d2b/d3e/d57 0 2026-03-10T14:07:30.141 INFO:tasks.workunit.client.1.vm04.stdout:5/301: dread - d7/d12/d2b/d3e/f4a zero size 2026-03-10T14:07:30.146 INFO:tasks.workunit.client.1.vm04.stdout:5/302: dwrite d7/d12/f42 [0,4194304] 0 2026-03-10T14:07:30.161 INFO:tasks.workunit.client.1.vm04.stdout:5/303: link d7/c1a d7/d12/d45/c58 0 2026-03-10T14:07:30.163 INFO:tasks.workunit.client.1.vm04.stdout:5/304: dread d7/d2d/d32/f3b [0,4194304] 0 2026-03-10T14:07:30.163 INFO:tasks.workunit.client.1.vm04.stdout:5/305: write d7/d12/d2b/d3e/f4a [438587,34563] 0 2026-03-10T14:07:30.190 INFO:tasks.workunit.client.1.vm04.stdout:2/304: symlink d0/d3/d8/l62 0 2026-03-10T14:07:30.193 INFO:tasks.workunit.client.1.vm04.stdout:2/305: dwrite d0/d3/d8/f42 [0,4194304] 0 2026-03-10T14:07:30.257 INFO:tasks.workunit.client.1.vm04.stdout:5/306: mkdir d7/d59 0 2026-03-10T14:07:30.260 INFO:tasks.workunit.client.1.vm04.stdout:5/307: truncate d7/d12/f27 740761 0 2026-03-10T14:07:30.260 INFO:tasks.workunit.client.1.vm04.stdout:5/308: chown d7/d59 33153 1 2026-03-10T14:07:30.261 INFO:tasks.workunit.client.1.vm04.stdout:5/309: mknod d7/c5a 0 2026-03-10T14:07:30.264 INFO:tasks.workunit.client.1.vm04.stdout:5/310: rmdir d7/d12/d2b/d3e/d3f 39 2026-03-10T14:07:30.266 INFO:tasks.workunit.client.1.vm04.stdout:5/311: creat d7/d2d/d32/f5b x:0 0 0 2026-03-10T14:07:30.268 INFO:tasks.workunit.client.1.vm04.stdout:5/312: write d7/d12/d2b/d3e/d3f/f52 [533569,32983] 0 2026-03-10T14:07:30.270 INFO:tasks.workunit.client.1.vm04.stdout:5/313: dread d7/f11 [0,4194304] 0 2026-03-10T14:07:30.274 INFO:tasks.workunit.client.1.vm04.stdout:5/314: getdents d7/d12/d2b/d3e/d57 0 2026-03-10T14:07:30.275 INFO:tasks.workunit.client.1.vm04.stdout:5/315: creat d7/d12/d45/f5c x:0 0 0 2026-03-10T14:07:30.288 INFO:tasks.workunit.client.1.vm04.stdout:9/268: mkdir d9/d5c 0 2026-03-10T14:07:30.288 INFO:tasks.workunit.client.1.vm04.stdout:9/269: write d9/d44/f49 [131401,55430] 0 2026-03-10T14:07:30.289 INFO:tasks.workunit.client.1.vm04.stdout:9/270: fdatasync d9/da/dd/d1c/f22 0 2026-03-10T14:07:30.312 INFO:tasks.workunit.client.1.vm04.stdout:4/258: rename d4/df/f18 to d4/d14/f5a 0 2026-03-10T14:07:30.313 INFO:tasks.workunit.client.1.vm04.stdout:9/271: dread d9/da/dd/d1c/f22 [0,4194304] 0 2026-03-10T14:07:30.314 INFO:tasks.workunit.client.1.vm04.stdout:9/272: readlink d9/da/l12 0 2026-03-10T14:07:30.318 INFO:tasks.workunit.client.1.vm04.stdout:7/293: rename d2/d2a/d42/c6c to d2/dc/de/d2d/d5c/c6d 0 2026-03-10T14:07:30.319 INFO:tasks.workunit.client.1.vm04.stdout:9/273: dwrite d9/d44/d4d/f4e [0,4194304] 0 2026-03-10T14:07:30.325 INFO:tasks.workunit.client.1.vm04.stdout:1/275: rename d3/d5/d1e to d3/d22/d63 0 2026-03-10T14:07:30.327 INFO:tasks.workunit.client.1.vm04.stdout:3/270: rename da/dc/f2c to da/dc/d35/d37/f5e 0 2026-03-10T14:07:30.328 INFO:tasks.workunit.client.1.vm04.stdout:3/271: write da/dc/d3f/f4d [245279,38596] 0 2026-03-10T14:07:30.330 INFO:tasks.workunit.client.1.vm04.stdout:3/272: read da/dc/d35/d52/f5a [147221,79477] 0 2026-03-10T14:07:30.334 INFO:tasks.workunit.client.1.vm04.stdout:9/274: getdents d9/d5c 0 2026-03-10T14:07:30.338 INFO:tasks.workunit.client.1.vm04.stdout:6/236: rename d3/d8/f3c to d3/de/d11/f3d 0 2026-03-10T14:07:30.342 INFO:tasks.workunit.client.1.vm04.stdout:3/273: mknod da/dc/d3f/c5f 0 2026-03-10T14:07:30.343 INFO:tasks.workunit.client.1.vm04.stdout:3/274: chown da/d30/f4e 603 1 2026-03-10T14:07:30.345 INFO:tasks.workunit.client.1.vm04.stdout:7/294: link d2/dc/de/d2d/l4f d2/d6b/l6e 0 2026-03-10T14:07:30.345 INFO:tasks.workunit.client.1.vm04.stdout:7/295: chown d2/f20 1238 1 2026-03-10T14:07:30.347 INFO:tasks.workunit.client.1.vm04.stdout:9/275: mkdir d9/da/d5d 0 2026-03-10T14:07:30.356 INFO:tasks.workunit.client.1.vm04.stdout:4/259: rename d4/df/d22/l24 to d4/d14/d1b/l5b 0 2026-03-10T14:07:30.358 INFO:tasks.workunit.client.1.vm04.stdout:7/296: symlink d2/d14/l6f 0 2026-03-10T14:07:30.359 INFO:tasks.workunit.client.1.vm04.stdout:6/237: mkdir d3/d1d/d3e 0 2026-03-10T14:07:30.360 INFO:tasks.workunit.client.1.vm04.stdout:6/238: write d3/d8/fc [2195539,45655] 0 2026-03-10T14:07:30.363 INFO:tasks.workunit.client.1.vm04.stdout:6/239: dwrite d3/de/d11/d2d/f27 [0,4194304] 0 2026-03-10T14:07:30.367 INFO:tasks.workunit.client.1.vm04.stdout:4/260: creat d4/d14/d1b/f5c x:0 0 0 2026-03-10T14:07:30.367 INFO:tasks.workunit.client.1.vm04.stdout:3/275: mknod da/c60 0 2026-03-10T14:07:30.369 INFO:tasks.workunit.client.1.vm04.stdout:3/276: truncate da/dc/d3f/f4d 1215656 0 2026-03-10T14:07:30.370 INFO:tasks.workunit.client.1.vm04.stdout:3/277: read da/dc/f1a [73954,81404] 0 2026-03-10T14:07:30.373 INFO:tasks.workunit.client.1.vm04.stdout:7/297: creat d2/d14/f70 x:0 0 0 2026-03-10T14:07:30.380 INFO:tasks.workunit.client.1.vm04.stdout:6/240: write d3/f4 [5191093,53021] 0 2026-03-10T14:07:30.380 INFO:tasks.workunit.client.1.vm04.stdout:6/241: dread - d3/de/d11/d2d/d32/d23/f33 zero size 2026-03-10T14:07:30.384 INFO:tasks.workunit.client.1.vm04.stdout:3/278: unlink da/dc/d35/c3d 0 2026-03-10T14:07:30.385 INFO:tasks.workunit.client.1.vm04.stdout:7/298: rmdir d2/dc/de/d11 39 2026-03-10T14:07:30.388 INFO:tasks.workunit.client.1.vm04.stdout:3/279: mkdir da/dc/d3f/d61 0 2026-03-10T14:07:30.391 INFO:tasks.workunit.client.1.vm04.stdout:7/299: mknod d2/dc/d4d/c71 0 2026-03-10T14:07:30.392 INFO:tasks.workunit.client.1.vm04.stdout:3/280: mkdir da/dc/d47/d62 0 2026-03-10T14:07:30.393 INFO:tasks.workunit.client.1.vm04.stdout:3/281: chown da/dc/d3f/d54/f5d 482846430 1 2026-03-10T14:07:30.400 INFO:tasks.workunit.client.1.vm04.stdout:7/300: read d2/d28/f29 [1822611,107634] 0 2026-03-10T14:07:30.401 INFO:tasks.workunit.client.1.vm04.stdout:2/306: rmdir d0/d3/d3a 39 2026-03-10T14:07:30.404 INFO:tasks.workunit.client.1.vm04.stdout:7/301: symlink d2/d2a/d42/l72 0 2026-03-10T14:07:30.405 INFO:tasks.workunit.client.1.vm04.stdout:2/307: symlink d0/d3/d8/dd/d26/d46/d4b/d56/l63 0 2026-03-10T14:07:30.409 INFO:tasks.workunit.client.1.vm04.stdout:2/308: dwrite d0/d14/d39/f44 [0,4194304] 0 2026-03-10T14:07:30.411 INFO:tasks.workunit.client.1.vm04.stdout:3/282: creat da/d3e/f63 x:0 0 0 2026-03-10T14:07:30.413 INFO:tasks.workunit.client.1.vm04.stdout:2/309: chown d0/d3/d3a/d3e/f2d 1 1 2026-03-10T14:07:30.414 INFO:tasks.workunit.client.1.vm04.stdout:2/310: write d0/d3/d8/dd/d26/f2a [2135122,14797] 0 2026-03-10T14:07:30.419 INFO:tasks.workunit.client.1.vm04.stdout:3/283: dwrite da/fd [0,4194304] 0 2026-03-10T14:07:30.434 INFO:tasks.workunit.client.1.vm04.stdout:7/302: creat d2/dc/de/f73 x:0 0 0 2026-03-10T14:07:30.437 INFO:tasks.workunit.client.1.vm04.stdout:7/303: truncate d2/dc/f25 56768 0 2026-03-10T14:07:30.440 INFO:tasks.workunit.client.1.vm04.stdout:7/304: creat d2/dc/d4d/f74 x:0 0 0 2026-03-10T14:07:30.441 INFO:tasks.workunit.client.1.vm04.stdout:7/305: fsync d2/d14/d64/f65 0 2026-03-10T14:07:30.441 INFO:tasks.workunit.client.1.vm04.stdout:8/284: rmdir d0/d3/dd/d11/d29 39 2026-03-10T14:07:30.446 INFO:tasks.workunit.client.1.vm04.stdout:7/306: truncate d2/dc/f4a 1417350 0 2026-03-10T14:07:30.448 INFO:tasks.workunit.client.1.vm04.stdout:7/307: symlink d2/dc/de/d2d/d38/d50/l75 0 2026-03-10T14:07:30.458 INFO:tasks.workunit.client.1.vm04.stdout:7/308: mknod d2/d14/d64/c76 0 2026-03-10T14:07:30.484 INFO:tasks.workunit.client.1.vm04.stdout:0/230: link d0/d1c/f2b d0/f48 0 2026-03-10T14:07:30.534 INFO:tasks.workunit.client.1.vm04.stdout:0/231: sync 2026-03-10T14:07:30.538 INFO:tasks.workunit.client.1.vm04.stdout:0/232: dwrite d0/d2/fa [0,4194304] 0 2026-03-10T14:07:30.541 INFO:tasks.workunit.client.1.vm04.stdout:0/233: write d0/d2/f12 [3690345,97024] 0 2026-03-10T14:07:30.546 INFO:tasks.workunit.client.1.vm04.stdout:0/234: getdents d0/d2/d15/d22/d38 0 2026-03-10T14:07:30.548 INFO:tasks.workunit.client.1.vm04.stdout:0/235: mkdir d0/d2/d15/d49 0 2026-03-10T14:07:30.552 INFO:tasks.workunit.client.1.vm04.stdout:0/236: dwrite d0/d2/f2d [0,4194304] 0 2026-03-10T14:07:30.666 INFO:tasks.workunit.client.1.vm04.stdout:5/316: dread d7/d12/f27 [0,4194304] 0 2026-03-10T14:07:30.668 INFO:tasks.workunit.client.1.vm04.stdout:5/317: mknod d7/d12/d45/c5d 0 2026-03-10T14:07:30.669 INFO:tasks.workunit.client.1.vm04.stdout:5/318: stat d7/d12/c17 0 2026-03-10T14:07:30.700 INFO:tasks.workunit.client.1.vm04.stdout:1/276: write d3/f18 [469182,82393] 0 2026-03-10T14:07:30.701 INFO:tasks.workunit.client.1.vm04.stdout:1/277: mknod d3/d5/d13/d1a/c64 0 2026-03-10T14:07:30.707 INFO:tasks.workunit.client.1.vm04.stdout:6/242: rename d3/de/d11 to d3/de/d35/d3f 0 2026-03-10T14:07:30.710 INFO:tasks.workunit.client.1.vm04.stdout:2/311: rename d0/d3/d8/dd/d26/l3c to d0/d3/d8/dd/d26/d46/d4b/l64 0 2026-03-10T14:07:30.711 INFO:tasks.workunit.client.1.vm04.stdout:2/312: chown d0/d3/d8/dd/d26/d46/d4b 126149311 1 2026-03-10T14:07:30.713 INFO:tasks.workunit.client.1.vm04.stdout:2/313: dread d0/d3/f1d [4194304,4194304] 0 2026-03-10T14:07:30.716 INFO:tasks.workunit.client.1.vm04.stdout:6/243: mkdir d3/de/d35/d3f/d2d/d38/d40 0 2026-03-10T14:07:30.719 INFO:tasks.workunit.client.1.vm04.stdout:6/244: creat d3/d1d/f41 x:0 0 0 2026-03-10T14:07:30.721 INFO:tasks.workunit.client.1.vm04.stdout:2/314: symlink d0/d3/d8/d17/d4e/l65 0 2026-03-10T14:07:30.722 INFO:tasks.workunit.client.1.vm04.stdout:9/276: dwrite d9/ff [0,4194304] 0 2026-03-10T14:07:30.724 INFO:tasks.workunit.client.1.vm04.stdout:2/315: mkdir d0/d3/d4a/d66 0 2026-03-10T14:07:30.727 INFO:tasks.workunit.client.1.vm04.stdout:1/278: dread d3/fa [0,4194304] 0 2026-03-10T14:07:30.727 INFO:tasks.workunit.client.1.vm04.stdout:6/245: symlink d3/de/d35/d3f/d2d/d32/d23/l42 0 2026-03-10T14:07:30.730 INFO:tasks.workunit.client.1.vm04.stdout:1/279: dwrite d3/d5/fd [0,4194304] 0 2026-03-10T14:07:30.737 INFO:tasks.workunit.client.1.vm04.stdout:9/277: creat d9/d58/f5e x:0 0 0 2026-03-10T14:07:30.740 INFO:tasks.workunit.client.1.vm04.stdout:9/278: dwrite d9/d44/f49 [0,4194304] 0 2026-03-10T14:07:30.744 INFO:tasks.workunit.client.1.vm04.stdout:9/279: dwrite d9/f4a [0,4194304] 0 2026-03-10T14:07:30.754 INFO:tasks.workunit.client.1.vm04.stdout:6/246: mkdir d3/de/d35/d3a/d43 0 2026-03-10T14:07:30.755 INFO:tasks.workunit.client.1.vm04.stdout:6/247: chown d3/de/d35/d3f/d2d/d32/d23/d24 1575308 1 2026-03-10T14:07:30.758 INFO:tasks.workunit.client.1.vm04.stdout:9/280: creat d9/d33/f5f x:0 0 0 2026-03-10T14:07:30.777 INFO:tasks.workunit.client.1.vm04.stdout:4/261: truncate d4/fd 1708190 0 2026-03-10T14:07:30.780 INFO:tasks.workunit.client.1.vm04.stdout:4/262: rename d4/df/d34/f23 to d4/df/d31/f5d 0 2026-03-10T14:07:30.782 INFO:tasks.workunit.client.1.vm04.stdout:4/263: readlink d4/l7 0 2026-03-10T14:07:30.782 INFO:tasks.workunit.client.1.vm04.stdout:4/264: mkdir d4/d14/d3c/d5e 0 2026-03-10T14:07:30.784 INFO:tasks.workunit.client.1.vm04.stdout:4/265: creat d4/f5f x:0 0 0 2026-03-10T14:07:30.789 INFO:tasks.workunit.client.1.vm04.stdout:4/266: creat d4/df/f60 x:0 0 0 2026-03-10T14:07:30.794 INFO:tasks.workunit.client.1.vm04.stdout:4/267: symlink d4/d14/d3c/d5e/l61 0 2026-03-10T14:07:30.802 INFO:tasks.workunit.client.1.vm04.stdout:4/268: unlink d4/d14/d1b/l29 0 2026-03-10T14:07:30.803 INFO:tasks.workunit.client.1.vm04.stdout:4/269: stat d4/df/d34 0 2026-03-10T14:07:30.803 INFO:tasks.workunit.client.1.vm04.stdout:4/270: chown d4/df/d22/f37 5680978 1 2026-03-10T14:07:30.808 INFO:tasks.workunit.client.1.vm04.stdout:4/271: mkdir d4/d14/d3c/d62 0 2026-03-10T14:07:30.814 INFO:tasks.workunit.client.1.vm04.stdout:6/248: dread d3/d8/fc [0,4194304] 0 2026-03-10T14:07:30.814 INFO:tasks.workunit.client.1.vm04.stdout:6/249: chown d3/de/d35/d3f/d2d/l26 468 1 2026-03-10T14:07:30.820 INFO:tasks.workunit.client.1.vm04.stdout:4/272: rename d4/d14/d1b/c35 to d4/df/c63 0 2026-03-10T14:07:30.830 INFO:tasks.workunit.client.1.vm04.stdout:6/250: creat d3/d1d/f44 x:0 0 0 2026-03-10T14:07:30.837 INFO:tasks.workunit.client.1.vm04.stdout:6/251: rename d3/de/d35/d3f/d2d to d3/de/d35/d3f/d2d/d32/d23/d24/d45 22 2026-03-10T14:07:30.837 INFO:tasks.workunit.client.1.vm04.stdout:6/252: read d3/de/d35/d3f/d2d/f18 [3533500,84878] 0 2026-03-10T14:07:30.839 INFO:tasks.workunit.client.1.vm04.stdout:4/273: mkdir d4/d14/d64 0 2026-03-10T14:07:30.841 INFO:tasks.workunit.client.1.vm04.stdout:6/253: creat d3/de/f46 x:0 0 0 2026-03-10T14:07:30.852 INFO:tasks.workunit.client.1.vm04.stdout:6/254: truncate d3/de/d35/d3f/d2d/f18 1815515 0 2026-03-10T14:07:30.852 INFO:tasks.workunit.client.1.vm04.stdout:6/255: fsync d3/f4 0 2026-03-10T14:07:30.855 INFO:tasks.workunit.client.1.vm04.stdout:6/256: dwrite d3/f4 [0,4194304] 0 2026-03-10T14:07:30.873 INFO:tasks.workunit.client.1.vm04.stdout:6/257: getdents d3/de/d35/d3f/d2d 0 2026-03-10T14:07:30.968 INFO:tasks.workunit.client.1.vm04.stdout:3/284: write da/fe [2799293,761] 0 2026-03-10T14:07:30.968 INFO:tasks.workunit.client.1.vm04.stdout:3/285: stat da/dc/d3f 0 2026-03-10T14:07:30.972 INFO:tasks.workunit.client.1.vm04.stdout:3/286: rmdir da/dc/d47/d62 0 2026-03-10T14:07:30.976 INFO:tasks.workunit.client.1.vm04.stdout:3/287: link da/dc/d35/d52/f5a da/dc/d35/f64 0 2026-03-10T14:07:30.977 INFO:tasks.workunit.client.1.vm04.stdout:3/288: fdatasync da/fb 0 2026-03-10T14:07:30.977 INFO:tasks.workunit.client.1.vm04.stdout:3/289: readlink da/dc/d35/d37/l41 0 2026-03-10T14:07:30.978 INFO:tasks.workunit.client.1.vm04.stdout:3/290: chown da/dc/f1d 1072873 1 2026-03-10T14:07:30.982 INFO:tasks.workunit.client.1.vm04.stdout:3/291: fdatasync da/dc/d35/d52/f5a 0 2026-03-10T14:07:30.990 INFO:tasks.workunit.client.1.vm04.stdout:6/258: fsync d3/f4 0 2026-03-10T14:07:30.993 INFO:tasks.workunit.client.1.vm04.stdout:6/259: mkdir d3/de/d35/d3f/d2d/d32/d23/d47 0 2026-03-10T14:07:30.994 INFO:tasks.workunit.client.1.vm04.stdout:6/260: read - d3/de/d35/d3f/d2d/f2e zero size 2026-03-10T14:07:31.003 INFO:tasks.workunit.client.1.vm04.stdout:8/285: write d0/d3/dd/d11/d12/d51/f4f [14170,21317] 0 2026-03-10T14:07:31.004 INFO:tasks.workunit.client.1.vm04.stdout:8/286: stat d0/d3/dd/d11/d12/f47 0 2026-03-10T14:07:31.047 INFO:tasks.workunit.client.1.vm04.stdout:8/287: sync 2026-03-10T14:07:31.051 INFO:tasks.workunit.client.1.vm04.stdout:8/288: truncate d0/d3/dd/d11/d12/f2c 3427108 0 2026-03-10T14:07:31.053 INFO:tasks.workunit.client.1.vm04.stdout:8/289: link d0/d3/dd/d11/d12/c3e d0/d3/dd/d11/d3b/c58 0 2026-03-10T14:07:31.054 INFO:tasks.workunit.client.1.vm04.stdout:8/290: write d0/d3/dd/d11/d12/f1d [970198,110205] 0 2026-03-10T14:07:31.055 INFO:tasks.workunit.client.1.vm04.stdout:0/237: truncate d0/d1c/f2b 372808 0 2026-03-10T14:07:31.058 INFO:tasks.workunit.client.1.vm04.stdout:0/238: dwrite d0/d2/fa [0,4194304] 0 2026-03-10T14:07:31.062 INFO:tasks.workunit.client.1.vm04.stdout:8/291: symlink d0/d3/dd/d11/l59 0 2026-03-10T14:07:31.064 INFO:tasks.workunit.client.1.vm04.stdout:0/239: creat d0/d2/d15/d22/d38/f4a x:0 0 0 2026-03-10T14:07:31.065 INFO:tasks.workunit.client.1.vm04.stdout:8/292: rmdir d0/d3 39 2026-03-10T14:07:31.077 INFO:tasks.workunit.client.1.vm04.stdout:5/319: dwrite d7/d12/d2b/f46 [0,4194304] 0 2026-03-10T14:07:31.086 INFO:tasks.workunit.client.1.vm04.stdout:5/320: mknod d7/d2d/d32/c5e 0 2026-03-10T14:07:31.087 INFO:tasks.workunit.client.1.vm04.stdout:6/261: read d3/de/d35/d3f/f29 [556248,25531] 0 2026-03-10T14:07:31.089 INFO:tasks.workunit.client.1.vm04.stdout:5/321: mknod d7/d12/d2b/d3e/c5f 0 2026-03-10T14:07:31.094 INFO:tasks.workunit.client.1.vm04.stdout:5/322: symlink d7/d12/d2b/d3e/l60 0 2026-03-10T14:07:31.095 INFO:tasks.workunit.client.1.vm04.stdout:5/323: stat d7/d2d/d32/f5b 0 2026-03-10T14:07:31.095 INFO:tasks.workunit.client.1.vm04.stdout:5/324: chown d7/d26/l39 17592595 1 2026-03-10T14:07:31.096 INFO:tasks.workunit.client.1.vm04.stdout:6/262: link d3/de/d35/d3f/d2d/c37 d3/de/d35/d3f/c48 0 2026-03-10T14:07:31.096 INFO:tasks.workunit.client.1.vm04.stdout:6/263: chown d3/de/f46 15 1 2026-03-10T14:07:31.099 INFO:tasks.workunit.client.1.vm04.stdout:5/325: rename d7/d12/f27 to d7/d12/d45/f61 0 2026-03-10T14:07:31.099 INFO:tasks.workunit.client.1.vm04.stdout:2/316: write d0/d14/d1b/f29 [1315708,6280] 0 2026-03-10T14:07:31.105 INFO:tasks.workunit.client.1.vm04.stdout:1/280: dwrite d3/d22/f2b [0,4194304] 0 2026-03-10T14:07:31.115 INFO:tasks.workunit.client.1.vm04.stdout:2/317: rename d0/d3/d4a/f4d to d0/d3/d8/d59/f67 0 2026-03-10T14:07:31.124 INFO:tasks.workunit.client.1.vm04.stdout:6/264: dread d3/de/d35/d3f/d2d/d32/f1f [0,4194304] 0 2026-03-10T14:07:31.125 INFO:tasks.workunit.client.1.vm04.stdout:6/265: write d3/d1d/f44 [906282,22674] 0 2026-03-10T14:07:31.125 INFO:tasks.workunit.client.1.vm04.stdout:6/266: fsync d3/de/d35/d3f/f3b 0 2026-03-10T14:07:31.126 INFO:tasks.workunit.client.1.vm04.stdout:4/274: getdents d4/df 0 2026-03-10T14:07:31.126 INFO:tasks.workunit.client.1.vm04.stdout:4/275: chown d4/df/d22/d47 1253095 1 2026-03-10T14:07:31.133 INFO:tasks.workunit.client.1.vm04.stdout:4/276: dwrite d4/f5f [0,4194304] 0 2026-03-10T14:07:31.138 INFO:tasks.workunit.client.1.vm04.stdout:2/318: dread d0/d3/f24 [0,4194304] 0 2026-03-10T14:07:31.155 INFO:tasks.workunit.client.1.vm04.stdout:9/281: dread d9/da/f1e [0,4194304] 0 2026-03-10T14:07:31.157 INFO:tasks.workunit.client.1.vm04.stdout:6/267: dwrite d3/f19 [0,4194304] 0 2026-03-10T14:07:31.160 INFO:tasks.workunit.client.1.vm04.stdout:6/268: write d3/de/d35/d3f/d2d/d32/d23/f33 [453205,51709] 0 2026-03-10T14:07:31.166 INFO:tasks.workunit.client.1.vm04.stdout:1/281: creat d3/d22/d63/f65 x:0 0 0 2026-03-10T14:07:31.168 INFO:tasks.workunit.client.1.vm04.stdout:3/292: write da/d30/f42 [3991806,28798] 0 2026-03-10T14:07:31.169 INFO:tasks.workunit.client.1.vm04.stdout:3/293: chown da/dc/d3f 132741 1 2026-03-10T14:07:31.171 INFO:tasks.workunit.client.1.vm04.stdout:9/282: unlink d9/f3e 0 2026-03-10T14:07:31.172 INFO:tasks.workunit.client.1.vm04.stdout:9/283: dread d9/da/f57 [0,4194304] 0 2026-03-10T14:07:31.177 INFO:tasks.workunit.client.1.vm04.stdout:1/282: mknod d3/d5/d13/d38/d58/c66 0 2026-03-10T14:07:31.177 INFO:tasks.workunit.client.1.vm04.stdout:1/283: write d3/d5/d13/f36 [813895,88755] 0 2026-03-10T14:07:31.180 INFO:tasks.workunit.client.1.vm04.stdout:7/309: write d2/dc/f4a [150723,51705] 0 2026-03-10T14:07:31.180 INFO:tasks.workunit.client.1.vm04.stdout:7/310: chown d2 39307 1 2026-03-10T14:07:31.181 INFO:tasks.workunit.client.1.vm04.stdout:7/311: truncate d2/d2a/d42/f4e 701023 0 2026-03-10T14:07:31.183 INFO:tasks.workunit.client.1.vm04.stdout:0/240: write d0/f1b [38202,109997] 0 2026-03-10T14:07:31.183 INFO:tasks.workunit.client.1.vm04.stdout:4/277: link d4/df/c4d d4/d14/d64/c65 0 2026-03-10T14:07:31.188 INFO:tasks.workunit.client.1.vm04.stdout:4/278: dwrite d4/f2c [0,4194304] 0 2026-03-10T14:07:31.200 INFO:tasks.workunit.client.1.vm04.stdout:7/312: write d2/f20 [1059401,6665] 0 2026-03-10T14:07:31.201 INFO:tasks.workunit.client.1.vm04.stdout:7/313: chown d2/d14/d3b/f5e 246811 1 2026-03-10T14:07:31.206 INFO:tasks.workunit.client.1.vm04.stdout:4/279: stat d4/d14/c51 0 2026-03-10T14:07:31.208 INFO:tasks.workunit.client.1.vm04.stdout:9/284: symlink d9/d5c/l60 0 2026-03-10T14:07:31.208 INFO:tasks.workunit.client.1.vm04.stdout:1/284: mknod d3/d5c/c67 0 2026-03-10T14:07:31.211 INFO:tasks.workunit.client.1.vm04.stdout:1/285: dread f1 [4194304,4194304] 0 2026-03-10T14:07:31.211 INFO:tasks.workunit.client.1.vm04.stdout:1/286: write d3/d5/fd [2492120,26997] 0 2026-03-10T14:07:31.215 INFO:tasks.workunit.client.1.vm04.stdout:7/314: dwrite d2/d14/f15 [4194304,4194304] 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:7/315: read - d2/dc/de/d2d/d38/f57 zero size 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:3/294: creat da/f65 x:0 0 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:3/295: write da/dc/f2a [518716,124802] 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:3/296: stat da/c60 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:4/280: unlink d4/df/f59 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:1/287: write d3/fc [1936124,72150] 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:7/316: rmdir d2/d28 39 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:0/241: link d0/c23 d0/d2/c4b 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:3/297: mkdir da/dc/d3f/d54/d66 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:9/285: mknod d9/da/d5d/c61 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:3/298: dwrite da/d30/f42 [0,4194304] 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:1/288: creat d3/d22/d2f/d57/f68 x:0 0 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:9/286: creat d9/d58/f62 x:0 0 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:4/281: creat d4/df/f66 x:0 0 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:4/282: write d4/d14/d1b/f26 [4246780,57116] 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:9/287: creat d9/d5c/f63 x:0 0 0 2026-03-10T14:07:31.246 INFO:tasks.workunit.client.1.vm04.stdout:7/317: link d2/d28/f3d d2/dc/de/d21/f77 0 2026-03-10T14:07:31.249 INFO:tasks.workunit.client.1.vm04.stdout:1/289: rename d3/d5/d13/f36 to d3/d22/d63/f69 0 2026-03-10T14:07:31.251 INFO:tasks.workunit.client.1.vm04.stdout:9/288: dread d9/f20 [0,4194304] 0 2026-03-10T14:07:31.252 INFO:tasks.workunit.client.1.vm04.stdout:9/289: readlink d9/d1d/l2c 0 2026-03-10T14:07:31.257 INFO:tasks.workunit.client.1.vm04.stdout:9/290: mknod d9/d1d/c64 0 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:4/283: getdents d4/d14/d3c/d5e 0 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:4/284: chown d4/d14/d3c/f41 10111 1 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:9/291: rename d9/da/f1e to d9/da/dd/d1c/f65 0 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:4/285: write d4/df/f4e [4531557,70864] 0 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:4/286: chown d4/d14 152612 1 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:9/292: creat d9/d44/d4d/f66 x:0 0 0 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:9/293: dwrite d9/ff [0,4194304] 0 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:1/290: link d3/d5/d13/c4c d3/d20/c6a 0 2026-03-10T14:07:31.281 INFO:tasks.workunit.client.1.vm04.stdout:9/294: dread d9/ff [0,4194304] 0 2026-03-10T14:07:31.291 INFO:tasks.workunit.client.1.vm04.stdout:1/291: symlink d3/d22/l6b 0 2026-03-10T14:07:31.293 INFO:tasks.workunit.client.1.vm04.stdout:9/295: creat d9/d1d/f67 x:0 0 0 2026-03-10T14:07:31.349 INFO:tasks.workunit.client.1.vm04.stdout:4/287: sync 2026-03-10T14:07:31.353 INFO:tasks.workunit.client.1.vm04.stdout:4/288: rename d4/d14/f5a to d4/d14/d3c/f67 0 2026-03-10T14:07:31.354 INFO:tasks.workunit.client.1.vm04.stdout:4/289: fdatasync d4/d14/d3c/f46 0 2026-03-10T14:07:31.361 INFO:tasks.workunit.client.1.vm04.stdout:4/290: symlink d4/df/d22/l68 0 2026-03-10T14:07:31.367 INFO:tasks.workunit.client.1.vm04.stdout:1/292: fdatasync d3/d5/fd 0 2026-03-10T14:07:31.371 INFO:tasks.workunit.client.1.vm04.stdout:4/291: creat d4/df/d34/f69 x:0 0 0 2026-03-10T14:07:31.375 INFO:tasks.workunit.client.1.vm04.stdout:8/293: dwrite d0/d3/dd/d11/d12/f50 [0,4194304] 0 2026-03-10T14:07:31.378 INFO:tasks.workunit.client.1.vm04.stdout:8/294: dread d0/f42 [0,4194304] 0 2026-03-10T14:07:31.386 INFO:tasks.workunit.client.1.vm04.stdout:1/293: mkdir d3/d22/d63/d35/d6c 0 2026-03-10T14:07:31.386 INFO:tasks.workunit.client.1.vm04.stdout:1/294: read - d3/d5/f56 zero size 2026-03-10T14:07:31.388 INFO:tasks.workunit.client.1.vm04.stdout:4/292: creat d4/df/d22/d47/d4f/f6a x:0 0 0 2026-03-10T14:07:31.389 INFO:tasks.workunit.client.1.vm04.stdout:4/293: dread - d4/df/d22/d47/d4f/f6a zero size 2026-03-10T14:07:31.399 INFO:tasks.workunit.client.1.vm04.stdout:1/295: mkdir d3/d22/d6d 0 2026-03-10T14:07:31.404 INFO:tasks.workunit.client.1.vm04.stdout:4/294: dwrite d4/df/d31/f3d [4194304,4194304] 0 2026-03-10T14:07:31.412 INFO:tasks.workunit.client.1.vm04.stdout:5/326: dwrite d7/f21 [0,4194304] 0 2026-03-10T14:07:31.413 INFO:tasks.workunit.client.1.vm04.stdout:8/295: symlink d0/d3/dd/d11/d29/l5a 0 2026-03-10T14:07:31.425 INFO:tasks.workunit.client.1.vm04.stdout:1/296: chown d3/f2c 50365 1 2026-03-10T14:07:31.429 INFO:tasks.workunit.client.1.vm04.stdout:1/297: dwrite d3/d5/fd [0,4194304] 0 2026-03-10T14:07:31.439 INFO:tasks.workunit.client.1.vm04.stdout:4/295: mknod d4/df/d34/c6b 0 2026-03-10T14:07:31.445 INFO:tasks.workunit.client.1.vm04.stdout:2/319: dwrite d0/d3/f1d [4194304,4194304] 0 2026-03-10T14:07:31.452 INFO:tasks.workunit.client.1.vm04.stdout:8/296: stat d0/d3/dd/d11/le 0 2026-03-10T14:07:31.453 INFO:tasks.workunit.client.1.vm04.stdout:8/297: truncate d0/d3/dd/d11/f3c 697371 0 2026-03-10T14:07:31.454 INFO:tasks.workunit.client.1.vm04.stdout:8/298: chown d0/d3/dd/d11/d12/d51/f4f 251 1 2026-03-10T14:07:31.458 INFO:tasks.workunit.client.1.vm04.stdout:8/299: dwrite d0/d3/dd/d11/d29/f52 [0,4194304] 0 2026-03-10T14:07:31.470 INFO:tasks.workunit.client.1.vm04.stdout:4/296: creat d4/d14/d1b/f6c x:0 0 0 2026-03-10T14:07:31.491 INFO:tasks.workunit.client.1.vm04.stdout:2/320: rename d0/d14/d39/d47/c43 to d0/d14/d54/c68 0 2026-03-10T14:07:31.493 INFO:tasks.workunit.client.1.vm04.stdout:8/300: truncate d0/d3/dd/d11/f57 2468514 0 2026-03-10T14:07:31.496 INFO:tasks.workunit.client.1.vm04.stdout:4/297: mkdir d4/d14/d6d 0 2026-03-10T14:07:31.499 INFO:tasks.workunit.client.1.vm04.stdout:2/321: unlink d0/d3/d8/d17/d35/f49 0 2026-03-10T14:07:31.503 INFO:tasks.workunit.client.1.vm04.stdout:4/298: mkdir d4/d14/d3c/d6e 0 2026-03-10T14:07:31.512 INFO:tasks.workunit.client.1.vm04.stdout:3/299: truncate da/f25 1395877 0 2026-03-10T14:07:31.512 INFO:tasks.workunit.client.1.vm04.stdout:3/300: stat da/dc/d35 0 2026-03-10T14:07:31.515 INFO:tasks.workunit.client.1.vm04.stdout:0/242: mkdir d0/d2/d15/d22/d38/d4c 0 2026-03-10T14:07:31.517 INFO:tasks.workunit.client.1.vm04.stdout:4/299: mkdir d4/df/d34/d6f 0 2026-03-10T14:07:31.518 INFO:tasks.workunit.client.1.vm04.stdout:2/322: symlink d0/d14/l69 0 2026-03-10T14:07:31.520 INFO:tasks.workunit.client.1.vm04.stdout:8/301: creat d0/d3/dd/d11/f5b x:0 0 0 2026-03-10T14:07:31.526 INFO:tasks.workunit.client.1.vm04.stdout:7/318: dwrite d2/dc/de/d2d/d38/f5a [0,4194304] 0 2026-03-10T14:07:31.538 INFO:tasks.workunit.client.1.vm04.stdout:4/300: mkdir d4/df/d22/d47/d70 0 2026-03-10T14:07:31.542 INFO:tasks.workunit.client.1.vm04.stdout:4/301: dwrite d4/df/d31/f55 [0,4194304] 0 2026-03-10T14:07:31.545 INFO:tasks.workunit.client.1.vm04.stdout:4/302: dwrite d4/df/d22/f4b [0,4194304] 0 2026-03-10T14:07:31.553 INFO:tasks.workunit.client.1.vm04.stdout:4/303: dwrite d4/fe [0,4194304] 0 2026-03-10T14:07:31.557 INFO:tasks.workunit.client.1.vm04.stdout:2/323: mknod d0/d14/d39/d47/c6a 0 2026-03-10T14:07:31.562 INFO:tasks.workunit.client.1.vm04.stdout:5/327: dread f4 [0,4194304] 0 2026-03-10T14:07:31.562 INFO:tasks.workunit.client.1.vm04.stdout:5/328: write d7/d2d/d32/f3b [1855533,128600] 0 2026-03-10T14:07:31.566 INFO:tasks.workunit.client.1.vm04.stdout:2/324: dwrite d0/d3/d8/f42 [4194304,4194304] 0 2026-03-10T14:07:31.567 INFO:tasks.workunit.client.1.vm04.stdout:8/302: rename d0/d3/dd/l39 to d0/d3/dd/d11/d12/l5c 0 2026-03-10T14:07:31.568 INFO:tasks.workunit.client.1.vm04.stdout:8/303: fsync d0/f23 0 2026-03-10T14:07:31.568 INFO:tasks.workunit.client.1.vm04.stdout:2/325: chown d0/d3 28 1 2026-03-10T14:07:31.572 INFO:tasks.workunit.client.1.vm04.stdout:7/319: creat d2/d14/f78 x:0 0 0 2026-03-10T14:07:31.583 INFO:tasks.workunit.client.1.vm04.stdout:4/304: unlink d4/d14/d1b/c32 0 2026-03-10T14:07:31.599 INFO:tasks.workunit.client.1.vm04.stdout:4/305: write d4/d14/d1b/f26 [2364501,88955] 0 2026-03-10T14:07:31.599 INFO:tasks.workunit.client.1.vm04.stdout:4/306: stat d4/f21 0 2026-03-10T14:07:31.599 INFO:tasks.workunit.client.1.vm04.stdout:4/307: dread - d4/df/d34/f69 zero size 2026-03-10T14:07:31.599 INFO:tasks.workunit.client.1.vm04.stdout:2/326: dwrite d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:31.599 INFO:tasks.workunit.client.1.vm04.stdout:7/320: symlink d2/dc/de/d2d/d38/d50/l79 0 2026-03-10T14:07:31.599 INFO:tasks.workunit.client.1.vm04.stdout:0/243: creat d0/d2/f4d x:0 0 0 2026-03-10T14:07:31.599 INFO:tasks.workunit.client.1.vm04.stdout:7/321: fdatasync d2/d14/d3b/f49 0 2026-03-10T14:07:31.599 INFO:tasks.workunit.client.1.vm04.stdout:3/301: getdents da/dc/d47 0 2026-03-10T14:07:31.601 INFO:tasks.workunit.client.1.vm04.stdout:7/322: read d2/d14/d44/f51 [511814,60727] 0 2026-03-10T14:07:31.603 INFO:tasks.workunit.client.1.vm04.stdout:0/244: read - d0/d2/f34 zero size 2026-03-10T14:07:31.604 INFO:tasks.workunit.client.1.vm04.stdout:4/308: creat d4/d14/d64/f71 x:0 0 0 2026-03-10T14:07:31.605 INFO:tasks.workunit.client.1.vm04.stdout:5/329: creat d7/d12/d2b/d3e/f62 x:0 0 0 2026-03-10T14:07:31.608 INFO:tasks.workunit.client.1.vm04.stdout:5/330: dread d7/d12/d2b/f55 [0,4194304] 0 2026-03-10T14:07:31.608 INFO:tasks.workunit.client.1.vm04.stdout:5/331: fsync d7/f1d 0 2026-03-10T14:07:31.609 INFO:tasks.workunit.client.1.vm04.stdout:4/309: mknod d4/d14/d3c/d5e/c72 0 2026-03-10T14:07:31.614 INFO:tasks.workunit.client.1.vm04.stdout:7/323: dwrite f0 [0,4194304] 0 2026-03-10T14:07:31.641 INFO:tasks.workunit.client.1.vm04.stdout:7/324: read - d2/d14/d64/f65 zero size 2026-03-10T14:07:31.641 INFO:tasks.workunit.client.1.vm04.stdout:3/302: symlink da/dc/d3f/d61/l67 0 2026-03-10T14:07:31.641 INFO:tasks.workunit.client.1.vm04.stdout:5/332: unlink d7/d9/f3a 0 2026-03-10T14:07:31.641 INFO:tasks.workunit.client.1.vm04.stdout:7/325: creat d2/dc/de/d21/f7a x:0 0 0 2026-03-10T14:07:31.641 INFO:tasks.workunit.client.1.vm04.stdout:7/326: dread d2/dc/de/d21/f77 [4194304,4194304] 0 2026-03-10T14:07:31.641 INFO:tasks.workunit.client.1.vm04.stdout:7/327: readlink d2/d2a/d42/l54 0 2026-03-10T14:07:31.641 INFO:tasks.workunit.client.1.vm04.stdout:3/303: symlink da/dc/d35/d37/l68 0 2026-03-10T14:07:31.644 INFO:tasks.workunit.client.1.vm04.stdout:7/328: creat d2/d2a/d42/d56/f7b x:0 0 0 2026-03-10T14:07:31.645 INFO:tasks.workunit.client.1.vm04.stdout:7/329: write d2/dc/de/d2d/d38/f41 [1733135,120723] 0 2026-03-10T14:07:31.646 INFO:tasks.workunit.client.1.vm04.stdout:3/304: truncate da/dc/d35/d52/f5a 859172 0 2026-03-10T14:07:31.646 INFO:tasks.workunit.client.1.vm04.stdout:3/305: chown da/dc/d35/d37 59 1 2026-03-10T14:07:31.647 INFO:tasks.workunit.client.1.vm04.stdout:3/306: chown da/d30/f4e 3112 1 2026-03-10T14:07:31.647 INFO:tasks.workunit.client.1.vm04.stdout:3/307: stat c7 0 2026-03-10T14:07:31.648 INFO:tasks.workunit.client.1.vm04.stdout:5/333: mknod d7/c63 0 2026-03-10T14:07:31.649 INFO:tasks.workunit.client.1.vm04.stdout:5/334: chown d7/d12/d2b/f4d 430590 1 2026-03-10T14:07:31.651 INFO:tasks.workunit.client.1.vm04.stdout:7/330: rmdir d2/dc/de/d2d/d5c 39 2026-03-10T14:07:31.651 INFO:tasks.workunit.client.1.vm04.stdout:7/331: fdatasync d2/d14/d3b/f43 0 2026-03-10T14:07:31.653 INFO:tasks.workunit.client.1.vm04.stdout:5/335: write d7/d12/d2b/f55 [4702545,105648] 0 2026-03-10T14:07:31.654 INFO:tasks.workunit.client.1.vm04.stdout:7/332: dread - d2/dc/de/d2d/d5c/f63 zero size 2026-03-10T14:07:31.657 INFO:tasks.workunit.client.1.vm04.stdout:5/336: creat d7/d2d/f64 x:0 0 0 2026-03-10T14:07:31.658 INFO:tasks.workunit.client.1.vm04.stdout:5/337: symlink d7/l65 0 2026-03-10T14:07:31.659 INFO:tasks.workunit.client.1.vm04.stdout:5/338: fsync d7/d2d/d32/d34/f4e 0 2026-03-10T14:07:31.673 INFO:tasks.workunit.client.1.vm04.stdout:5/339: symlink d7/d26/l66 0 2026-03-10T14:07:31.729 INFO:tasks.workunit.client.1.vm04.stdout:2/327: sync 2026-03-10T14:07:31.731 INFO:tasks.workunit.client.1.vm04.stdout:2/328: unlink d0/d3/l13 0 2026-03-10T14:07:31.733 INFO:tasks.workunit.client.1.vm04.stdout:2/329: creat d0/d14/f6b x:0 0 0 2026-03-10T14:07:31.734 INFO:tasks.workunit.client.1.vm04.stdout:2/330: chown d0/d3/d3a/d3e/f2d 1 1 2026-03-10T14:07:31.737 INFO:tasks.workunit.client.1.vm04.stdout:2/331: dwrite d0/d3/f1d [0,4194304] 0 2026-03-10T14:07:31.740 INFO:tasks.workunit.client.1.vm04.stdout:2/332: truncate d0/d3/d8/f6 1335712 0 2026-03-10T14:07:31.740 INFO:tasks.workunit.client.1.vm04.stdout:4/310: fsync d4/df/d31/f3d 0 2026-03-10T14:07:31.740 INFO:tasks.workunit.client.1.vm04.stdout:3/308: sync 2026-03-10T14:07:31.741 INFO:tasks.workunit.client.1.vm04.stdout:4/311: chown d4/d14/d1b/f6c 0 1 2026-03-10T14:07:31.742 INFO:tasks.workunit.client.1.vm04.stdout:3/309: creat da/dc/d35/d52/f69 x:0 0 0 2026-03-10T14:07:31.743 INFO:tasks.workunit.client.1.vm04.stdout:4/312: dread - d4/df/d22/f37 zero size 2026-03-10T14:07:31.749 INFO:tasks.workunit.client.1.vm04.stdout:2/333: dwrite d0/d3/f1d [4194304,4194304] 0 2026-03-10T14:07:31.752 INFO:tasks.workunit.client.1.vm04.stdout:3/310: getdents da/d30 0 2026-03-10T14:07:31.766 INFO:tasks.workunit.client.1.vm04.stdout:2/334: mknod d0/d3/d8/d17/d35/c6c 0 2026-03-10T14:07:31.767 INFO:tasks.workunit.client.1.vm04.stdout:2/335: write d0/d3/d3a/d3e/f61 [473850,29136] 0 2026-03-10T14:07:31.770 INFO:tasks.workunit.client.1.vm04.stdout:3/311: creat da/dc/d35/f6a x:0 0 0 2026-03-10T14:07:31.778 INFO:tasks.workunit.client.1.vm04.stdout:2/336: symlink d0/d3/d8/d59/l6d 0 2026-03-10T14:07:31.779 INFO:tasks.workunit.client.1.vm04.stdout:2/337: chown d0/d3/f1d 14 1 2026-03-10T14:07:31.780 INFO:tasks.workunit.client.1.vm04.stdout:2/338: readlink d0/d3/d8/dd/d26/d46/d4b/d56/l5e 0 2026-03-10T14:07:31.783 INFO:tasks.workunit.client.1.vm04.stdout:2/339: dwrite d0/d3/f1d [4194304,4194304] 0 2026-03-10T14:07:31.785 INFO:tasks.workunit.client.1.vm04.stdout:6/269: truncate d3/de/d35/d3f/d2d/f18 831337 0 2026-03-10T14:07:31.786 INFO:tasks.workunit.client.1.vm04.stdout:6/270: write d3/de/d35/d3f/d2d/d32/d23/f31 [4855882,126] 0 2026-03-10T14:07:31.788 INFO:tasks.workunit.client.1.vm04.stdout:3/312: creat da/dc/d47/f6b x:0 0 0 2026-03-10T14:07:31.792 INFO:tasks.workunit.client.1.vm04.stdout:2/340: creat d0/d14/d39/f6e x:0 0 0 2026-03-10T14:07:31.804 INFO:tasks.workunit.client.1.vm04.stdout:2/341: write d0/d14/d1b/f55 [275694,56017] 0 2026-03-10T14:07:31.804 INFO:tasks.workunit.client.1.vm04.stdout:2/342: dwrite d0/d3/d8/d17/d35/f5f [0,4194304] 0 2026-03-10T14:07:31.804 INFO:tasks.workunit.client.1.vm04.stdout:4/313: dread d4/d14/d1b/f20 [0,4194304] 0 2026-03-10T14:07:31.804 INFO:tasks.workunit.client.1.vm04.stdout:4/314: write d4/d14/d1b/f5c [76575,49648] 0 2026-03-10T14:07:31.808 INFO:tasks.workunit.client.1.vm04.stdout:2/343: dwrite d0/d14/f6b [0,4194304] 0 2026-03-10T14:07:31.811 INFO:tasks.workunit.client.1.vm04.stdout:3/313: creat da/d30/f6c x:0 0 0 2026-03-10T14:07:31.819 INFO:tasks.workunit.client.1.vm04.stdout:6/271: symlink d3/l49 0 2026-03-10T14:07:31.823 INFO:tasks.workunit.client.1.vm04.stdout:4/315: readlink d4/df/l25 0 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:2/344: readlink d0/d14/d1b/l48 0 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:3/314: mkdir da/dc/d35/d52/d6d 0 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:6/272: chown d3/de/d35/d3a 715 1 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:4/316: symlink d4/d14/l73 0 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:3/315: fdatasync da/dc/d35/f64 0 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:3/316: truncate da/dc/d35/f5b 670351 0 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:6/273: creat d3/d1d/f4a x:0 0 0 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:3/317: dwrite da/fd [0,4194304] 0 2026-03-10T14:07:31.856 INFO:tasks.workunit.client.1.vm04.stdout:4/317: mkdir d4/df/d22/d47/d70/d74 0 2026-03-10T14:07:31.860 INFO:tasks.workunit.client.1.vm04.stdout:3/318: creat da/dc/d35/d37/f6e x:0 0 0 2026-03-10T14:07:31.864 INFO:tasks.workunit.client.1.vm04.stdout:4/318: rmdir d4/d14/d3c/d6e 0 2026-03-10T14:07:31.866 INFO:tasks.workunit.client.1.vm04.stdout:4/319: symlink d4/df/d34/d6f/l75 0 2026-03-10T14:07:31.868 INFO:tasks.workunit.client.1.vm04.stdout:4/320: creat d4/df/d22/d47/d70/d74/f76 x:0 0 0 2026-03-10T14:07:31.886 INFO:tasks.workunit.client.1.vm04.stdout:2/345: dread d0/d3/d8/dd/d26/f36 [0,4194304] 0 2026-03-10T14:07:31.886 INFO:tasks.workunit.client.1.vm04.stdout:2/346: stat d0/d3/f11 0 2026-03-10T14:07:32.072 INFO:tasks.workunit.client.1.vm04.stdout:2/347: dread d0/d3/d8/dd/d26/f33 [0,4194304] 0 2026-03-10T14:07:32.073 INFO:tasks.workunit.client.1.vm04.stdout:2/348: chown d0/d3/d3a/d3e/f61 10900743 1 2026-03-10T14:07:32.075 INFO:tasks.workunit.client.1.vm04.stdout:2/349: dwrite d0/d14/f6b [0,4194304] 0 2026-03-10T14:07:32.077 INFO:tasks.workunit.client.1.vm04.stdout:2/350: truncate d0/d14/d1b/f32 4268194 0 2026-03-10T14:07:32.113 INFO:tasks.workunit.client.1.vm04.stdout:2/351: sync 2026-03-10T14:07:32.114 INFO:tasks.workunit.client.1.vm04.stdout:2/352: truncate d0/d14/f6b 5160761 0 2026-03-10T14:07:32.114 INFO:tasks.workunit.client.1.vm04.stdout:2/353: chown d0/d3/d8/d17/d35/l40 8333881 1 2026-03-10T14:07:32.127 INFO:tasks.workunit.client.1.vm04.stdout:1/298: write d3/f8 [383005,114975] 0 2026-03-10T14:07:32.128 INFO:tasks.workunit.client.1.vm04.stdout:1/299: chown f1 7832 1 2026-03-10T14:07:32.129 INFO:tasks.workunit.client.1.vm04.stdout:1/300: write d3/d5/f56 [893406,59476] 0 2026-03-10T14:07:32.157 INFO:tasks.workunit.client.1.vm04.stdout:8/304: dread d0/d3/dd/d11/d12/f50 [0,4194304] 0 2026-03-10T14:07:32.162 INFO:tasks.workunit.client.1.vm04.stdout:7/333: rename d2/d14 to d2/dc/de/d2d/d60/d7c 0 2026-03-10T14:07:32.166 INFO:tasks.workunit.client.1.vm04.stdout:7/334: dwrite d2/dc/de/d2d/d60/d7c/f70 [0,4194304] 0 2026-03-10T14:07:32.182 INFO:tasks.workunit.client.1.vm04.stdout:7/335: unlink d2/dc/de/d21/c3e 0 2026-03-10T14:07:32.192 INFO:tasks.workunit.client.1.vm04.stdout:7/336: dwrite d2/dc/de/d2d/d60/d7c/d3b/f49 [0,4194304] 0 2026-03-10T14:07:32.193 INFO:tasks.workunit.client.1.vm04.stdout:9/296: write d9/da/dd/d1c/f65 [389501,111749] 0 2026-03-10T14:07:32.194 INFO:tasks.workunit.client.1.vm04.stdout:9/297: chown d9/d44/f51 0 1 2026-03-10T14:07:32.210 INFO:tasks.workunit.client.1.vm04.stdout:8/305: creat d0/d3/dd/f5d x:0 0 0 2026-03-10T14:07:32.211 INFO:tasks.workunit.client.1.vm04.stdout:5/340: rename d7/d2d/d32/c48 to d7/d12/d2b/d3e/c67 0 2026-03-10T14:07:32.214 INFO:tasks.workunit.client.1.vm04.stdout:5/341: dwrite d7/d12/d45/f5c [0,4194304] 0 2026-03-10T14:07:32.215 INFO:tasks.workunit.client.1.vm04.stdout:5/342: chown d7/d12/d2b/d3e 1 1 2026-03-10T14:07:32.215 INFO:tasks.workunit.client.1.vm04.stdout:5/343: dread - d7/d26/f54 zero size 2026-03-10T14:07:32.227 INFO:tasks.workunit.client.1.vm04.stdout:3/319: rename da/dc/f1a to da/dc/d35/d52/f6f 0 2026-03-10T14:07:32.234 INFO:tasks.workunit.client.1.vm04.stdout:3/320: mkdir da/dc/d35/d52/d70 0 2026-03-10T14:07:32.240 INFO:tasks.workunit.client.1.vm04.stdout:5/344: dwrite d7/d12/d2b/f4d [0,4194304] 0 2026-03-10T14:07:32.240 INFO:tasks.workunit.client.1.vm04.stdout:8/306: dwrite d0/d3/dd/d11/d12/f50 [4194304,4194304] 0 2026-03-10T14:07:32.242 INFO:tasks.workunit.client.1.vm04.stdout:2/354: rename d0/d14/c19 to d0/d14/d1b/d45/c6f 0 2026-03-10T14:07:32.245 INFO:tasks.workunit.client.1.vm04.stdout:5/345: symlink d7/d9/l68 0 2026-03-10T14:07:32.246 INFO:tasks.workunit.client.1.vm04.stdout:3/321: dwrite da/dc/d35/f50 [0,4194304] 0 2026-03-10T14:07:32.246 INFO:tasks.workunit.client.1.vm04.stdout:2/355: write d0/d3/d8/f30 [609047,72025] 0 2026-03-10T14:07:32.246 INFO:tasks.workunit.client.1.vm04.stdout:5/346: write d7/d2d/d32/f5b [937462,58644] 0 2026-03-10T14:07:32.252 INFO:tasks.workunit.client.1.vm04.stdout:5/347: fdatasync d7/d9/fd 0 2026-03-10T14:07:32.261 INFO:tasks.workunit.client.1.vm04.stdout:5/348: stat d7 0 2026-03-10T14:07:32.265 INFO:tasks.workunit.client.1.vm04.stdout:8/307: symlink d0/l5e 0 2026-03-10T14:07:32.270 INFO:tasks.workunit.client.1.vm04.stdout:2/356: readlink d0/d3/d8/dd/d26/d46/d4b/l64 0 2026-03-10T14:07:32.271 INFO:tasks.workunit.client.1.vm04.stdout:2/357: fsync d0/d14/d1b/f32 0 2026-03-10T14:07:32.272 INFO:tasks.workunit.client.1.vm04.stdout:2/358: chown d0/d14/d1b/f29 0 1 2026-03-10T14:07:32.272 INFO:tasks.workunit.client.1.vm04.stdout:2/359: chown d0/d3/d8/d17/d4e 1203 1 2026-03-10T14:07:32.272 INFO:tasks.workunit.client.1.vm04.stdout:2/360: write d0/d3/f1d [6798988,23419] 0 2026-03-10T14:07:32.272 INFO:tasks.workunit.client.1.vm04.stdout:2/361: chown d0/d3/d8/l62 602783621 1 2026-03-10T14:07:32.284 INFO:tasks.workunit.client.1.vm04.stdout:5/349: mkdir d7/d2d/d69 0 2026-03-10T14:07:32.285 INFO:tasks.workunit.client.1.vm04.stdout:5/350: truncate d7/d9/f20 2941862 0 2026-03-10T14:07:32.289 INFO:tasks.workunit.client.1.vm04.stdout:5/351: dwrite d7/d12/d2b/f46 [0,4194304] 0 2026-03-10T14:07:32.295 INFO:tasks.workunit.client.1.vm04.stdout:5/352: chown d7/d9/l18 1665049 1 2026-03-10T14:07:32.295 INFO:tasks.workunit.client.1.vm04.stdout:5/353: chown d7/d12/d45/c5d 506448663 1 2026-03-10T14:07:32.299 INFO:tasks.workunit.client.1.vm04.stdout:8/308: creat d0/d3/dd/d11/f5f x:0 0 0 2026-03-10T14:07:32.304 INFO:tasks.workunit.client.1.vm04.stdout:8/309: symlink d0/d3/d5/l60 0 2026-03-10T14:07:32.308 INFO:tasks.workunit.client.1.vm04.stdout:8/310: dwrite d0/d3/dd/d11/f5f [0,4194304] 0 2026-03-10T14:07:32.309 INFO:tasks.workunit.client.1.vm04.stdout:8/311: readlink d0/l3a 0 2026-03-10T14:07:32.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:32 vm04.local ceph-mon[55966]: pgmap v144: 65 pgs: 65 active+clean; 612 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 7.9 MiB/s rd, 75 MiB/s wr, 353 op/s 2026-03-10T14:07:32.314 INFO:tasks.workunit.client.1.vm04.stdout:8/312: dwrite d0/d3/d5/f30 [0,4194304] 0 2026-03-10T14:07:32.315 INFO:tasks.workunit.client.1.vm04.stdout:8/313: chown d0/d3/d5/f15 12653776 1 2026-03-10T14:07:32.320 INFO:tasks.workunit.client.1.vm04.stdout:2/362: dread d0/d3/d3a/d3e/f38 [0,4194304] 0 2026-03-10T14:07:32.322 INFO:tasks.workunit.client.1.vm04.stdout:8/314: creat d0/d3/dd/d11/d12/f61 x:0 0 0 2026-03-10T14:07:32.328 INFO:tasks.workunit.client.1.vm04.stdout:8/315: readlink d0/l3a 0 2026-03-10T14:07:32.328 INFO:tasks.workunit.client.1.vm04.stdout:5/354: getdents d7/d12 0 2026-03-10T14:07:32.328 INFO:tasks.workunit.client.1.vm04.stdout:2/363: chown d0/d3/l3f 0 1 2026-03-10T14:07:32.332 INFO:tasks.workunit.client.1.vm04.stdout:8/316: unlink d0/d3/dd/d11/d29/l2f 0 2026-03-10T14:07:32.341 INFO:tasks.workunit.client.1.vm04.stdout:2/364: mkdir d0/d14/d39/d47/d70 0 2026-03-10T14:07:32.344 INFO:tasks.workunit.client.1.vm04.stdout:8/317: symlink d0/d3/d5/l62 0 2026-03-10T14:07:32.348 INFO:tasks.workunit.client.1.vm04.stdout:2/365: dwrite d0/d3/d8/d59/f67 [4194304,4194304] 0 2026-03-10T14:07:32.349 INFO:tasks.workunit.client.1.vm04.stdout:2/366: write d0/d3/d4a/f57 [909823,64943] 0 2026-03-10T14:07:32.354 INFO:tasks.workunit.client.1.vm04.stdout:2/367: dwrite d0/d3/f11 [4194304,4194304] 0 2026-03-10T14:07:32.356 INFO:tasks.workunit.client.1.vm04.stdout:8/318: chown d0/d3/dd/c22 224895 1 2026-03-10T14:07:32.357 INFO:tasks.workunit.client.1.vm04.stdout:2/368: write d0/d3/d8/dd/d26/f36 [4428345,98293] 0 2026-03-10T14:07:32.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:32 vm03.local ceph-mon[49718]: pgmap v144: 65 pgs: 65 active+clean; 612 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 7.9 MiB/s rd, 75 MiB/s wr, 353 op/s 2026-03-10T14:07:32.361 INFO:tasks.workunit.client.1.vm04.stdout:2/369: dread d0/d3/d8/f42 [4194304,4194304] 0 2026-03-10T14:07:32.364 INFO:tasks.workunit.client.1.vm04.stdout:2/370: dread d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:32.370 INFO:tasks.workunit.client.1.vm04.stdout:2/371: dread d0/d14/d1b/f29 [0,4194304] 0 2026-03-10T14:07:32.371 INFO:tasks.workunit.client.1.vm04.stdout:2/372: write d0/d14/d1b/f29 [179150,45112] 0 2026-03-10T14:07:32.375 INFO:tasks.workunit.client.1.vm04.stdout:2/373: rmdir d0/d3/d8/dd/d26/d46/d4b 39 2026-03-10T14:07:32.382 INFO:tasks.workunit.client.1.vm04.stdout:2/374: dread d0/d3/d8/d17/f1f [0,4194304] 0 2026-03-10T14:07:32.388 INFO:tasks.workunit.client.1.vm04.stdout:2/375: symlink d0/d3/d8/dd/d26/l71 0 2026-03-10T14:07:32.397 INFO:tasks.workunit.client.1.vm04.stdout:2/376: chown d0/d3/l2c 1 1 2026-03-10T14:07:32.397 INFO:tasks.workunit.client.1.vm04.stdout:2/377: rename d0/d3/d3a/d3e/f2d to d0/d3/d4a/d66/f72 0 2026-03-10T14:07:32.402 INFO:tasks.workunit.client.1.vm04.stdout:3/322: dread da/d30/f32 [0,4194304] 0 2026-03-10T14:07:32.405 INFO:tasks.workunit.client.1.vm04.stdout:2/378: creat d0/d3/d8/d17/f73 x:0 0 0 2026-03-10T14:07:32.413 INFO:tasks.workunit.client.1.vm04.stdout:3/323: creat da/dc/d3f/d59/f71 x:0 0 0 2026-03-10T14:07:32.414 INFO:tasks.workunit.client.1.vm04.stdout:5/355: fdatasync d7/d12/d2b/f46 0 2026-03-10T14:07:32.414 INFO:tasks.workunit.client.1.vm04.stdout:5/356: dread - d7/d2d/f64 zero size 2026-03-10T14:07:32.416 INFO:tasks.workunit.client.1.vm04.stdout:2/379: creat d0/d14/d39/d47/d70/f74 x:0 0 0 2026-03-10T14:07:32.417 INFO:tasks.workunit.client.1.vm04.stdout:2/380: truncate d0/d14/d1b/f29 2506709 0 2026-03-10T14:07:32.417 INFO:tasks.workunit.client.1.vm04.stdout:2/381: readlink d0/d3/d8/d17/d4e/l65 0 2026-03-10T14:07:32.418 INFO:tasks.workunit.client.1.vm04.stdout:3/324: creat da/dc/d35/d52/f72 x:0 0 0 2026-03-10T14:07:32.421 INFO:tasks.workunit.client.1.vm04.stdout:3/325: dwrite da/dc/f1d [4194304,4194304] 0 2026-03-10T14:07:32.440 INFO:tasks.workunit.client.1.vm04.stdout:2/382: rename d0/c15 to d0/d3/d8/dd/d26/d46/d4b/d56/c75 0 2026-03-10T14:07:32.441 INFO:tasks.workunit.client.1.vm04.stdout:2/383: chown d0/d14/d39/d47/f5d 7225293 1 2026-03-10T14:07:32.443 INFO:tasks.workunit.client.1.vm04.stdout:3/326: mknod da/dc/d3f/d54/d66/c73 0 2026-03-10T14:07:32.446 INFO:tasks.workunit.client.1.vm04.stdout:3/327: dwrite da/f10 [0,4194304] 0 2026-03-10T14:07:32.449 INFO:tasks.workunit.client.1.vm04.stdout:3/328: dwrite f4 [4194304,4194304] 0 2026-03-10T14:07:32.451 INFO:tasks.workunit.client.1.vm04.stdout:5/357: link d7/d12/c19 d7/d12/d2b/d3e/d57/c6a 0 2026-03-10T14:07:32.455 INFO:tasks.workunit.client.1.vm04.stdout:5/358: dwrite d7/d12/d45/f5c [0,4194304] 0 2026-03-10T14:07:32.455 INFO:tasks.workunit.client.1.vm04.stdout:5/359: fsync d7/d9/fd 0 2026-03-10T14:07:32.456 INFO:tasks.workunit.client.1.vm04.stdout:5/360: fsync d7/d26/f56 0 2026-03-10T14:07:32.458 INFO:tasks.workunit.client.1.vm04.stdout:2/384: unlink d0/d14/d39/d47/c6a 0 2026-03-10T14:07:32.458 INFO:tasks.workunit.client.1.vm04.stdout:2/385: chown d0/d3/d8/dd 239114 1 2026-03-10T14:07:32.458 INFO:tasks.workunit.client.1.vm04.stdout:2/386: fsync d0/d3/f11 0 2026-03-10T14:07:32.462 INFO:tasks.workunit.client.1.vm04.stdout:2/387: dread d0/d3/f24 [0,4194304] 0 2026-03-10T14:07:32.481 INFO:tasks.workunit.client.1.vm04.stdout:3/329: creat da/dc/d35/d52/d53/f74 x:0 0 0 2026-03-10T14:07:32.481 INFO:tasks.workunit.client.1.vm04.stdout:0/245: truncate d0/d2/f17 208616 0 2026-03-10T14:07:32.482 INFO:tasks.workunit.client.1.vm04.stdout:3/330: truncate da/dc/d35/f5b 816892 0 2026-03-10T14:07:32.482 INFO:tasks.workunit.client.1.vm04.stdout:5/361: mkdir d7/d26/d6b 0 2026-03-10T14:07:32.487 INFO:tasks.workunit.client.1.vm04.stdout:2/388: write d0/d3/d8/f42 [8794933,9915] 0 2026-03-10T14:07:32.489 INFO:tasks.workunit.client.1.vm04.stdout:3/331: creat da/dc/d35/d52/d6d/f75 x:0 0 0 2026-03-10T14:07:32.491 INFO:tasks.workunit.client.1.vm04.stdout:3/332: dwrite f4 [0,4194304] 0 2026-03-10T14:07:32.493 INFO:tasks.workunit.client.1.vm04.stdout:5/362: rename d7/ce to d7/d2d/d32/c6c 0 2026-03-10T14:07:32.493 INFO:tasks.workunit.client.1.vm04.stdout:2/389: symlink d0/l76 0 2026-03-10T14:07:32.494 INFO:tasks.workunit.client.1.vm04.stdout:3/333: mknod da/dc/d3f/d54/c76 0 2026-03-10T14:07:32.496 INFO:tasks.workunit.client.1.vm04.stdout:5/363: fsync f4 0 2026-03-10T14:07:32.496 INFO:tasks.workunit.client.1.vm04.stdout:5/364: dread - d7/d2d/f64 zero size 2026-03-10T14:07:32.496 INFO:tasks.workunit.client.1.vm04.stdout:5/365: dread - d7/f3c zero size 2026-03-10T14:07:32.497 INFO:tasks.workunit.client.1.vm04.stdout:2/390: fdatasync d0/d3/d4a/d66/f72 0 2026-03-10T14:07:32.500 INFO:tasks.workunit.client.1.vm04.stdout:2/391: dwrite d0/d3/d3a/d3e/f61 [0,4194304] 0 2026-03-10T14:07:32.501 INFO:tasks.workunit.client.1.vm04.stdout:2/392: write d0/d3/d8/d17/d35/f5f [484702,2754] 0 2026-03-10T14:07:32.504 INFO:tasks.workunit.client.1.vm04.stdout:2/393: read d0/d3/d4a/f57 [538964,97575] 0 2026-03-10T14:07:32.504 INFO:tasks.workunit.client.1.vm04.stdout:2/394: chown d0/c5c 0 1 2026-03-10T14:07:32.515 INFO:tasks.workunit.client.1.vm04.stdout:2/395: symlink d0/d3/d8/dd/d26/l77 0 2026-03-10T14:07:32.516 INFO:tasks.workunit.client.1.vm04.stdout:2/396: creat d0/d3/d8/d17/d4e/f78 x:0 0 0 2026-03-10T14:07:32.517 INFO:tasks.workunit.client.1.vm04.stdout:5/366: creat d7/d2d/f6d x:0 0 0 2026-03-10T14:07:32.518 INFO:tasks.workunit.client.1.vm04.stdout:5/367: chown d7/f24 36 1 2026-03-10T14:07:32.535 INFO:tasks.workunit.client.1.vm04.stdout:3/334: truncate da/dc/d35/f64 90002 0 2026-03-10T14:07:32.536 INFO:tasks.workunit.client.1.vm04.stdout:3/335: chown da/dc/d35/d52/d53 360 1 2026-03-10T14:07:32.539 INFO:tasks.workunit.client.1.vm04.stdout:3/336: dread da/dc/d35/f50 [0,4194304] 0 2026-03-10T14:07:32.541 INFO:tasks.workunit.client.1.vm04.stdout:3/337: creat da/dc/d35/d37/f77 x:0 0 0 2026-03-10T14:07:32.549 INFO:tasks.workunit.client.1.vm04.stdout:3/338: dread da/f22 [0,4194304] 0 2026-03-10T14:07:32.551 INFO:tasks.workunit.client.1.vm04.stdout:3/339: mkdir da/dc/d35/d52/d53/d78 0 2026-03-10T14:07:32.553 INFO:tasks.workunit.client.1.vm04.stdout:3/340: link da/dc/f1d da/dc/d35/d52/f79 0 2026-03-10T14:07:32.554 INFO:tasks.workunit.client.1.vm04.stdout:3/341: symlink da/dc/d3f/d59/l7a 0 2026-03-10T14:07:32.556 INFO:tasks.workunit.client.1.vm04.stdout:3/342: unlink da/dc/d3f/d54/d66/c73 0 2026-03-10T14:07:32.560 INFO:tasks.workunit.client.1.vm04.stdout:3/343: dwrite da/dc/d35/d52/f79 [4194304,4194304] 0 2026-03-10T14:07:32.562 INFO:tasks.workunit.client.1.vm04.stdout:3/344: chown da/dc/d35/d52/f69 1018962 1 2026-03-10T14:07:32.562 INFO:tasks.workunit.client.1.vm04.stdout:3/345: stat da/d30/f32 0 2026-03-10T14:07:32.607 INFO:tasks.workunit.client.1.vm04.stdout:5/368: sync 2026-03-10T14:07:32.610 INFO:tasks.workunit.client.1.vm04.stdout:5/369: dwrite d7/d12/d2b/d3e/d3f/f52 [0,4194304] 0 2026-03-10T14:07:32.626 INFO:tasks.workunit.client.1.vm04.stdout:5/370: mkdir d7/d26/d6b/d6e 0 2026-03-10T14:07:32.663 INFO:tasks.workunit.client.1.vm04.stdout:6/274: dwrite d3/de/d35/d3f/d2d/f2e [0,4194304] 0 2026-03-10T14:07:32.672 INFO:tasks.workunit.client.1.vm04.stdout:4/321: dwrite d4/d14/d3c/f3e [0,4194304] 0 2026-03-10T14:07:32.678 INFO:tasks.workunit.client.1.vm04.stdout:1/301: dwrite d3/d22/f3b [0,4194304] 0 2026-03-10T14:07:32.690 INFO:tasks.workunit.client.1.vm04.stdout:5/371: link d7/d2d/d32/l37 d7/d12/d2b/d3e/d57/l6f 0 2026-03-10T14:07:32.696 INFO:tasks.workunit.client.1.vm04.stdout:7/337: write d2/dc/de/d21/f45 [551886,12051] 0 2026-03-10T14:07:32.697 INFO:tasks.workunit.client.1.vm04.stdout:4/322: creat d4/f77 x:0 0 0 2026-03-10T14:07:32.697 INFO:tasks.workunit.client.1.vm04.stdout:4/323: chown d4/d14/d1b 347 1 2026-03-10T14:07:32.698 INFO:tasks.workunit.client.1.vm04.stdout:4/324: chown d4/d14/d3c/d5e/l61 443969048 1 2026-03-10T14:07:32.699 INFO:tasks.workunit.client.1.vm04.stdout:1/302: symlink d3/d22/l6e 0 2026-03-10T14:07:32.703 INFO:tasks.workunit.client.1.vm04.stdout:6/275: creat d3/de/d35/d3a/d43/f4b x:0 0 0 2026-03-10T14:07:32.703 INFO:tasks.workunit.client.1.vm04.stdout:5/372: unlink d7/d2d/l50 0 2026-03-10T14:07:32.703 INFO:tasks.workunit.client.1.vm04.stdout:5/373: write d7/d9/f28 [1736886,130926] 0 2026-03-10T14:07:32.706 INFO:tasks.workunit.client.1.vm04.stdout:4/325: symlink d4/df/d34/l78 0 2026-03-10T14:07:32.707 INFO:tasks.workunit.client.1.vm04.stdout:6/276: truncate d3/f19 5066653 0 2026-03-10T14:07:32.708 INFO:tasks.workunit.client.1.vm04.stdout:5/374: creat d7/d26/f70 x:0 0 0 2026-03-10T14:07:32.709 INFO:tasks.workunit.client.1.vm04.stdout:5/375: dread - d7/d2d/f6d zero size 2026-03-10T14:07:32.711 INFO:tasks.workunit.client.1.vm04.stdout:4/326: creat d4/d14/d3c/d5e/f79 x:0 0 0 2026-03-10T14:07:32.712 INFO:tasks.workunit.client.1.vm04.stdout:6/277: mkdir d3/de/d35/d3a/d43/d4c 0 2026-03-10T14:07:32.715 INFO:tasks.workunit.client.1.vm04.stdout:5/376: mknod d7/d26/d6b/c71 0 2026-03-10T14:07:32.718 INFO:tasks.workunit.client.1.vm04.stdout:7/338: getdents d2/dc/de/d2d/d60/d7c 0 2026-03-10T14:07:32.718 INFO:tasks.workunit.client.1.vm04.stdout:8/319: dread d0/d3/dd/d11/f57 [0,4194304] 0 2026-03-10T14:07:32.731 INFO:tasks.workunit.client.1.vm04.stdout:5/377: creat d7/d12/d2b/f72 x:0 0 0 2026-03-10T14:07:32.732 INFO:tasks.workunit.client.1.vm04.stdout:5/378: write d7/f21 [1767934,76661] 0 2026-03-10T14:07:32.733 INFO:tasks.workunit.client.1.vm04.stdout:7/339: dread d2/dc/de/d2d/f39 [0,4194304] 0 2026-03-10T14:07:32.735 INFO:tasks.workunit.client.1.vm04.stdout:7/340: write d2/dc/de/d2d/d60/d7c/f58 [5033965,41792] 0 2026-03-10T14:07:32.736 INFO:tasks.workunit.client.1.vm04.stdout:6/278: creat d3/de/d35/d3a/d43/d4c/f4d x:0 0 0 2026-03-10T14:07:32.737 INFO:tasks.workunit.client.1.vm04.stdout:6/279: fdatasync d3/de/d35/d3f/f3b 0 2026-03-10T14:07:32.739 INFO:tasks.workunit.client.1.vm04.stdout:6/280: dread d3/de/d35/d3f/d2d/f2e [0,4194304] 0 2026-03-10T14:07:32.742 INFO:tasks.workunit.client.1.vm04.stdout:4/327: mknod d4/d14/c7a 0 2026-03-10T14:07:32.743 INFO:tasks.workunit.client.1.vm04.stdout:4/328: dread d4/df/d31/f55 [0,4194304] 0 2026-03-10T14:07:32.744 INFO:tasks.workunit.client.1.vm04.stdout:4/329: fdatasync d4/f57 0 2026-03-10T14:07:32.747 INFO:tasks.workunit.client.1.vm04.stdout:7/341: mknod d2/dc/de/d2d/d38/d50/c7d 0 2026-03-10T14:07:32.748 INFO:tasks.workunit.client.1.vm04.stdout:1/303: link f2 d3/d22/d63/f6f 0 2026-03-10T14:07:32.751 INFO:tasks.workunit.client.1.vm04.stdout:4/330: creat d4/d14/d3c/d5e/f7b x:0 0 0 2026-03-10T14:07:32.752 INFO:tasks.workunit.client.1.vm04.stdout:7/342: creat d2/d28/f7e x:0 0 0 2026-03-10T14:07:32.753 INFO:tasks.workunit.client.1.vm04.stdout:7/343: write d2/dc/f4a [1781317,74101] 0 2026-03-10T14:07:32.757 INFO:tasks.workunit.client.1.vm04.stdout:7/344: mkdir d2/dc/d4d/d7f 0 2026-03-10T14:07:32.757 INFO:tasks.workunit.client.1.vm04.stdout:8/320: sync 2026-03-10T14:07:32.757 INFO:tasks.workunit.client.1.vm04.stdout:5/379: sync 2026-03-10T14:07:32.757 INFO:tasks.workunit.client.1.vm04.stdout:6/281: sync 2026-03-10T14:07:32.764 INFO:tasks.workunit.client.1.vm04.stdout:6/282: dread d3/de/d35/d3f/f17 [0,4194304] 0 2026-03-10T14:07:32.766 INFO:tasks.workunit.client.1.vm04.stdout:8/321: dwrite d0/d3/dd/d11/d12/d51/f4f [0,4194304] 0 2026-03-10T14:07:32.771 INFO:tasks.workunit.client.1.vm04.stdout:8/322: sync 2026-03-10T14:07:32.776 INFO:tasks.workunit.client.1.vm04.stdout:1/304: symlink d3/d22/d6d/l70 0 2026-03-10T14:07:32.776 INFO:tasks.workunit.client.1.vm04.stdout:1/305: chown d3/f18 104031 1 2026-03-10T14:07:32.782 INFO:tasks.workunit.client.1.vm04.stdout:9/298: dwrite f5 [8388608,4194304] 0 2026-03-10T14:07:32.787 INFO:tasks.workunit.client.1.vm04.stdout:9/299: dread - d9/d33/f45 zero size 2026-03-10T14:07:32.789 INFO:tasks.workunit.client.1.vm04.stdout:6/283: mkdir d3/de/d35/d3f/d2d/d32/d23/d4e 0 2026-03-10T14:07:32.792 INFO:tasks.workunit.client.1.vm04.stdout:8/323: unlink d0/d3/dd/c27 0 2026-03-10T14:07:32.793 INFO:tasks.workunit.client.1.vm04.stdout:4/331: getdents d4/df/d34/d6f 0 2026-03-10T14:07:32.805 INFO:tasks.workunit.client.1.vm04.stdout:7/345: rename d2/dc/de/c5b to d2/dc/de/d2d/c80 0 2026-03-10T14:07:32.809 INFO:tasks.workunit.client.1.vm04.stdout:7/346: dwrite d2/dc/de/d2d/d60/d7c/f78 [0,4194304] 0 2026-03-10T14:07:32.812 INFO:tasks.workunit.client.1.vm04.stdout:1/306: dread d3/f2c [0,4194304] 0 2026-03-10T14:07:32.813 INFO:tasks.workunit.client.1.vm04.stdout:9/300: mknod d9/d44/d4d/c68 0 2026-03-10T14:07:32.817 INFO:tasks.workunit.client.1.vm04.stdout:6/284: mknod d3/de/d35/d3f/d2d/d32/d23/d24/c4f 0 2026-03-10T14:07:32.818 INFO:tasks.workunit.client.1.vm04.stdout:4/332: truncate d4/df/d34/f1f 1764056 0 2026-03-10T14:07:32.822 INFO:tasks.workunit.client.1.vm04.stdout:5/380: rename d7/d9/l18 to d7/d2d/d32/l73 0 2026-03-10T14:07:32.826 INFO:tasks.workunit.client.1.vm04.stdout:5/381: dwrite d7/d2d/d32/d34/f4e [0,4194304] 0 2026-03-10T14:07:32.826 INFO:tasks.workunit.client.1.vm04.stdout:5/382: chown d7/d9/l68 125 1 2026-03-10T14:07:32.836 INFO:tasks.workunit.client.1.vm04.stdout:7/347: mkdir d2/dc/de/d2d/d60/d81 0 2026-03-10T14:07:32.836 INFO:tasks.workunit.client.1.vm04.stdout:7/348: chown d2/dc/de/d2d/d60/d7c/d44 7 1 2026-03-10T14:07:32.838 INFO:tasks.workunit.client.1.vm04.stdout:1/307: creat d3/d5c/f71 x:0 0 0 2026-03-10T14:07:32.841 INFO:tasks.workunit.client.1.vm04.stdout:6/285: write d3/de/d35/d3f/f3d [1277010,57944] 0 2026-03-10T14:07:32.847 INFO:tasks.workunit.client.1.vm04.stdout:5/383: mkdir d7/d2d/d32/d34/d74 0 2026-03-10T14:07:32.847 INFO:tasks.workunit.client.1.vm04.stdout:5/384: write d7/d12/f51 [1941832,31846] 0 2026-03-10T14:07:32.847 INFO:tasks.workunit.client.1.vm04.stdout:7/349: mknod d2/dc/de/d2d/d60/d7c/d36/c82 0 2026-03-10T14:07:32.847 INFO:tasks.workunit.client.1.vm04.stdout:7/350: chown d2/dc/d4d/d7f 36149 1 2026-03-10T14:07:32.850 INFO:tasks.workunit.client.1.vm04.stdout:6/286: creat d3/de/d35/d3f/d2d/d38/f50 x:0 0 0 2026-03-10T14:07:32.851 INFO:tasks.workunit.client.1.vm04.stdout:1/308: dwrite d3/d5/d13/d1a/f4b [0,4194304] 0 2026-03-10T14:07:32.861 INFO:tasks.workunit.client.1.vm04.stdout:7/351: creat d2/dc/de/d2d/d38/f83 x:0 0 0 2026-03-10T14:07:32.863 INFO:tasks.workunit.client.1.vm04.stdout:7/352: creat d2/dc/de/d2d/d60/d7c/f84 x:0 0 0 2026-03-10T14:07:32.864 INFO:tasks.workunit.client.1.vm04.stdout:7/353: dread - d2/d28/f7e zero size 2026-03-10T14:07:32.864 INFO:tasks.workunit.client.1.vm04.stdout:7/354: stat d2/dc/de/d2d/f39 0 2026-03-10T14:07:32.866 INFO:tasks.workunit.client.1.vm04.stdout:1/309: creat d3/d22/d63/f72 x:0 0 0 2026-03-10T14:07:32.867 INFO:tasks.workunit.client.1.vm04.stdout:7/355: symlink d2/dc/de/d2d/d60/d7c/l85 0 2026-03-10T14:07:32.883 INFO:tasks.workunit.client.1.vm04.stdout:7/356: fsync d2/d2a/f35 0 2026-03-10T14:07:32.883 INFO:tasks.workunit.client.1.vm04.stdout:7/357: fdatasync d2/dc/de/d2d/d60/d7c/f84 0 2026-03-10T14:07:32.883 INFO:tasks.workunit.client.1.vm04.stdout:7/358: mkdir d2/d2a/d42/d86 0 2026-03-10T14:07:32.906 INFO:tasks.workunit.client.1.vm04.stdout:7/359: dread d2/dc/f26 [0,4194304] 0 2026-03-10T14:07:32.907 INFO:tasks.workunit.client.1.vm04.stdout:1/310: dread d3/d20/f27 [0,4194304] 0 2026-03-10T14:07:32.910 INFO:tasks.workunit.client.1.vm04.stdout:1/311: creat d3/d22/d63/f73 x:0 0 0 2026-03-10T14:07:32.913 INFO:tasks.workunit.client.1.vm04.stdout:1/312: dwrite d3/d22/d63/f2d [0,4194304] 0 2026-03-10T14:07:32.931 INFO:tasks.workunit.client.1.vm04.stdout:1/313: dread d3/f14 [0,4194304] 0 2026-03-10T14:07:32.934 INFO:tasks.workunit.client.1.vm04.stdout:1/314: link d3/c2e d3/d20/c74 0 2026-03-10T14:07:32.934 INFO:tasks.workunit.client.1.vm04.stdout:1/315: write d3/f8 [1298214,43628] 0 2026-03-10T14:07:32.937 INFO:tasks.workunit.client.1.vm04.stdout:1/316: unlink d3/d22/d2f/f3a 0 2026-03-10T14:07:32.940 INFO:tasks.workunit.client.1.vm04.stdout:1/317: dwrite d3/d22/d2f/f5d [0,4194304] 0 2026-03-10T14:07:32.943 INFO:tasks.workunit.client.1.vm04.stdout:1/318: symlink d3/d22/d63/d35/l75 0 2026-03-10T14:07:32.957 INFO:tasks.workunit.client.1.vm04.stdout:8/324: rename d0/d3/dd/d11 to d0/d3/d63 0 2026-03-10T14:07:32.959 INFO:tasks.workunit.client.1.vm04.stdout:8/325: truncate d0/d3/d63/d29/f45 5062520 0 2026-03-10T14:07:32.973 INFO:tasks.workunit.client.1.vm04.stdout:8/326: write d0/d3/d63/f5b [650273,117893] 0 2026-03-10T14:07:32.973 INFO:tasks.workunit.client.1.vm04.stdout:5/385: rename d7/d12/d2b/d3e/f62 to d7/d9/f75 0 2026-03-10T14:07:32.973 INFO:tasks.workunit.client.1.vm04.stdout:8/327: chown d0/d3/d63/c25 57790838 1 2026-03-10T14:07:32.973 INFO:tasks.workunit.client.1.vm04.stdout:1/319: rename d3/d5/d13/d1a/l4d to d3/d22/l76 0 2026-03-10T14:07:32.973 INFO:tasks.workunit.client.1.vm04.stdout:8/328: rename d0/d3/d63/d29/f52 to d0/d3/d63/d12/d51/f64 0 2026-03-10T14:07:32.973 INFO:tasks.workunit.client.1.vm04.stdout:8/329: chown d0/d3/dd/f5d 54395782 1 2026-03-10T14:07:32.973 INFO:tasks.workunit.client.1.vm04.stdout:1/320: write d3/d20/f27 [2801870,5284] 0 2026-03-10T14:07:32.973 INFO:tasks.workunit.client.1.vm04.stdout:8/330: truncate d0/d3/d63/d12/f61 138211 0 2026-03-10T14:07:32.974 INFO:tasks.workunit.client.1.vm04.stdout:5/386: mkdir d7/d2d/d76 0 2026-03-10T14:07:32.978 INFO:tasks.workunit.client.1.vm04.stdout:5/387: dwrite d7/d12/d2b/f53 [0,4194304] 0 2026-03-10T14:07:32.982 INFO:tasks.workunit.client.1.vm04.stdout:5/388: dwrite d7/d12/d2b/d3e/f4a [0,4194304] 0 2026-03-10T14:07:32.994 INFO:tasks.workunit.client.1.vm04.stdout:1/321: dread d3/d5/fd [0,4194304] 0 2026-03-10T14:07:32.998 INFO:tasks.workunit.client.1.vm04.stdout:1/322: dwrite d3/d22/f2b [0,4194304] 0 2026-03-10T14:07:33.001 INFO:tasks.workunit.client.1.vm04.stdout:8/331: write d0/d3/d63/d12/f47 [925128,125974] 0 2026-03-10T14:07:33.006 INFO:tasks.workunit.client.1.vm04.stdout:1/323: rmdir d3 39 2026-03-10T14:07:33.008 INFO:tasks.workunit.client.1.vm04.stdout:5/389: mkdir d7/d12/d2b/d3e/d57/d77 0 2026-03-10T14:07:33.011 INFO:tasks.workunit.client.1.vm04.stdout:8/332: symlink d0/d3/dd/d33/l65 0 2026-03-10T14:07:33.012 INFO:tasks.workunit.client.1.vm04.stdout:8/333: write d0/d3/d63/f5f [2939073,3131] 0 2026-03-10T14:07:33.013 INFO:tasks.workunit.client.1.vm04.stdout:5/390: symlink d7/d12/d2b/d3e/d3f/l78 0 2026-03-10T14:07:33.015 INFO:tasks.workunit.client.1.vm04.stdout:8/334: creat d0/d3/d5/f66 x:0 0 0 2026-03-10T14:07:33.016 INFO:tasks.workunit.client.1.vm04.stdout:8/335: read d0/d3/d63/f5f [4172341,66082] 0 2026-03-10T14:07:33.018 INFO:tasks.workunit.client.1.vm04.stdout:5/391: symlink d7/d59/l79 0 2026-03-10T14:07:33.019 INFO:tasks.workunit.client.1.vm04.stdout:1/324: creat d3/d22/d63/f77 x:0 0 0 2026-03-10T14:07:33.021 INFO:tasks.workunit.client.1.vm04.stdout:5/392: fdatasync d7/f11 0 2026-03-10T14:07:33.026 INFO:tasks.workunit.client.1.vm04.stdout:5/393: truncate f4 2312906 0 2026-03-10T14:07:33.027 INFO:tasks.workunit.client.1.vm04.stdout:5/394: dread - d7/d2d/f64 zero size 2026-03-10T14:07:33.033 INFO:tasks.workunit.client.1.vm04.stdout:5/395: mknod d7/d2d/d69/c7a 0 2026-03-10T14:07:33.039 INFO:tasks.workunit.client.1.vm04.stdout:5/396: stat d7/d12/d2b/d3e/l60 0 2026-03-10T14:07:33.039 INFO:tasks.workunit.client.1.vm04.stdout:5/397: chown d7/f24 55032 1 2026-03-10T14:07:33.039 INFO:tasks.workunit.client.1.vm04.stdout:5/398: chown d7/d12/d2b/d3e/l60 7878403 1 2026-03-10T14:07:33.039 INFO:tasks.workunit.client.1.vm04.stdout:5/399: symlink d7/d12/d2b/d3e/d57/l7b 0 2026-03-10T14:07:33.039 INFO:tasks.workunit.client.1.vm04.stdout:5/400: chown d7/d26/f54 273342 1 2026-03-10T14:07:33.041 INFO:tasks.workunit.client.1.vm04.stdout:8/336: dread d0/d3/d63/d12/f2c [0,4194304] 0 2026-03-10T14:07:33.046 INFO:tasks.workunit.client.1.vm04.stdout:5/401: mknod d7/d12/d2b/d3e/d3f/c7c 0 2026-03-10T14:07:33.049 INFO:tasks.workunit.client.1.vm04.stdout:8/337: write d0/d3/d63/d29/f2b [814597,119822] 0 2026-03-10T14:07:33.050 INFO:tasks.workunit.client.1.vm04.stdout:5/402: mkdir d7/d59/d7d 0 2026-03-10T14:07:33.051 INFO:tasks.workunit.client.1.vm04.stdout:8/338: mkdir d0/d3/d63/d12/d51/d67 0 2026-03-10T14:07:33.054 INFO:tasks.workunit.client.1.vm04.stdout:5/403: rename d7/d2d/d32/d34/d74 to d7/d59/d7e 0 2026-03-10T14:07:33.059 INFO:tasks.workunit.client.1.vm04.stdout:8/339: rename d0/d3/c38 to d0/d3/d63/c68 0 2026-03-10T14:07:33.059 INFO:tasks.workunit.client.1.vm04.stdout:8/340: chown d0/l5e 381872 1 2026-03-10T14:07:33.062 INFO:tasks.workunit.client.1.vm04.stdout:8/341: dread d0/d3/d63/f5f [0,4194304] 0 2026-03-10T14:07:33.086 INFO:tasks.workunit.client.1.vm04.stdout:2/397: stat d0/d3/d8/dd/d26/d46/d4b/d56/c75 0 2026-03-10T14:07:33.087 INFO:tasks.workunit.client.1.vm04.stdout:2/398: dread - d0/d3/d8/d17/d4e/f78 zero size 2026-03-10T14:07:33.087 INFO:tasks.workunit.client.1.vm04.stdout:2/399: dread - d0/d3/d8/d17/f73 zero size 2026-03-10T14:07:33.091 INFO:tasks.workunit.client.1.vm04.stdout:2/400: dwrite d0/d3/d8/d17/d4e/f78 [0,4194304] 0 2026-03-10T14:07:33.099 INFO:tasks.workunit.client.1.vm04.stdout:2/401: readlink d0/l10 0 2026-03-10T14:07:33.105 INFO:tasks.workunit.client.1.vm04.stdout:2/402: rename d0/d14/d39/f6e to d0/d14/f79 0 2026-03-10T14:07:33.107 INFO:tasks.workunit.client.1.vm04.stdout:0/246: dread d0/d2/f17 [0,4194304] 0 2026-03-10T14:07:33.111 INFO:tasks.workunit.client.1.vm04.stdout:0/247: unlink d0/f35 0 2026-03-10T14:07:33.113 INFO:tasks.workunit.client.1.vm04.stdout:0/248: write d0/d2/d15/d22/f30 [3051670,97196] 0 2026-03-10T14:07:33.115 INFO:tasks.workunit.client.1.vm04.stdout:0/249: mkdir d0/d2/d15/d22/d38/d4e 0 2026-03-10T14:07:33.117 INFO:tasks.workunit.client.1.vm04.stdout:0/250: dread d0/d2/f9 [0,4194304] 0 2026-03-10T14:07:33.118 INFO:tasks.workunit.client.1.vm04.stdout:0/251: symlink d0/d2/d15/d22/d38/l4f 0 2026-03-10T14:07:33.126 INFO:tasks.workunit.client.1.vm04.stdout:0/252: rmdir d0/d2/d15/d22/d38/d4c 0 2026-03-10T14:07:33.133 INFO:tasks.workunit.client.1.vm04.stdout:0/253: rename d0/d2/d15/d22/d38/d4e to d0/d2/d15/d49/d50 0 2026-03-10T14:07:33.134 INFO:tasks.workunit.client.1.vm04.stdout:0/254: dread - d0/d2/d15/f2f zero size 2026-03-10T14:07:33.146 INFO:tasks.workunit.client.1.vm04.stdout:3/346: getdents da/dc/d3f/d59 0 2026-03-10T14:07:33.152 INFO:tasks.workunit.client.1.vm04.stdout:3/347: mknod da/dc/d35/d52/d53/c7b 0 2026-03-10T14:07:33.158 INFO:tasks.workunit.client.1.vm04.stdout:2/403: dread d0/d3/d8/f6 [0,4194304] 0 2026-03-10T14:07:33.160 INFO:tasks.workunit.client.1.vm04.stdout:2/404: dread d0/d14/f6b [0,4194304] 0 2026-03-10T14:07:33.177 INFO:tasks.workunit.client.1.vm04.stdout:0/255: sync 2026-03-10T14:07:33.181 INFO:tasks.workunit.client.1.vm04.stdout:6/287: fsync d3/de/d35/d3f/f22 0 2026-03-10T14:07:33.188 INFO:tasks.workunit.client.1.vm04.stdout:2/405: symlink d0/d3/d8/dd/d26/d46/l7a 0 2026-03-10T14:07:33.194 INFO:tasks.workunit.client.1.vm04.stdout:0/256: creat d0/d1c/f51 x:0 0 0 2026-03-10T14:07:33.195 INFO:tasks.workunit.client.1.vm04.stdout:6/288: write d3/de/d35/d3f/d2d/d32/d23/f2f [4819421,28882] 0 2026-03-10T14:07:33.198 INFO:tasks.workunit.client.1.vm04.stdout:0/257: mknod d0/d2/d15/d22/c52 0 2026-03-10T14:07:33.204 INFO:tasks.workunit.client.1.vm04.stdout:6/289: rename d3/de/d35/d3f/f3d to d3/de/d35/d3a/f51 0 2026-03-10T14:07:33.216 INFO:tasks.workunit.client.1.vm04.stdout:0/258: dwrite d0/d2/f9 [0,4194304] 0 2026-03-10T14:07:33.283 INFO:tasks.workunit.client.1.vm04.stdout:4/333: dwrite d4/d14/d1b/f28 [0,4194304] 0 2026-03-10T14:07:33.284 INFO:tasks.workunit.client.1.vm04.stdout:4/334: chown d4/df/d22/d47/d70/d74 320 1 2026-03-10T14:07:33.286 INFO:tasks.workunit.client.1.vm04.stdout:4/335: dread d4/f9 [0,4194304] 0 2026-03-10T14:07:33.304 INFO:tasks.workunit.client.1.vm04.stdout:9/301: truncate d9/d44/f49 407752 0 2026-03-10T14:07:33.307 INFO:tasks.workunit.client.1.vm04.stdout:9/302: symlink d9/da/d5d/l69 0 2026-03-10T14:07:33.340 INFO:tasks.workunit.client.1.vm04.stdout:7/360: truncate d2/dc/de/d21/f45 3048208 0 2026-03-10T14:07:33.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:33 vm03.local ceph-mon[49718]: pgmap v145: 65 pgs: 65 active+clean; 692 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 12 MiB/s rd, 90 MiB/s wr, 411 op/s 2026-03-10T14:07:33.408 INFO:tasks.workunit.client.1.vm04.stdout:1/325: dwrite d3/f2c [0,4194304] 0 2026-03-10T14:07:33.424 INFO:tasks.workunit.client.1.vm04.stdout:1/326: getdents d3/d5/d13/d38 0 2026-03-10T14:07:33.434 INFO:tasks.workunit.client.1.vm04.stdout:1/327: dread d3/f8 [0,4194304] 0 2026-03-10T14:07:33.443 INFO:tasks.workunit.client.1.vm04.stdout:1/328: truncate d3/d22/d2f/d57/f68 534186 0 2026-03-10T14:07:33.448 INFO:tasks.workunit.client.1.vm04.stdout:1/329: write d3/d22/d2f/d57/f68 [755989,14323] 0 2026-03-10T14:07:33.450 INFO:tasks.workunit.client.1.vm04.stdout:1/330: chown d3/d5/d13/l3f 27 1 2026-03-10T14:07:33.458 INFO:tasks.workunit.client.1.vm04.stdout:1/331: creat d3/d22/d2f/d57/f78 x:0 0 0 2026-03-10T14:07:33.473 INFO:tasks.workunit.client.1.vm04.stdout:5/404: rmdir d7/d2d 39 2026-03-10T14:07:33.477 INFO:tasks.workunit.client.1.vm04.stdout:5/405: rename d7/d12/d2b/d3e/d3f/c41 to d7/d26/d6b/c7f 0 2026-03-10T14:07:33.482 INFO:tasks.workunit.client.1.vm04.stdout:8/342: getdents d0/d3 0 2026-03-10T14:07:33.508 INFO:tasks.workunit.client.1.vm04.stdout:8/343: dread d0/d3/d63/d29/f45 [0,4194304] 0 2026-03-10T14:07:33.512 INFO:tasks.workunit.client.1.vm04.stdout:8/344: rename d0/d3/d63/d3b to d0/d3/d63/d12/d69 0 2026-03-10T14:07:33.512 INFO:tasks.workunit.client.1.vm04.stdout:8/345: write d0/f23 [4259542,84768] 0 2026-03-10T14:07:33.515 INFO:tasks.workunit.client.1.vm04.stdout:8/346: rename d0/d3/d63/d12/d69/c58 to d0/d3/d63/d12/d69/c6a 0 2026-03-10T14:07:33.566 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:33 vm04.local ceph-mon[55966]: pgmap v145: 65 pgs: 65 active+clean; 692 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 12 MiB/s rd, 90 MiB/s wr, 411 op/s 2026-03-10T14:07:33.637 INFO:tasks.workunit.client.1.vm04.stdout:3/348: dwrite da/dc/d35/d52/f6f [0,4194304] 0 2026-03-10T14:07:33.647 INFO:tasks.workunit.client.1.vm04.stdout:3/349: dread da/dc/f2a [0,4194304] 0 2026-03-10T14:07:33.648 INFO:tasks.workunit.client.1.vm04.stdout:3/350: truncate da/d3e/f63 496628 0 2026-03-10T14:07:33.649 INFO:tasks.workunit.client.1.vm04.stdout:3/351: stat da/dc/c21 0 2026-03-10T14:07:33.675 INFO:tasks.workunit.client.1.vm04.stdout:3/352: link da/dc/d35/d37/l68 da/dc/d3f/d54/l7c 0 2026-03-10T14:07:33.675 INFO:tasks.workunit.client.1.vm04.stdout:2/406: write d0/d3/f24 [4538484,50220] 0 2026-03-10T14:07:33.680 INFO:tasks.workunit.client.1.vm04.stdout:2/407: rmdir d0/d3 39 2026-03-10T14:07:33.681 INFO:tasks.workunit.client.1.vm04.stdout:2/408: write d0/d3/d8/f42 [7634832,64724] 0 2026-03-10T14:07:33.693 INFO:tasks.workunit.client.1.vm04.stdout:2/409: creat d0/d3/d8/dd/d26/d46/d4b/d56/f7b x:0 0 0 2026-03-10T14:07:33.701 INFO:tasks.workunit.client.1.vm04.stdout:3/353: link da/d3e/c51 da/dc/d3f/d59/c7d 0 2026-03-10T14:07:33.702 INFO:tasks.workunit.client.1.vm04.stdout:2/410: creat d0/d3/d4a/f7c x:0 0 0 2026-03-10T14:07:33.702 INFO:tasks.workunit.client.1.vm04.stdout:3/354: chown da/dc/d35/d52/d53/c7b 18 1 2026-03-10T14:07:33.704 INFO:tasks.workunit.client.1.vm04.stdout:2/411: read d0/d14/d1b/f29 [2496237,97411] 0 2026-03-10T14:07:33.728 INFO:tasks.workunit.client.1.vm04.stdout:6/290: getdents d3/de/d35/d3a 0 2026-03-10T14:07:33.734 INFO:tasks.workunit.client.1.vm04.stdout:3/355: write da/dc/f2a [749735,77959] 0 2026-03-10T14:07:33.735 INFO:tasks.workunit.client.1.vm04.stdout:0/259: write d0/d2/fd [1419629,94062] 0 2026-03-10T14:07:33.739 INFO:tasks.workunit.client.1.vm04.stdout:3/356: creat da/dc/d3f/f7e x:0 0 0 2026-03-10T14:07:33.739 INFO:tasks.workunit.client.1.vm04.stdout:3/357: stat da/d30 0 2026-03-10T14:07:33.744 INFO:tasks.workunit.client.1.vm04.stdout:6/291: read d3/de/d35/d3f/d2d/d32/f1f [2237541,24526] 0 2026-03-10T14:07:33.744 INFO:tasks.workunit.client.1.vm04.stdout:3/358: symlink da/dc/l7f 0 2026-03-10T14:07:33.745 INFO:tasks.workunit.client.1.vm04.stdout:3/359: chown da/f22 5606 1 2026-03-10T14:07:33.750 INFO:tasks.workunit.client.1.vm04.stdout:3/360: creat da/dc/d3f/d54/d66/f80 x:0 0 0 2026-03-10T14:07:33.758 INFO:tasks.workunit.client.1.vm04.stdout:3/361: symlink da/dc/d3f/l81 0 2026-03-10T14:07:33.875 INFO:tasks.workunit.client.1.vm04.stdout:2/412: dread d0/d3/d8/f30 [0,4194304] 0 2026-03-10T14:07:33.877 INFO:tasks.workunit.client.1.vm04.stdout:2/413: creat d0/d3/d4a/d66/f7d x:0 0 0 2026-03-10T14:07:33.919 INFO:tasks.workunit.client.1.vm04.stdout:4/336: dwrite d4/df/d31/f55 [0,4194304] 0 2026-03-10T14:07:33.921 INFO:tasks.workunit.client.1.vm04.stdout:4/337: write d4/f2c [726049,100757] 0 2026-03-10T14:07:33.922 INFO:tasks.workunit.client.1.vm04.stdout:4/338: creat d4/df/d34/f7c x:0 0 0 2026-03-10T14:07:33.931 INFO:tasks.workunit.client.1.vm04.stdout:4/339: mknod d4/df/d31/c7d 0 2026-03-10T14:07:33.931 INFO:tasks.workunit.client.1.vm04.stdout:4/340: rmdir d4/df/d22/d47/d4f 39 2026-03-10T14:07:33.941 INFO:tasks.workunit.client.1.vm04.stdout:4/341: rename d4/df/d22/d47/l58 to d4/df/d22/d47/d70/l7e 0 2026-03-10T14:07:33.951 INFO:tasks.workunit.client.1.vm04.stdout:4/342: rename d4/d14/d64/c65 to d4/d14/d3c/d5e/c7f 0 2026-03-10T14:07:33.951 INFO:tasks.workunit.client.1.vm04.stdout:4/343: stat d4/d14/d3c/d5e/f79 0 2026-03-10T14:07:33.952 INFO:tasks.workunit.client.1.vm04.stdout:4/344: unlink d4/d14/c51 0 2026-03-10T14:07:33.953 INFO:tasks.workunit.client.1.vm04.stdout:4/345: chown d4/df/d22/d47/d70 1128 1 2026-03-10T14:07:33.958 INFO:tasks.workunit.client.1.vm04.stdout:4/346: symlink d4/df/l80 0 2026-03-10T14:07:33.959 INFO:tasks.workunit.client.1.vm04.stdout:4/347: chown d4/df/l1c 110 1 2026-03-10T14:07:33.968 INFO:tasks.workunit.client.1.vm04.stdout:4/348: dwrite d4/d14/d3c/d5e/f7b [0,4194304] 0 2026-03-10T14:07:33.969 INFO:tasks.workunit.client.1.vm04.stdout:4/349: chown d4/d14/d1b/f26 2644508 1 2026-03-10T14:07:34.029 INFO:tasks.workunit.client.1.vm04.stdout:4/350: dread d4/d14/d1b/f2f [0,4194304] 0 2026-03-10T14:07:34.068 INFO:tasks.workunit.client.1.vm04.stdout:9/303: dwrite d9/da/dd/f31 [0,4194304] 0 2026-03-10T14:07:34.083 INFO:tasks.workunit.client.1.vm04.stdout:9/304: rmdir d9/d1d 39 2026-03-10T14:07:34.111 INFO:tasks.workunit.client.1.vm04.stdout:2/414: sync 2026-03-10T14:07:34.111 INFO:tasks.workunit.client.1.vm04.stdout:4/351: sync 2026-03-10T14:07:34.112 INFO:tasks.workunit.client.1.vm04.stdout:4/352: write d4/f5f [4882801,24770] 0 2026-03-10T14:07:34.132 INFO:tasks.workunit.client.1.vm04.stdout:7/361: dwrite d2/d2a/d42/f4e [0,4194304] 0 2026-03-10T14:07:34.142 INFO:tasks.workunit.client.1.vm04.stdout:4/353: dread d4/d14/d1b/f26 [0,4194304] 0 2026-03-10T14:07:34.148 INFO:tasks.workunit.client.1.vm04.stdout:7/362: dread d2/d2a/f35 [0,4194304] 0 2026-03-10T14:07:34.154 INFO:tasks.workunit.client.1.vm04.stdout:7/363: dwrite d2/dc/de/d2d/d60/d7c/d64/f65 [0,4194304] 0 2026-03-10T14:07:34.161 INFO:tasks.workunit.client.1.vm04.stdout:7/364: creat d2/dc/de/d2d/d60/d7c/d36/f87 x:0 0 0 2026-03-10T14:07:34.162 INFO:tasks.workunit.client.1.vm04.stdout:7/365: chown d2/dc/de/d2d/d60/d7c/d3b 116815856 1 2026-03-10T14:07:34.164 INFO:tasks.workunit.client.1.vm04.stdout:7/366: fdatasync d2/dc/de/f31 0 2026-03-10T14:07:34.170 INFO:tasks.workunit.client.1.vm04.stdout:7/367: dwrite d2/d28/f7e [0,4194304] 0 2026-03-10T14:07:34.197 INFO:tasks.workunit.client.1.vm04.stdout:1/332: dwrite f1 [0,4194304] 0 2026-03-10T14:07:34.200 INFO:tasks.workunit.client.1.vm04.stdout:1/333: mkdir d3/d5c/d79 0 2026-03-10T14:07:34.207 INFO:tasks.workunit.client.1.vm04.stdout:1/334: symlink d3/d22/d2f/d57/l7a 0 2026-03-10T14:07:34.230 INFO:tasks.workunit.client.1.vm04.stdout:5/406: write d7/d26/f30 [1143673,78755] 0 2026-03-10T14:07:34.233 INFO:tasks.workunit.client.1.vm04.stdout:5/407: mknod d7/d2d/d32/d34/c80 0 2026-03-10T14:07:34.235 INFO:tasks.workunit.client.1.vm04.stdout:5/408: creat d7/d26/d6b/d6e/f81 x:0 0 0 2026-03-10T14:07:34.236 INFO:tasks.workunit.client.1.vm04.stdout:5/409: write d7/d2d/d32/d34/f4e [1707952,76549] 0 2026-03-10T14:07:34.240 INFO:tasks.workunit.client.1.vm04.stdout:5/410: mkdir d7/d26/d6b/d6e/d82 0 2026-03-10T14:07:34.240 INFO:tasks.workunit.client.1.vm04.stdout:5/411: dread - d7/d9/f75 zero size 2026-03-10T14:07:34.253 INFO:tasks.workunit.client.1.vm04.stdout:5/412: getdents d7/d12/d2b/d3e/d57/d77 0 2026-03-10T14:07:34.269 INFO:tasks.workunit.client.1.vm04.stdout:5/413: dread d7/d9/f28 [0,4194304] 0 2026-03-10T14:07:34.294 INFO:tasks.workunit.client.1.vm04.stdout:8/347: dwrite d0/f42 [0,4194304] 0 2026-03-10T14:07:34.298 INFO:tasks.workunit.client.1.vm04.stdout:8/348: creat d0/d3/f6b x:0 0 0 2026-03-10T14:07:34.300 INFO:tasks.workunit.client.1.vm04.stdout:8/349: write d0/d3/d63/f57 [2229323,26789] 0 2026-03-10T14:07:34.301 INFO:tasks.workunit.client.1.vm04.stdout:8/350: write d0/f42 [1346097,119934] 0 2026-03-10T14:07:34.302 INFO:tasks.workunit.client.1.vm04.stdout:8/351: chown d0/d3/d63/l59 11 1 2026-03-10T14:07:34.308 INFO:tasks.workunit.client.1.vm04.stdout:8/352: dread d0/d3/d5/f15 [0,4194304] 0 2026-03-10T14:07:34.321 INFO:tasks.workunit.client.1.vm04.stdout:8/353: link d0/l5e d0/d3/d63/d12/d51/d67/l6c 0 2026-03-10T14:07:34.325 INFO:tasks.workunit.client.1.vm04.stdout:8/354: rename d0/d3/d63/d29/f4d to d0/d3/d5/f6d 0 2026-03-10T14:07:34.326 INFO:tasks.workunit.client.1.vm04.stdout:5/414: sync 2026-03-10T14:07:34.330 INFO:tasks.workunit.client.1.vm04.stdout:5/415: write d7/f24 [334646,7958] 0 2026-03-10T14:07:34.332 INFO:tasks.workunit.client.1.vm04.stdout:8/355: dread d0/f42 [0,4194304] 0 2026-03-10T14:07:34.333 INFO:tasks.workunit.client.1.vm04.stdout:8/356: fdatasync d0/d3/d63/d12/f47 0 2026-03-10T14:07:34.340 INFO:tasks.workunit.client.1.vm04.stdout:8/357: creat d0/d3/d63/d12/d51/d67/f6e x:0 0 0 2026-03-10T14:07:34.342 INFO:tasks.workunit.client.1.vm04.stdout:5/416: read d7/d12/d2b/f4d [3956236,46463] 0 2026-03-10T14:07:34.361 INFO:tasks.workunit.client.1.vm04.stdout:8/358: sync 2026-03-10T14:07:34.361 INFO:tasks.workunit.client.1.vm04.stdout:8/359: stat d0/l4e 0 2026-03-10T14:07:34.405 INFO:tasks.workunit.client.1.vm04.stdout:3/362: dread da/dc/d35/d52/f5a [0,4194304] 0 2026-03-10T14:07:34.422 INFO:tasks.workunit.client.1.vm04.stdout:3/363: creat da/dc/d3f/d54/f82 x:0 0 0 2026-03-10T14:07:34.422 INFO:tasks.workunit.client.1.vm04.stdout:3/364: write da/dc/d35/f6a [239780,109761] 0 2026-03-10T14:07:34.456 INFO:tasks.workunit.client.1.vm04.stdout:3/365: rename da/d30/f6c to da/dc/d3f/f83 0 2026-03-10T14:07:34.460 INFO:tasks.workunit.client.1.vm04.stdout:3/366: symlink da/dc/d35/d52/d6d/l84 0 2026-03-10T14:07:34.468 INFO:tasks.workunit.client.1.vm04.stdout:3/367: link da/dc/d35/d37/f5e da/dc/d3f/f85 0 2026-03-10T14:07:34.475 INFO:tasks.workunit.client.1.vm04.stdout:3/368: truncate da/dc/d35/f64 1115949 0 2026-03-10T14:07:34.496 INFO:tasks.workunit.client.1.vm04.stdout:3/369: link da/dc/d3f/l4b da/dc/d47/l86 0 2026-03-10T14:07:34.500 INFO:tasks.workunit.client.1.vm04.stdout:3/370: write da/f14 [4081230,104640] 0 2026-03-10T14:07:34.507 INFO:tasks.workunit.client.1.vm04.stdout:3/371: mknod da/dc/d3f/d61/c87 0 2026-03-10T14:07:34.508 INFO:tasks.workunit.client.1.vm04.stdout:3/372: stat da/dc/d3f/d54/d66 0 2026-03-10T14:07:34.634 INFO:tasks.workunit.client.1.vm04.stdout:3/373: sync 2026-03-10T14:07:34.695 INFO:tasks.workunit.client.1.vm04.stdout:3/374: rename da/dc/d35/d37/c48 to da/dc/d35/d52/d70/c88 0 2026-03-10T14:07:34.701 INFO:tasks.workunit.client.1.vm04.stdout:0/260: dwrite d0/d2/f13 [0,4194304] 0 2026-03-10T14:07:34.710 INFO:tasks.workunit.client.1.vm04.stdout:0/261: dread d0/d1c/f2c [0,4194304] 0 2026-03-10T14:07:34.731 INFO:tasks.workunit.client.1.vm04.stdout:0/262: unlink d0/d2/c4b 0 2026-03-10T14:07:34.737 INFO:tasks.workunit.client.1.vm04.stdout:0/263: creat d0/d2/d15/d49/d50/f53 x:0 0 0 2026-03-10T14:07:34.743 INFO:tasks.workunit.client.1.vm04.stdout:0/264: read - d0/d2/d25/f3f zero size 2026-03-10T14:07:34.744 INFO:tasks.workunit.client.1.vm04.stdout:0/265: chown d0/d2/d15/f2f 0 1 2026-03-10T14:07:34.746 INFO:tasks.workunit.client.1.vm04.stdout:0/266: write d0/d2/f13 [4421989,47301] 0 2026-03-10T14:07:34.752 INFO:tasks.workunit.client.1.vm04.stdout:0/267: creat d0/d1c/f54 x:0 0 0 2026-03-10T14:07:34.753 INFO:tasks.workunit.client.1.vm04.stdout:0/268: dread - d0/d2/d15/d49/d50/f53 zero size 2026-03-10T14:07:34.756 INFO:tasks.workunit.client.1.vm04.stdout:0/269: creat d0/d2/d15/d49/d50/f55 x:0 0 0 2026-03-10T14:07:34.757 INFO:tasks.workunit.client.1.vm04.stdout:0/270: readlink d0/d2/d15/d22/l36 0 2026-03-10T14:07:34.765 INFO:tasks.workunit.client.1.vm04.stdout:0/271: dwrite d0/d2/fa [4194304,4194304] 0 2026-03-10T14:07:34.788 INFO:tasks.workunit.client.1.vm04.stdout:0/272: chown d0/c23 12444465 1 2026-03-10T14:07:34.791 INFO:tasks.workunit.client.1.vm04.stdout:0/273: stat d0/d2/l39 0 2026-03-10T14:07:34.938 INFO:tasks.workunit.client.1.vm04.stdout:9/305: truncate d9/da/dd/f31 2962033 0 2026-03-10T14:07:34.939 INFO:tasks.workunit.client.1.vm04.stdout:9/306: symlink d9/da/dd/l6a 0 2026-03-10T14:07:34.942 INFO:tasks.workunit.client.1.vm04.stdout:9/307: creat d9/da/f6b x:0 0 0 2026-03-10T14:07:34.943 INFO:tasks.workunit.client.1.vm04.stdout:9/308: fsync f5 0 2026-03-10T14:07:34.943 INFO:tasks.workunit.client.1.vm04.stdout:9/309: chown d9/d44/d4d/c68 49742863 1 2026-03-10T14:07:34.945 INFO:tasks.workunit.client.1.vm04.stdout:9/310: creat d9/d5c/f6c x:0 0 0 2026-03-10T14:07:35.013 INFO:tasks.workunit.client.1.vm04.stdout:2/415: dwrite d0/d3/f9 [0,4194304] 0 2026-03-10T14:07:35.105 INFO:tasks.workunit.client.1.vm04.stdout:7/368: dwrite d2/dc/de/d2d/d60/d7c/d64/f65 [4194304,4194304] 0 2026-03-10T14:07:35.127 INFO:tasks.workunit.client.1.vm04.stdout:4/354: dwrite d4/fa [0,4194304] 0 2026-03-10T14:07:35.132 INFO:tasks.workunit.client.1.vm04.stdout:4/355: mknod d4/df/c81 0 2026-03-10T14:07:35.140 INFO:tasks.workunit.client.1.vm04.stdout:4/356: getdents d4/df/d22/d47/d4f 0 2026-03-10T14:07:35.303 INFO:tasks.workunit.client.1.vm04.stdout:7/369: sync 2026-03-10T14:07:35.305 INFO:tasks.workunit.client.1.vm04.stdout:7/370: symlink d2/dc/de/d2d/d60/d7c/d3b/l88 0 2026-03-10T14:07:35.315 INFO:tasks.workunit.client.1.vm04.stdout:7/371: dread d2/f4 [0,4194304] 0 2026-03-10T14:07:35.325 INFO:tasks.workunit.client.1.vm04.stdout:1/335: write d3/d22/d63/f69 [1411409,17756] 0 2026-03-10T14:07:35.338 INFO:tasks.workunit.client.1.vm04.stdout:7/372: dread d2/dc/de/d2d/d60/d7c/f70 [0,4194304] 0 2026-03-10T14:07:35.347 INFO:tasks.workunit.client.1.vm04.stdout:7/373: creat d2/dc/f89 x:0 0 0 2026-03-10T14:07:35.347 INFO:tasks.workunit.client.1.vm04.stdout:7/374: chown d2/dc/de/f73 3211 1 2026-03-10T14:07:35.373 INFO:tasks.workunit.client.1.vm04.stdout:8/360: getdents d0/d3/d63/d12/d51/d67 0 2026-03-10T14:07:35.376 INFO:tasks.workunit.client.1.vm04.stdout:8/361: unlink d0/d3/d63/c68 0 2026-03-10T14:07:35.379 INFO:tasks.workunit.client.1.vm04.stdout:8/362: rmdir d0/d3/dd/d33 39 2026-03-10T14:07:35.389 INFO:tasks.workunit.client.1.vm04.stdout:8/363: rename d0/d3/d5/c37 to d0/d3/d63/d12/d51/c6f 0 2026-03-10T14:07:35.476 INFO:tasks.workunit.client.1.vm04.stdout:8/364: sync 2026-03-10T14:07:35.478 INFO:tasks.workunit.client.1.vm04.stdout:8/365: write d0/d3/d63/d12/d51/d67/f6e [251970,4942] 0 2026-03-10T14:07:35.482 INFO:tasks.workunit.client.1.vm04.stdout:8/366: chown d0/d3 623110 1 2026-03-10T14:07:35.488 INFO:tasks.workunit.client.1.vm04.stdout:8/367: creat d0/d3/d5/f70 x:0 0 0 2026-03-10T14:07:35.494 INFO:tasks.workunit.client.1.vm04.stdout:8/368: dread d0/d3/d63/f5b [0,4194304] 0 2026-03-10T14:07:35.495 INFO:tasks.workunit.client.1.vm04.stdout:8/369: chown d0/d3/d63/d12 3927 1 2026-03-10T14:07:35.495 INFO:tasks.workunit.client.1.vm04.stdout:8/370: chown d0/d3/d63/le 195971 1 2026-03-10T14:07:35.503 INFO:tasks.workunit.client.1.vm04.stdout:8/371: dwrite d0/d3/dd/f5d [0,4194304] 0 2026-03-10T14:07:35.511 INFO:tasks.workunit.client.1.vm04.stdout:8/372: link d0/d3/d63/d12/d51/f4f d0/d3/dd/d33/f71 0 2026-03-10T14:07:35.527 INFO:tasks.workunit.client.1.vm04.stdout:8/373: link d0/d3/d63/d12/l5c d0/d3/l72 0 2026-03-10T14:07:35.530 INFO:tasks.workunit.client.1.vm04.stdout:8/374: truncate d0/d3/d63/d29/f45 4446098 0 2026-03-10T14:07:35.577 INFO:tasks.workunit.client.1.vm04.stdout:6/292: write d3/ff [4794205,43714] 0 2026-03-10T14:07:35.583 INFO:tasks.workunit.client.1.vm04.stdout:3/375: read da/dc/d35/f64 [368169,95561] 0 2026-03-10T14:07:35.585 INFO:tasks.workunit.client.1.vm04.stdout:3/376: dread - da/dc/d35/d52/f69 zero size 2026-03-10T14:07:35.587 INFO:tasks.workunit.client.1.vm04.stdout:3/377: chown da/d30/f4e 866809 1 2026-03-10T14:07:35.592 INFO:tasks.workunit.client.1.vm04.stdout:1/336: truncate d3/d5/d13/f5a 3384761 0 2026-03-10T14:07:35.592 INFO:tasks.workunit.client.1.vm04.stdout:3/378: write da/dc/d35/d37/f77 [731298,69049] 0 2026-03-10T14:07:35.592 INFO:tasks.workunit.client.1.vm04.stdout:6/293: dwrite d3/de/d35/d3a/f51 [0,4194304] 0 2026-03-10T14:07:35.592 INFO:tasks.workunit.client.1.vm04.stdout:3/379: write da/dc/d47/f6b [268724,28340] 0 2026-03-10T14:07:35.596 INFO:tasks.workunit.client.1.vm04.stdout:6/294: mkdir d3/de/d35/d3a/d43/d52 0 2026-03-10T14:07:35.596 INFO:tasks.workunit.client.1.vm04.stdout:3/380: symlink da/dc/d3f/d61/l89 0 2026-03-10T14:07:35.601 INFO:tasks.workunit.client.1.vm04.stdout:6/295: write d3/f4 [4942005,116358] 0 2026-03-10T14:07:35.619 INFO:tasks.workunit.client.1.vm04.stdout:3/381: dread da/fd [0,4194304] 0 2026-03-10T14:07:35.621 INFO:tasks.workunit.client.1.vm04.stdout:3/382: mkdir da/dc/d35/d52/d6d/d8a 0 2026-03-10T14:07:35.622 INFO:tasks.workunit.client.1.vm04.stdout:3/383: creat da/d3e/f8b x:0 0 0 2026-03-10T14:07:35.624 INFO:tasks.workunit.client.1.vm04.stdout:3/384: creat da/dc/d3f/d61/f8c x:0 0 0 2026-03-10T14:07:35.632 INFO:tasks.workunit.client.1.vm04.stdout:3/385: dread da/dc/f39 [0,4194304] 0 2026-03-10T14:07:35.763 INFO:tasks.workunit.client.1.vm04.stdout:1/337: sync 2026-03-10T14:07:35.763 INFO:tasks.workunit.client.1.vm04.stdout:3/386: sync 2026-03-10T14:07:35.768 INFO:tasks.workunit.client.1.vm04.stdout:1/338: chown d3/d22/d2f/f5d 1 1 2026-03-10T14:07:35.770 INFO:tasks.workunit.client.1.vm04.stdout:1/339: mkdir d3/d5/d13/d38/d58/d5b/d7b 0 2026-03-10T14:07:35.781 INFO:tasks.workunit.client.1.vm04.stdout:1/340: dwrite d3/d22/d2f/d57/f68 [0,4194304] 0 2026-03-10T14:07:35.812 INFO:tasks.workunit.client.1.vm04.stdout:3/387: sync 2026-03-10T14:07:35.841 INFO:tasks.workunit.client.1.vm04.stdout:3/388: getdents da/dc/d35 0 2026-03-10T14:07:35.847 INFO:tasks.workunit.client.1.vm04.stdout:3/389: mknod da/dc/d35/d52/d6d/d8a/c8d 0 2026-03-10T14:07:35.854 INFO:tasks.workunit.client.1.vm04.stdout:3/390: rename da/dc/d3f/d59 to da/d8e 0 2026-03-10T14:07:35.867 INFO:tasks.workunit.client.1.vm04.stdout:3/391: dwrite da/d30/f32 [0,4194304] 0 2026-03-10T14:07:35.875 INFO:tasks.workunit.client.1.vm04.stdout:3/392: dwrite da/dc/d3f/d61/f8c [0,4194304] 0 2026-03-10T14:07:35.879 INFO:tasks.workunit.client.1.vm04.stdout:3/393: chown da/dc/d3f/d54/l7c 15997 1 2026-03-10T14:07:35.884 INFO:tasks.workunit.client.1.vm04.stdout:3/394: creat da/dc/d35/d52/d70/f8f x:0 0 0 2026-03-10T14:07:35.885 INFO:tasks.workunit.client.1.vm04.stdout:3/395: write da/f14 [3124416,97902] 0 2026-03-10T14:07:35.891 INFO:tasks.workunit.client.1.vm04.stdout:3/396: dwrite da/dc/d35/d52/f72 [0,4194304] 0 2026-03-10T14:07:35.892 INFO:tasks.workunit.client.1.vm04.stdout:3/397: chown da/d3e/f63 450909100 1 2026-03-10T14:07:36.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:35 vm04.local ceph-mon[55966]: pgmap v146: 65 pgs: 65 active+clean; 826 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 19 MiB/s rd, 109 MiB/s wr, 469 op/s 2026-03-10T14:07:36.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:35 vm03.local ceph-mon[49718]: pgmap v146: 65 pgs: 65 active+clean; 826 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 19 MiB/s rd, 109 MiB/s wr, 469 op/s 2026-03-10T14:07:36.121 INFO:tasks.workunit.client.1.vm04.stdout:9/311: dwrite d9/d1d/f1f [0,4194304] 0 2026-03-10T14:07:36.124 INFO:tasks.workunit.client.1.vm04.stdout:9/312: write d9/d5c/f6c [979287,29686] 0 2026-03-10T14:07:36.178 INFO:tasks.workunit.client.1.vm04.stdout:2/416: write d0/d3/d4a/f57 [1238436,5449] 0 2026-03-10T14:07:36.283 INFO:tasks.workunit.client.1.vm04.stdout:4/357: write d4/d14/d3c/f41 [155557,45636] 0 2026-03-10T14:07:36.288 INFO:tasks.workunit.client.1.vm04.stdout:4/358: dwrite d4/df/d34/f69 [0,4194304] 0 2026-03-10T14:07:36.289 INFO:tasks.workunit.client.1.vm04.stdout:4/359: readlink d4/l7 0 2026-03-10T14:07:36.291 INFO:tasks.workunit.client.1.vm04.stdout:4/360: mknod d4/d14/d64/c82 0 2026-03-10T14:07:36.292 INFO:tasks.workunit.client.1.vm04.stdout:4/361: chown d4/d14/d3c/c42 1023870 1 2026-03-10T14:07:36.300 INFO:tasks.workunit.client.1.vm04.stdout:4/362: fdatasync d4/df/d22/f2b 0 2026-03-10T14:07:36.300 INFO:tasks.workunit.client.1.vm04.stdout:4/363: chown d4/fe 4 1 2026-03-10T14:07:36.313 INFO:tasks.workunit.client.1.vm04.stdout:4/364: fdatasync d4/df/d22/f37 0 2026-03-10T14:07:36.316 INFO:tasks.workunit.client.1.vm04.stdout:4/365: fsync d4/df/f60 0 2026-03-10T14:07:36.317 INFO:tasks.workunit.client.1.vm04.stdout:4/366: symlink d4/df/d34/l83 0 2026-03-10T14:07:36.319 INFO:tasks.workunit.client.1.vm04.stdout:4/367: fsync d4/f9 0 2026-03-10T14:07:36.320 INFO:tasks.workunit.client.1.vm04.stdout:4/368: truncate d4/d14/d1b/f20 4181634 0 2026-03-10T14:07:36.605 INFO:tasks.workunit.client.1.vm04.stdout:1/341: unlink d3/d22/d63/l51 0 2026-03-10T14:07:36.610 INFO:tasks.workunit.client.1.vm04.stdout:1/342: creat d3/d5/d13/d38/d58/d5b/f7c x:0 0 0 2026-03-10T14:07:36.621 INFO:tasks.workunit.client.1.vm04.stdout:1/343: dwrite d3/d22/d63/f72 [0,4194304] 0 2026-03-10T14:07:36.630 INFO:tasks.workunit.client.1.vm04.stdout:1/344: dwrite d3/d5c/f71 [0,4194304] 0 2026-03-10T14:07:36.633 INFO:tasks.workunit.client.1.vm04.stdout:1/345: creat d3/d22/d6d/f7d x:0 0 0 2026-03-10T14:07:36.636 INFO:tasks.workunit.client.1.vm04.stdout:1/346: write d3/d22/d63/f72 [3224405,58413] 0 2026-03-10T14:07:36.638 INFO:tasks.workunit.client.1.vm04.stdout:1/347: write d3/d20/f27 [4917150,28101] 0 2026-03-10T14:07:36.846 INFO:tasks.workunit.client.1.vm04.stdout:7/375: getdents d2/dc 0 2026-03-10T14:07:36.848 INFO:tasks.workunit.client.1.vm04.stdout:7/376: read - d2/dc/de/d2d/d38/f57 zero size 2026-03-10T14:07:36.857 INFO:tasks.workunit.client.1.vm04.stdout:7/377: creat d2/dc/de/d2d/d38/f8a x:0 0 0 2026-03-10T14:07:36.862 INFO:tasks.workunit.client.1.vm04.stdout:7/378: mkdir d2/dc/de/d2d/d60/d7c/d36/d8b 0 2026-03-10T14:07:36.981 INFO:tasks.workunit.client.1.vm04.stdout:8/375: truncate d0/d3/d63/f3c 20359 0 2026-03-10T14:07:36.983 INFO:tasks.workunit.client.1.vm04.stdout:8/376: mkdir d0/d3/d73 0 2026-03-10T14:07:36.985 INFO:tasks.workunit.client.1.vm04.stdout:8/377: write d0/d3/d5/f66 [428813,51240] 0 2026-03-10T14:07:36.989 INFO:tasks.workunit.client.1.vm04.stdout:8/378: fdatasync d0/d3/d63/f5b 0 2026-03-10T14:07:37.004 INFO:tasks.workunit.client.1.vm04.stdout:8/379: dread d0/d3/dd/f5d [0,4194304] 0 2026-03-10T14:07:37.005 INFO:tasks.workunit.client.1.vm04.stdout:8/380: chown d0/d3/d63/d12/f2c 1624 1 2026-03-10T14:07:37.011 INFO:tasks.workunit.client.1.vm04.stdout:8/381: link d0/d3/d63/c4c d0/d3/d63/c74 0 2026-03-10T14:07:37.012 INFO:tasks.workunit.client.1.vm04.stdout:8/382: write d0/d3/f6b [1047061,92027] 0 2026-03-10T14:07:37.013 INFO:tasks.workunit.client.1.vm04.stdout:8/383: truncate d0/d3/d5/f66 916977 0 2026-03-10T14:07:37.014 INFO:tasks.workunit.client.1.vm04.stdout:8/384: chown d0/l3a 1 1 2026-03-10T14:07:37.019 INFO:tasks.workunit.client.1.vm04.stdout:8/385: mkdir d0/d75 0 2026-03-10T14:07:37.035 INFO:tasks.workunit.client.1.vm04.stdout:8/386: mkdir d0/d3/dd/d76 0 2026-03-10T14:07:37.047 INFO:tasks.workunit.client.1.vm04.stdout:0/274: rmdir d0/d2/d15/d22 39 2026-03-10T14:07:37.070 INFO:tasks.workunit.client.1.vm04.stdout:0/275: mkdir d0/d2/d15/d22/d38/d56 0 2026-03-10T14:07:37.075 INFO:tasks.workunit.client.1.vm04.stdout:0/276: link d0/d2/d25/f45 d0/d2/d15/f57 0 2026-03-10T14:07:37.077 INFO:tasks.workunit.client.1.vm04.stdout:0/277: creat d0/d2/d15/d22/d38/d56/f58 x:0 0 0 2026-03-10T14:07:37.086 INFO:tasks.workunit.client.1.vm04.stdout:0/278: chown d0/d2/c37 170568152 1 2026-03-10T14:07:37.096 INFO:tasks.workunit.client.1.vm04.stdout:6/296: write d3/f2a [125530,2976] 0 2026-03-10T14:07:37.099 INFO:tasks.workunit.client.1.vm04.stdout:6/297: write d3/de/d35/d3f/f3b [562278,109399] 0 2026-03-10T14:07:37.115 INFO:tasks.workunit.client.1.vm04.stdout:3/398: rename da/d3e/f8b to da/dc/f90 0 2026-03-10T14:07:37.117 INFO:tasks.workunit.client.1.vm04.stdout:3/399: mknod da/c91 0 2026-03-10T14:07:37.118 INFO:tasks.workunit.client.1.vm04.stdout:3/400: write da/dc/f1d [2209387,96244] 0 2026-03-10T14:07:37.127 INFO:tasks.workunit.client.1.vm04.stdout:3/401: dwrite da/dc/d35/d52/f79 [4194304,4194304] 0 2026-03-10T14:07:37.130 INFO:tasks.workunit.client.1.vm04.stdout:3/402: write da/dc/d35/f5b [1200299,94203] 0 2026-03-10T14:07:37.135 INFO:tasks.workunit.client.1.vm04.stdout:3/403: write da/fe [2531239,4943] 0 2026-03-10T14:07:37.143 INFO:tasks.workunit.client.1.vm04.stdout:3/404: dwrite da/d3e/f63 [0,4194304] 0 2026-03-10T14:07:37.201 INFO:tasks.workunit.client.1.vm04.stdout:2/417: truncate d0/d3/d8/f30 3607250 0 2026-03-10T14:07:37.227 INFO:tasks.workunit.client.1.vm04.stdout:9/313: rename d9/d44/l54 to d9/d1d/l6d 0 2026-03-10T14:07:37.240 INFO:tasks.workunit.client.1.vm04.stdout:4/369: rename d4/d14/d1b/f26 to d4/df/d22/d47/d4f/f84 0 2026-03-10T14:07:37.241 INFO:tasks.workunit.client.1.vm04.stdout:9/314: fdatasync d9/da/dd/d1c/f3b 0 2026-03-10T14:07:37.242 INFO:tasks.workunit.client.1.vm04.stdout:4/370: write d4/fa [941427,28828] 0 2026-03-10T14:07:37.254 INFO:tasks.workunit.client.1.vm04.stdout:9/315: dread - d9/d58/f5e zero size 2026-03-10T14:07:37.254 INFO:tasks.workunit.client.1.vm04.stdout:4/371: mkdir d4/d14/d3c/d85 0 2026-03-10T14:07:37.264 INFO:tasks.workunit.client.1.vm04.stdout:9/316: rename d9/d5c/f63 to d9/d5c/f6e 0 2026-03-10T14:07:37.276 INFO:tasks.workunit.client.1.vm04.stdout:7/379: rename d2/dc/de/d2d/d60/d7c/l6f to d2/dc/de/d2d/d5c/l8c 0 2026-03-10T14:07:37.281 INFO:tasks.workunit.client.1.vm04.stdout:8/387: rename d0/d3/d5/l62 to d0/d3/d63/d29/l77 0 2026-03-10T14:07:37.282 INFO:tasks.workunit.client.1.vm04.stdout:7/380: creat d2/dc/f8d x:0 0 0 2026-03-10T14:07:37.282 INFO:tasks.workunit.client.1.vm04.stdout:8/388: read d0/d3/d63/d12/f2c [224381,90289] 0 2026-03-10T14:07:37.283 INFO:tasks.workunit.client.1.vm04.stdout:8/389: fdatasync d0/d3/d63/d12/d51/d67/f6e 0 2026-03-10T14:07:37.283 INFO:tasks.workunit.client.1.vm04.stdout:8/390: fsync d0/f1b 0 2026-03-10T14:07:37.286 INFO:tasks.workunit.client.1.vm04.stdout:6/298: rename d3/de/f13 to d3/de/d35/d3a/d43/d4c/f53 0 2026-03-10T14:07:37.287 INFO:tasks.workunit.client.1.vm04.stdout:6/299: fsync d3/f19 0 2026-03-10T14:07:37.293 INFO:tasks.workunit.client.1.vm04.stdout:8/391: dwrite d0/d3/d63/d12/d51/d67/f6e [0,4194304] 0 2026-03-10T14:07:37.306 INFO:tasks.workunit.client.1.vm04.stdout:6/300: mknod d3/de/d35/d3a/c54 0 2026-03-10T14:07:37.314 INFO:tasks.workunit.client.1.vm04.stdout:7/381: creat d2/dc/de/d2d/d5c/f8e x:0 0 0 2026-03-10T14:07:37.317 INFO:tasks.workunit.client.1.vm04.stdout:6/301: creat d3/d1d/f55 x:0 0 0 2026-03-10T14:07:37.320 INFO:tasks.workunit.client.1.vm04.stdout:6/302: dread - d3/de/f46 zero size 2026-03-10T14:07:37.332 INFO:tasks.workunit.client.1.vm04.stdout:6/303: rename d3/d8 to d3/de/d35/d3a/d43/d56 0 2026-03-10T14:07:37.358 INFO:tasks.workunit.client.1.vm04.stdout:6/304: creat d3/f57 x:0 0 0 2026-03-10T14:07:37.410 INFO:tasks.workunit.client.1.vm04.stdout:1/348: truncate d3/fc 121039 0 2026-03-10T14:07:37.412 INFO:tasks.workunit.client.1.vm04.stdout:1/349: symlink d3/d5/d13/d38/l7e 0 2026-03-10T14:07:37.414 INFO:tasks.workunit.client.1.vm04.stdout:1/350: truncate d3/d5/d13/d38/f3d 1005742 0 2026-03-10T14:07:37.416 INFO:tasks.workunit.client.1.vm04.stdout:1/351: creat d3/d22/d63/f7f x:0 0 0 2026-03-10T14:07:37.423 INFO:tasks.workunit.client.1.vm04.stdout:5/417: creat d7/d12/f83 x:0 0 0 2026-03-10T14:07:37.445 INFO:tasks.workunit.client.1.vm04.stdout:5/418: dwrite d7/d12/f42 [4194304,4194304] 0 2026-03-10T14:07:37.445 INFO:tasks.workunit.client.1.vm04.stdout:1/352: dread d3/f18 [0,4194304] 0 2026-03-10T14:07:37.473 INFO:tasks.workunit.client.1.vm04.stdout:5/419: read d7/d12/d2b/d3e/d3f/f52 [2699371,43570] 0 2026-03-10T14:07:37.481 INFO:tasks.workunit.client.1.vm04.stdout:1/353: rename d3/c2a to d3/d5/d13/d38/d58/c80 0 2026-03-10T14:07:37.481 INFO:tasks.workunit.client.1.vm04.stdout:0/279: write d0/d2/d25/f3f [79643,122560] 0 2026-03-10T14:07:37.482 INFO:tasks.workunit.client.1.vm04.stdout:0/280: chown d0/d1c/l3a 515901604 1 2026-03-10T14:07:37.486 INFO:tasks.workunit.client.1.vm04.stdout:5/420: write d7/d9/f75 [595124,64381] 0 2026-03-10T14:07:37.487 INFO:tasks.workunit.client.1.vm04.stdout:1/354: mknod d3/d5/d13/d38/d58/c81 0 2026-03-10T14:07:37.494 INFO:tasks.workunit.client.1.vm04.stdout:5/421: dwrite d7/d2d/f64 [0,4194304] 0 2026-03-10T14:07:37.495 INFO:tasks.workunit.client.1.vm04.stdout:0/281: unlink d0/d2/d15/d22/d38/l3b 0 2026-03-10T14:07:37.511 INFO:tasks.workunit.client.1.vm04.stdout:3/405: dwrite da/dc/d3f/f4d [0,4194304] 0 2026-03-10T14:07:37.541 INFO:tasks.workunit.client.1.vm04.stdout:0/282: dread - d0/d2/f34 zero size 2026-03-10T14:07:37.541 INFO:tasks.workunit.client.1.vm04.stdout:3/406: link da/dc/d35/d37/c38 da/dc/d35/c92 0 2026-03-10T14:07:37.542 INFO:tasks.workunit.client.1.vm04.stdout:3/407: mknod da/dc/d35/c93 0 2026-03-10T14:07:37.542 INFO:tasks.workunit.client.1.vm04.stdout:3/408: creat da/dc/d3f/d61/f94 x:0 0 0 2026-03-10T14:07:37.542 INFO:tasks.workunit.client.1.vm04.stdout:8/392: truncate d0/d3/d63/d12/d51/d67/f6e 1440283 0 2026-03-10T14:07:37.542 INFO:tasks.workunit.client.1.vm04.stdout:8/393: mkdir d0/d3/dd/d78 0 2026-03-10T14:07:37.542 INFO:tasks.workunit.client.1.vm04.stdout:7/382: dwrite d2/dc/d4d/f74 [0,4194304] 0 2026-03-10T14:07:37.542 INFO:tasks.workunit.client.1.vm04.stdout:8/394: chown d0/d3/d5/f6d 124 1 2026-03-10T14:07:37.547 INFO:tasks.workunit.client.1.vm04.stdout:8/395: creat d0/d3/dd/d76/f79 x:0 0 0 2026-03-10T14:07:37.550 INFO:tasks.workunit.client.1.vm04.stdout:8/396: write d0/d3/dd/f5d [513414,115627] 0 2026-03-10T14:07:37.583 INFO:tasks.workunit.client.1.vm04.stdout:4/372: sync 2026-03-10T14:07:37.583 INFO:tasks.workunit.client.1.vm04.stdout:2/418: sync 2026-03-10T14:07:37.584 INFO:tasks.workunit.client.1.vm04.stdout:2/419: write d0/d14/d39/d47/d70/f74 [267591,116681] 0 2026-03-10T14:07:37.586 INFO:tasks.workunit.client.1.vm04.stdout:4/373: write d4/df/d22/d47/d4f/f84 [3516625,25713] 0 2026-03-10T14:07:37.590 INFO:tasks.workunit.client.1.vm04.stdout:2/420: write d0/d14/d1b/f55 [1091934,55700] 0 2026-03-10T14:07:37.593 INFO:tasks.workunit.client.1.vm04.stdout:2/421: creat d0/d14/d39/d47/f7e x:0 0 0 2026-03-10T14:07:37.594 INFO:tasks.workunit.client.1.vm04.stdout:2/422: read - d0/d3/d4a/d66/f7d zero size 2026-03-10T14:07:37.599 INFO:tasks.workunit.client.1.vm04.stdout:9/317: sync 2026-03-10T14:07:37.603 INFO:tasks.workunit.client.1.vm04.stdout:3/409: dread da/dc/f2a [0,4194304] 0 2026-03-10T14:07:37.607 INFO:tasks.workunit.client.1.vm04.stdout:3/410: dwrite da/dc/d35/d52/d6d/f75 [0,4194304] 0 2026-03-10T14:07:37.614 INFO:tasks.workunit.client.1.vm04.stdout:6/305: dwrite d3/de/d35/d3f/f17 [0,4194304] 0 2026-03-10T14:07:37.620 INFO:tasks.workunit.client.1.vm04.stdout:3/411: mknod da/dc/d35/c95 0 2026-03-10T14:07:37.620 INFO:tasks.workunit.client.1.vm04.stdout:6/306: symlink d3/de/d35/d3f/d2d/l58 0 2026-03-10T14:07:37.626 INFO:tasks.workunit.client.1.vm04.stdout:3/412: unlink da/f65 0 2026-03-10T14:07:37.629 INFO:tasks.workunit.client.1.vm04.stdout:6/307: truncate d3/de/d35/d3a/d43/d56/fc 1527490 0 2026-03-10T14:07:37.641 INFO:tasks.workunit.client.1.vm04.stdout:3/413: stat da/d30/f55 0 2026-03-10T14:07:37.642 INFO:tasks.workunit.client.1.vm04.stdout:6/308: mknod d3/de/d35/d3a/d43/d4c/c59 0 2026-03-10T14:07:37.676 INFO:tasks.workunit.client.1.vm04.stdout:3/414: dread da/f25 [0,4194304] 0 2026-03-10T14:07:37.683 INFO:tasks.workunit.client.1.vm04.stdout:3/415: creat da/dc/d35/d52/d53/f96 x:0 0 0 2026-03-10T14:07:37.689 INFO:tasks.workunit.client.1.vm04.stdout:3/416: creat da/dc/d3f/d54/f97 x:0 0 0 2026-03-10T14:07:37.689 INFO:tasks.workunit.client.1.vm04.stdout:1/355: rename d3/d5 to d3/d20/d60/d82 0 2026-03-10T14:07:37.691 INFO:tasks.workunit.client.1.vm04.stdout:3/417: truncate da/dc/d3f/d61/f94 716601 0 2026-03-10T14:07:37.694 INFO:tasks.workunit.client.1.vm04.stdout:1/356: readlink d3/d22/d2f/d57/l59 0 2026-03-10T14:07:37.697 INFO:tasks.workunit.client.1.vm04.stdout:5/422: rename d7/d26/f56 to d7/d2d/d76/f84 0 2026-03-10T14:07:37.704 INFO:tasks.workunit.client.1.vm04.stdout:1/357: stat d3/lf 0 2026-03-10T14:07:37.705 INFO:tasks.workunit.client.1.vm04.stdout:3/418: creat da/dc/d35/d52/d53/d78/f98 x:0 0 0 2026-03-10T14:07:37.709 INFO:tasks.workunit.client.1.vm04.stdout:5/423: symlink d7/d12/d45/l85 0 2026-03-10T14:07:37.711 INFO:tasks.workunit.client.1.vm04.stdout:5/424: truncate d7/d2d/f6d 391064 0 2026-03-10T14:07:37.717 INFO:tasks.workunit.client.1.vm04.stdout:0/283: dwrite d0/f19 [0,4194304] 0 2026-03-10T14:07:37.718 INFO:tasks.workunit.client.1.vm04.stdout:8/397: rename d0/d3/d5/l53 to d0/d3/d63/d12/d69/l7a 0 2026-03-10T14:07:37.733 INFO:tasks.workunit.client.1.vm04.stdout:7/383: write d2/d28/f3d [4333726,114988] 0 2026-03-10T14:07:37.742 INFO:tasks.workunit.client.1.vm04.stdout:0/284: dread d0/d1c/f2b [0,4194304] 0 2026-03-10T14:07:37.747 INFO:tasks.workunit.client.1.vm04.stdout:9/318: dwrite d9/d58/f5e [0,4194304] 0 2026-03-10T14:07:37.753 INFO:tasks.workunit.client.1.vm04.stdout:7/384: fsync d2/dc/de/d21/f45 0 2026-03-10T14:07:37.759 INFO:tasks.workunit.client.1.vm04.stdout:0/285: dread - d0/d2/d25/f45 zero size 2026-03-10T14:07:37.759 INFO:tasks.workunit.client.1.vm04.stdout:6/309: rename d3/de/d35/d3f/f29 to d3/de/d35/d3f/d2d/d32/d23/f5a 0 2026-03-10T14:07:37.764 INFO:tasks.workunit.client.1.vm04.stdout:6/310: dread d3/de/d35/d3a/f51 [0,4194304] 0 2026-03-10T14:07:37.770 INFO:tasks.workunit.client.1.vm04.stdout:0/286: truncate d0/f1b 428067 0 2026-03-10T14:07:37.770 INFO:tasks.workunit.client.1.vm04.stdout:7/385: creat d2/dc/de/d2d/d60/d7c/f8f x:0 0 0 2026-03-10T14:07:37.782 INFO:tasks.workunit.client.1.vm04.stdout:5/425: dread d7/d12/d45/f61 [0,4194304] 0 2026-03-10T14:07:37.782 INFO:tasks.workunit.client.1.vm04.stdout:0/287: dread d0/d1c/f2c [0,4194304] 0 2026-03-10T14:07:37.782 INFO:tasks.workunit.client.1.vm04.stdout:5/426: readlink d7/d2d/d32/l40 0 2026-03-10T14:07:37.785 INFO:tasks.workunit.client.1.vm04.stdout:3/419: rename da/dc/d35/d37/f77 to da/dc/d3f/d54/d66/f99 0 2026-03-10T14:07:37.785 INFO:tasks.workunit.client.1.vm04.stdout:0/288: chown d0/c47 257604746 1 2026-03-10T14:07:37.786 INFO:tasks.workunit.client.1.vm04.stdout:3/420: chown da/d3e/c40 7 1 2026-03-10T14:07:37.786 INFO:tasks.workunit.client.1.vm04.stdout:3/421: write da/dc/d35/f5b [761125,18585] 0 2026-03-10T14:07:37.826 INFO:tasks.workunit.client.1.vm04.stdout:6/311: mknod d3/de/d35/d3f/d2d/d32/d23/d4e/c5b 0 2026-03-10T14:07:37.829 INFO:tasks.workunit.client.1.vm04.stdout:3/422: dwrite da/dc/d35/d37/f6e [0,4194304] 0 2026-03-10T14:07:37.833 INFO:tasks.workunit.client.1.vm04.stdout:7/386: truncate d2/dc/de/d2d/d60/d7c/d3b/f49 2374524 0 2026-03-10T14:07:37.846 INFO:tasks.workunit.client.1.vm04.stdout:3/423: mknod da/d8e/c9a 0 2026-03-10T14:07:37.852 INFO:tasks.workunit.client.1.vm04.stdout:5/427: link d7/d12/c15 d7/d12/d2b/c86 0 2026-03-10T14:07:37.868 INFO:tasks.workunit.client.1.vm04.stdout:7/387: symlink d2/d28/l90 0 2026-03-10T14:07:37.877 INFO:tasks.workunit.client.1.vm04.stdout:3/424: mkdir da/dc/d47/d9b 0 2026-03-10T14:07:37.892 INFO:tasks.workunit.client.1.vm04.stdout:0/289: rename d0/d2/f40 to d0/d2/d15/f59 0 2026-03-10T14:07:37.892 INFO:tasks.workunit.client.1.vm04.stdout:4/374: write d4/d14/d1b/f20 [555316,46794] 0 2026-03-10T14:07:37.892 INFO:tasks.workunit.client.1.vm04.stdout:0/290: write d0/f19 [2066074,53151] 0 2026-03-10T14:07:37.892 INFO:tasks.workunit.client.1.vm04.stdout:1/358: dwrite d3/d20/d60/d82/d13/d1a/f24 [0,4194304] 0 2026-03-10T14:07:37.896 INFO:tasks.workunit.client.1.vm04.stdout:6/312: getdents d3/de/d35 0 2026-03-10T14:07:37.902 INFO:tasks.workunit.client.1.vm04.stdout:8/398: write d0/d3/dd/d33/f71 [2734482,70355] 0 2026-03-10T14:07:37.915 INFO:tasks.workunit.client.1.vm04.stdout:5/428: sync 2026-03-10T14:07:37.916 INFO:tasks.workunit.client.1.vm04.stdout:5/429: write d7/d26/d6b/d6e/f81 [779766,112366] 0 2026-03-10T14:07:37.918 INFO:tasks.workunit.client.1.vm04.stdout:5/430: chown d7/d2d/f64 1150617 1 2026-03-10T14:07:37.929 INFO:tasks.workunit.client.1.vm04.stdout:8/399: fsync d0/d3/d63/d12/f2c 0 2026-03-10T14:07:37.940 INFO:tasks.workunit.client.1.vm04.stdout:5/431: mkdir d7/d59/d7e/d87 0 2026-03-10T14:07:37.947 INFO:tasks.workunit.client.1.vm04.stdout:7/388: getdents d2/dc/de/d2d/d60/d7c 0 2026-03-10T14:07:37.961 INFO:tasks.workunit.client.1.vm04.stdout:8/400: mknod d0/d75/c7b 0 2026-03-10T14:07:37.962 INFO:tasks.workunit.client.1.vm04.stdout:8/401: write d0/d3/d5/f15 [1953494,35971] 0 2026-03-10T14:07:37.969 INFO:tasks.workunit.client.1.vm04.stdout:7/389: fdatasync d2/dc/de/d2d/d60/d7c/d3b/f49 0 2026-03-10T14:07:37.985 INFO:tasks.workunit.client.1.vm04.stdout:7/390: creat d2/dc/de/d2d/d60/f91 x:0 0 0 2026-03-10T14:07:37.988 INFO:tasks.workunit.client.1.vm04.stdout:7/391: write d2/dc/de/d2d/d60/d7c/f84 [231169,38996] 0 2026-03-10T14:07:37.991 INFO:tasks.workunit.client.1.vm04.stdout:7/392: getdents d2/dc/d4d/d7f 0 2026-03-10T14:07:37.999 INFO:tasks.workunit.client.1.vm04.stdout:8/402: dread d0/f23 [0,4194304] 0 2026-03-10T14:07:38.007 INFO:tasks.workunit.client.1.vm04.stdout:7/393: dwrite d2/dc/de/d2d/d60/d7c/d36/f52 [0,4194304] 0 2026-03-10T14:07:38.010 INFO:tasks.workunit.client.1.vm04.stdout:8/403: unlink d0/d3/dd/d76/f79 0 2026-03-10T14:07:38.019 INFO:tasks.workunit.client.1.vm04.stdout:8/404: mknod d0/d3/d5/c7c 0 2026-03-10T14:07:38.026 INFO:tasks.workunit.client.1.vm04.stdout:8/405: dwrite d0/d3/d5/f70 [0,4194304] 0 2026-03-10T14:07:38.032 INFO:tasks.workunit.client.1.vm04.stdout:3/425: dread da/dc/f1d [0,4194304] 0 2026-03-10T14:07:38.036 INFO:tasks.workunit.client.1.vm04.stdout:3/426: creat da/dc/d3f/d54/d66/f9c x:0 0 0 2026-03-10T14:07:38.040 INFO:tasks.workunit.client.1.vm04.stdout:3/427: write da/d3e/f44 [4332551,73813] 0 2026-03-10T14:07:38.058 INFO:tasks.workunit.client.1.vm04.stdout:6/313: read d3/de/d35/d3f/d2d/d32/d23/f33 [380566,34813] 0 2026-03-10T14:07:38.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:37 vm04.local ceph-mon[55966]: pgmap v147: 65 pgs: 65 active+clean; 891 MiB data, 3.5 GiB used, 117 GiB / 120 GiB avail; 21 MiB/s rd, 98 MiB/s wr, 394 op/s 2026-03-10T14:07:38.069 INFO:tasks.workunit.client.1.vm04.stdout:8/406: sync 2026-03-10T14:07:38.069 INFO:tasks.workunit.client.1.vm04.stdout:3/428: sync 2026-03-10T14:07:38.070 INFO:tasks.workunit.client.1.vm04.stdout:3/429: write da/f14 [5012134,123076] 0 2026-03-10T14:07:38.070 INFO:tasks.workunit.client.1.vm04.stdout:8/407: read d0/d3/d5/f15 [4004323,103565] 0 2026-03-10T14:07:38.071 INFO:tasks.workunit.client.1.vm04.stdout:3/430: write da/dc/d35/f6a [1057270,61570] 0 2026-03-10T14:07:38.071 INFO:tasks.workunit.client.1.vm04.stdout:3/431: write da/fe [1092496,21692] 0 2026-03-10T14:07:38.072 INFO:tasks.workunit.client.1.vm04.stdout:3/432: write da/dc/d3f/d61/f94 [492765,79025] 0 2026-03-10T14:07:38.089 INFO:tasks.workunit.client.1.vm04.stdout:3/433: creat da/dc/d35/d52/d6d/f9d x:0 0 0 2026-03-10T14:07:38.090 INFO:tasks.workunit.client.1.vm04.stdout:3/434: readlink da/dc/d35/d52/d53/l58 0 2026-03-10T14:07:38.092 INFO:tasks.workunit.client.1.vm04.stdout:8/408: truncate d0/f1b 1544166 0 2026-03-10T14:07:38.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:37 vm03.local ceph-mon[49718]: pgmap v147: 65 pgs: 65 active+clean; 891 MiB data, 3.5 GiB used, 117 GiB / 120 GiB avail; 21 MiB/s rd, 98 MiB/s wr, 394 op/s 2026-03-10T14:07:38.112 INFO:tasks.workunit.client.1.vm04.stdout:8/409: link d0/c56 d0/d3/d63/d12/d69/c7d 0 2026-03-10T14:07:38.117 INFO:tasks.workunit.client.1.vm04.stdout:8/410: write d0/d3/d63/d12/d51/d67/f6e [2277999,17263] 0 2026-03-10T14:07:38.119 INFO:tasks.workunit.client.1.vm04.stdout:7/394: dread d2/dc/de/d2d/d60/d7c/f15 [0,4194304] 0 2026-03-10T14:07:38.132 INFO:tasks.workunit.client.1.vm04.stdout:9/319: read d9/da/dd/f19 [286168,12302] 0 2026-03-10T14:07:38.137 INFO:tasks.workunit.client.1.vm04.stdout:0/291: write d0/d2/d25/f45 [516284,91989] 0 2026-03-10T14:07:38.137 INFO:tasks.workunit.client.1.vm04.stdout:4/375: truncate d4/fe 2797271 0 2026-03-10T14:07:38.140 INFO:tasks.workunit.client.1.vm04.stdout:5/432: truncate d7/d12/d2b/f53 2465173 0 2026-03-10T14:07:38.154 INFO:tasks.workunit.client.1.vm04.stdout:0/292: symlink d0/d1c/l5a 0 2026-03-10T14:07:38.154 INFO:tasks.workunit.client.1.vm04.stdout:4/376: creat d4/df/d22/d47/d70/d74/f86 x:0 0 0 2026-03-10T14:07:38.162 INFO:tasks.workunit.client.1.vm04.stdout:9/320: getdents d9/d33 0 2026-03-10T14:07:38.162 INFO:tasks.workunit.client.1.vm04.stdout:4/377: link d4/d14/d1b/f30 d4/df/d22/d47/d70/f87 0 2026-03-10T14:07:38.165 INFO:tasks.workunit.client.1.vm04.stdout:9/321: unlink d9/d33/f5f 0 2026-03-10T14:07:38.170 INFO:tasks.workunit.client.1.vm04.stdout:4/378: creat d4/d14/d64/f88 x:0 0 0 2026-03-10T14:07:38.170 INFO:tasks.workunit.client.1.vm04.stdout:5/433: sync 2026-03-10T14:07:38.179 INFO:tasks.workunit.client.1.vm04.stdout:9/322: mknod d9/da/c6f 0 2026-03-10T14:07:38.182 INFO:tasks.workunit.client.1.vm04.stdout:6/314: write d3/d1d/f4a [547597,40240] 0 2026-03-10T14:07:38.201 INFO:tasks.workunit.client.1.vm04.stdout:5/434: truncate d7/f3c 487329 0 2026-03-10T14:07:38.223 INFO:tasks.workunit.client.1.vm04.stdout:6/315: mkdir d3/de/d35/d3f/d2d/d32/d5c 0 2026-03-10T14:07:38.229 INFO:tasks.workunit.client.1.vm04.stdout:4/379: rename d4/df/c1a to d4/c89 0 2026-03-10T14:07:38.235 INFO:tasks.workunit.client.1.vm04.stdout:2/423: dread d0/d14/d1b/f32 [0,4194304] 0 2026-03-10T14:07:38.247 INFO:tasks.workunit.client.1.vm04.stdout:3/435: dwrite f8 [0,4194304] 0 2026-03-10T14:07:38.256 INFO:tasks.workunit.client.1.vm04.stdout:5/435: dread d7/d2d/d32/f5b [0,4194304] 0 2026-03-10T14:07:38.258 INFO:tasks.workunit.client.1.vm04.stdout:7/395: dwrite d2/dc/de/d21/f7a [0,4194304] 0 2026-03-10T14:07:38.262 INFO:tasks.workunit.client.1.vm04.stdout:2/424: rmdir d0/d3/d8/dd/d26/d46 39 2026-03-10T14:07:38.266 INFO:tasks.workunit.client.1.vm04.stdout:3/436: dwrite da/dc/d35/f5b [0,4194304] 0 2026-03-10T14:07:38.278 INFO:tasks.workunit.client.1.vm04.stdout:6/316: creat d3/de/d35/d3f/d2d/d38/d40/f5d x:0 0 0 2026-03-10T14:07:38.281 INFO:tasks.workunit.client.1.vm04.stdout:0/293: write d0/d2/d15/d22/d38/f3e [1313631,3494] 0 2026-03-10T14:07:38.284 INFO:tasks.workunit.client.1.vm04.stdout:2/425: dwrite d0/d3/d4a/f57 [0,4194304] 0 2026-03-10T14:07:38.287 INFO:tasks.workunit.client.1.vm04.stdout:0/294: dread d0/d2/d25/f3f [0,4194304] 0 2026-03-10T14:07:38.295 INFO:tasks.workunit.client.1.vm04.stdout:4/380: creat d4/d14/d6d/f8a x:0 0 0 2026-03-10T14:07:38.296 INFO:tasks.workunit.client.1.vm04.stdout:9/323: write d9/da/f41 [1679447,94407] 0 2026-03-10T14:07:38.306 INFO:tasks.workunit.client.1.vm04.stdout:6/317: mkdir d3/de/d35/d3a/d43/d4c/d5e 0 2026-03-10T14:07:38.307 INFO:tasks.workunit.client.1.vm04.stdout:3/437: creat da/dc/d3f/d54/d66/f9e x:0 0 0 2026-03-10T14:07:38.307 INFO:tasks.workunit.client.1.vm04.stdout:6/318: write d3/de/d35/d3f/d2d/f2e [4777839,42046] 0 2026-03-10T14:07:38.314 INFO:tasks.workunit.client.1.vm04.stdout:4/381: mknod d4/d14/d1b/c8b 0 2026-03-10T14:07:38.321 INFO:tasks.workunit.client.1.vm04.stdout:3/438: mknod da/d3e/c9f 0 2026-03-10T14:07:38.321 INFO:tasks.workunit.client.1.vm04.stdout:2/426: read d0/d3/d8/d59/f67 [796584,36158] 0 2026-03-10T14:07:38.321 INFO:tasks.workunit.client.1.vm04.stdout:6/319: symlink d3/de/d35/d3a/d43/d4c/l5f 0 2026-03-10T14:07:38.322 INFO:tasks.workunit.client.1.vm04.stdout:6/320: dread - d3/d1d/f55 zero size 2026-03-10T14:07:38.323 INFO:tasks.workunit.client.1.vm04.stdout:4/382: dread - d4/d14/d64/f71 zero size 2026-03-10T14:07:38.329 INFO:tasks.workunit.client.1.vm04.stdout:3/439: creat da/dc/d35/d52/d6d/d8a/fa0 x:0 0 0 2026-03-10T14:07:38.333 INFO:tasks.workunit.client.1.vm04.stdout:2/427: creat d0/d14/d39/f7f x:0 0 0 2026-03-10T14:07:38.334 INFO:tasks.workunit.client.1.vm04.stdout:2/428: stat d0/d3/d4a/c50 0 2026-03-10T14:07:38.334 INFO:tasks.workunit.client.1.vm04.stdout:0/295: sync 2026-03-10T14:07:38.343 INFO:tasks.workunit.client.1.vm04.stdout:0/296: dread d0/f19 [0,4194304] 0 2026-03-10T14:07:38.352 INFO:tasks.workunit.client.1.vm04.stdout:0/297: creat d0/d2/d15/d22/d38/f5b x:0 0 0 2026-03-10T14:07:38.352 INFO:tasks.workunit.client.1.vm04.stdout:0/298: write d0/d2/fd [1584944,52725] 0 2026-03-10T14:07:38.361 INFO:tasks.workunit.client.1.vm04.stdout:0/299: mkdir d0/d2/d15/d49/d50/d5c 0 2026-03-10T14:07:38.363 INFO:tasks.workunit.client.1.vm04.stdout:0/300: dread d0/d1c/f2b [0,4194304] 0 2026-03-10T14:07:38.365 INFO:tasks.workunit.client.1.vm04.stdout:2/429: getdents d0 0 2026-03-10T14:07:38.368 INFO:tasks.workunit.client.1.vm04.stdout:0/301: mknod d0/d2/d15/d22/d38/c5d 0 2026-03-10T14:07:38.372 INFO:tasks.workunit.client.1.vm04.stdout:2/430: chown d0/d3/d8/dd/d26/d46/l7a 27 1 2026-03-10T14:07:38.390 INFO:tasks.workunit.client.1.vm04.stdout:1/359: dread d3/d20/f27 [0,4194304] 0 2026-03-10T14:07:38.391 INFO:tasks.workunit.client.1.vm04.stdout:1/360: write d3/d22/d63/f69 [244948,30376] 0 2026-03-10T14:07:38.392 INFO:tasks.workunit.client.1.vm04.stdout:7/396: dwrite d2/dc/de/d2d/d60/d7c/d3b/f43 [0,4194304] 0 2026-03-10T14:07:38.401 INFO:tasks.workunit.client.1.vm04.stdout:9/324: write d9/da/dd/d1c/f65 [1886935,85991] 0 2026-03-10T14:07:38.401 INFO:tasks.workunit.client.1.vm04.stdout:8/411: dread d0/d3/d63/d29/f45 [0,4194304] 0 2026-03-10T14:07:38.410 INFO:tasks.workunit.client.1.vm04.stdout:5/436: truncate d7/f24 5060691 0 2026-03-10T14:07:38.413 INFO:tasks.workunit.client.1.vm04.stdout:4/383: truncate d4/df/d31/f5d 3232761 0 2026-03-10T14:07:38.414 INFO:tasks.workunit.client.1.vm04.stdout:4/384: chown d4/df/f60 174436 1 2026-03-10T14:07:38.415 INFO:tasks.workunit.client.1.vm04.stdout:6/321: dwrite d3/de/f46 [0,4194304] 0 2026-03-10T14:07:38.416 INFO:tasks.workunit.client.1.vm04.stdout:4/385: write d4/df/d22/d47/d70/d74/f86 [774907,46456] 0 2026-03-10T14:07:38.421 INFO:tasks.workunit.client.1.vm04.stdout:3/440: truncate da/dc/d35/f5b 806066 0 2026-03-10T14:07:38.421 INFO:tasks.workunit.client.1.vm04.stdout:4/386: chown d4/df/d22/d47/d70/d74/f76 64 1 2026-03-10T14:07:38.437 INFO:tasks.workunit.client.1.vm04.stdout:2/431: mknod d0/d3/d4a/c80 0 2026-03-10T14:07:38.447 INFO:tasks.workunit.client.1.vm04.stdout:1/361: truncate d3/d20/f32 2591474 0 2026-03-10T14:07:38.458 INFO:tasks.workunit.client.1.vm04.stdout:9/325: mkdir d9/d44/d70 0 2026-03-10T14:07:38.459 INFO:tasks.workunit.client.1.vm04.stdout:8/412: mknod d0/d3/d63/d29/c7e 0 2026-03-10T14:07:38.460 INFO:tasks.workunit.client.1.vm04.stdout:3/441: sync 2026-03-10T14:07:38.479 INFO:tasks.workunit.client.1.vm04.stdout:2/432: creat d0/d3/d8/d17/d35/f81 x:0 0 0 2026-03-10T14:07:38.483 INFO:tasks.workunit.client.1.vm04.stdout:4/387: dread d4/d14/d1b/f5c [0,4194304] 0 2026-03-10T14:07:38.484 INFO:tasks.workunit.client.1.vm04.stdout:4/388: chown d4/d14/d3c/f46 3689966 1 2026-03-10T14:07:38.489 INFO:tasks.workunit.client.1.vm04.stdout:8/413: dwrite d0/d3/d5/f6d [0,4194304] 0 2026-03-10T14:07:38.510 INFO:tasks.workunit.client.1.vm04.stdout:3/442: unlink da/dc/d3f/d54/d66/f9c 0 2026-03-10T14:07:38.518 INFO:tasks.workunit.client.1.vm04.stdout:0/302: link d0/d2/f2d d0/d2/d15/d22/d38/d56/f5e 0 2026-03-10T14:07:38.525 INFO:tasks.workunit.client.1.vm04.stdout:9/326: creat d9/d44/d70/f71 x:0 0 0 2026-03-10T14:07:38.533 INFO:tasks.workunit.client.1.vm04.stdout:5/437: dwrite d7/d12/d45/f5c [0,4194304] 0 2026-03-10T14:07:38.544 INFO:tasks.workunit.client.1.vm04.stdout:9/327: dread d9/da/dd/d1c/f65 [0,4194304] 0 2026-03-10T14:07:38.545 INFO:tasks.workunit.client.1.vm04.stdout:2/433: mknod d0/d3/d3a/d3e/c82 0 2026-03-10T14:07:38.554 INFO:tasks.workunit.client.1.vm04.stdout:6/322: dwrite d3/de/d35/d3a/d43/d56/f2b [0,4194304] 0 2026-03-10T14:07:38.564 INFO:tasks.workunit.client.1.vm04.stdout:6/323: write d3/f57 [569020,50799] 0 2026-03-10T14:07:38.566 INFO:tasks.workunit.client.1.vm04.stdout:8/414: mknod d0/d3/d63/d12/c7f 0 2026-03-10T14:07:38.579 INFO:tasks.workunit.client.1.vm04.stdout:3/443: unlink da/dc/d35/d52/d6d/d8a/c8d 0 2026-03-10T14:07:38.587 INFO:tasks.workunit.client.1.vm04.stdout:3/444: dwrite f8 [0,4194304] 0 2026-03-10T14:07:38.602 INFO:tasks.workunit.client.1.vm04.stdout:0/303: fsync d0/d2/d15/d22/d38/f4a 0 2026-03-10T14:07:38.602 INFO:tasks.workunit.client.1.vm04.stdout:1/362: creat d3/d22/d63/f83 x:0 0 0 2026-03-10T14:07:38.612 INFO:tasks.workunit.client.1.vm04.stdout:5/438: rename d7/d12/d2b/f55 to d7/d12/d2b/d3e/d3f/f88 0 2026-03-10T14:07:38.613 INFO:tasks.workunit.client.1.vm04.stdout:7/397: truncate d2/d28/f7e 1520835 0 2026-03-10T14:07:38.630 INFO:tasks.workunit.client.1.vm04.stdout:9/328: symlink d9/d5c/l72 0 2026-03-10T14:07:38.653 INFO:tasks.workunit.client.1.vm04.stdout:7/398: dread d2/dc/de/d2d/d38/f37 [0,4194304] 0 2026-03-10T14:07:38.658 INFO:tasks.workunit.client.1.vm04.stdout:3/445: dread da/dc/d35/f46 [0,4194304] 0 2026-03-10T14:07:38.661 INFO:tasks.workunit.client.1.vm04.stdout:3/446: write da/dc/d3f/d54/d66/f9e [371560,96385] 0 2026-03-10T14:07:38.666 INFO:tasks.workunit.client.1.vm04.stdout:2/434: dread d0/d3/d4a/d66/f72 [4194304,4194304] 0 2026-03-10T14:07:38.671 INFO:tasks.workunit.client.1.vm04.stdout:8/415: rename d0/d3/d63/d12/f61 to d0/d3/f80 0 2026-03-10T14:07:38.677 INFO:tasks.workunit.client.1.vm04.stdout:6/324: getdents d3/de/d35/d3f/d2d/d32/d5c 0 2026-03-10T14:07:38.708 INFO:tasks.workunit.client.1.vm04.stdout:7/399: creat d2/d2a/f92 x:0 0 0 2026-03-10T14:07:38.708 INFO:tasks.workunit.client.1.vm04.stdout:1/363: mkdir d3/d20/d60/d82/d13/d38/d58/d5b/d7b/d84 0 2026-03-10T14:07:38.709 INFO:tasks.workunit.client.1.vm04.stdout:0/304: truncate d0/d2/f2d 5099889 0 2026-03-10T14:07:38.709 INFO:tasks.workunit.client.1.vm04.stdout:7/400: write d2/dc/de/d2d/d5c/f8e [523095,71253] 0 2026-03-10T14:07:38.712 INFO:tasks.workunit.client.1.vm04.stdout:0/305: read d0/d2/fa [1375079,96746] 0 2026-03-10T14:07:38.721 INFO:tasks.workunit.client.1.vm04.stdout:2/435: chown d0/d14/d1b/d45/c6f 12 1 2026-03-10T14:07:38.721 INFO:tasks.workunit.client.1.vm04.stdout:5/439: rename d7/d26/f70 to d7/d2d/d32/d34/f89 0 2026-03-10T14:07:38.726 INFO:tasks.workunit.client.1.vm04.stdout:8/416: creat d0/d3/d63/d12/d69/f81 x:0 0 0 2026-03-10T14:07:38.728 INFO:tasks.workunit.client.1.vm04.stdout:4/389: getdents d4/df/d22 0 2026-03-10T14:07:38.730 INFO:tasks.workunit.client.1.vm04.stdout:5/440: dread d7/d2d/f64 [0,4194304] 0 2026-03-10T14:07:38.731 INFO:tasks.workunit.client.1.vm04.stdout:5/441: chown d7/f11 457083 1 2026-03-10T14:07:38.744 INFO:tasks.workunit.client.1.vm04.stdout:4/390: dread d4/d14/d3c/f3e [0,4194304] 0 2026-03-10T14:07:38.744 INFO:tasks.workunit.client.1.vm04.stdout:1/364: dwrite d3/d22/d2f/d57/f78 [0,4194304] 0 2026-03-10T14:07:38.745 INFO:tasks.workunit.client.1.vm04.stdout:4/391: read d4/f6 [3135228,69105] 0 2026-03-10T14:07:38.754 INFO:tasks.workunit.client.1.vm04.stdout:4/392: dread d4/d14/d1b/f2f [0,4194304] 0 2026-03-10T14:07:38.761 INFO:tasks.workunit.client.1.vm04.stdout:7/401: creat d2/d2a/f93 x:0 0 0 2026-03-10T14:07:38.767 INFO:tasks.workunit.client.1.vm04.stdout:7/402: truncate d2/dc/de/d2d/d60/f91 986568 0 2026-03-10T14:07:38.767 INFO:tasks.workunit.client.1.vm04.stdout:3/447: truncate da/f19 2985315 0 2026-03-10T14:07:38.777 INFO:tasks.workunit.client.1.vm04.stdout:2/436: truncate d0/d3/d3a/d3e/f61 3257252 0 2026-03-10T14:07:38.779 INFO:tasks.workunit.client.1.vm04.stdout:2/437: read d0/d14/d39/d47/d70/f74 [356844,90730] 0 2026-03-10T14:07:38.783 INFO:tasks.workunit.client.1.vm04.stdout:8/417: fsync d0/f26 0 2026-03-10T14:07:38.784 INFO:tasks.workunit.client.1.vm04.stdout:8/418: readlink d0/d3/d63/d29/l5a 0 2026-03-10T14:07:38.789 INFO:tasks.workunit.client.1.vm04.stdout:8/419: fsync d0/d3/d5/f66 0 2026-03-10T14:07:38.792 INFO:tasks.workunit.client.1.vm04.stdout:8/420: write d0/d3/d63/f57 [451525,105106] 0 2026-03-10T14:07:38.792 INFO:tasks.workunit.client.1.vm04.stdout:8/421: readlink d0/d3/d63/d29/l5a 0 2026-03-10T14:07:38.799 INFO:tasks.workunit.client.1.vm04.stdout:9/329: link d9/d58/f5e d9/f73 0 2026-03-10T14:07:38.801 INFO:tasks.workunit.client.1.vm04.stdout:9/330: readlink d9/da/l12 0 2026-03-10T14:07:38.804 INFO:tasks.workunit.client.1.vm04.stdout:9/331: write d9/d5c/f6c [112287,61073] 0 2026-03-10T14:07:38.810 INFO:tasks.workunit.client.1.vm04.stdout:5/442: mkdir d7/d12/d2b/d3e/d57/d8a 0 2026-03-10T14:07:38.816 INFO:tasks.workunit.client.1.vm04.stdout:1/365: symlink d3/d22/d6d/l85 0 2026-03-10T14:07:38.816 INFO:tasks.workunit.client.1.vm04.stdout:1/366: chown d3/d22/l6e 88064168 1 2026-03-10T14:07:38.825 INFO:tasks.workunit.client.1.vm04.stdout:0/306: mknod d0/d2/d15/d49/d50/d5c/c5f 0 2026-03-10T14:07:38.832 INFO:tasks.workunit.client.1.vm04.stdout:3/448: truncate da/d30/f55 4821669 0 2026-03-10T14:07:38.853 INFO:tasks.workunit.client.1.vm04.stdout:2/438: symlink d0/d14/d39/d47/d70/l83 0 2026-03-10T14:07:38.861 INFO:tasks.workunit.client.1.vm04.stdout:6/325: link d3/de/d35/d3f/d2d/d32/d23/d24/f36 d3/de/d35/d3f/d2d/d38/f60 0 2026-03-10T14:07:38.861 INFO:tasks.workunit.client.1.vm04.stdout:9/332: truncate d9/d44/d4d/f66 275830 0 2026-03-10T14:07:38.868 INFO:tasks.workunit.client.1.vm04.stdout:5/443: mknod d7/d2d/d32/c8b 0 2026-03-10T14:07:38.868 INFO:tasks.workunit.client.1.vm04.stdout:1/367: dread - d3/d22/d63/f65 zero size 2026-03-10T14:07:38.873 INFO:tasks.workunit.client.1.vm04.stdout:0/307: symlink d0/d2/d15/d22/d38/l60 0 2026-03-10T14:07:38.873 INFO:tasks.workunit.client.1.vm04.stdout:3/449: mknod da/d30/ca1 0 2026-03-10T14:07:38.898 INFO:tasks.workunit.client.1.vm04.stdout:2/439: mknod d0/d3/d8/dd/d26/d46/d4b/c84 0 2026-03-10T14:07:38.898 INFO:tasks.workunit.client.1.vm04.stdout:2/440: fdatasync d0/d3/d4a/d66/f7d 0 2026-03-10T14:07:38.929 INFO:tasks.workunit.client.1.vm04.stdout:4/393: dwrite d4/f9 [4194304,4194304] 0 2026-03-10T14:07:38.938 INFO:tasks.workunit.client.1.vm04.stdout:5/444: mkdir d7/d12/d2b/d8c 0 2026-03-10T14:07:38.939 INFO:tasks.workunit.client.1.vm04.stdout:5/445: write d7/d9/f75 [1555288,48920] 0 2026-03-10T14:07:38.939 INFO:tasks.workunit.client.1.vm04.stdout:0/308: fsync d0/d2/f13 0 2026-03-10T14:07:38.940 INFO:tasks.workunit.client.1.vm04.stdout:6/326: stat d3/de/d35/d3f/d2d/d38/f60 0 2026-03-10T14:07:38.941 INFO:tasks.workunit.client.1.vm04.stdout:4/394: mkdir d4/df/d22/d47/d4f/d8c 0 2026-03-10T14:07:38.941 INFO:tasks.workunit.client.1.vm04.stdout:6/327: chown d3/de/d35/d3a/d43/d4c/f4d 483216690 1 2026-03-10T14:07:38.945 INFO:tasks.workunit.client.1.vm04.stdout:9/333: sync 2026-03-10T14:07:38.955 INFO:tasks.workunit.client.1.vm04.stdout:3/450: dread da/f28 [0,4194304] 0 2026-03-10T14:07:38.957 INFO:tasks.workunit.client.1.vm04.stdout:1/368: rename d3/d22/d63/d35/l53 to d3/d22/d63/d35/d6c/l86 0 2026-03-10T14:07:38.957 INFO:tasks.workunit.client.1.vm04.stdout:1/369: fsync d3/d22/d63/f7f 0 2026-03-10T14:07:38.963 INFO:tasks.workunit.client.1.vm04.stdout:8/422: getdents d0/d75 0 2026-03-10T14:07:38.964 INFO:tasks.workunit.client.1.vm04.stdout:0/309: mkdir d0/d2/d15/d49/d50/d61 0 2026-03-10T14:07:38.969 INFO:tasks.workunit.client.1.vm04.stdout:4/395: mknod d4/df/d34/d6f/c8d 0 2026-03-10T14:07:38.969 INFO:tasks.workunit.client.1.vm04.stdout:6/328: mkdir d3/de/d35/d3f/d2d/d32/d61 0 2026-03-10T14:07:38.973 INFO:tasks.workunit.client.1.vm04.stdout:0/310: chown d0/d2/d25/f2a 9749 1 2026-03-10T14:07:38.974 INFO:tasks.workunit.client.1.vm04.stdout:4/396: rename d4/df/d34/d6f to d4/df/d34/d6f/d8e 22 2026-03-10T14:07:38.975 INFO:tasks.workunit.client.1.vm04.stdout:0/311: dread - d0/d2/d15/f59 zero size 2026-03-10T14:07:38.977 INFO:tasks.workunit.client.1.vm04.stdout:1/370: mknod d3/d20/d60/d82/d13/d38/d58/d5b/c87 0 2026-03-10T14:07:38.978 INFO:tasks.workunit.client.1.vm04.stdout:1/371: chown d3/d22/c52 11727 1 2026-03-10T14:07:38.980 INFO:tasks.workunit.client.1.vm04.stdout:3/451: dwrite da/dc/d35/d52/f79 [4194304,4194304] 0 2026-03-10T14:07:38.980 INFO:tasks.workunit.client.1.vm04.stdout:8/423: fsync d0/d3/f80 0 2026-03-10T14:07:38.986 INFO:tasks.workunit.client.1.vm04.stdout:8/424: write d0/d3/d5/f70 [3050508,49335] 0 2026-03-10T14:07:39.006 INFO:tasks.workunit.client.1.vm04.stdout:4/397: unlink d4/fd 0 2026-03-10T14:07:39.006 INFO:tasks.workunit.client.1.vm04.stdout:4/398: chown d4/df/d22/l50 249134314 1 2026-03-10T14:07:39.014 INFO:tasks.workunit.client.1.vm04.stdout:6/329: sync 2026-03-10T14:07:39.024 INFO:tasks.workunit.client.1.vm04.stdout:9/334: dread d9/d44/f49 [0,4194304] 0 2026-03-10T14:07:39.028 INFO:tasks.workunit.client.1.vm04.stdout:3/452: fsync f4 0 2026-03-10T14:07:39.028 INFO:tasks.workunit.client.1.vm04.stdout:1/372: dread d3/f8 [0,4194304] 0 2026-03-10T14:07:39.029 INFO:tasks.workunit.client.1.vm04.stdout:3/453: write da/d30/f42 [3857386,106406] 0 2026-03-10T14:07:39.044 INFO:tasks.workunit.client.1.vm04.stdout:6/330: dread - d3/de/d35/d3a/d43/f4b zero size 2026-03-10T14:07:39.045 INFO:tasks.workunit.client.1.vm04.stdout:7/403: dwrite d2/d28/f7e [0,4194304] 0 2026-03-10T14:07:39.048 INFO:tasks.workunit.client.1.vm04.stdout:6/331: write d3/de/d35/d3f/d2d/f2e [4725846,15749] 0 2026-03-10T14:07:39.050 INFO:tasks.workunit.client.1.vm04.stdout:7/404: chown d2/dc/de/d2d/d60/d7c 369910047 1 2026-03-10T14:07:39.050 INFO:tasks.workunit.client.1.vm04.stdout:6/332: chown d3/de/d35/d3f/d2d/d38/d40/f5d 38 1 2026-03-10T14:07:39.056 INFO:tasks.workunit.client.1.vm04.stdout:7/405: dwrite d2/d28/f3d [0,4194304] 0 2026-03-10T14:07:39.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:38 vm04.local ceph-mon[55966]: pgmap v148: 65 pgs: 65 active+clean; 948 MiB data, 3.8 GiB used, 116 GiB / 120 GiB avail; 23 MiB/s rd, 95 MiB/s wr, 449 op/s 2026-03-10T14:07:39.067 INFO:tasks.workunit.client.1.vm04.stdout:9/335: mkdir d9/da/dd/d74 0 2026-03-10T14:07:39.067 INFO:tasks.workunit.client.1.vm04.stdout:9/336: readlink d9/d5c/l60 0 2026-03-10T14:07:39.067 INFO:tasks.workunit.client.1.vm04.stdout:1/373: read - d3/d22/d63/f77 zero size 2026-03-10T14:07:39.068 INFO:tasks.workunit.client.1.vm04.stdout:1/374: stat d3/d20/l4e 0 2026-03-10T14:07:39.068 INFO:tasks.workunit.client.1.vm04.stdout:9/337: chown d9/da/d5d/c61 8944869 1 2026-03-10T14:07:39.072 INFO:tasks.workunit.client.1.vm04.stdout:9/338: dread d9/d1d/f1f [0,4194304] 0 2026-03-10T14:07:39.082 INFO:tasks.workunit.client.1.vm04.stdout:3/454: truncate da/dc/d35/d52/f6f 1512707 0 2026-03-10T14:07:39.082 INFO:tasks.workunit.client.1.vm04.stdout:2/441: fsync d0/d3/d3a/d3e/f61 0 2026-03-10T14:07:39.083 INFO:tasks.workunit.client.1.vm04.stdout:4/399: unlink d4/f6 0 2026-03-10T14:07:39.086 INFO:tasks.workunit.client.1.vm04.stdout:8/425: mkdir d0/d82 0 2026-03-10T14:07:39.100 INFO:tasks.workunit.client.1.vm04.stdout:1/375: truncate d3/d20/d60/d82/f56 27246 0 2026-03-10T14:07:39.100 INFO:tasks.workunit.client.1.vm04.stdout:9/339: read - d9/d58/f62 zero size 2026-03-10T14:07:39.104 INFO:tasks.workunit.client.1.vm04.stdout:2/442: rmdir d0/d14/d54 39 2026-03-10T14:07:39.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:38 vm03.local ceph-mon[49718]: pgmap v148: 65 pgs: 65 active+clean; 948 MiB data, 3.8 GiB used, 116 GiB / 120 GiB avail; 23 MiB/s rd, 95 MiB/s wr, 449 op/s 2026-03-10T14:07:39.108 INFO:tasks.workunit.client.1.vm04.stdout:2/443: read d0/d14/d1b/f55 [752215,54400] 0 2026-03-10T14:07:39.109 INFO:tasks.workunit.client.1.vm04.stdout:2/444: chown d0/d3/d8/d17 5693 1 2026-03-10T14:07:39.111 INFO:tasks.workunit.client.1.vm04.stdout:7/406: dread d2/dc/de/d2d/d5c/f8e [0,4194304] 0 2026-03-10T14:07:39.127 INFO:tasks.workunit.client.1.vm04.stdout:6/333: creat d3/de/d35/d3f/d2d/d32/d23/d47/f62 x:0 0 0 2026-03-10T14:07:39.164 INFO:tasks.workunit.client.1.vm04.stdout:2/445: truncate d0/d3/d8/dd/d26/f2a 5098571 0 2026-03-10T14:07:39.164 INFO:tasks.workunit.client.1.vm04.stdout:5/446: dwrite d7/d2d/d32/f5b [0,4194304] 0 2026-03-10T14:07:39.171 INFO:tasks.workunit.client.1.vm04.stdout:1/376: dread d3/d22/d63/f2d [0,4194304] 0 2026-03-10T14:07:39.172 INFO:tasks.workunit.client.1.vm04.stdout:1/377: dread - d3/d22/d6d/f7d zero size 2026-03-10T14:07:39.188 INFO:tasks.workunit.client.1.vm04.stdout:9/340: creat d9/da/dd/d74/f75 x:0 0 0 2026-03-10T14:07:39.190 INFO:tasks.workunit.client.1.vm04.stdout:0/312: write d0/d1c/f51 [784605,3815] 0 2026-03-10T14:07:39.193 INFO:tasks.workunit.client.1.vm04.stdout:0/313: fdatasync d0/d2/d15/d22/d38/f3e 0 2026-03-10T14:07:39.215 INFO:tasks.workunit.client.1.vm04.stdout:4/400: rename d4/f9 to d4/df/d34/f8f 0 2026-03-10T14:07:39.215 INFO:tasks.workunit.client.1.vm04.stdout:3/455: rename da/dc/d35 to da/dc/d35/d52/d70/da2 22 2026-03-10T14:07:39.216 INFO:tasks.workunit.client.1.vm04.stdout:4/401: readlink d4/df/d34/l78 0 2026-03-10T14:07:39.219 INFO:tasks.workunit.client.1.vm04.stdout:3/456: dread da/dc/d35/d52/d6d/f75 [0,4194304] 0 2026-03-10T14:07:39.235 INFO:tasks.workunit.client.1.vm04.stdout:8/426: dwrite d0/d3/d63/d12/f2c [4194304,4194304] 0 2026-03-10T14:07:39.263 INFO:tasks.workunit.client.1.vm04.stdout:8/427: dread d0/f26 [0,4194304] 0 2026-03-10T14:07:39.273 INFO:tasks.workunit.client.1.vm04.stdout:3/457: mkdir da/dc/d35/d52/da3 0 2026-03-10T14:07:39.290 INFO:tasks.workunit.client.1.vm04.stdout:7/407: rename d2/d28 to d2/d94 0 2026-03-10T14:07:39.291 INFO:tasks.workunit.client.1.vm04.stdout:6/334: dread d3/de/d35/d3f/d2d/d32/f1f [0,4194304] 0 2026-03-10T14:07:39.295 INFO:tasks.workunit.client.1.vm04.stdout:5/447: dwrite d7/d9/fd [4194304,4194304] 0 2026-03-10T14:07:39.299 INFO:tasks.workunit.client.1.vm04.stdout:5/448: dread d7/d9/fd [4194304,4194304] 0 2026-03-10T14:07:39.312 INFO:tasks.workunit.client.1.vm04.stdout:9/341: link d9/da/f41 d9/d33/f76 0 2026-03-10T14:07:39.323 INFO:tasks.workunit.client.1.vm04.stdout:1/378: rename d3/d20/d60/d82/d13/f5a to d3/d20/f88 0 2026-03-10T14:07:39.326 INFO:tasks.workunit.client.1.vm04.stdout:4/402: fdatasync d4/df/f2e 0 2026-03-10T14:07:39.336 INFO:tasks.workunit.client.1.vm04.stdout:2/446: dread d0/d3/d8/dd/d26/f36 [0,4194304] 0 2026-03-10T14:07:39.341 INFO:tasks.workunit.client.1.vm04.stdout:8/428: creat d0/d75/f83 x:0 0 0 2026-03-10T14:07:39.383 INFO:tasks.workunit.client.1.vm04.stdout:7/408: mknod d2/dc/de/d2d/d60/d7c/c95 0 2026-03-10T14:07:39.383 INFO:tasks.workunit.client.1.vm04.stdout:5/449: mknod d7/d2d/d32/c8d 0 2026-03-10T14:07:39.383 INFO:tasks.workunit.client.1.vm04.stdout:8/429: unlink d0/d3/d63/d29/f2b 0 2026-03-10T14:07:39.383 INFO:tasks.workunit.client.1.vm04.stdout:8/430: chown d0/d3/dd 227993 1 2026-03-10T14:07:39.383 INFO:tasks.workunit.client.1.vm04.stdout:8/431: chown d0/d3/c28 256364 1 2026-03-10T14:07:39.390 INFO:tasks.workunit.client.1.vm04.stdout:7/409: dwrite d2/dc/f89 [0,4194304] 0 2026-03-10T14:07:39.395 INFO:tasks.workunit.client.1.vm04.stdout:0/314: getdents d0/d1c 0 2026-03-10T14:07:39.403 INFO:tasks.workunit.client.1.vm04.stdout:6/335: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:39.412 INFO:tasks.workunit.client.1.vm04.stdout:5/450: dwrite d7/d9/f28 [0,4194304] 0 2026-03-10T14:07:39.440 INFO:tasks.workunit.client.1.vm04.stdout:5/451: creat d7/d26/d6b/d6e/f8e x:0 0 0 2026-03-10T14:07:39.441 INFO:tasks.workunit.client.1.vm04.stdout:5/452: chown d7/d2d/d32/l40 24 1 2026-03-10T14:07:39.450 INFO:tasks.workunit.client.1.vm04.stdout:3/458: link da/f19 da/dc/fa4 0 2026-03-10T14:07:39.456 INFO:tasks.workunit.client.1.vm04.stdout:7/410: creat d2/dc/d4d/d7f/f96 x:0 0 0 2026-03-10T14:07:39.463 INFO:tasks.workunit.client.1.vm04.stdout:5/453: write d7/d12/d2b/f4d [128393,124702] 0 2026-03-10T14:07:39.465 INFO:tasks.workunit.client.1.vm04.stdout:3/459: mknod da/d30/ca5 0 2026-03-10T14:07:39.468 INFO:tasks.workunit.client.1.vm04.stdout:4/403: getdents d4 0 2026-03-10T14:07:39.485 INFO:tasks.workunit.client.1.vm04.stdout:7/411: creat d2/dc/de/d2d/d60/d7c/f97 x:0 0 0 2026-03-10T14:07:39.495 INFO:tasks.workunit.client.1.vm04.stdout:4/404: symlink d4/df/d22/d47/d4f/l90 0 2026-03-10T14:07:39.504 INFO:tasks.workunit.client.1.vm04.stdout:0/315: dread d0/d2/f12 [0,4194304] 0 2026-03-10T14:07:39.504 INFO:tasks.workunit.client.1.vm04.stdout:0/316: stat d0/d2/f13 0 2026-03-10T14:07:39.504 INFO:tasks.workunit.client.1.vm04.stdout:4/405: symlink d4/df/d34/d6f/l91 0 2026-03-10T14:07:39.951 INFO:tasks.workunit.client.1.vm04.stdout:9/342: dwrite d9/da/f13 [0,4194304] 0 2026-03-10T14:07:39.955 INFO:tasks.workunit.client.1.vm04.stdout:9/343: dread d9/d44/f51 [0,4194304] 0 2026-03-10T14:07:39.961 INFO:tasks.workunit.client.1.vm04.stdout:1/379: truncate d3/d22/f42 4379931 0 2026-03-10T14:07:39.977 INFO:tasks.workunit.client.1.vm04.stdout:6/336: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:07:39.985 INFO:tasks.workunit.client.1.vm04.stdout:1/380: fsync d3/fc 0 2026-03-10T14:07:39.999 INFO:tasks.workunit.client.1.vm04.stdout:1/381: dread - d3/d22/d63/f73 zero size 2026-03-10T14:07:40.022 INFO:tasks.workunit.client.1.vm04.stdout:1/382: dread d3/d20/d60/d82/d13/d1a/f62 [4194304,4194304] 0 2026-03-10T14:07:40.029 INFO:tasks.workunit.client.1.vm04.stdout:1/383: rename d3/d22/d63/f73 to d3/d22/d63/f89 0 2026-03-10T14:07:40.033 INFO:tasks.workunit.client.1.vm04.stdout:1/384: symlink d3/d20/d60/d82/d13/d38/d58/l8a 0 2026-03-10T14:07:40.035 INFO:tasks.workunit.client.1.vm04.stdout:0/317: write d0/d2/f4d [91671,105106] 0 2026-03-10T14:07:40.037 INFO:tasks.workunit.client.1.vm04.stdout:0/318: truncate d0/d2/f4d 239370 0 2026-03-10T14:07:40.182 INFO:tasks.workunit.client.1.vm04.stdout:5/454: rmdir d7/d26/d6b/d6e 39 2026-03-10T14:07:40.184 INFO:tasks.workunit.client.1.vm04.stdout:6/337: symlink d3/de/l63 0 2026-03-10T14:07:40.193 INFO:tasks.workunit.client.1.vm04.stdout:5/455: creat d7/d2d/d76/f8f x:0 0 0 2026-03-10T14:07:40.194 INFO:tasks.workunit.client.1.vm04.stdout:5/456: symlink d7/d12/d2b/l90 0 2026-03-10T14:07:40.196 INFO:tasks.workunit.client.1.vm04.stdout:6/338: rmdir d3/d1d/d3e 0 2026-03-10T14:07:40.205 INFO:tasks.workunit.client.1.vm04.stdout:6/339: dwrite d3/f4 [4194304,4194304] 0 2026-03-10T14:07:40.217 INFO:tasks.workunit.client.1.vm04.stdout:0/319: unlink d0/d2/f4d 0 2026-03-10T14:07:40.218 INFO:tasks.workunit.client.1.vm04.stdout:8/432: unlink d0/d3/d63/c4b 0 2026-03-10T14:07:40.220 INFO:tasks.workunit.client.1.vm04.stdout:8/433: read d0/d3/d63/d29/f45 [803520,42154] 0 2026-03-10T14:07:40.223 INFO:tasks.workunit.client.1.vm04.stdout:6/340: mknod d3/de/d35/d3a/d43/d4c/c64 0 2026-03-10T14:07:40.223 INFO:tasks.workunit.client.1.vm04.stdout:7/412: link d2/dc/de/d2d/d38/f83 d2/dc/de/f98 0 2026-03-10T14:07:40.223 INFO:tasks.workunit.client.1.vm04.stdout:6/341: fdatasync d3/d1d/f4a 0 2026-03-10T14:07:40.235 INFO:tasks.workunit.client.1.vm04.stdout:3/460: rename da/d30/l45 to da/dc/d3f/d54/la6 0 2026-03-10T14:07:40.235 INFO:tasks.workunit.client.1.vm04.stdout:2/447: unlink d0/d3/d4a/c50 0 2026-03-10T14:07:40.240 INFO:tasks.workunit.client.1.vm04.stdout:8/434: dwrite d0/d3/d63/d12/d69/f81 [0,4194304] 0 2026-03-10T14:07:40.242 INFO:tasks.workunit.client.1.vm04.stdout:8/435: chown d0/d3/d5/f66 231777333 1 2026-03-10T14:07:40.248 INFO:tasks.workunit.client.1.vm04.stdout:3/461: unlink da/dc/d35/f46 0 2026-03-10T14:07:40.248 INFO:tasks.workunit.client.1.vm04.stdout:7/413: link d2/dc/de/d2d/l3f d2/d94/l99 0 2026-03-10T14:07:40.248 INFO:tasks.workunit.client.1.vm04.stdout:2/448: mkdir d0/d3/d8/d17/d4e/d85 0 2026-03-10T14:07:40.250 INFO:tasks.workunit.client.1.vm04.stdout:8/436: unlink d0/d3/d5/f6d 0 2026-03-10T14:07:40.275 INFO:tasks.workunit.client.1.vm04.stdout:3/462: creat da/dc/d3f/d54/d66/fa7 x:0 0 0 2026-03-10T14:07:40.277 INFO:tasks.workunit.client.1.vm04.stdout:3/463: chown da/dc/d3f/d61/l89 209166 1 2026-03-10T14:07:40.287 INFO:tasks.workunit.client.1.vm04.stdout:2/449: mkdir d0/d3/d8/d17/d4e/d85/d86 0 2026-03-10T14:07:40.289 INFO:tasks.workunit.client.1.vm04.stdout:7/414: dwrite d2/dc/f8d [0,4194304] 0 2026-03-10T14:07:40.289 INFO:tasks.workunit.client.1.vm04.stdout:7/415: fdatasync d2/d94/f7e 0 2026-03-10T14:07:40.294 INFO:tasks.workunit.client.1.vm04.stdout:7/416: fsync f0 0 2026-03-10T14:07:40.315 INFO:tasks.workunit.client.1.vm04.stdout:0/320: dread d0/d1c/f2e [0,4194304] 0 2026-03-10T14:07:40.315 INFO:tasks.workunit.client.1.vm04.stdout:7/417: creat d2/d94/f9a x:0 0 0 2026-03-10T14:07:40.315 INFO:tasks.workunit.client.1.vm04.stdout:2/450: link d0/d3/d8/dd/c16 d0/d3/d8/dd/c87 0 2026-03-10T14:07:40.315 INFO:tasks.workunit.client.1.vm04.stdout:0/321: mkdir d0/d2/d15/d22/d62 0 2026-03-10T14:07:40.315 INFO:tasks.workunit.client.1.vm04.stdout:2/451: unlink d0/d3/la 0 2026-03-10T14:07:40.327 INFO:tasks.workunit.client.1.vm04.stdout:7/418: symlink d2/dc/de/d11/l9b 0 2026-03-10T14:07:40.328 INFO:tasks.workunit.client.1.vm04.stdout:5/457: dread d7/f11 [0,4194304] 0 2026-03-10T14:07:40.334 INFO:tasks.workunit.client.1.vm04.stdout:0/322: unlink d0/d2/d15/d22/d38/f4a 0 2026-03-10T14:07:40.376 INFO:tasks.workunit.client.1.vm04.stdout:5/458: truncate d7/d12/f51 2296135 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:5/459: creat d7/d26/d6b/d6e/f91 x:0 0 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:0/323: getdents d0/d2/d15/d22/d38/d56 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:5/460: symlink d7/d12/d2b/d3e/d57/d77/l92 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:0/324: mknod d0/d2/d15/d49/c63 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:5/461: fdatasync d7/d12/d2b/d3e/d3f/f88 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:0/325: creat d0/d2/d25/f64 x:0 0 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:5/462: rmdir d7/d59 39 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:5/463: stat d7/f1d 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:0/326: mknod d0/d2/d15/d22/d38/d56/c65 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:0/327: dread d0/d1c/f2e [0,4194304] 0 2026-03-10T14:07:40.377 INFO:tasks.workunit.client.1.vm04.stdout:5/464: mkdir d7/d12/d2b/d93 0 2026-03-10T14:07:40.382 INFO:tasks.workunit.client.1.vm04.stdout:2/452: read d0/d14/f6b [3768752,59259] 0 2026-03-10T14:07:40.396 INFO:tasks.workunit.client.1.vm04.stdout:5/465: dread d7/d12/d2b/f53 [0,4194304] 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:2/453: creat d0/d3/d8/d17/d4e/d85/f88 x:0 0 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/466: unlink d7/d12/f83 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:2/454: creat d0/d3/d8/d17/d4e/d85/f89 x:0 0 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/467: rmdir d7/d12/d45 39 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/468: creat d7/d12/d2b/d3e/d57/d8a/f94 x:0 0 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/469: creat d7/d2d/d69/f95 x:0 0 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/470: truncate d7/d2d/d69/f95 557429 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/471: unlink d7/d12/d2b/c86 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/472: readlink d7/d26/l66 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/473: fdatasync d7/f11 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/474: symlink d7/d12/d2b/d3e/l96 0 2026-03-10T14:07:40.425 INFO:tasks.workunit.client.1.vm04.stdout:5/475: fdatasync d7/d2d/d32/d34/f89 0 2026-03-10T14:07:40.473 INFO:tasks.workunit.client.1.vm04.stdout:6/342: write d3/de/d35/d3f/d2d/d32/f1f [4027328,35919] 0 2026-03-10T14:07:40.480 INFO:tasks.workunit.client.1.vm04.stdout:4/406: rename d4/d14/d3c/f67 to d4/d14/d3c/d5e/f92 0 2026-03-10T14:07:40.483 INFO:tasks.workunit.client.1.vm04.stdout:9/344: rename d9/f73 to d9/d5c/f77 0 2026-03-10T14:07:40.484 INFO:tasks.workunit.client.1.vm04.stdout:6/343: getdents d3/de/d35/d3a/d43/d4c 0 2026-03-10T14:07:40.485 INFO:tasks.workunit.client.1.vm04.stdout:4/407: dread - d4/d14/d6d/f8a zero size 2026-03-10T14:07:40.486 INFO:tasks.workunit.client.1.vm04.stdout:1/385: rename d3/d22/c52 to d3/d20/d60/d82/d13/d38/d58/d5b/d7b/d84/c8b 0 2026-03-10T14:07:40.490 INFO:tasks.workunit.client.1.vm04.stdout:0/328: rename d0/d1c to d0/d2/d15/d22/d38/d56/d66 0 2026-03-10T14:07:40.493 INFO:tasks.workunit.client.1.vm04.stdout:1/386: read d3/d20/d60/d82/fd [2832373,115006] 0 2026-03-10T14:07:40.494 INFO:tasks.workunit.client.1.vm04.stdout:9/345: dwrite d9/da/dd/f47 [0,4194304] 0 2026-03-10T14:07:40.495 INFO:tasks.workunit.client.1.vm04.stdout:1/387: chown d3/d22/l37 41511462 1 2026-03-10T14:07:40.498 INFO:tasks.workunit.client.1.vm04.stdout:0/329: link d0/d2/fd d0/d2/d15/d22/d38/d56/f67 0 2026-03-10T14:07:40.506 INFO:tasks.workunit.client.1.vm04.stdout:9/346: symlink d9/l78 0 2026-03-10T14:07:40.506 INFO:tasks.workunit.client.1.vm04.stdout:9/347: stat d9/da/dd/d1c/f2e 0 2026-03-10T14:07:40.508 INFO:tasks.workunit.client.1.vm04.stdout:1/388: truncate d3/d20/f88 4016068 0 2026-03-10T14:07:40.509 INFO:tasks.workunit.client.1.vm04.stdout:1/389: chown d3/d5c 54 1 2026-03-10T14:07:40.511 INFO:tasks.workunit.client.1.vm04.stdout:9/348: fsync d9/da/dd/d1c/f30 0 2026-03-10T14:07:40.514 INFO:tasks.workunit.client.1.vm04.stdout:1/390: symlink d3/d22/l8c 0 2026-03-10T14:07:40.517 INFO:tasks.workunit.client.1.vm04.stdout:9/349: fsync d9/d33/f4b 0 2026-03-10T14:07:40.521 INFO:tasks.workunit.client.1.vm04.stdout:9/350: dwrite f5 [8388608,4194304] 0 2026-03-10T14:07:40.532 INFO:tasks.workunit.client.1.vm04.stdout:9/351: unlink d9/da/dd/d1c/c42 0 2026-03-10T14:07:40.923 INFO:tasks.workunit.client.1.vm04.stdout:8/437: sync 2026-03-10T14:07:40.941 INFO:tasks.workunit.client.1.vm04.stdout:3/464: dwrite f4 [4194304,4194304] 0 2026-03-10T14:07:40.941 INFO:tasks.workunit.client.1.vm04.stdout:3/465: stat da/dc/l12 0 2026-03-10T14:07:40.945 INFO:tasks.workunit.client.1.vm04.stdout:3/466: rename da/dc/d35/d52/d53/f74 to da/dc/d35/d52/fa8 0 2026-03-10T14:07:40.983 INFO:tasks.workunit.client.1.vm04.stdout:7/419: dwrite d2/dc/de/d2d/d38/f57 [0,4194304] 0 2026-03-10T14:07:40.988 INFO:tasks.workunit.client.1.vm04.stdout:7/420: creat d2/d2a/f9c x:0 0 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:7/421: symlink d2/dc/de/d2d/d60/d7c/d36/l9d 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:7/422: truncate f0 4369250 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:7/423: mknod d2/dc/de/d2d/d5c/c9e 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:2/455: write d0/d3/d8/f30 [4497383,95460] 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:7/424: mkdir d2/d9f 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:7/425: fdatasync d2/f4 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:2/456: creat d0/d14/d1b/f8a x:0 0 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:7/426: write d2/dc/de/d2d/d60/d7c/f8f [962242,43917] 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:2/457: mkdir d0/d14/d39/d47/d70/d8b 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:7/427: readlink d2/dc/l33 0 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:2/458: chown d0/d14/d39/d47/l5b 18698 1 2026-03-10T14:07:41.032 INFO:tasks.workunit.client.1.vm04.stdout:2/459: chown d0/d14/d39/f7f 581438 1 2026-03-10T14:07:41.098 INFO:tasks.workunit.client.1.vm04.stdout:5/476: dwrite d7/d2d/d32/d34/f89 [0,4194304] 0 2026-03-10T14:07:41.099 INFO:tasks.workunit.client.1.vm04.stdout:5/477: write d7/d12/d2b/f4d [397588,23688] 0 2026-03-10T14:07:41.102 INFO:tasks.workunit.client.1.vm04.stdout:5/478: readlink d7/d12/l22 0 2026-03-10T14:07:41.102 INFO:tasks.workunit.client.1.vm04.stdout:6/344: truncate d3/de/d35/d3f/d2d/d32/d23/f2f 4366551 0 2026-03-10T14:07:41.102 INFO:tasks.workunit.client.1.vm04.stdout:6/345: fsync d3/de/d35/d3f/f17 0 2026-03-10T14:07:41.103 INFO:tasks.workunit.client.1.vm04.stdout:5/479: write d7/d12/d2b/f46 [2167152,23620] 0 2026-03-10T14:07:41.108 INFO:tasks.workunit.client.1.vm04.stdout:5/480: read d7/d12/d45/f5c [3176545,104374] 0 2026-03-10T14:07:41.109 INFO:tasks.workunit.client.1.vm04.stdout:0/330: write d0/d2/f12 [2401574,7194] 0 2026-03-10T14:07:41.109 INFO:tasks.workunit.client.1.vm04.stdout:4/408: dwrite d4/df/d22/d47/d4f/f6a [0,4194304] 0 2026-03-10T14:07:41.114 INFO:tasks.workunit.client.1.vm04.stdout:1/391: write d3/d22/d63/f2d [383141,102661] 0 2026-03-10T14:07:41.143 INFO:tasks.workunit.client.1.vm04.stdout:0/331: symlink d0/d2/d15/d22/d38/d56/d66/l68 0 2026-03-10T14:07:41.143 INFO:tasks.workunit.client.1.vm04.stdout:4/409: rename d4/df/d34/d6f/c8d to d4/d14/d1b/c93 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/481: link d7/d26/d6b/c71 d7/d59/d7e/d87/c97 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:4/410: symlink d4/d14/d6d/l94 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/482: mknod d7/d12/d2b/d93/c98 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/483: chown d7/d2d/d32 37296450 1 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:4/411: creat d4/df/d34/f95 x:0 0 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/484: rename d7/d2d/d32/c6c to d7/d12/d2b/d3e/c99 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/485: chown d7/d26/l39 272 1 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/486: mkdir d7/d59/d7d/d9a 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/487: getdents d7/d12/d2b 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/488: symlink d7/d12/d2b/d3e/d57/d77/l9b 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/489: rename d7/c5a to d7/d59/c9c 0 2026-03-10T14:07:41.144 INFO:tasks.workunit.client.1.vm04.stdout:5/490: write d7/d9/f28 [4788201,17744] 0 2026-03-10T14:07:41.157 INFO:tasks.workunit.client.1.vm04.stdout:0/332: dread d0/f1b [0,4194304] 0 2026-03-10T14:07:41.167 INFO:tasks.workunit.client.1.vm04.stdout:0/333: dwrite d0/d2/d25/f45 [0,4194304] 0 2026-03-10T14:07:41.167 INFO:tasks.workunit.client.1.vm04.stdout:0/334: readlink d0/d2/d15/d22/l36 0 2026-03-10T14:07:41.186 INFO:tasks.workunit.client.1.vm04.stdout:0/335: read d0/d2/d15/d22/d38/d56/f5e [4542131,12652] 0 2026-03-10T14:07:41.186 INFO:tasks.workunit.client.1.vm04.stdout:0/336: chown d0/l43 7041 1 2026-03-10T14:07:41.193 INFO:tasks.workunit.client.1.vm04.stdout:0/337: dread d0/d2/f12 [0,4194304] 0 2026-03-10T14:07:41.200 INFO:tasks.workunit.client.1.vm04.stdout:0/338: symlink d0/d2/d15/d22/l69 0 2026-03-10T14:07:41.206 INFO:tasks.workunit.client.1.vm04.stdout:0/339: symlink d0/d2/d15/d22/d38/d56/l6a 0 2026-03-10T14:07:41.210 INFO:tasks.workunit.client.1.vm04.stdout:9/352: dwrite d9/da/dd/d1c/f2e [0,4194304] 0 2026-03-10T14:07:41.212 INFO:tasks.workunit.client.1.vm04.stdout:7/428: sync 2026-03-10T14:07:41.213 INFO:tasks.workunit.client.1.vm04.stdout:1/392: sync 2026-03-10T14:07:41.218 INFO:tasks.workunit.client.1.vm04.stdout:1/393: symlink d3/d5c/l8d 0 2026-03-10T14:07:41.218 INFO:tasks.workunit.client.1.vm04.stdout:9/353: readlink d9/l3c 0 2026-03-10T14:07:41.219 INFO:tasks.workunit.client.1.vm04.stdout:9/354: stat d9/d1d 0 2026-03-10T14:07:41.223 INFO:tasks.workunit.client.1.vm04.stdout:1/394: getdents d3/d22/d2f 0 2026-03-10T14:07:41.226 INFO:tasks.workunit.client.1.vm04.stdout:1/395: getdents d3/d22/d2f/d57 0 2026-03-10T14:07:41.229 INFO:tasks.workunit.client.1.vm04.stdout:1/396: creat d3/d22/f8e x:0 0 0 2026-03-10T14:07:41.271 INFO:tasks.workunit.client.1.vm04.stdout:1/397: dread d3/f14 [0,4194304] 0 2026-03-10T14:07:41.333 INFO:tasks.workunit.client.1.vm04.stdout:4/412: sync 2026-03-10T14:07:41.336 INFO:tasks.workunit.client.1.vm04.stdout:4/413: creat d4/f96 x:0 0 0 2026-03-10T14:07:41.354 INFO:tasks.workunit.client.1.vm04.stdout:4/414: chown d4/df/d22 3112185 1 2026-03-10T14:07:41.363 INFO:tasks.workunit.client.1.vm04.stdout:4/415: dread d4/df/d22/f4b [0,4194304] 0 2026-03-10T14:07:41.363 INFO:tasks.workunit.client.1.vm04.stdout:4/416: readlink d4/df/l25 0 2026-03-10T14:07:41.365 INFO:tasks.workunit.client.1.vm04.stdout:4/417: creat d4/d14/d64/f97 x:0 0 0 2026-03-10T14:07:41.365 INFO:tasks.workunit.client.1.vm04.stdout:4/418: chown d4/df/f60 1203074 1 2026-03-10T14:07:41.368 INFO:tasks.workunit.client.1.vm04.stdout:4/419: creat d4/d14/f98 x:0 0 0 2026-03-10T14:07:41.372 INFO:tasks.workunit.client.1.vm04.stdout:4/420: dwrite d4/f96 [0,4194304] 0 2026-03-10T14:07:41.374 INFO:tasks.workunit.client.1.vm04.stdout:4/421: dread - d4/d14/d1b/f6c zero size 2026-03-10T14:07:41.378 INFO:tasks.workunit.client.1.vm04.stdout:4/422: getdents d4/df 0 2026-03-10T14:07:41.379 INFO:tasks.workunit.client.1.vm04.stdout:4/423: creat d4/d14/d1b/f99 x:0 0 0 2026-03-10T14:07:41.387 INFO:tasks.workunit.client.1.vm04.stdout:0/340: sync 2026-03-10T14:07:41.388 INFO:tasks.workunit.client.1.vm04.stdout:0/341: symlink d0/d2/d15/d22/d38/d56/l6b 0 2026-03-10T14:07:41.392 INFO:tasks.workunit.client.1.vm04.stdout:0/342: dwrite d0/d2/f13 [0,4194304] 0 2026-03-10T14:07:41.419 INFO:tasks.workunit.client.1.vm04.stdout:1/398: dread d3/d20/d60/d82/d13/d1a/f28 [0,4194304] 0 2026-03-10T14:07:41.421 INFO:tasks.workunit.client.1.vm04.stdout:1/399: write d3/d20/f27 [2459485,66057] 0 2026-03-10T14:07:41.424 INFO:tasks.workunit.client.1.vm04.stdout:1/400: mkdir d3/d8f 0 2026-03-10T14:07:41.434 INFO:tasks.workunit.client.1.vm04.stdout:1/401: dread d3/d20/f32 [0,4194304] 0 2026-03-10T14:07:41.450 INFO:tasks.workunit.client.1.vm04.stdout:8/438: write d0/d3/d63/d12/f1d [1370917,91501] 0 2026-03-10T14:07:41.457 INFO:tasks.workunit.client.1.vm04.stdout:3/467: dwrite da/d30/f4e [0,4194304] 0 2026-03-10T14:07:41.458 INFO:tasks.workunit.client.1.vm04.stdout:8/439: mknod d0/d3/d63/d12/c84 0 2026-03-10T14:07:41.462 INFO:tasks.workunit.client.1.vm04.stdout:6/346: dwrite d3/de/d35/d3f/d2d/d32/d23/f33 [0,4194304] 0 2026-03-10T14:07:41.465 INFO:tasks.workunit.client.1.vm04.stdout:3/468: creat da/dc/d3f/d54/fa9 x:0 0 0 2026-03-10T14:07:41.465 INFO:tasks.workunit.client.1.vm04.stdout:8/440: creat d0/d3/d63/d12/d51/d67/f85 x:0 0 0 2026-03-10T14:07:41.468 INFO:tasks.workunit.client.1.vm04.stdout:8/441: truncate d0/f26 3057432 0 2026-03-10T14:07:41.476 INFO:tasks.workunit.client.1.vm04.stdout:3/469: write da/dc/d3f/f83 [140367,45181] 0 2026-03-10T14:07:41.476 INFO:tasks.workunit.client.1.vm04.stdout:8/442: creat d0/d3/dd/d76/f86 x:0 0 0 2026-03-10T14:07:41.476 INFO:tasks.workunit.client.1.vm04.stdout:6/347: dwrite d3/de/d35/d3a/d43/d56/f2b [0,4194304] 0 2026-03-10T14:07:41.476 INFO:tasks.workunit.client.1.vm04.stdout:8/443: truncate d0/d75/f83 440486 0 2026-03-10T14:07:41.477 INFO:tasks.workunit.client.1.vm04.stdout:6/348: chown d3/de/d35/d3f/d2d/d38/f60 115 1 2026-03-10T14:07:41.477 INFO:tasks.workunit.client.1.vm04.stdout:8/444: chown d0/d3/d5/l2d 0 1 2026-03-10T14:07:41.486 INFO:tasks.workunit.client.1.vm04.stdout:3/470: mknod da/dc/d35/d52/da3/caa 0 2026-03-10T14:07:41.486 INFO:tasks.workunit.client.1.vm04.stdout:6/349: fdatasync d3/ff 0 2026-03-10T14:07:41.486 INFO:tasks.workunit.client.1.vm04.stdout:8/445: creat d0/d3/d63/d12/d51/d67/f87 x:0 0 0 2026-03-10T14:07:41.501 INFO:tasks.workunit.client.1.vm04.stdout:6/350: mknod d3/de/d35/d3f/c65 0 2026-03-10T14:07:41.537 INFO:tasks.workunit.client.1.vm04.stdout:6/351: getdents d3/de/d35/d3f/d2d/d38 0 2026-03-10T14:07:41.537 INFO:tasks.workunit.client.1.vm04.stdout:6/352: rmdir d3/de/d35/d3a/d43 39 2026-03-10T14:07:41.537 INFO:tasks.workunit.client.1.vm04.stdout:6/353: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:41.612 INFO:tasks.workunit.client.1.vm04.stdout:6/354: sync 2026-03-10T14:07:41.614 INFO:tasks.workunit.client.1.vm04.stdout:6/355: chown d3/de/d35/d3a/d43/d4c/f4d 7 1 2026-03-10T14:07:41.615 INFO:tasks.workunit.client.1.vm04.stdout:6/356: chown d3/de 125 1 2026-03-10T14:07:41.618 INFO:tasks.workunit.client.1.vm04.stdout:6/357: mknod d3/de/d35/d3a/d43/d4c/d5e/c66 0 2026-03-10T14:07:41.618 INFO:tasks.workunit.client.1.vm04.stdout:6/358: fsync d3/f19 0 2026-03-10T14:07:41.624 INFO:tasks.workunit.client.1.vm04.stdout:6/359: rmdir d3/de/d35/d3a/d43/d52 0 2026-03-10T14:07:41.624 INFO:tasks.workunit.client.1.vm04.stdout:6/360: mknod d3/de/d35/d3f/d2d/d32/d23/d24/c67 0 2026-03-10T14:07:41.633 INFO:tasks.workunit.client.1.vm04.stdout:6/361: dwrite d3/de/d35/d3f/d2d/d32/f1f [0,4194304] 0 2026-03-10T14:07:41.651 INFO:tasks.workunit.client.1.vm04.stdout:5/491: rmdir d7/d12 39 2026-03-10T14:07:41.656 INFO:tasks.workunit.client.1.vm04.stdout:7/429: write d2/dc/de/d2d/d38/f83 [1693,114846] 0 2026-03-10T14:07:41.664 INFO:tasks.workunit.client.1.vm04.stdout:9/355: dwrite d9/da/f41 [0,4194304] 0 2026-03-10T14:07:41.672 INFO:tasks.workunit.client.1.vm04.stdout:4/424: write d4/d14/d1b/f30 [102037,22493] 0 2026-03-10T14:07:41.681 INFO:tasks.workunit.client.1.vm04.stdout:6/362: symlink d3/de/d35/l68 0 2026-03-10T14:07:41.684 INFO:tasks.workunit.client.1.vm04.stdout:0/343: dwrite d0/d2/d15/d22/f30 [4194304,4194304] 0 2026-03-10T14:07:41.684 INFO:tasks.workunit.client.1.vm04.stdout:0/344: fsync d0/d2/d15/f57 0 2026-03-10T14:07:41.684 INFO:tasks.workunit.client.1.vm04.stdout:5/492: chown d7/d12/d2b/d3e/c5f 156540 1 2026-03-10T14:07:41.685 INFO:tasks.workunit.client.1.vm04.stdout:5/493: dread - d7/d26/d6b/d6e/f8e zero size 2026-03-10T14:07:41.689 INFO:tasks.workunit.client.1.vm04.stdout:7/430: unlink d2/dc/de/d2d/d38/d50/c7d 0 2026-03-10T14:07:41.689 INFO:tasks.workunit.client.1.vm04.stdout:9/356: symlink d9/d44/l79 0 2026-03-10T14:07:41.692 INFO:tasks.workunit.client.1.vm04.stdout:6/363: stat d3/de/d35/d3f/l12 0 2026-03-10T14:07:41.692 INFO:tasks.workunit.client.1.vm04.stdout:4/425: creat d4/df/d34/d6f/f9a x:0 0 0 2026-03-10T14:07:41.692 INFO:tasks.workunit.client.1.vm04.stdout:9/357: write d9/d33/f76 [3119411,8379] 0 2026-03-10T14:07:41.695 INFO:tasks.workunit.client.1.vm04.stdout:0/345: mknod d0/d2/d15/d49/d50/c6c 0 2026-03-10T14:07:41.699 INFO:tasks.workunit.client.1.vm04.stdout:5/494: dwrite d7/d2d/d32/f5b [0,4194304] 0 2026-03-10T14:07:41.701 INFO:tasks.workunit.client.1.vm04.stdout:7/431: mknod d2/dc/de/d11/ca0 0 2026-03-10T14:07:41.708 INFO:tasks.workunit.client.1.vm04.stdout:0/346: mknod d0/d2/d15/d22/c6d 0 2026-03-10T14:07:41.708 INFO:tasks.workunit.client.1.vm04.stdout:0/347: readlink d0/d2/d15/d22/d38/d56/l6b 0 2026-03-10T14:07:41.708 INFO:tasks.workunit.client.1.vm04.stdout:4/426: mknod d4/d14/c9b 0 2026-03-10T14:07:41.709 INFO:tasks.workunit.client.1.vm04.stdout:6/364: truncate d3/de/d35/d3a/d43/d56/fc 1425202 0 2026-03-10T14:07:41.714 INFO:tasks.workunit.client.1.vm04.stdout:4/427: creat d4/d14/d6d/f9c x:0 0 0 2026-03-10T14:07:41.714 INFO:tasks.workunit.client.1.vm04.stdout:7/432: rename d2/dc/de/d2d/d5c/l8c to d2/dc/de/d11/la1 0 2026-03-10T14:07:41.718 INFO:tasks.workunit.client.1.vm04.stdout:7/433: dread d2/d2a/d42/f4e [0,4194304] 0 2026-03-10T14:07:41.722 INFO:tasks.workunit.client.1.vm04.stdout:7/434: chown d2/dc/de/d2d/d38/d50/l69 0 1 2026-03-10T14:07:41.723 INFO:tasks.workunit.client.1.vm04.stdout:0/348: mkdir d0/d6e 0 2026-03-10T14:07:41.723 INFO:tasks.workunit.client.1.vm04.stdout:4/428: creat d4/d14/d1b/f9d x:0 0 0 2026-03-10T14:07:41.730 INFO:tasks.workunit.client.1.vm04.stdout:7/435: dread - d2/dc/de/d2d/d60/d7c/d3b/f5e zero size 2026-03-10T14:07:41.730 INFO:tasks.workunit.client.1.vm04.stdout:4/429: symlink d4/d14/d6d/l9e 0 2026-03-10T14:07:41.730 INFO:tasks.workunit.client.1.vm04.stdout:0/349: unlink d0/d2/d25/l27 0 2026-03-10T14:07:41.732 INFO:tasks.workunit.client.1.vm04.stdout:4/430: read d4/d14/f3b [680000,108756] 0 2026-03-10T14:07:41.735 INFO:tasks.workunit.client.1.vm04.stdout:7/436: creat d2/dc/de/d2d/d38/d50/fa2 x:0 0 0 2026-03-10T14:07:41.736 INFO:tasks.workunit.client.1.vm04.stdout:7/437: truncate d2/dc/de/f98 439466 0 2026-03-10T14:07:41.747 INFO:tasks.workunit.client.1.vm04.stdout:0/350: creat d0/d2/d15/d49/d50/d61/f6f x:0 0 0 2026-03-10T14:07:41.748 INFO:tasks.workunit.client.1.vm04.stdout:0/351: dread - d0/d2/d15/f2f zero size 2026-03-10T14:07:41.750 INFO:tasks.workunit.client.1.vm04.stdout:1/402: dwrite d3/d22/f2b [0,4194304] 0 2026-03-10T14:07:41.755 INFO:tasks.workunit.client.1.vm04.stdout:1/403: dread d3/f8 [0,4194304] 0 2026-03-10T14:07:41.760 INFO:tasks.workunit.client.1.vm04.stdout:7/438: symlink d2/d2a/d42/d56/la3 0 2026-03-10T14:07:41.768 INFO:tasks.workunit.client.1.vm04.stdout:5/495: dread d7/f1d [0,4194304] 0 2026-03-10T14:07:41.775 INFO:tasks.workunit.client.1.vm04.stdout:1/404: symlink d3/d8f/l90 0 2026-03-10T14:07:41.782 INFO:tasks.workunit.client.1.vm04.stdout:8/446: write d0/d3/d63/d12/f50 [7036281,57965] 0 2026-03-10T14:07:41.785 INFO:tasks.workunit.client.1.vm04.stdout:2/460: dwrite d0/d3/f1d [4194304,4194304] 0 2026-03-10T14:07:41.790 INFO:tasks.workunit.client.1.vm04.stdout:3/471: dwrite da/dc/f90 [0,4194304] 0 2026-03-10T14:07:41.805 INFO:tasks.workunit.client.1.vm04.stdout:5/496: creat d7/d2d/d32/f9d x:0 0 0 2026-03-10T14:07:41.805 INFO:tasks.workunit.client.1.vm04.stdout:1/405: mknod d3/d20/d60/d82/d13/d38/d58/d5b/d7b/d84/c91 0 2026-03-10T14:07:41.807 INFO:tasks.workunit.client.1.vm04.stdout:8/447: symlink d0/d3/d63/d12/d69/l88 0 2026-03-10T14:07:41.818 INFO:tasks.workunit.client.1.vm04.stdout:4/431: dread d4/d14/d1b/f28 [0,4194304] 0 2026-03-10T14:07:41.818 INFO:tasks.workunit.client.1.vm04.stdout:9/358: dwrite d9/d44/d4d/f66 [0,4194304] 0 2026-03-10T14:07:41.819 INFO:tasks.workunit.client.1.vm04.stdout:0/352: creat d0/f70 x:0 0 0 2026-03-10T14:07:41.819 INFO:tasks.workunit.client.1.vm04.stdout:3/472: creat da/dc/d35/d52/d6d/fab x:0 0 0 2026-03-10T14:07:41.819 INFO:tasks.workunit.client.1.vm04.stdout:9/359: dwrite d9/d5c/f6c [0,4194304] 0 2026-03-10T14:07:41.840 INFO:tasks.workunit.client.1.vm04.stdout:1/406: rmdir d3/d20/d60/d82/d13/d38/d58/d5b/d7b 39 2026-03-10T14:07:41.840 INFO:tasks.workunit.client.1.vm04.stdout:4/432: chown d4/df/d34/f8f 221415 1 2026-03-10T14:07:41.841 INFO:tasks.workunit.client.1.vm04.stdout:1/407: stat d3/d22/l37 0 2026-03-10T14:07:41.841 INFO:tasks.workunit.client.1.vm04.stdout:4/433: chown d4/d14/d1b/f20 25 1 2026-03-10T14:07:41.848 INFO:tasks.workunit.client.1.vm04.stdout:0/353: fdatasync d0/d2/d25/f33 0 2026-03-10T14:07:41.849 INFO:tasks.workunit.client.1.vm04.stdout:3/473: mkdir da/dc/d35/d52/da3/dac 0 2026-03-10T14:07:41.865 INFO:tasks.workunit.client.1.vm04.stdout:2/461: rename d0/d3/d8/d59 to d0/d3/d4a/d8c 0 2026-03-10T14:07:41.870 INFO:tasks.workunit.client.1.vm04.stdout:9/360: symlink d9/d1d/l7a 0 2026-03-10T14:07:41.870 INFO:tasks.workunit.client.1.vm04.stdout:4/434: mknod d4/df/d31/c9f 0 2026-03-10T14:07:41.874 INFO:tasks.workunit.client.1.vm04.stdout:3/474: mknod da/dc/d3f/cad 0 2026-03-10T14:07:41.875 INFO:tasks.workunit.client.1.vm04.stdout:1/408: mkdir d3/d20/d60/d82/d13/d38/d58/d5b/d92 0 2026-03-10T14:07:41.882 INFO:tasks.workunit.client.1.vm04.stdout:9/361: mknod d9/d44/d59/c7b 0 2026-03-10T14:07:41.882 INFO:tasks.workunit.client.1.vm04.stdout:9/362: chown d9/da/dd 0 1 2026-03-10T14:07:41.891 INFO:tasks.workunit.client.1.vm04.stdout:2/462: link d0/d14/f79 d0/d14/d39/d47/d70/f8d 0 2026-03-10T14:07:41.899 INFO:tasks.workunit.client.1.vm04.stdout:1/409: creat d3/d20/d60/d82/d13/d38/d58/d5b/d92/f93 x:0 0 0 2026-03-10T14:07:41.900 INFO:tasks.workunit.client.1.vm04.stdout:2/463: unlink d0/d14/d39/f7f 0 2026-03-10T14:07:41.907 INFO:tasks.workunit.client.1.vm04.stdout:9/363: link d9/da/dd/f24 d9/d44/d59/f7c 0 2026-03-10T14:07:41.909 INFO:tasks.workunit.client.1.vm04.stdout:4/435: rename d4/c45 to d4/d14/ca0 0 2026-03-10T14:07:41.909 INFO:tasks.workunit.client.1.vm04.stdout:4/436: chown d4/f96 92527627 1 2026-03-10T14:07:41.915 INFO:tasks.workunit.client.1.vm04.stdout:6/365: write d3/de/d35/d3a/d43/d4c/f53 [746986,123384] 0 2026-03-10T14:07:41.931 INFO:tasks.workunit.client.1.vm04.stdout:2/464: fdatasync d0/d3/f9 0 2026-03-10T14:07:41.939 INFO:tasks.workunit.client.1.vm04.stdout:7/439: write d2/dc/de/d2d/d60/d7c/d44/f51 [1473715,1084] 0 2026-03-10T14:07:41.943 INFO:tasks.workunit.client.1.vm04.stdout:5/497: dwrite d7/d12/d2b/f72 [0,4194304] 0 2026-03-10T14:07:41.946 INFO:tasks.workunit.client.1.vm04.stdout:9/364: unlink d9/d33/l55 0 2026-03-10T14:07:41.955 INFO:tasks.workunit.client.1.vm04.stdout:5/498: dwrite d7/d26/d6b/d6e/f91 [0,4194304] 0 2026-03-10T14:07:41.955 INFO:tasks.workunit.client.1.vm04.stdout:4/437: rename d4/d14/d64/f88 to d4/df/fa1 0 2026-03-10T14:07:41.958 INFO:tasks.workunit.client.1.vm04.stdout:6/366: truncate d3/ff 2578890 0 2026-03-10T14:07:41.962 INFO:tasks.workunit.client.1.vm04.stdout:7/440: symlink d2/dc/de/d2d/d60/d7c/d44/la4 0 2026-03-10T14:07:41.975 INFO:tasks.workunit.client.1.vm04.stdout:5/499: mkdir d7/d12/d2b/d93/d9e 0 2026-03-10T14:07:41.975 INFO:tasks.workunit.client.1.vm04.stdout:9/365: stat d9/da/dd/l1a 0 2026-03-10T14:07:41.975 INFO:tasks.workunit.client.1.vm04.stdout:5/500: readlink d7/d26/l39 0 2026-03-10T14:07:41.976 INFO:tasks.workunit.client.1.vm04.stdout:5/501: write d7/d9/fd [7451046,22536] 0 2026-03-10T14:07:41.979 INFO:tasks.workunit.client.1.vm04.stdout:5/502: readlink d7/d2d/l2f 0 2026-03-10T14:07:41.983 INFO:tasks.workunit.client.1.vm04.stdout:7/441: creat d2/dc/de/d2d/d38/fa5 x:0 0 0 2026-03-10T14:07:41.984 INFO:tasks.workunit.client.1.vm04.stdout:1/410: getdents d3/d22/d6d 0 2026-03-10T14:07:41.992 INFO:tasks.workunit.client.1.vm04.stdout:9/366: chown d9/d44/d59/f7c 0 1 2026-03-10T14:07:41.992 INFO:tasks.workunit.client.1.vm04.stdout:1/411: symlink d3/d22/d63/d35/l94 0 2026-03-10T14:07:41.992 INFO:tasks.workunit.client.1.vm04.stdout:7/442: rename d2/dc/f89 to d2/dc/de/d2d/d60/d81/fa6 0 2026-03-10T14:07:41.993 INFO:tasks.workunit.client.1.vm04.stdout:1/412: chown d3/d22/l49 115303 1 2026-03-10T14:07:41.993 INFO:tasks.workunit.client.1.vm04.stdout:1/413: stat d3/d22/l6e 0 2026-03-10T14:07:41.997 INFO:tasks.workunit.client.1.vm04.stdout:9/367: fsync d9/d1d/f1f 0 2026-03-10T14:07:42.005 INFO:tasks.workunit.client.1.vm04.stdout:9/368: mkdir d9/d44/d4d/d7d 0 2026-03-10T14:07:42.008 INFO:tasks.workunit.client.1.vm04.stdout:1/414: mknod d3/d20/d60/d82/d13/d38/d58/d5b/d92/c95 0 2026-03-10T14:07:42.013 INFO:tasks.workunit.client.1.vm04.stdout:9/369: link d9/d44/d4d/c68 d9/da/dd/c7e 0 2026-03-10T14:07:42.014 INFO:tasks.workunit.client.1.vm04.stdout:9/370: chown d9/d1d/c64 2011 1 2026-03-10T14:07:42.035 INFO:tasks.workunit.client.1.vm04.stdout:9/371: dwrite d9/ff [0,4194304] 0 2026-03-10T14:07:42.035 INFO:tasks.workunit.client.1.vm04.stdout:0/354: write d0/d2/d15/d22/d38/d56/f5e [1410592,4997] 0 2026-03-10T14:07:42.046 INFO:tasks.workunit.client.1.vm04.stdout:3/475: dwrite da/dc/fa4 [0,4194304] 0 2026-03-10T14:07:42.057 INFO:tasks.workunit.client.1.vm04.stdout:3/476: getdents da/dc/d47/d9b 0 2026-03-10T14:07:42.062 INFO:tasks.workunit.client.1.vm04.stdout:3/477: dread da/dc/fa4 [0,4194304] 0 2026-03-10T14:07:42.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:41 vm04.local ceph-mon[55966]: pgmap v149: 65 pgs: 65 active+clean; 959 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 22 MiB/s rd, 72 MiB/s wr, 398 op/s 2026-03-10T14:07:42.103 INFO:tasks.workunit.client.1.vm04.stdout:4/438: symlink d4/df/d22/d47/d70/la2 0 2026-03-10T14:07:42.105 INFO:tasks.workunit.client.1.vm04.stdout:0/355: dread d0/d2/d25/f45 [0,4194304] 0 2026-03-10T14:07:42.106 INFO:tasks.workunit.client.1.vm04.stdout:4/439: write d4/df/d22/d47/d70/d74/f86 [939188,14130] 0 2026-03-10T14:07:42.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:41 vm03.local ceph-mon[49718]: pgmap v149: 65 pgs: 65 active+clean; 959 MiB data, 3.9 GiB used, 116 GiB / 120 GiB avail; 22 MiB/s rd, 72 MiB/s wr, 398 op/s 2026-03-10T14:07:42.110 INFO:tasks.workunit.client.1.vm04.stdout:0/356: creat d0/d2/d15/d22/d38/f71 x:0 0 0 2026-03-10T14:07:42.112 INFO:tasks.workunit.client.1.vm04.stdout:4/440: symlink d4/d14/d3c/la3 0 2026-03-10T14:07:42.114 INFO:tasks.workunit.client.1.vm04.stdout:0/357: fsync d0/d2/d15/d22/d38/d56/d66/f54 0 2026-03-10T14:07:42.126 INFO:tasks.workunit.client.1.vm04.stdout:0/358: unlink d0/d2/d15/d22/d38/d56/d66/l68 0 2026-03-10T14:07:42.126 INFO:tasks.workunit.client.1.vm04.stdout:2/465: dwrite d0/d3/f24 [4194304,4194304] 0 2026-03-10T14:07:42.136 INFO:tasks.workunit.client.1.vm04.stdout:0/359: dread d0/d2/d15/d22/d38/d56/d66/f51 [0,4194304] 0 2026-03-10T14:07:42.138 INFO:tasks.workunit.client.1.vm04.stdout:0/360: rename d0/f1b to d0/f72 0 2026-03-10T14:07:42.138 INFO:tasks.workunit.client.1.vm04.stdout:0/361: chown d0/d2/d25/f45 1005187469 1 2026-03-10T14:07:42.140 INFO:tasks.workunit.client.1.vm04.stdout:0/362: fdatasync d0/d2/f9 0 2026-03-10T14:07:42.141 INFO:tasks.workunit.client.1.vm04.stdout:0/363: mknod d0/d2/d15/c73 0 2026-03-10T14:07:42.225 INFO:tasks.workunit.client.1.vm04.stdout:3/478: sync 2026-03-10T14:07:42.225 INFO:tasks.workunit.client.1.vm04.stdout:4/441: sync 2026-03-10T14:07:42.226 INFO:tasks.workunit.client.1.vm04.stdout:0/364: sync 2026-03-10T14:07:42.228 INFO:tasks.workunit.client.1.vm04.stdout:3/479: read - da/dc/d3f/f7e zero size 2026-03-10T14:07:42.235 INFO:tasks.workunit.client.1.vm04.stdout:0/365: sync 2026-03-10T14:07:42.236 INFO:tasks.workunit.client.1.vm04.stdout:0/366: read - d0/d2/d15/f59 zero size 2026-03-10T14:07:42.241 INFO:tasks.workunit.client.1.vm04.stdout:3/480: creat da/dc/d35/d52/da3/dac/fae x:0 0 0 2026-03-10T14:07:42.256 INFO:tasks.workunit.client.1.vm04.stdout:3/481: write da/dc/d35/d52/d6d/f75 [1758572,1115] 0 2026-03-10T14:07:42.257 INFO:tasks.workunit.client.1.vm04.stdout:3/482: chown da/dc/d3f/d54/d66/f80 38449408 1 2026-03-10T14:07:42.276 INFO:tasks.workunit.client.1.vm04.stdout:8/448: dread d0/d3/d5/f30 [0,4194304] 0 2026-03-10T14:07:42.278 INFO:tasks.workunit.client.1.vm04.stdout:6/367: dwrite d3/d1d/f41 [0,4194304] 0 2026-03-10T14:07:42.285 INFO:tasks.workunit.client.1.vm04.stdout:8/449: dwrite d0/d3/d5/f66 [0,4194304] 0 2026-03-10T14:07:42.287 INFO:tasks.workunit.client.1.vm04.stdout:8/450: readlink d0/d3/d63/d29/l5a 0 2026-03-10T14:07:42.302 INFO:tasks.workunit.client.1.vm04.stdout:3/483: getdents da/d30 0 2026-03-10T14:07:42.318 INFO:tasks.workunit.client.1.vm04.stdout:5/503: truncate d7/d2d/d69/f95 128435 0 2026-03-10T14:07:42.323 INFO:tasks.workunit.client.1.vm04.stdout:5/504: mkdir d7/d12/d2b/d3e/d57/d9f 0 2026-03-10T14:07:42.324 INFO:tasks.workunit.client.1.vm04.stdout:7/443: write d2/dc/de/d2d/d60/d7c/d3b/f5e [373218,73111] 0 2026-03-10T14:07:42.335 INFO:tasks.workunit.client.1.vm04.stdout:5/505: creat d7/d2d/d69/fa0 x:0 0 0 2026-03-10T14:07:42.336 INFO:tasks.workunit.client.1.vm04.stdout:7/444: symlink d2/dc/de/d2d/d60/d7c/la7 0 2026-03-10T14:07:42.342 INFO:tasks.workunit.client.1.vm04.stdout:1/415: dwrite d3/d22/d2f/f34 [0,4194304] 0 2026-03-10T14:07:42.342 INFO:tasks.workunit.client.1.vm04.stdout:3/484: sync 2026-03-10T14:07:42.346 INFO:tasks.workunit.client.1.vm04.stdout:1/416: dread d3/d20/f32 [0,4194304] 0 2026-03-10T14:07:42.352 INFO:tasks.workunit.client.1.vm04.stdout:7/445: rename d2/dc/de/d11/l9b to d2/dc/d4d/la8 0 2026-03-10T14:07:42.356 INFO:tasks.workunit.client.1.vm04.stdout:9/372: dwrite d9/da/dd/f48 [0,4194304] 0 2026-03-10T14:07:42.370 INFO:tasks.workunit.client.1.vm04.stdout:1/417: symlink d3/d5c/l96 0 2026-03-10T14:07:42.370 INFO:tasks.workunit.client.1.vm04.stdout:1/418: chown d3/d20/d60/d82/d13/d1a 183 1 2026-03-10T14:07:42.384 INFO:tasks.workunit.client.1.vm04.stdout:9/373: rename d9/d1d/l7a to d9/d44/d4d/d7d/l7f 0 2026-03-10T14:07:42.393 INFO:tasks.workunit.client.1.vm04.stdout:1/419: truncate d3/d20/f32 1513200 0 2026-03-10T14:07:42.393 INFO:tasks.workunit.client.1.vm04.stdout:1/420: dread - d3/d22/d63/f7f zero size 2026-03-10T14:07:42.404 INFO:tasks.workunit.client.1.vm04.stdout:9/374: symlink d9/da/d5d/l80 0 2026-03-10T14:07:42.412 INFO:tasks.workunit.client.1.vm04.stdout:5/506: getdents d7/d2d 0 2026-03-10T14:07:42.413 INFO:tasks.workunit.client.1.vm04.stdout:5/507: chown d7/d12/d2b/d3e/l96 81746 1 2026-03-10T14:07:42.415 INFO:tasks.workunit.client.1.vm04.stdout:2/466: dwrite d0/d3/d4a/d8c/f67 [0,4194304] 0 2026-03-10T14:07:42.421 INFO:tasks.workunit.client.1.vm04.stdout:9/375: truncate d9/d58/f62 599902 0 2026-03-10T14:07:42.422 INFO:tasks.workunit.client.1.vm04.stdout:9/376: truncate d9/da/dd/f48 4405591 0 2026-03-10T14:07:42.422 INFO:tasks.workunit.client.1.vm04.stdout:9/377: readlink d9/d5c/l60 0 2026-03-10T14:07:42.437 INFO:tasks.workunit.client.1.vm04.stdout:5/508: creat d7/d12/d2b/d93/fa1 x:0 0 0 2026-03-10T14:07:42.438 INFO:tasks.workunit.client.1.vm04.stdout:5/509: chown d7/d2d/l47 25521 1 2026-03-10T14:07:42.443 INFO:tasks.workunit.client.1.vm04.stdout:9/378: mkdir d9/da/d5d/d81 0 2026-03-10T14:07:42.450 INFO:tasks.workunit.client.1.vm04.stdout:5/510: creat d7/d12/d2b/d3e/d57/d77/fa2 x:0 0 0 2026-03-10T14:07:42.451 INFO:tasks.workunit.client.1.vm04.stdout:9/379: fsync d9/d5c/f77 0 2026-03-10T14:07:42.455 INFO:tasks.workunit.client.1.vm04.stdout:9/380: dwrite d9/ff [0,4194304] 0 2026-03-10T14:07:42.460 INFO:tasks.workunit.client.1.vm04.stdout:4/442: dwrite d4/f21 [0,4194304] 0 2026-03-10T14:07:42.471 INFO:tasks.workunit.client.1.vm04.stdout:9/381: creat d9/d44/d4d/d7d/f82 x:0 0 0 2026-03-10T14:07:42.480 INFO:tasks.workunit.client.1.vm04.stdout:9/382: symlink d9/d44/d70/l83 0 2026-03-10T14:07:42.484 INFO:tasks.workunit.client.1.vm04.stdout:6/368: dwrite d3/de/d35/d3f/d2d/d32/d23/f31 [4194304,4194304] 0 2026-03-10T14:07:42.498 INFO:tasks.workunit.client.1.vm04.stdout:5/511: dread d7/d9/f28 [0,4194304] 0 2026-03-10T14:07:42.498 INFO:tasks.workunit.client.1.vm04.stdout:9/383: mknod d9/da/dd/d74/c84 0 2026-03-10T14:07:42.499 INFO:tasks.workunit.client.1.vm04.stdout:5/512: stat d7/d12/d2b/d3e/d57/l7b 0 2026-03-10T14:07:42.505 INFO:tasks.workunit.client.1.vm04.stdout:5/513: write d7/d12/d2b/d3e/d57/d8a/f94 [492850,4046] 0 2026-03-10T14:07:42.509 INFO:tasks.workunit.client.1.vm04.stdout:4/443: truncate d4/d14/d3c/d5e/f92 595477 0 2026-03-10T14:07:42.515 INFO:tasks.workunit.client.1.vm04.stdout:9/384: creat d9/da/dd/f85 x:0 0 0 2026-03-10T14:07:42.515 INFO:tasks.workunit.client.1.vm04.stdout:5/514: truncate d7/d2d/f64 4534728 0 2026-03-10T14:07:42.523 INFO:tasks.workunit.client.1.vm04.stdout:5/515: creat d7/d26/d6b/d6e/fa3 x:0 0 0 2026-03-10T14:07:42.527 INFO:tasks.workunit.client.1.vm04.stdout:8/451: dwrite d0/f26 [0,4194304] 0 2026-03-10T14:07:42.539 INFO:tasks.workunit.client.1.vm04.stdout:8/452: dread d0/d3/d63/d29/f45 [0,4194304] 0 2026-03-10T14:07:42.545 INFO:tasks.workunit.client.1.vm04.stdout:4/444: getdents d4/df/d31 0 2026-03-10T14:07:42.553 INFO:tasks.workunit.client.1.vm04.stdout:8/453: mkdir d0/d3/dd/d89 0 2026-03-10T14:07:42.560 INFO:tasks.workunit.client.1.vm04.stdout:8/454: mkdir d0/d75/d8a 0 2026-03-10T14:07:42.569 INFO:tasks.workunit.client.1.vm04.stdout:8/455: mkdir d0/d3/d63/d12/d51/d8b 0 2026-03-10T14:07:42.578 INFO:tasks.workunit.client.1.vm04.stdout:7/446: dread f0 [0,4194304] 0 2026-03-10T14:07:42.585 INFO:tasks.workunit.client.1.vm04.stdout:7/447: dread - d2/dc/de/d2d/d60/d7c/d44/f66 zero size 2026-03-10T14:07:42.593 INFO:tasks.workunit.client.1.vm04.stdout:7/448: mkdir d2/dc/de/d2d/d5c/da9 0 2026-03-10T14:07:42.598 INFO:tasks.workunit.client.1.vm04.stdout:7/449: mkdir d2/dc/de/d2d/d60/d7c/d36/daa 0 2026-03-10T14:07:42.605 INFO:tasks.workunit.client.1.vm04.stdout:7/450: dread d2/dc/de/d2d/d38/f37 [0,4194304] 0 2026-03-10T14:07:42.615 INFO:tasks.workunit.client.1.vm04.stdout:7/451: read - d2/dc/de/d2d/d60/d7c/d3b/f48 zero size 2026-03-10T14:07:42.644 INFO:tasks.workunit.client.1.vm04.stdout:3/485: dread da/dc/d35/d52/f6f [0,4194304] 0 2026-03-10T14:07:42.651 INFO:tasks.workunit.client.1.vm04.stdout:1/421: dwrite d3/d5c/f71 [4194304,4194304] 0 2026-03-10T14:07:42.658 INFO:tasks.workunit.client.1.vm04.stdout:0/367: dwrite d0/d2/f17 [0,4194304] 0 2026-03-10T14:07:42.666 INFO:tasks.workunit.client.1.vm04.stdout:3/486: symlink da/dc/d35/d52/laf 0 2026-03-10T14:07:42.666 INFO:tasks.workunit.client.1.vm04.stdout:1/422: readlink d3/l9 0 2026-03-10T14:07:42.670 INFO:tasks.workunit.client.1.vm04.stdout:1/423: rename f2 to d3/d22/d63/f97 0 2026-03-10T14:07:42.675 INFO:tasks.workunit.client.1.vm04.stdout:1/424: truncate d3/d22/d63/f89 87687 0 2026-03-10T14:07:42.676 INFO:tasks.workunit.client.1.vm04.stdout:1/425: write d3/d22/d63/f83 [99060,117427] 0 2026-03-10T14:07:42.677 INFO:tasks.workunit.client.1.vm04.stdout:1/426: fdatasync d3/d22/d2f/f3c 0 2026-03-10T14:07:42.682 INFO:tasks.workunit.client.1.vm04.stdout:1/427: dwrite d3/d20/d60/d82/d13/d38/d58/d5b/d92/f93 [0,4194304] 0 2026-03-10T14:07:42.686 INFO:tasks.workunit.client.1.vm04.stdout:1/428: mkdir d3/d5c/d79/d98 0 2026-03-10T14:07:42.688 INFO:tasks.workunit.client.1.vm04.stdout:1/429: truncate d3/d22/d63/f83 924684 0 2026-03-10T14:07:42.703 INFO:tasks.workunit.client.1.vm04.stdout:2/467: dread d0/d3/d4a/d66/f72 [0,4194304] 0 2026-03-10T14:07:42.705 INFO:tasks.workunit.client.1.vm04.stdout:2/468: mkdir d0/d3/d8/d17/d4e/d8e 0 2026-03-10T14:07:42.707 INFO:tasks.workunit.client.1.vm04.stdout:2/469: unlink d0/d3/d4a/d8c/f67 0 2026-03-10T14:07:42.709 INFO:tasks.workunit.client.1.vm04.stdout:2/470: symlink d0/d14/d54/l8f 0 2026-03-10T14:07:42.717 INFO:tasks.workunit.client.1.vm04.stdout:0/368: dread d0/d2/f2d [0,4194304] 0 2026-03-10T14:07:42.720 INFO:tasks.workunit.client.1.vm04.stdout:0/369: dread d0/d2/d15/d22/d38/d56/d66/f2c [0,4194304] 0 2026-03-10T14:07:42.726 INFO:tasks.workunit.client.1.vm04.stdout:0/370: stat d0/d2/d15/d22/d38/l60 0 2026-03-10T14:07:42.729 INFO:tasks.workunit.client.1.vm04.stdout:0/371: link d0/l43 d0/d2/d15/l74 0 2026-03-10T14:07:42.739 INFO:tasks.workunit.client.1.vm04.stdout:6/369: truncate d3/de/d35/d3a/d43/d56/f2b 2440217 0 2026-03-10T14:07:42.743 INFO:tasks.workunit.client.1.vm04.stdout:6/370: truncate d3/de/d35/d3f/d2d/d32/d23/f33 4419774 0 2026-03-10T14:07:42.744 INFO:tasks.workunit.client.1.vm04.stdout:6/371: rmdir d3/de/d35/d3a/d43/d4c 39 2026-03-10T14:07:42.750 INFO:tasks.workunit.client.1.vm04.stdout:6/372: dwrite d3/de/d35/d3f/f17 [4194304,4194304] 0 2026-03-10T14:07:42.751 INFO:tasks.workunit.client.1.vm04.stdout:9/385: dwrite d9/da/f6b [0,4194304] 0 2026-03-10T14:07:42.756 INFO:tasks.workunit.client.1.vm04.stdout:7/452: sync 2026-03-10T14:07:42.758 INFO:tasks.workunit.client.1.vm04.stdout:1/430: sync 2026-03-10T14:07:42.764 INFO:tasks.workunit.client.1.vm04.stdout:7/453: dwrite d2/dc/de/d2d/d60/d7c/d3b/f49 [0,4194304] 0 2026-03-10T14:07:42.766 INFO:tasks.workunit.client.1.vm04.stdout:6/373: unlink d3/de/l63 0 2026-03-10T14:07:42.767 INFO:tasks.workunit.client.1.vm04.stdout:1/431: dwrite d3/d22/f2b [4194304,4194304] 0 2026-03-10T14:07:42.767 INFO:tasks.workunit.client.1.vm04.stdout:9/386: creat d9/d44/d70/f86 x:0 0 0 2026-03-10T14:07:42.770 INFO:tasks.workunit.client.1.vm04.stdout:5/516: dwrite d7/f21 [0,4194304] 0 2026-03-10T14:07:42.771 INFO:tasks.workunit.client.1.vm04.stdout:7/454: truncate d2/d2a/f93 157282 0 2026-03-10T14:07:42.772 INFO:tasks.workunit.client.1.vm04.stdout:7/455: dread - d2/dc/de/d2d/d38/fa5 zero size 2026-03-10T14:07:42.776 INFO:tasks.workunit.client.1.vm04.stdout:9/387: rename d9/da/dd/d1c/c43 to d9/d44/d59/c87 0 2026-03-10T14:07:42.797 INFO:tasks.workunit.client.1.vm04.stdout:9/388: unlink d9/da/dd/d1c/f30 0 2026-03-10T14:07:42.798 INFO:tasks.workunit.client.1.vm04.stdout:7/456: getdents d2/d2a/d42/d86 0 2026-03-10T14:07:42.798 INFO:tasks.workunit.client.1.vm04.stdout:7/457: stat f0 0 2026-03-10T14:07:42.800 INFO:tasks.workunit.client.1.vm04.stdout:1/432: rename d3/d22/d63/d35/l75 to d3/d20/d60/d82/d13/d38/l99 0 2026-03-10T14:07:42.808 INFO:tasks.workunit.client.1.vm04.stdout:9/389: mknod d9/da/dd/c88 0 2026-03-10T14:07:42.810 INFO:tasks.workunit.client.1.vm04.stdout:7/458: symlink d2/dc/de/d2d/d60/d7c/d36/daa/lab 0 2026-03-10T14:07:42.810 INFO:tasks.workunit.client.1.vm04.stdout:1/433: rename d3/d22/d6d/f7d to d3/d22/d63/d35/f9a 0 2026-03-10T14:07:42.816 INFO:tasks.workunit.client.1.vm04.stdout:9/390: rmdir d9/da 39 2026-03-10T14:07:42.817 INFO:tasks.workunit.client.1.vm04.stdout:7/459: dread - d2/dc/de/d2d/d5c/f63 zero size 2026-03-10T14:07:42.819 INFO:tasks.workunit.client.1.vm04.stdout:9/391: truncate f8 5382045 0 2026-03-10T14:07:42.820 INFO:tasks.workunit.client.1.vm04.stdout:9/392: readlink d9/d33/l52 0 2026-03-10T14:07:42.821 INFO:tasks.workunit.client.1.vm04.stdout:7/460: mkdir d2/dac 0 2026-03-10T14:07:42.825 INFO:tasks.workunit.client.1.vm04.stdout:1/434: rename d3/d20/c54 to d3/d20/d60/d82/c9b 0 2026-03-10T14:07:42.833 INFO:tasks.workunit.client.1.vm04.stdout:1/435: dwrite d3/d22/d2f/f34 [0,4194304] 0 2026-03-10T14:07:42.837 INFO:tasks.workunit.client.1.vm04.stdout:9/393: mknod d9/da/c89 0 2026-03-10T14:07:42.837 INFO:tasks.workunit.client.1.vm04.stdout:1/436: dread d3/fc [0,4194304] 0 2026-03-10T14:07:42.841 INFO:tasks.workunit.client.1.vm04.stdout:9/394: creat d9/da/dd/f8a x:0 0 0 2026-03-10T14:07:42.849 INFO:tasks.workunit.client.1.vm04.stdout:5/517: truncate d7/d12/d2b/d3e/d3f/f52 1338017 0 2026-03-10T14:07:42.854 INFO:tasks.workunit.client.1.vm04.stdout:8/456: dwrite d0/d3/d63/d12/f47 [0,4194304] 0 2026-03-10T14:07:42.854 INFO:tasks.workunit.client.1.vm04.stdout:4/445: dwrite d4/df/d34/f1f [0,4194304] 0 2026-03-10T14:07:42.865 INFO:tasks.workunit.client.1.vm04.stdout:3/487: truncate da/dc/f1d 6976879 0 2026-03-10T14:07:42.874 INFO:tasks.workunit.client.1.vm04.stdout:1/437: truncate d3/d20/f88 547672 0 2026-03-10T14:07:42.874 INFO:tasks.workunit.client.1.vm04.stdout:9/395: truncate d9/da/dd/d1c/f65 1570367 0 2026-03-10T14:07:42.875 INFO:tasks.workunit.client.1.vm04.stdout:2/471: dwrite d0/d14/d39/d47/d70/f8d [0,4194304] 0 2026-03-10T14:07:42.875 INFO:tasks.workunit.client.1.vm04.stdout:8/457: fdatasync d0/d3/d63/f5b 0 2026-03-10T14:07:42.876 INFO:tasks.workunit.client.1.vm04.stdout:8/458: readlink d0/d3/d63/d12/d69/l88 0 2026-03-10T14:07:42.876 INFO:tasks.workunit.client.1.vm04.stdout:2/472: readlink d0/d3/d8/l12 0 2026-03-10T14:07:42.877 INFO:tasks.workunit.client.1.vm04.stdout:2/473: stat d0/d3/d8/d17 0 2026-03-10T14:07:42.890 INFO:tasks.workunit.client.1.vm04.stdout:1/438: unlink d3/d5c/l96 0 2026-03-10T14:07:42.890 INFO:tasks.workunit.client.1.vm04.stdout:8/459: creat d0/d3/d63/d12/d69/f8c x:0 0 0 2026-03-10T14:07:42.898 INFO:tasks.workunit.client.1.vm04.stdout:2/474: rename d0/d3/d8/dd/d26/f2a to d0/d3/d8/d17/d4e/d85/f90 0 2026-03-10T14:07:42.901 INFO:tasks.workunit.client.1.vm04.stdout:5/518: getdents d7/d12/d2b/d3e/d57/d8a 0 2026-03-10T14:07:42.901 INFO:tasks.workunit.client.1.vm04.stdout:3/488: link da/d30/ca1 da/dc/d35/cb0 0 2026-03-10T14:07:42.901 INFO:tasks.workunit.client.1.vm04.stdout:8/460: creat d0/d3/dd/d78/f8d x:0 0 0 2026-03-10T14:07:42.908 INFO:tasks.workunit.client.1.vm04.stdout:5/519: dwrite d7/d12/f42 [0,4194304] 0 2026-03-10T14:07:42.913 INFO:tasks.workunit.client.1.vm04.stdout:1/439: rename d3/d22/c41 to d3/d20/d60/c9c 0 2026-03-10T14:07:42.917 INFO:tasks.workunit.client.1.vm04.stdout:8/461: read d0/d3/dd/fc [422053,102608] 0 2026-03-10T14:07:42.927 INFO:tasks.workunit.client.1.vm04.stdout:3/489: truncate da/d3e/f63 1987630 0 2026-03-10T14:07:42.928 INFO:tasks.workunit.client.1.vm04.stdout:8/462: creat d0/d3/d63/d12/d51/f8e x:0 0 0 2026-03-10T14:07:42.988 INFO:tasks.workunit.client.1.vm04.stdout:0/372: dwrite d0/d2/d15/f44 [0,4194304] 0 2026-03-10T14:07:42.991 INFO:tasks.workunit.client.1.vm04.stdout:0/373: unlink d0/d2/d15/d22/d38/f3e 0 2026-03-10T14:07:42.992 INFO:tasks.workunit.client.1.vm04.stdout:0/374: fsync d0/d2/d15/d22/f30 0 2026-03-10T14:07:42.998 INFO:tasks.workunit.client.1.vm04.stdout:0/375: mkdir d0/d2/d15/d49/d50/d61/d75 0 2026-03-10T14:07:42.999 INFO:tasks.workunit.client.1.vm04.stdout:0/376: truncate d0/d2/d15/f59 763298 0 2026-03-10T14:07:43.003 INFO:tasks.workunit.client.1.vm04.stdout:0/377: dread d0/d2/d25/f45 [0,4194304] 0 2026-03-10T14:07:43.011 INFO:tasks.workunit.client.0.vm03.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T14:07:43.027 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T14:07:43.027 INFO:tasks.workunit.client.0.vm03.stderr:+ make 2026-03-10T14:07:43.109 INFO:tasks.workunit.client.1.vm04.stdout:9/396: read f8 [771953,119884] 0 2026-03-10T14:07:43.112 INFO:tasks.workunit.client.1.vm04.stdout:9/397: creat d9/da/d5d/f8b x:0 0 0 2026-03-10T14:07:43.112 INFO:tasks.workunit.client.1.vm04.stdout:9/398: write d9/d44/d70/f71 [1006512,68847] 0 2026-03-10T14:07:43.132 INFO:tasks.workunit.client.1.vm04.stdout:7/461: write d2/dc/de/f1e [1878604,75525] 0 2026-03-10T14:07:43.139 INFO:tasks.workunit.client.1.vm04.stdout:7/462: truncate d2/dc/f26 1870406 0 2026-03-10T14:07:43.140 INFO:tasks.workunit.client.1.vm04.stdout:4/446: dwrite d4/d14/f27 [0,4194304] 0 2026-03-10T14:07:43.167 INFO:tasks.workunit.client.1.vm04.stdout:2/475: rename d0/d3 to d0/d14/d91 0 2026-03-10T14:07:43.169 INFO:tasks.workunit.client.1.vm04.stdout:5/520: rename d7/d2d/d32/f5b to d7/d12/d2b/d93/fa4 0 2026-03-10T14:07:43.171 INFO:tasks.workunit.client.1.vm04.stdout:2/476: truncate d0/d14/d91/f9 3749630 0 2026-03-10T14:07:43.175 INFO:tasks.workunit.client.1.vm04.stdout:2/477: chown d0/d14/d91/d8/dd/d26/d46/d4b/d56/l60 104666387 1 2026-03-10T14:07:43.181 INFO:tasks.workunit.client.1.vm04.stdout:5/521: unlink d7/d2d/d32/c8d 0 2026-03-10T14:07:43.183 INFO:tasks.workunit.client.1.vm04.stdout:5/522: dread - d7/d2d/d76/f8f zero size 2026-03-10T14:07:43.184 INFO:tasks.workunit.client.1.vm04.stdout:5/523: write d7/d12/d2b/f46 [689374,6762] 0 2026-03-10T14:07:43.186 INFO:tasks.workunit.client.1.vm04.stdout:3/490: dwrite da/dc/d35/d37/f5e [0,4194304] 0 2026-03-10T14:07:43.191 INFO:tasks.workunit.client.1.vm04.stdout:3/491: write da/dc/d3f/d54/d66/fa7 [289557,42083] 0 2026-03-10T14:07:43.196 INFO:tasks.workunit.client.1.vm04.stdout:1/440: rename d3/d20/c6a to d3/d20/d60/d82/d13/d38/d58/d5b/d92/c9d 0 2026-03-10T14:07:43.201 INFO:tasks.workunit.client.1.vm04.stdout:3/492: creat da/dc/d35/d52/d53/d78/fb1 x:0 0 0 2026-03-10T14:07:43.205 INFO:tasks.workunit.client.1.vm04.stdout:7/463: rename d2/dc/de/d2d/d60/d7c/f70 to d2/dc/de/d2d/d38/d50/fad 0 2026-03-10T14:07:43.212 INFO:tasks.workunit.client.1.vm04.stdout:3/493: dread - da/d3e/f4c zero size 2026-03-10T14:07:43.216 INFO:tasks.workunit.client.1.vm04.stdout:2/478: dread d0/d14/d91/d3a/d3e/f38 [0,4194304] 0 2026-03-10T14:07:43.217 INFO:tasks.workunit.client.1.vm04.stdout:5/524: dread d7/f4b [0,4194304] 0 2026-03-10T14:07:43.223 INFO:tasks.workunit.client.1.vm04.stdout:3/494: symlink da/dc/d35/d52/d6d/lb2 0 2026-03-10T14:07:43.227 INFO:tasks.workunit.client.1.vm04.stdout:0/378: getdents d0/d2/d15/d49/d50/d61 0 2026-03-10T14:07:43.228 INFO:tasks.workunit.client.1.vm04.stdout:0/379: chown d0/d2/d15/d22/d38/d56/d66/l42 6418 1 2026-03-10T14:07:43.234 INFO:tasks.workunit.client.1.vm04.stdout:2/479: fdatasync d0/d14/d91/d3a/d3e/f38 0 2026-03-10T14:07:43.240 INFO:tasks.workunit.client.1.vm04.stdout:3/495: mknod da/dc/cb3 0 2026-03-10T14:07:43.245 INFO:tasks.workunit.client.1.vm04.stdout:5/525: dread d7/d12/d2b/d3e/f4a [0,4194304] 0 2026-03-10T14:07:43.248 INFO:tasks.workunit.client.1.vm04.stdout:5/526: dwrite d7/d26/d6b/d6e/fa3 [0,4194304] 0 2026-03-10T14:07:43.252 INFO:tasks.workunit.client.1.vm04.stdout:0/380: creat d0/d2/d15/d49/d50/d5c/f76 x:0 0 0 2026-03-10T14:07:43.255 INFO:tasks.workunit.client.1.vm04.stdout:9/399: write d9/da/dd/f19 [1301863,81779] 0 2026-03-10T14:07:43.257 INFO:tasks.workunit.client.1.vm04.stdout:5/527: dread d7/d12/d2b/d3e/d57/d8a/f94 [0,4194304] 0 2026-03-10T14:07:43.263 INFO:tasks.workunit.client.1.vm04.stdout:5/528: dwrite d7/d26/d6b/d6e/f91 [0,4194304] 0 2026-03-10T14:07:43.274 INFO:tasks.workunit.client.1.vm04.stdout:3/496: creat da/dc/d35/d52/da3/dac/fb4 x:0 0 0 2026-03-10T14:07:43.276 INFO:tasks.workunit.client.1.vm04.stdout:5/529: dwrite d7/d2d/d69/fa0 [0,4194304] 0 2026-03-10T14:07:43.276 INFO:tasks.workunit.client.1.vm04.stdout:5/530: write d7/f21 [750489,7252] 0 2026-03-10T14:07:43.301 INFO:tasks.workunit.client.1.vm04.stdout:9/400: mkdir d9/da/d8c 0 2026-03-10T14:07:43.302 INFO:tasks.workunit.client.1.vm04.stdout:9/401: write d9/da/dd/f85 [520879,37104] 0 2026-03-10T14:07:43.318 INFO:tasks.workunit.client.1.vm04.stdout:3/497: mkdir da/d8e/db5 0 2026-03-10T14:07:43.339 INFO:tasks.workunit.client.1.vm04.stdout:9/402: mkdir d9/d58/d8d 0 2026-03-10T14:07:43.340 INFO:tasks.workunit.client.1.vm04.stdout:9/403: chown d9/d1d/f23 64205 1 2026-03-10T14:07:43.343 INFO:tasks.workunit.client.1.vm04.stdout:9/404: rmdir d9/d58/d8d 0 2026-03-10T14:07:43.357 INFO:tasks.workunit.client.1.vm04.stdout:6/374: dread d3/de/d35/d3f/d2d/d32/d23/f31 [0,4194304] 0 2026-03-10T14:07:43.363 INFO:tasks.workunit.client.1.vm04.stdout:9/405: dread d9/da/dd/f47 [0,4194304] 0 2026-03-10T14:07:43.371 INFO:tasks.workunit.client.1.vm04.stdout:9/406: mknod d9/d44/d59/c8e 0 2026-03-10T14:07:43.378 INFO:tasks.workunit.client.0.vm03.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T14:07:43.378 INFO:tasks.workunit.client.1.vm04.stdout:8/463: dread d0/f42 [0,4194304] 0 2026-03-10T14:07:43.382 INFO:tasks.workunit.client.1.vm04.stdout:8/464: dwrite d0/d3/dd/d76/f86 [0,4194304] 0 2026-03-10T14:07:43.389 INFO:tasks.workunit.client.1.vm04.stdout:4/447: write d4/d14/d1b/f6c [783568,61109] 0 2026-03-10T14:07:43.402 INFO:tasks.workunit.client.1.vm04.stdout:1/441: dwrite d3/d20/d60/d82/d13/d38/d58/d5b/f7c [0,4194304] 0 2026-03-10T14:07:43.409 INFO:tasks.workunit.client.1.vm04.stdout:7/464: fdatasync d2/dc/de/d2d/d38/d50/fad 0 2026-03-10T14:07:43.414 INFO:tasks.workunit.client.1.vm04.stdout:8/465: creat d0/d3/d63/f8f x:0 0 0 2026-03-10T14:07:43.437 INFO:tasks.workunit.client.1.vm04.stdout:1/442: link d3/d22/l76 d3/d22/d2f/l9e 0 2026-03-10T14:07:43.439 INFO:tasks.workunit.client.1.vm04.stdout:8/466: mknod d0/d3/d63/d12/d51/d8b/c90 0 2026-03-10T14:07:43.440 INFO:tasks.workunit.client.1.vm04.stdout:8/467: write d0/d75/f83 [268424,21824] 0 2026-03-10T14:07:43.442 INFO:tasks.workunit.client.1.vm04.stdout:2/480: write d0/d14/d91/d8/dd/d26/f36 [1578626,12778] 0 2026-03-10T14:07:43.444 INFO:tasks.workunit.client.1.vm04.stdout:8/468: read d0/d3/f35 [7345774,58695] 0 2026-03-10T14:07:43.445 INFO:tasks.workunit.client.1.vm04.stdout:8/469: chown d0/d3/d63/l59 5630083 1 2026-03-10T14:07:43.447 INFO:tasks.workunit.client.1.vm04.stdout:2/481: mkdir d0/d14/d91/d4a/d8c/d92 0 2026-03-10T14:07:43.459 INFO:tasks.workunit.client.1.vm04.stdout:2/482: dwrite d0/d14/f79 [0,4194304] 0 2026-03-10T14:07:43.459 INFO:tasks.workunit.client.1.vm04.stdout:0/381: write d0/d2/d15/d22/d38/d56/d66/f2e [1607056,129284] 0 2026-03-10T14:07:43.459 INFO:tasks.workunit.client.1.vm04.stdout:0/382: dread d0/d2/d15/f57 [0,4194304] 0 2026-03-10T14:07:43.471 INFO:tasks.workunit.client.1.vm04.stdout:0/383: dread d0/d2/d15/d22/d38/d56/d66/f29 [0,4194304] 0 2026-03-10T14:07:43.472 INFO:tasks.workunit.client.1.vm04.stdout:2/483: mkdir d0/d14/d39/d47/d93 0 2026-03-10T14:07:43.474 INFO:tasks.workunit.client.1.vm04.stdout:0/384: rename d0/d2/d15 to d0/d2/d15/d49/d50/d77 22 2026-03-10T14:07:43.478 INFO:tasks.workunit.client.1.vm04.stdout:2/484: dread d0/d14/d91/f24 [4194304,4194304] 0 2026-03-10T14:07:43.484 INFO:tasks.workunit.client.1.vm04.stdout:6/375: truncate d3/de/f46 1530163 0 2026-03-10T14:07:43.487 INFO:tasks.workunit.client.1.vm04.stdout:5/531: dwrite d7/d2d/d76/f84 [0,4194304] 0 2026-03-10T14:07:43.489 INFO:tasks.workunit.client.1.vm04.stdout:0/385: unlink d0/f48 0 2026-03-10T14:07:43.490 INFO:tasks.workunit.client.1.vm04.stdout:0/386: readlink d0/d2/d15/d22/d38/d56/l6b 0 2026-03-10T14:07:43.504 INFO:tasks.workunit.client.1.vm04.stdout:9/407: write d9/da/dd/f24 [3236728,5440] 0 2026-03-10T14:07:43.504 INFO:tasks.workunit.client.1.vm04.stdout:5/532: mkdir d7/d12/d2b/d3e/d57/d77/da5 0 2026-03-10T14:07:43.519 INFO:tasks.workunit.client.1.vm04.stdout:6/376: read d3/de/d35/d3f/d2d/d32/d23/f33 [1094725,46282] 0 2026-03-10T14:07:43.520 INFO:tasks.workunit.client.1.vm04.stdout:3/498: write da/dc/d35/d52/f79 [5014684,80894] 0 2026-03-10T14:07:43.525 INFO:tasks.workunit.client.1.vm04.stdout:4/448: write d4/d14/d3c/d5e/f79 [699972,98553] 0 2026-03-10T14:07:43.529 INFO:tasks.workunit.client.1.vm04.stdout:9/408: read d9/d1d/f3d [354078,61056] 0 2026-03-10T14:07:43.529 INFO:tasks.workunit.client.1.vm04.stdout:5/533: mkdir d7/d12/d2b/d3e/d3f/da6 0 2026-03-10T14:07:43.536 INFO:tasks.workunit.client.1.vm04.stdout:2/485: link d0/d14/d91/d8/f6 d0/d14/d91/d8/d17/d35/f94 0 2026-03-10T14:07:43.566 INFO:tasks.workunit.client.1.vm04.stdout:3/499: fdatasync da/dc/d3f/d54/f82 0 2026-03-10T14:07:43.573 INFO:tasks.workunit.client.1.vm04.stdout:6/377: rename d3/de/d35/d3f/f3b to d3/de/d35/d3f/d2d/d32/d23/d4e/f69 0 2026-03-10T14:07:43.573 INFO:tasks.workunit.client.1.vm04.stdout:1/443: write d3/d22/f3b [2962500,56628] 0 2026-03-10T14:07:43.573 INFO:tasks.workunit.client.1.vm04.stdout:9/409: dwrite d9/d44/d59/f7c [0,4194304] 0 2026-03-10T14:07:43.586 INFO:tasks.workunit.client.1.vm04.stdout:3/500: dwrite da/dc/d35/d52/da3/dac/fae [0,4194304] 0 2026-03-10T14:07:43.588 INFO:tasks.workunit.client.1.vm04.stdout:1/444: write d3/d22/d63/f83 [1578421,15779] 0 2026-03-10T14:07:43.588 INFO:tasks.workunit.client.1.vm04.stdout:6/378: write d3/de/d35/d3f/d2d/d38/d40/f5d [452244,106490] 0 2026-03-10T14:07:43.588 INFO:tasks.workunit.client.1.vm04.stdout:4/449: creat d4/df/d22/d47/d4f/d8c/fa4 x:0 0 0 2026-03-10T14:07:43.588 INFO:tasks.workunit.client.1.vm04.stdout:0/387: dread d0/d2/fa [0,4194304] 0 2026-03-10T14:07:43.594 INFO:tasks.workunit.client.1.vm04.stdout:6/379: chown d3/de/d35/d3f/l1e 9222 1 2026-03-10T14:07:43.603 INFO:tasks.workunit.client.1.vm04.stdout:4/450: rmdir d4/d14 39 2026-03-10T14:07:43.606 INFO:tasks.workunit.client.1.vm04.stdout:5/534: dread d7/f11 [0,4194304] 0 2026-03-10T14:07:43.608 INFO:tasks.workunit.client.1.vm04.stdout:3/501: dread da/dc/d47/f6b [0,4194304] 0 2026-03-10T14:07:43.617 INFO:tasks.workunit.client.1.vm04.stdout:6/380: unlink d3/de/d35/d3f/d2d/f27 0 2026-03-10T14:07:43.620 INFO:tasks.workunit.client.1.vm04.stdout:4/451: stat d4/df/d22/d47/d70/l7e 0 2026-03-10T14:07:43.623 INFO:tasks.workunit.client.1.vm04.stdout:0/388: write d0/d2/f12 [4680511,50609] 0 2026-03-10T14:07:43.623 INFO:tasks.workunit.client.1.vm04.stdout:0/389: readlink d0/d2/d15/d22/d38/l60 0 2026-03-10T14:07:43.627 INFO:tasks.workunit.client.1.vm04.stdout:1/445: truncate d3/d20/d60/d82/d13/d1a/f62 1800076 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:5/535: symlink d7/d12/d2b/d3e/d3f/da6/la7 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:0/390: fdatasync d0/d2/d15/f2f 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:0/391: chown d0/d2/d25 967 1 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:0/392: symlink d0/d6e/l78 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:6/381: link d3/c7 d3/de/d35/d3f/d2d/d32/c6a 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:4/452: symlink d4/d14/d3c/d85/la5 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:1/446: getdents d3/d20 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:1/447: write d3/d22/d63/f2d [805405,129269] 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:0/393: creat d0/d2/d15/d22/d38/d56/f79 x:0 0 0 2026-03-10T14:07:43.650 INFO:tasks.workunit.client.1.vm04.stdout:0/394: creat d0/d2/d15/d22/d38/d56/d66/f7a x:0 0 0 2026-03-10T14:07:43.657 INFO:tasks.workunit.client.1.vm04.stdout:1/448: rmdir d3/d20/d60/d82/d13/d38/d58/d5b/d7b/d84 39 2026-03-10T14:07:43.659 INFO:tasks.workunit.client.1.vm04.stdout:1/449: truncate d3/fa 112984 0 2026-03-10T14:07:43.660 INFO:tasks.workunit.client.1.vm04.stdout:1/450: dread - d3/d22/f8e zero size 2026-03-10T14:07:43.660 INFO:tasks.workunit.client.1.vm04.stdout:4/453: dread d4/df/d31/f55 [0,4194304] 0 2026-03-10T14:07:43.663 INFO:tasks.workunit.client.1.vm04.stdout:0/395: creat d0/d2/d15/d22/d62/f7b x:0 0 0 2026-03-10T14:07:43.666 INFO:tasks.workunit.client.1.vm04.stdout:4/454: rmdir d4/df/d34/d6f 39 2026-03-10T14:07:43.667 INFO:tasks.workunit.client.1.vm04.stdout:4/455: stat d4/d14/d64/c82 0 2026-03-10T14:07:43.669 INFO:tasks.workunit.client.1.vm04.stdout:0/396: rename d0/d2/d15/d49/d50/d61/f6f to d0/d2/d15/d49/f7c 0 2026-03-10T14:07:43.671 INFO:tasks.workunit.client.1.vm04.stdout:1/451: dread d3/d20/d60/d82/fd [0,4194304] 0 2026-03-10T14:07:43.672 INFO:tasks.workunit.client.1.vm04.stdout:1/452: chown d3/d20/d60/d82/d13/d1a/c64 8659 1 2026-03-10T14:07:43.674 INFO:tasks.workunit.client.1.vm04.stdout:0/397: creat d0/d2/d15/d22/d38/f7d x:0 0 0 2026-03-10T14:07:43.675 INFO:tasks.workunit.client.1.vm04.stdout:1/453: rmdir d3/d20/d60/d82/d13/d38/d58/d5b/d7b 39 2026-03-10T14:07:43.677 INFO:tasks.workunit.client.1.vm04.stdout:1/454: rmdir d3/d22/d63/d35/d6c 39 2026-03-10T14:07:43.704 INFO:tasks.workunit.client.0.vm03.stderr:++ readlink -f fsstress 2026-03-10T14:07:43.714 INFO:tasks.workunit.client.0.vm03.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T14:07:43.714 INFO:tasks.workunit.client.0.vm03.stderr:+ popd 2026-03-10T14:07:43.714 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T14:07:43.715 INFO:tasks.workunit.client.0.vm03.stderr:+ popd 2026-03-10T14:07:43.715 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-10T14:07:43.715 INFO:tasks.workunit.client.0.vm03.stderr:++ mktemp -d -p . 2026-03-10T14:07:43.715 INFO:tasks.workunit.client.0.vm03.stderr:+ T=./tmp.Ji2F8ZhDoI 2026-03-10T14:07:43.715 INFO:tasks.workunit.client.0.vm03.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.Ji2F8ZhDoI -l 1 -n 1000 -p 10 -v 2026-03-10T14:07:43.715 INFO:tasks.workunit.client.0.vm03.stdout:seed = 1773848718 2026-03-10T14:07:43.717 INFO:tasks.workunit.client.0.vm03.stdout:0/0: write - no filename 2026-03-10T14:07:43.717 INFO:tasks.workunit.client.0.vm03.stdout:0/1: truncate - no filename 2026-03-10T14:07:43.717 INFO:tasks.workunit.client.0.vm03.stdout:0/2: chown . 259 1 2026-03-10T14:07:43.717 INFO:tasks.workunit.client.0.vm03.stdout:0/3: link - no file 2026-03-10T14:07:43.717 INFO:tasks.workunit.client.0.vm03.stdout:0/4: fsync - no filename 2026-03-10T14:07:43.717 INFO:tasks.workunit.client.0.vm03.stdout:0/5: truncate - no filename 2026-03-10T14:07:43.717 INFO:tasks.workunit.client.0.vm03.stdout:0/6: chown . 1018957041 1 2026-03-10T14:07:43.717 INFO:tasks.workunit.client.0.vm03.stdout:0/7: dwrite - no filename 2026-03-10T14:07:43.718 INFO:tasks.workunit.client.0.vm03.stdout:1/0: chown . 8 1 2026-03-10T14:07:43.718 INFO:tasks.workunit.client.0.vm03.stdout:1/1: write - no filename 2026-03-10T14:07:43.718 INFO:tasks.workunit.client.0.vm03.stdout:0/8: chown . 3358832 1 2026-03-10T14:07:43.718 INFO:tasks.workunit.client.0.vm03.stdout:0/9: dread - no filename 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:1/2: mkdir d0 0 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:1/3: fsync - no filename 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:1/4: dwrite - no filename 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:1/5: dwrite - no filename 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:1/6: dwrite - no filename 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:1/7: truncate - no filename 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:1/8: dread - no filename 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:2/0: dwrite - no filename 2026-03-10T14:07:43.720 INFO:tasks.workunit.client.0.vm03.stdout:2/1: dread - no filename 2026-03-10T14:07:43.721 INFO:tasks.workunit.client.0.vm03.stdout:0/10: creat f0 x:0 0 0 2026-03-10T14:07:43.723 INFO:tasks.workunit.client.0.vm03.stdout:0/11: rename f0 to f1 0 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:3/0: chown . 2011166 1 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:2/2: creat f0 x:0 0 0 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:1/9: mkdir d0/d1 0 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:2/3: fdatasync f0 0 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:4/0: unlink - no file 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:4/1: fsync - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:4/2: rename - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:4/3: dread - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:4/4: truncate - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:4/5: readlink - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:1/10: rmdir d0 39 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:1/11: write - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:1/12: dwrite - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:3/1: mknod c0 0 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:3/2: fsync - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:2/4: dread - f0 zero size 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:5/0: truncate - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:5/1: dwrite - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:5/2: write - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:5/3: write - no filename 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:3/3: unlink c0 0 2026-03-10T14:07:43.728 INFO:tasks.workunit.client.0.vm03.stdout:3/4: stat - no entries 2026-03-10T14:07:43.737 INFO:tasks.workunit.client.0.vm03.stdout:4/6: mknod c0 0 2026-03-10T14:07:43.737 INFO:tasks.workunit.client.0.vm03.stdout:0/12: dwrite f1 [0,4194304] 0 2026-03-10T14:07:43.737 INFO:tasks.workunit.client.0.vm03.stdout:0/13: dread f1 [0,4194304] 0 2026-03-10T14:07:43.737 INFO:tasks.workunit.client.0.vm03.stdout:5/4: getdents . 0 2026-03-10T14:07:43.745 INFO:tasks.workunit.client.0.vm03.stdout:7/0: dwrite - no filename 2026-03-10T14:07:43.746 INFO:tasks.workunit.client.0.vm03.stdout:7/1: write - no filename 2026-03-10T14:07:43.746 INFO:tasks.workunit.client.0.vm03.stdout:7/2: readlink - no filename 2026-03-10T14:07:43.746 INFO:tasks.workunit.client.0.vm03.stdout:7/3: read - no filename 2026-03-10T14:07:43.747 INFO:tasks.workunit.client.0.vm03.stdout:7/4: rename - no filename 2026-03-10T14:07:43.747 INFO:tasks.workunit.client.0.vm03.stdout:7/5: dread - no filename 2026-03-10T14:07:43.747 INFO:tasks.workunit.client.0.vm03.stdout:7/6: rename - no filename 2026-03-10T14:07:43.747 INFO:tasks.workunit.client.0.vm03.stdout:7/7: dread - no filename 2026-03-10T14:07:43.747 INFO:tasks.workunit.client.0.vm03.stdout:7/8: dread - no filename 2026-03-10T14:07:43.747 INFO:tasks.workunit.client.0.vm03.stdout:7/9: rmdir - no directory 2026-03-10T14:07:43.747 INFO:tasks.workunit.client.0.vm03.stdout:7/10: dwrite - no filename 2026-03-10T14:07:43.747 INFO:tasks.workunit.client.0.vm03.stdout:0/14: creat f2 x:0 0 0 2026-03-10T14:07:43.748 INFO:tasks.workunit.client.0.vm03.stdout:0/15: chown f1 5978 1 2026-03-10T14:07:43.749 INFO:tasks.workunit.client.1.vm04.stdout:8/470: write d0/d3/dd/fc [1447136,121856] 0 2026-03-10T14:07:43.751 INFO:tasks.workunit.client.1.vm04.stdout:8/471: creat d0/d3/d73/f91 x:0 0 0 2026-03-10T14:07:43.753 INFO:tasks.workunit.client.1.vm04.stdout:8/472: rename d0/d3/dd/d76/f86 to d0/d3/dd/d76/f92 0 2026-03-10T14:07:43.755 INFO:tasks.workunit.client.1.vm04.stdout:8/473: mknod d0/d3/d63/c93 0 2026-03-10T14:07:43.784 INFO:tasks.workunit.client.1.vm04.stdout:8/474: creat d0/d3/d73/f94 x:0 0 0 2026-03-10T14:07:43.784 INFO:tasks.workunit.client.1.vm04.stdout:8/475: dwrite d0/d3/d63/d12/d51/d67/f85 [0,4194304] 0 2026-03-10T14:07:43.784 INFO:tasks.workunit.client.1.vm04.stdout:8/476: mknod d0/d75/c95 0 2026-03-10T14:07:43.784 INFO:tasks.workunit.client.1.vm04.stdout:8/477: dwrite d0/d3/d63/d12/f47 [0,4194304] 0 2026-03-10T14:07:43.784 INFO:tasks.workunit.client.1.vm04.stdout:8/478: rename d0/d3/dd/d33 to d0/d3/d63/d12/d51/d67/d96 0 2026-03-10T14:07:43.784 INFO:tasks.workunit.client.1.vm04.stdout:8/479: chown d0 4333 1 2026-03-10T14:07:43.784 INFO:tasks.workunit.client.1.vm04.stdout:8/480: dwrite d0/d3/d63/d12/d51/f64 [0,4194304] 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/0: dwrite - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/1: fdatasync - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/2: dwrite - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/3: rename - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/4: write - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/5: unlink - no file 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:5/5: symlink l0 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:2/5: dwrite f0 [0,4194304] 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:6/0: creat f0 x:0 0 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:0/16: mkdir d3 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:6/1: read - f0 zero size 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:7/11: getdents . 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:7/12: fdatasync - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:5/6: rename l0 to l1 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:0/17: read - f2 zero size 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:9/0: stat - no entries 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/6: creat f0 x:0 0 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:5/7: mknod c2 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:5/8: rmdir - no directory 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:5/9: dwrite - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/7: symlink l1 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:5/10: rename c2 to c3 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:9/1: mknod c0 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:9/2: write - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:9/3: chown c0 631 1 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:0/18: getdents d3 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:5/11: mkdir d4 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:9/4: symlink l1 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:8/8: link f0 f2 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:9/5: mkdir d2 0 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:9/6: read - no filename 2026-03-10T14:07:43.785 INFO:tasks.workunit.client.0.vm03.stdout:9/7: dwrite - no filename 2026-03-10T14:07:43.786 INFO:tasks.workunit.client.0.vm03.stdout:9/8: chown c0 23 1 2026-03-10T14:07:43.786 INFO:tasks.workunit.client.0.vm03.stdout:8/9: write f2 [596806,98486] 0 2026-03-10T14:07:43.800 INFO:tasks.workunit.client.0.vm03.stdout:0/19: dwrite f1 [0,4194304] 0 2026-03-10T14:07:43.801 INFO:tasks.workunit.client.0.vm03.stdout:0/20: read - f2 zero size 2026-03-10T14:07:43.805 INFO:tasks.workunit.client.0.vm03.stdout:9/9: creat d2/f3 x:0 0 0 2026-03-10T14:07:43.805 INFO:tasks.workunit.client.0.vm03.stdout:9/10: truncate d2/f3 325041 0 2026-03-10T14:07:43.807 INFO:tasks.workunit.client.0.vm03.stdout:9/11: mkdir d2/d4 0 2026-03-10T14:07:43.808 INFO:tasks.workunit.client.0.vm03.stdout:0/21: creat d3/f4 x:0 0 0 2026-03-10T14:07:43.813 INFO:tasks.workunit.client.0.vm03.stdout:9/12: mknod d2/c5 0 2026-03-10T14:07:43.819 INFO:tasks.workunit.client.0.vm03.stdout:0/22: dwrite f1 [0,4194304] 0 2026-03-10T14:07:43.819 INFO:tasks.workunit.client.0.vm03.stdout:0/23: truncate d3/f4 512038 0 2026-03-10T14:07:43.819 INFO:tasks.workunit.client.0.vm03.stdout:0/24: stat f1 0 2026-03-10T14:07:43.826 INFO:tasks.workunit.client.0.vm03.stdout:0/25: dwrite d3/f4 [0,4194304] 0 2026-03-10T14:07:43.829 INFO:tasks.workunit.client.0.vm03.stdout:0/26: symlink d3/l5 0 2026-03-10T14:07:43.836 INFO:tasks.workunit.client.1.vm04.stdout:2/486: sync 2026-03-10T14:07:43.837 INFO:tasks.workunit.client.1.vm04.stdout:5/536: sync 2026-03-10T14:07:43.844 INFO:tasks.workunit.client.1.vm04.stdout:2/487: fsync d0/d14/d91/d8/d17/f73 0 2026-03-10T14:07:43.844 INFO:tasks.workunit.client.1.vm04.stdout:5/537: mknod d7/d26/d6b/d6e/d82/ca8 0 2026-03-10T14:07:43.847 INFO:tasks.workunit.client.1.vm04.stdout:5/538: unlink d7/d2d/d69/fa0 0 2026-03-10T14:07:43.848 INFO:tasks.workunit.client.1.vm04.stdout:2/488: mkdir d0/d14/d91/d8/dd/d26/d95 0 2026-03-10T14:07:43.850 INFO:tasks.workunit.client.1.vm04.stdout:2/489: chown d0/d14/d91/d3a/c4c 790 1 2026-03-10T14:07:43.855 INFO:tasks.workunit.client.1.vm04.stdout:5/539: rename d7/d12/d2b/d93/c98 to d7/d12/d2b/d8c/ca9 0 2026-03-10T14:07:43.860 INFO:tasks.workunit.client.1.vm04.stdout:2/490: chown d0/d14/d91/d8/dd/d26/d46/c52 5911791 1 2026-03-10T14:07:43.870 INFO:tasks.workunit.client.1.vm04.stdout:2/491: dwrite d0/d14/d1b/f32 [4194304,4194304] 0 2026-03-10T14:07:43.873 INFO:tasks.workunit.client.1.vm04.stdout:2/492: fsync d0/d14/d91/d3a/d3e/f38 0 2026-03-10T14:07:43.879 INFO:tasks.workunit.client.1.vm04.stdout:2/493: mkdir d0/d14/d91/d8/d17/d4e/d85/d86/d96 0 2026-03-10T14:07:43.881 INFO:tasks.workunit.client.1.vm04.stdout:8/481: dread d0/d3/dd/fc [0,4194304] 0 2026-03-10T14:07:43.898 INFO:tasks.workunit.client.1.vm04.stdout:2/494: rename d0/d14/d1b/l48 to d0/d14/d39/l97 0 2026-03-10T14:07:43.902 INFO:tasks.workunit.client.1.vm04.stdout:2/495: symlink d0/d14/d91/d8/dd/d26/d46/d4b/d56/l98 0 2026-03-10T14:07:43.933 INFO:tasks.workunit.client.1.vm04.stdout:7/465: chown d2/dc/f26 15772 1 2026-03-10T14:07:43.937 INFO:tasks.workunit.client.1.vm04.stdout:9/410: dwrite d9/da/dd/d1c/f22 [0,4194304] 0 2026-03-10T14:07:43.939 INFO:tasks.workunit.client.1.vm04.stdout:9/411: readlink d9/da/l12 0 2026-03-10T14:07:43.940 INFO:tasks.workunit.client.1.vm04.stdout:9/412: stat d9/d44 0 2026-03-10T14:07:43.941 INFO:tasks.workunit.client.1.vm04.stdout:3/502: dwrite da/d30/f32 [0,4194304] 0 2026-03-10T14:07:43.946 INFO:tasks.workunit.client.1.vm04.stdout:6/382: rmdir d3/de/d35/d3f/d2d 39 2026-03-10T14:07:43.970 INFO:tasks.workunit.client.1.vm04.stdout:6/383: rmdir d3/de 39 2026-03-10T14:07:43.970 INFO:tasks.workunit.client.1.vm04.stdout:7/466: mkdir d2/dc/de/dae 0 2026-03-10T14:07:43.970 INFO:tasks.workunit.client.1.vm04.stdout:3/503: dread da/dc/d3f/d54/d66/fa7 [0,4194304] 0 2026-03-10T14:07:43.970 INFO:tasks.workunit.client.1.vm04.stdout:7/467: rename d2/dc/de/d2d/d60/d7c/d36/f52 to d2/dc/de/d2d/d60/faf 0 2026-03-10T14:07:43.970 INFO:tasks.workunit.client.1.vm04.stdout:4/456: write d4/df/d34/f7c [302729,29558] 0 2026-03-10T14:07:43.973 INFO:tasks.workunit.client.1.vm04.stdout:3/504: readlink da/dc/d3f/l4b 0 2026-03-10T14:07:43.974 INFO:tasks.workunit.client.1.vm04.stdout:0/398: write d0/d2/d15/d49/d50/f53 [898414,73748] 0 2026-03-10T14:07:43.976 INFO:tasks.workunit.client.1.vm04.stdout:4/457: creat d4/df/d22/d47/d4f/fa6 x:0 0 0 2026-03-10T14:07:43.979 INFO:tasks.workunit.client.1.vm04.stdout:0/399: creat d0/d2/d15/d22/d38/d56/d66/f7e x:0 0 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:0/400: creat d0/d2/d25/f7f x:0 0 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:3/505: dwrite da/dc/d3f/f83 [0,4194304] 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:0/401: getdents d0/d2/d15/d49/d50 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:0/402: readlink d0/d2/d15/d22/d38/d56/d66/l3a 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:3/506: write da/f19 [4285973,28906] 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:0/403: chown d0/d2/d15/d22/d38/d56/d66/f54 828 1 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:0/404: creat d0/d2/d15/d49/d50/d61/d75/f80 x:0 0 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:0/405: creat d0/d2/d15/d22/f81 x:0 0 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.1.vm04.stdout:0/406: write d0/d2/d15/d22/d38/d56/f79 [398279,77643] 0 2026-03-10T14:07:44.025 INFO:tasks.workunit.client.0.vm03.stdout:8/10: dread f2 [0,4194304] 0 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/11: dwrite f2 [0,4194304] 0 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/12: chown l1 0 1 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/13: symlink l3 0 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/14: chown l3 393 1 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/15: read f0 [708304,57542] 0 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/16: creat f4 x:0 0 0 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/17: creat f5 x:0 0 0 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/18: dwrite f4 [0,4194304] 0 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/19: dwrite f0 [0,4194304] 0 2026-03-10T14:07:44.026 INFO:tasks.workunit.client.0.vm03.stdout:8/20: write f4 [2082642,42235] 0 2026-03-10T14:07:44.027 INFO:tasks.workunit.client.1.vm04.stdout:0/407: dwrite d0/d2/d25/f7f [0,4194304] 0 2026-03-10T14:07:44.028 INFO:tasks.workunit.client.1.vm04.stdout:0/408: chown d0/d2/d15/d22/c6d 5140 1 2026-03-10T14:07:44.034 INFO:tasks.workunit.client.1.vm04.stdout:0/409: dread - d0/d2/d15/d22/d38/f5b zero size 2026-03-10T14:07:44.039 INFO:tasks.workunit.client.1.vm04.stdout:0/410: dwrite d0/d2/d15/d22/d38/d56/f79 [0,4194304] 0 2026-03-10T14:07:44.046 INFO:tasks.workunit.client.1.vm04.stdout:0/411: dwrite d0/d2/d15/d22/d38/d56/d66/f2e [0,4194304] 0 2026-03-10T14:07:44.046 INFO:tasks.workunit.client.1.vm04.stdout:0/412: chown d0/d2/d15/d22/d38/d56/d66/l42 28 1 2026-03-10T14:07:44.140 INFO:tasks.workunit.client.1.vm04.stdout:1/455: dwrite d3/d20/d60/d82/d13/d1a/f4b [0,4194304] 0 2026-03-10T14:07:44.254 INFO:tasks.workunit.client.1.vm04.stdout:4/458: sync 2026-03-10T14:07:44.256 INFO:tasks.workunit.client.1.vm04.stdout:4/459: creat d4/df/d31/fa7 x:0 0 0 2026-03-10T14:07:44.259 INFO:tasks.workunit.client.1.vm04.stdout:4/460: creat d4/d14/fa8 x:0 0 0 2026-03-10T14:07:44.260 INFO:tasks.workunit.client.1.vm04.stdout:4/461: rmdir d4/d14/d3c 39 2026-03-10T14:07:44.266 INFO:tasks.workunit.client.1.vm04.stdout:4/462: mkdir d4/d14/d6d/da9 0 2026-03-10T14:07:44.271 INFO:tasks.workunit.client.1.vm04.stdout:4/463: rename d4/d14/d3c/d5e/c7f to d4/d14/d1b/caa 0 2026-03-10T14:07:44.272 INFO:tasks.workunit.client.1.vm04.stdout:4/464: write d4/d14/d1b/f9d [755808,42486] 0 2026-03-10T14:07:44.273 INFO:tasks.workunit.client.1.vm04.stdout:4/465: write d4/df/d22/d47/d70/d74/f86 [1494017,1012] 0 2026-03-10T14:07:44.277 INFO:tasks.workunit.client.1.vm04.stdout:4/466: fsync d4/d14/d3c/d5e/f7b 0 2026-03-10T14:07:44.386 INFO:tasks.workunit.client.0.vm03.stdout:8/21: sync 2026-03-10T14:07:44.386 INFO:tasks.workunit.client.0.vm03.stdout:6/2: sync 2026-03-10T14:07:44.386 INFO:tasks.workunit.client.0.vm03.stdout:7/13: sync 2026-03-10T14:07:44.386 INFO:tasks.workunit.client.0.vm03.stdout:4/7: sync 2026-03-10T14:07:44.388 INFO:tasks.workunit.client.0.vm03.stdout:6/3: creat f1 x:0 0 0 2026-03-10T14:07:44.388 INFO:tasks.workunit.client.0.vm03.stdout:6/4: dread - f0 zero size 2026-03-10T14:07:44.388 INFO:tasks.workunit.client.0.vm03.stdout:4/8: symlink l1 0 2026-03-10T14:07:44.388 INFO:tasks.workunit.client.0.vm03.stdout:4/9: dread - no filename 2026-03-10T14:07:44.390 INFO:tasks.workunit.client.0.vm03.stdout:7/14: creat f0 x:0 0 0 2026-03-10T14:07:44.390 INFO:tasks.workunit.client.0.vm03.stdout:6/5: truncate f0 923607 0 2026-03-10T14:07:44.393 INFO:tasks.workunit.client.0.vm03.stdout:7/15: mknod c1 0 2026-03-10T14:07:44.393 INFO:tasks.workunit.client.0.vm03.stdout:7/16: rmdir - no directory 2026-03-10T14:07:44.393 INFO:tasks.workunit.client.0.vm03.stdout:7/17: rmdir - no directory 2026-03-10T14:07:44.394 INFO:tasks.workunit.client.0.vm03.stdout:7/18: write f0 [384392,63546] 0 2026-03-10T14:07:44.394 INFO:tasks.workunit.client.0.vm03.stdout:7/19: truncate f0 463197 0 2026-03-10T14:07:44.397 INFO:tasks.workunit.client.0.vm03.stdout:6/6: dwrite f1 [0,4194304] 0 2026-03-10T14:07:44.405 INFO:tasks.workunit.client.0.vm03.stdout:6/7: dread f1 [0,4194304] 0 2026-03-10T14:07:44.412 INFO:tasks.workunit.client.0.vm03.stdout:6/8: rename f1 to f2 0 2026-03-10T14:07:44.414 INFO:tasks.workunit.client.0.vm03.stdout:6/9: link f0 f3 0 2026-03-10T14:07:44.454 INFO:tasks.workunit.client.0.vm03.stdout:4/10: sync 2026-03-10T14:07:44.455 INFO:tasks.workunit.client.0.vm03.stdout:4/11: dwrite - no filename 2026-03-10T14:07:44.455 INFO:tasks.workunit.client.0.vm03.stdout:4/12: dread - no filename 2026-03-10T14:07:44.455 INFO:tasks.workunit.client.0.vm03.stdout:4/13: chown c0 2 1 2026-03-10T14:07:44.457 INFO:tasks.workunit.client.0.vm03.stdout:7/20: fdatasync f0 0 2026-03-10T14:07:44.458 INFO:tasks.workunit.client.0.vm03.stdout:7/21: write f0 [94376,49264] 0 2026-03-10T14:07:44.460 INFO:tasks.workunit.client.0.vm03.stdout:7/22: creat f2 x:0 0 0 2026-03-10T14:07:44.461 INFO:tasks.workunit.client.0.vm03.stdout:7/23: rename f2 to f3 0 2026-03-10T14:07:44.471 INFO:tasks.workunit.client.0.vm03.stdout:4/14: sync 2026-03-10T14:07:44.473 INFO:tasks.workunit.client.0.vm03.stdout:4/15: creat f2 x:0 0 0 2026-03-10T14:07:44.474 INFO:tasks.workunit.client.0.vm03.stdout:4/16: write f2 [947096,3678] 0 2026-03-10T14:07:44.474 INFO:tasks.workunit.client.0.vm03.stdout:4/17: write f2 [40865,9349] 0 2026-03-10T14:07:44.540 INFO:tasks.workunit.client.1.vm04.stdout:5/540: dwrite d7/f24 [4194304,4194304] 0 2026-03-10T14:07:44.552 INFO:tasks.workunit.client.1.vm04.stdout:5/541: creat d7/d12/d2b/d3e/d57/d77/da5/faa x:0 0 0 2026-03-10T14:07:44.575 INFO:tasks.workunit.client.1.vm04.stdout:2/496: truncate d0/d14/d91/d3a/d3e/f38 5160112 0 2026-03-10T14:07:44.587 INFO:tasks.workunit.client.1.vm04.stdout:8/482: write d0/d3/f35 [1194065,3308] 0 2026-03-10T14:07:44.588 INFO:tasks.workunit.client.1.vm04.stdout:8/483: write d0/d3/f80 [163254,120554] 0 2026-03-10T14:07:44.593 INFO:tasks.workunit.client.1.vm04.stdout:2/497: dwrite d0/d14/d91/d4a/d66/f72 [8388608,4194304] 0 2026-03-10T14:07:44.597 INFO:tasks.workunit.client.1.vm04.stdout:2/498: read d0/d14/f79 [3945162,32936] 0 2026-03-10T14:07:44.610 INFO:tasks.workunit.client.1.vm04.stdout:8/484: dread d0/d3/d63/d12/d51/d67/d96/f71 [0,4194304] 0 2026-03-10T14:07:44.619 INFO:tasks.workunit.client.1.vm04.stdout:9/413: write d9/d5c/f6e [114498,123402] 0 2026-03-10T14:07:44.619 INFO:tasks.workunit.client.1.vm04.stdout:8/485: stat d0/d3/d63/d29/l77 0 2026-03-10T14:07:44.621 INFO:tasks.workunit.client.1.vm04.stdout:8/486: write d0/d3/d63/d12/f50 [8124941,84209] 0 2026-03-10T14:07:44.625 INFO:tasks.workunit.client.1.vm04.stdout:8/487: creat d0/d3/d63/d12/d51/f97 x:0 0 0 2026-03-10T14:07:44.629 INFO:tasks.workunit.client.1.vm04.stdout:8/488: creat d0/d3/d73/f98 x:0 0 0 2026-03-10T14:07:44.635 INFO:tasks.workunit.client.1.vm04.stdout:8/489: symlink d0/d3/dd/l99 0 2026-03-10T14:07:44.653 INFO:tasks.workunit.client.1.vm04.stdout:6/384: dwrite d3/de/d35/d3a/f51 [0,4194304] 0 2026-03-10T14:07:44.661 INFO:tasks.workunit.client.1.vm04.stdout:7/468: write d2/dc/de/d2d/d60/d7c/d3b/f48 [279206,64945] 0 2026-03-10T14:07:44.663 INFO:tasks.workunit.client.1.vm04.stdout:0/413: rmdir d0/d2/d15/d22/d38/d56 39 2026-03-10T14:07:44.664 INFO:tasks.workunit.client.1.vm04.stdout:0/414: write d0/d2/f12 [1234331,107935] 0 2026-03-10T14:07:44.664 INFO:tasks.workunit.client.1.vm04.stdout:0/415: write d0/d2/d15/f57 [3858674,14397] 0 2026-03-10T14:07:44.678 INFO:tasks.workunit.client.1.vm04.stdout:3/507: write da/dc/d3f/f4d [1378059,69537] 0 2026-03-10T14:07:44.680 INFO:tasks.workunit.client.0.vm03.stdout:1/13: stat d0/d1 0 2026-03-10T14:07:44.681 INFO:tasks.workunit.client.0.vm03.stdout:1/14: dread - no filename 2026-03-10T14:07:44.682 INFO:tasks.workunit.client.0.vm03.stdout:1/15: rmdir d0/d1 0 2026-03-10T14:07:44.683 INFO:tasks.workunit.client.0.vm03.stdout:1/16: mkdir d0/d2 0 2026-03-10T14:07:44.685 INFO:tasks.workunit.client.0.vm03.stdout:1/17: creat d0/d2/f3 x:0 0 0 2026-03-10T14:07:44.688 INFO:tasks.workunit.client.0.vm03.stdout:3/5: getdents . 0 2026-03-10T14:07:44.688 INFO:tasks.workunit.client.0.vm03.stdout:3/6: truncate - no filename 2026-03-10T14:07:44.688 INFO:tasks.workunit.client.0.vm03.stdout:3/7: write - no filename 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/8: mkdir d1 0 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/9: fdatasync - no filename 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/10: dwrite - no filename 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/11: read - no filename 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/12: dwrite - no filename 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/13: dwrite - no filename 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/14: dread - no filename 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/15: write - no filename 2026-03-10T14:07:44.689 INFO:tasks.workunit.client.0.vm03.stdout:3/16: rename d1 to d1/d2 22 2026-03-10T14:07:44.706 INFO:tasks.workunit.client.0.vm03.stdout:3/17: sync 2026-03-10T14:07:44.708 INFO:tasks.workunit.client.0.vm03.stdout:3/18: sync 2026-03-10T14:07:44.709 INFO:tasks.workunit.client.0.vm03.stdout:3/19: rmdir d1 0 2026-03-10T14:07:44.709 INFO:tasks.workunit.client.0.vm03.stdout:3/20: fdatasync - no filename 2026-03-10T14:07:44.710 INFO:tasks.workunit.client.0.vm03.stdout:3/21: symlink l3 0 2026-03-10T14:07:44.710 INFO:tasks.workunit.client.0.vm03.stdout:3/22: write - no filename 2026-03-10T14:07:44.710 INFO:tasks.workunit.client.0.vm03.stdout:3/23: dwrite - no filename 2026-03-10T14:07:44.711 INFO:tasks.workunit.client.0.vm03.stdout:3/24: mknod c4 0 2026-03-10T14:07:44.712 INFO:tasks.workunit.client.0.vm03.stdout:3/25: symlink l5 0 2026-03-10T14:07:44.712 INFO:tasks.workunit.client.0.vm03.stdout:3/26: chown l3 121 1 2026-03-10T14:07:44.712 INFO:tasks.workunit.client.0.vm03.stdout:3/27: dread - no filename 2026-03-10T14:07:44.713 INFO:tasks.workunit.client.0.vm03.stdout:3/28: rename c4 to c6 0 2026-03-10T14:07:44.713 INFO:tasks.workunit.client.0.vm03.stdout:3/29: write - no filename 2026-03-10T14:07:44.713 INFO:tasks.workunit.client.0.vm03.stdout:3/30: rmdir - no directory 2026-03-10T14:07:44.713 INFO:tasks.workunit.client.0.vm03.stdout:3/31: write - no filename 2026-03-10T14:07:44.714 INFO:tasks.workunit.client.0.vm03.stdout:3/32: mknod c7 0 2026-03-10T14:07:44.715 INFO:tasks.workunit.client.0.vm03.stdout:3/33: symlink l8 0 2026-03-10T14:07:44.718 INFO:tasks.workunit.client.1.vm04.stdout:3/508: creat da/d30/fb6 x:0 0 0 2026-03-10T14:07:44.721 INFO:tasks.workunit.client.1.vm04.stdout:4/467: dwrite d4/d14/d3c/d5e/f92 [0,4194304] 0 2026-03-10T14:07:44.723 INFO:tasks.workunit.client.1.vm04.stdout:6/385: getdents d3/de/d35/d3f/d2d/d38/d40 0 2026-03-10T14:07:44.730 INFO:tasks.workunit.client.1.vm04.stdout:4/468: readlink d4/l19 0 2026-03-10T14:07:44.732 INFO:tasks.workunit.client.1.vm04.stdout:7/469: getdents d2/dc/de/d2d/d60/d7c/d36/daa 0 2026-03-10T14:07:44.734 INFO:tasks.workunit.client.1.vm04.stdout:4/469: creat d4/d14/d64/fab x:0 0 0 2026-03-10T14:07:44.735 INFO:tasks.workunit.client.1.vm04.stdout:4/470: write d4/df/d22/d47/d4f/f6a [1097111,17279] 0 2026-03-10T14:07:44.740 INFO:tasks.workunit.client.1.vm04.stdout:7/470: creat d2/dc/de/d21/fb0 x:0 0 0 2026-03-10T14:07:44.749 INFO:tasks.workunit.client.1.vm04.stdout:4/471: mkdir d4/d14/dac 0 2026-03-10T14:07:44.750 INFO:tasks.workunit.client.1.vm04.stdout:4/472: dread - d4/d14/d64/f97 zero size 2026-03-10T14:07:44.758 INFO:tasks.workunit.client.1.vm04.stdout:7/471: mknod d2/dc/de/d2d/d60/d7c/d36/d8b/cb1 0 2026-03-10T14:07:44.763 INFO:tasks.workunit.client.1.vm04.stdout:8/490: dread d0/d3/d5/f66 [0,4194304] 0 2026-03-10T14:07:44.765 INFO:tasks.workunit.client.1.vm04.stdout:8/491: truncate d0/d3/d73/f98 938393 0 2026-03-10T14:07:44.766 INFO:tasks.workunit.client.1.vm04.stdout:8/492: readlink d0/d3/d63/le 0 2026-03-10T14:07:44.774 INFO:tasks.workunit.client.1.vm04.stdout:8/493: creat d0/d3/d73/f9a x:0 0 0 2026-03-10T14:07:44.782 INFO:tasks.workunit.client.1.vm04.stdout:8/494: dwrite d0/d75/f83 [0,4194304] 0 2026-03-10T14:07:44.793 INFO:tasks.workunit.client.1.vm04.stdout:8/495: creat d0/d3/d63/d29/f9b x:0 0 0 2026-03-10T14:07:44.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:44 vm04.local ceph-mon[55966]: pgmap v150: 65 pgs: 65 active+clean; 1023 MiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 23 MiB/s rd, 67 MiB/s wr, 416 op/s 2026-03-10T14:07:44.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:44 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:07:44.818 INFO:tasks.workunit.client.1.vm04.stdout:8/496: dwrite d0/d3/dd/d76/f92 [0,4194304] 0 2026-03-10T14:07:44.827 INFO:tasks.workunit.client.1.vm04.stdout:8/497: link d0/f42 d0/d3/dd/d76/f9c 0 2026-03-10T14:07:44.834 INFO:tasks.workunit.client.1.vm04.stdout:8/498: dread d0/d3/d63/d12/d51/f4f [0,4194304] 0 2026-03-10T14:07:44.843 INFO:tasks.workunit.client.1.vm04.stdout:8/499: creat d0/d3/d63/d12/d69/f9d x:0 0 0 2026-03-10T14:07:44.847 INFO:tasks.workunit.client.1.vm04.stdout:8/500: rmdir d0/d3/dd 39 2026-03-10T14:07:44.854 INFO:tasks.workunit.client.1.vm04.stdout:8/501: creat d0/d75/d8a/f9e x:0 0 0 2026-03-10T14:07:44.855 INFO:tasks.workunit.client.1.vm04.stdout:8/502: chown d0/d3/d63/f8f 6 1 2026-03-10T14:07:44.856 INFO:tasks.workunit.client.1.vm04.stdout:8/503: stat d0/d3/d63/d12/f2c 0 2026-03-10T14:07:44.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:44 vm03.local ceph-mon[49718]: pgmap v150: 65 pgs: 65 active+clean; 1023 MiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 23 MiB/s rd, 67 MiB/s wr, 416 op/s 2026-03-10T14:07:44.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:44 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:07:44.859 INFO:tasks.workunit.client.1.vm04.stdout:8/504: write d0/d3/d5/f70 [4550037,125392] 0 2026-03-10T14:07:44.867 INFO:tasks.workunit.client.1.vm04.stdout:8/505: dwrite d0/d75/f83 [0,4194304] 0 2026-03-10T14:07:44.870 INFO:tasks.workunit.client.1.vm04.stdout:8/506: creat d0/d3/dd/d76/f9f x:0 0 0 2026-03-10T14:07:44.872 INFO:tasks.workunit.client.1.vm04.stdout:8/507: rmdir d0/d3/d63/d12/d51/d67 39 2026-03-10T14:07:44.873 INFO:tasks.workunit.client.1.vm04.stdout:8/508: stat d0/d3/d63/d12/d51/f64 0 2026-03-10T14:07:44.875 INFO:tasks.workunit.client.1.vm04.stdout:8/509: write d0/d3/d73/f91 [841214,33009] 0 2026-03-10T14:07:44.883 INFO:tasks.workunit.client.1.vm04.stdout:5/542: dwrite d7/d2d/d69/f95 [0,4194304] 0 2026-03-10T14:07:44.884 INFO:tasks.workunit.client.1.vm04.stdout:8/510: dwrite d0/d3/f80 [0,4194304] 0 2026-03-10T14:07:44.889 INFO:tasks.workunit.client.1.vm04.stdout:2/499: rename d0/d14/d91/c34 to d0/d14/d91/d8/dd/d26/c99 0 2026-03-10T14:07:44.895 INFO:tasks.workunit.client.1.vm04.stdout:2/500: symlink d0/d14/d91/d8/d17/d4e/l9a 0 2026-03-10T14:07:44.907 INFO:tasks.workunit.client.1.vm04.stdout:9/414: rename d9/da/dd/d1c/f65 to d9/d33/f8f 0 2026-03-10T14:07:45.097 INFO:tasks.workunit.client.1.vm04.stdout:9/415: sync 2026-03-10T14:07:45.097 INFO:tasks.workunit.client.1.vm04.stdout:2/501: sync 2026-03-10T14:07:45.110 INFO:tasks.workunit.client.1.vm04.stdout:2/502: symlink d0/d14/d91/d8/d17/d4e/d85/d86/l9b 0 2026-03-10T14:07:45.116 INFO:tasks.workunit.client.1.vm04.stdout:2/503: creat d0/d14/d54/f9c x:0 0 0 2026-03-10T14:07:45.121 INFO:tasks.workunit.client.1.vm04.stdout:2/504: getdents d0/d14/d91/d8/dd/d26/d46/d4b 0 2026-03-10T14:07:45.132 INFO:tasks.workunit.client.0.vm03.stdout:5/12: readlink l1 0 2026-03-10T14:07:45.132 INFO:tasks.workunit.client.0.vm03.stdout:5/13: dwrite - no filename 2026-03-10T14:07:45.135 INFO:tasks.workunit.client.0.vm03.stdout:2/6: truncate f0 3888415 0 2026-03-10T14:07:45.135 INFO:tasks.workunit.client.0.vm03.stdout:2/7: rmdir - no directory 2026-03-10T14:07:45.135 INFO:tasks.workunit.client.1.vm04.stdout:1/456: stat d3/fa 0 2026-03-10T14:07:45.139 INFO:tasks.workunit.client.1.vm04.stdout:1/457: unlink d3/d20/d60/d82/d13/d1a/f24 0 2026-03-10T14:07:45.143 INFO:tasks.workunit.client.1.vm04.stdout:0/416: write d0/d2/fd [401187,91355] 0 2026-03-10T14:07:45.147 INFO:tasks.workunit.client.1.vm04.stdout:0/417: chown d0/d2/d15/d49/d50/d61/d75/f80 27 1 2026-03-10T14:07:45.148 INFO:tasks.workunit.client.1.vm04.stdout:3/509: dwrite da/dc/d3f/d54/f97 [0,4194304] 0 2026-03-10T14:07:45.154 INFO:tasks.workunit.client.1.vm04.stdout:3/510: write da/dc/d35/d52/da3/dac/fb4 [798707,116041] 0 2026-03-10T14:07:45.156 INFO:tasks.workunit.client.1.vm04.stdout:4/473: dwrite d4/df/d22/f4b [0,4194304] 0 2026-03-10T14:07:45.168 INFO:tasks.workunit.client.1.vm04.stdout:0/418: dread - d0/d2/f34 zero size 2026-03-10T14:07:45.168 INFO:tasks.workunit.client.1.vm04.stdout:3/511: truncate da/dc/d35/f50 1101109 0 2026-03-10T14:07:45.169 INFO:tasks.workunit.client.1.vm04.stdout:3/512: fdatasync da/dc/f90 0 2026-03-10T14:07:45.173 INFO:tasks.workunit.client.1.vm04.stdout:7/472: dwrite d2/dc/de/f73 [0,4194304] 0 2026-03-10T14:07:45.176 INFO:tasks.workunit.client.1.vm04.stdout:0/419: chown d0/d2/d15/d22/d38/d56/d66/f7e 27787 1 2026-03-10T14:07:45.185 INFO:tasks.workunit.client.1.vm04.stdout:7/473: creat d2/dc/de/dae/fb2 x:0 0 0 2026-03-10T14:07:45.190 INFO:tasks.workunit.client.1.vm04.stdout:3/513: creat da/fb7 x:0 0 0 2026-03-10T14:07:45.192 INFO:tasks.workunit.client.1.vm04.stdout:7/474: chown d2/dc/f25 1937503 1 2026-03-10T14:07:45.193 INFO:tasks.workunit.client.1.vm04.stdout:3/514: mknod da/dc/d35/d52/d6d/cb8 0 2026-03-10T14:07:45.194 INFO:tasks.workunit.client.1.vm04.stdout:1/458: sync 2026-03-10T14:07:45.194 INFO:tasks.workunit.client.1.vm04.stdout:0/420: sync 2026-03-10T14:07:45.195 INFO:tasks.workunit.client.1.vm04.stdout:3/515: chown da/dc/d35/d52/d6d/cb8 81448 1 2026-03-10T14:07:45.202 INFO:tasks.workunit.client.1.vm04.stdout:3/516: truncate da/d8e/f71 1034237 0 2026-03-10T14:07:45.207 INFO:tasks.workunit.client.1.vm04.stdout:1/459: mknod d3/d5c/d79/d98/c9f 0 2026-03-10T14:07:45.213 INFO:tasks.workunit.client.1.vm04.stdout:3/517: unlink da/dc/d35/d52/d6d/l84 0 2026-03-10T14:07:45.216 INFO:tasks.workunit.client.1.vm04.stdout:0/421: dread d0/d2/d15/d22/f30 [0,4194304] 0 2026-03-10T14:07:45.231 INFO:tasks.workunit.client.1.vm04.stdout:0/422: mknod d0/d2/d25/c82 0 2026-03-10T14:07:45.242 INFO:tasks.workunit.client.1.vm04.stdout:1/460: dread d3/f2c [0,4194304] 0 2026-03-10T14:07:45.250 INFO:tasks.workunit.client.1.vm04.stdout:1/461: mkdir d3/d20/d60/d82/d13/da0 0 2026-03-10T14:07:45.253 INFO:tasks.workunit.client.1.vm04.stdout:8/511: truncate d0/d3/d63/d12/d69/f81 2545207 0 2026-03-10T14:07:45.253 INFO:tasks.workunit.client.1.vm04.stdout:5/543: write d7/d12/d2b/d3e/f4a [4722782,41383] 0 2026-03-10T14:07:45.254 INFO:tasks.workunit.client.1.vm04.stdout:8/512: stat d0/d3/d63/d12/f47 0 2026-03-10T14:07:45.255 INFO:tasks.workunit.client.1.vm04.stdout:8/513: fsync d0/d3/dd/d76/f92 0 2026-03-10T14:07:45.259 INFO:tasks.workunit.client.1.vm04.stdout:0/423: dread d0/d2/f2d [0,4194304] 0 2026-03-10T14:07:45.263 INFO:tasks.workunit.client.1.vm04.stdout:5/544: mknod d7/d59/d7d/d9a/cab 0 2026-03-10T14:07:45.267 INFO:tasks.workunit.client.1.vm04.stdout:1/462: sync 2026-03-10T14:07:45.267 INFO:tasks.workunit.client.1.vm04.stdout:0/424: dwrite d0/d2/d15/f44 [0,4194304] 0 2026-03-10T14:07:45.269 INFO:tasks.workunit.client.1.vm04.stdout:1/463: stat d3/d20/d60/d82/d13/d1a 0 2026-03-10T14:07:45.274 INFO:tasks.workunit.client.1.vm04.stdout:5/545: rename d7/d12/l22 to d7/d12/d2b/d3e/d57/d9f/lac 0 2026-03-10T14:07:45.286 INFO:tasks.workunit.client.1.vm04.stdout:5/546: creat d7/d59/fad x:0 0 0 2026-03-10T14:07:45.308 INFO:tasks.workunit.client.1.vm04.stdout:6/386: symlink d3/de/d35/d3f/d2d/d32/l6b 0 2026-03-10T14:07:45.312 INFO:tasks.workunit.client.1.vm04.stdout:6/387: creat d3/d1d/f6c x:0 0 0 2026-03-10T14:07:45.318 INFO:tasks.workunit.client.0.vm03.stdout:9/13: rmdir d2 39 2026-03-10T14:07:45.320 INFO:tasks.workunit.client.1.vm04.stdout:6/388: creat d3/de/f6d x:0 0 0 2026-03-10T14:07:45.329 INFO:tasks.workunit.client.1.vm04.stdout:9/416: write d9/d1d/f3d [22557,115790] 0 2026-03-10T14:07:45.332 INFO:tasks.workunit.client.1.vm04.stdout:9/417: chown d9/d58 30101 1 2026-03-10T14:07:45.333 INFO:tasks.workunit.client.1.vm04.stdout:9/418: readlink d9/da/l37 0 2026-03-10T14:07:45.334 INFO:tasks.workunit.client.1.vm04.stdout:9/419: stat d9/da/c3f 0 2026-03-10T14:07:45.336 INFO:tasks.workunit.client.0.vm03.stdout:0/27: rmdir d3 39 2026-03-10T14:07:45.339 INFO:tasks.workunit.client.1.vm04.stdout:2/505: dwrite d0/d14/d91/d8/d17/f73 [0,4194304] 0 2026-03-10T14:07:45.342 INFO:tasks.workunit.client.1.vm04.stdout:9/420: dwrite d9/da/dd/f8a [0,4194304] 0 2026-03-10T14:07:45.350 INFO:tasks.workunit.client.0.vm03.stdout:8/22: fsync f5 0 2026-03-10T14:07:45.351 INFO:tasks.workunit.client.1.vm04.stdout:3/518: truncate da/dc/f90 1362167 0 2026-03-10T14:07:45.356 INFO:tasks.workunit.client.1.vm04.stdout:3/519: write da/dc/fa4 [1356967,78546] 0 2026-03-10T14:07:45.357 INFO:tasks.workunit.client.1.vm04.stdout:9/421: rename d9/da/dd/f19 to d9/da/d5d/f90 0 2026-03-10T14:07:45.359 INFO:tasks.workunit.client.1.vm04.stdout:4/474: dwrite d4/d14/f3b [0,4194304] 0 2026-03-10T14:07:45.361 INFO:tasks.workunit.client.1.vm04.stdout:2/506: unlink d0/d14/d91/d8/dd/c87 0 2026-03-10T14:07:45.364 INFO:tasks.workunit.client.1.vm04.stdout:7/475: write d2/dc/de/f62 [817917,101811] 0 2026-03-10T14:07:45.367 INFO:tasks.workunit.client.1.vm04.stdout:3/520: creat da/d3e/fb9 x:0 0 0 2026-03-10T14:07:45.368 INFO:tasks.workunit.client.1.vm04.stdout:9/422: read d9/d44/d4d/f4e [1152081,122674] 0 2026-03-10T14:07:45.377 INFO:tasks.workunit.client.1.vm04.stdout:2/507: symlink d0/l9d 0 2026-03-10T14:07:45.377 INFO:tasks.workunit.client.1.vm04.stdout:7/476: fdatasync d2/dc/de/d2d/d60/d7c/f84 0 2026-03-10T14:07:45.380 INFO:tasks.workunit.client.1.vm04.stdout:3/521: unlink da/dc/d35/d52/d53/d78/fb1 0 2026-03-10T14:07:45.382 INFO:tasks.workunit.client.1.vm04.stdout:3/522: write da/dc/d3f/d54/f97 [2136797,55710] 0 2026-03-10T14:07:45.383 INFO:tasks.workunit.client.1.vm04.stdout:2/508: read d0/d14/d91/d8/dd/d26/f33 [937168,91511] 0 2026-03-10T14:07:45.387 INFO:tasks.workunit.client.1.vm04.stdout:9/423: creat d9/da/dd/f91 x:0 0 0 2026-03-10T14:07:45.388 INFO:tasks.workunit.client.1.vm04.stdout:7/477: mkdir d2/dc/de/d2d/d60/d81/db3 0 2026-03-10T14:07:45.393 INFO:tasks.workunit.client.1.vm04.stdout:3/523: creat da/d3e/fba x:0 0 0 2026-03-10T14:07:45.400 INFO:tasks.workunit.client.1.vm04.stdout:2/509: unlink d0/d14/d91/d8/dd/d26/d46/l7a 0 2026-03-10T14:07:45.400 INFO:tasks.workunit.client.1.vm04.stdout:4/475: creat d4/d14/fad x:0 0 0 2026-03-10T14:07:45.404 INFO:tasks.workunit.client.0.vm03.stdout:9/14: write d2/f3 [1035984,121156] 0 2026-03-10T14:07:45.407 INFO:tasks.workunit.client.0.vm03.stdout:8/23: creat f6 x:0 0 0 2026-03-10T14:07:45.408 INFO:tasks.workunit.client.0.vm03.stdout:5/14: link c3 d4/c5 0 2026-03-10T14:07:45.410 INFO:tasks.workunit.client.1.vm04.stdout:2/510: dwrite d0/d14/d91/d8/dd/d26/f36 [0,4194304] 0 2026-03-10T14:07:45.412 INFO:tasks.workunit.client.0.vm03.stdout:8/24: dwrite f6 [0,4194304] 0 2026-03-10T14:07:45.416 INFO:tasks.workunit.client.0.vm03.stdout:9/15: mknod d2/c6 0 2026-03-10T14:07:45.416 INFO:tasks.workunit.client.0.vm03.stdout:0/28: symlink d3/l6 0 2026-03-10T14:07:45.421 INFO:tasks.workunit.client.0.vm03.stdout:8/25: creat f7 x:0 0 0 2026-03-10T14:07:45.422 INFO:tasks.workunit.client.0.vm03.stdout:9/16: symlink d2/l7 0 2026-03-10T14:07:45.422 INFO:tasks.workunit.client.0.vm03.stdout:9/17: rename d2 to d2/d4/d8 22 2026-03-10T14:07:45.423 INFO:tasks.workunit.client.0.vm03.stdout:9/18: write d2/f3 [876077,16905] 0 2026-03-10T14:07:45.423 INFO:tasks.workunit.client.0.vm03.stdout:9/19: stat d2 0 2026-03-10T14:07:45.427 INFO:tasks.workunit.client.0.vm03.stdout:9/20: dwrite d2/f3 [0,4194304] 0 2026-03-10T14:07:45.427 INFO:tasks.workunit.client.0.vm03.stdout:9/21: chown l1 16321 1 2026-03-10T14:07:45.428 INFO:tasks.workunit.client.0.vm03.stdout:9/22: dread d2/f3 [0,4194304] 0 2026-03-10T14:07:45.434 INFO:tasks.workunit.client.1.vm04.stdout:9/424: creat d9/da/dd/d74/f92 x:0 0 0 2026-03-10T14:07:45.435 INFO:tasks.workunit.client.0.vm03.stdout:0/29: creat d3/f7 x:0 0 0 2026-03-10T14:07:45.435 INFO:tasks.workunit.client.0.vm03.stdout:5/15: mkdir d4/d6 0 2026-03-10T14:07:45.437 INFO:tasks.workunit.client.0.vm03.stdout:8/26: mknod c8 0 2026-03-10T14:07:45.437 INFO:tasks.workunit.client.0.vm03.stdout:8/27: stat l1 0 2026-03-10T14:07:45.438 INFO:tasks.workunit.client.0.vm03.stdout:8/28: chown f6 1004 1 2026-03-10T14:07:45.440 INFO:tasks.workunit.client.0.vm03.stdout:0/30: dwrite f1 [0,4194304] 0 2026-03-10T14:07:45.452 INFO:tasks.workunit.client.1.vm04.stdout:4/476: rmdir d4/df/d22/d47/d70/d74 39 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.1.vm04.stdout:2/511: chown d0/d14/d91/d8/f6 24817939 1 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.1.vm04.stdout:2/512: chown d0/d14 2 1 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.0.vm03.stdout:8/29: dwrite f0 [0,4194304] 0 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.0.vm03.stdout:8/30: chown c8 1650 1 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.0.vm03.stdout:8/31: readlink l3 0 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.0.vm03.stdout:9/23: mknod d2/c9 0 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.0.vm03.stdout:0/31: mknod d3/c8 0 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.0.vm03.stdout:8/32: creat f9 x:0 0 0 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.0.vm03.stdout:8/33: rmdir - no directory 2026-03-10T14:07:45.461 INFO:tasks.workunit.client.0.vm03.stdout:5/16: mknod d4/d6/c7 0 2026-03-10T14:07:45.462 INFO:tasks.workunit.client.0.vm03.stdout:8/34: readlink l1 0 2026-03-10T14:07:45.462 INFO:tasks.workunit.client.0.vm03.stdout:0/32: dread f1 [0,4194304] 0 2026-03-10T14:07:45.462 INFO:tasks.workunit.client.0.vm03.stdout:8/35: dwrite f2 [0,4194304] 0 2026-03-10T14:07:45.462 INFO:tasks.workunit.client.0.vm03.stdout:8/36: readlink l3 0 2026-03-10T14:07:45.462 INFO:tasks.workunit.client.0.vm03.stdout:8/37: read - f9 zero size 2026-03-10T14:07:45.462 INFO:tasks.workunit.client.1.vm04.stdout:8/514: dwrite d0/d3/dd/d76/f9c [0,4194304] 0 2026-03-10T14:07:45.462 INFO:tasks.workunit.client.1.vm04.stdout:9/425: mkdir d9/d5c/d93 0 2026-03-10T14:07:45.463 INFO:tasks.workunit.client.0.vm03.stdout:5/17: readlink l1 0 2026-03-10T14:07:45.465 INFO:tasks.workunit.client.1.vm04.stdout:8/515: dread d0/d3/d63/d12/d51/f4f [0,4194304] 0 2026-03-10T14:07:45.466 INFO:tasks.workunit.client.0.vm03.stdout:0/33: creat d3/f9 x:0 0 0 2026-03-10T14:07:45.466 INFO:tasks.workunit.client.1.vm04.stdout:7/478: dread d2/dc/de/d2d/f39 [4194304,4194304] 0 2026-03-10T14:07:45.506 INFO:tasks.workunit.client.1.vm04.stdout:7/479: chown d2/dc/de/d2d/d38 503 1 2026-03-10T14:07:45.506 INFO:tasks.workunit.client.1.vm04.stdout:9/426: mknod d9/d33/c94 0 2026-03-10T14:07:45.506 INFO:tasks.workunit.client.1.vm04.stdout:8/516: dread d0/d75/f83 [0,4194304] 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.1.vm04.stdout:9/427: dwrite d9/da/f41 [0,4194304] 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.1.vm04.stdout:9/428: chown d9/da/f6b 2 1 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.1.vm04.stdout:9/429: truncate d9/da/dd/f91 468604 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.1.vm04.stdout:7/480: link d2/dc/de/d2d/d60/d7c/d36/l5d d2/d6b/lb4 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.1.vm04.stdout:8/517: creat d0/d82/fa0 x:0 0 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.1.vm04.stdout:4/477: getdents d4/df/d34/d6f 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:8/38: mkdir da 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:5/18: rmdir d4/d6 39 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/34: mknod d3/ca 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:8/39: creat da/fb x:0 0 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:8/40: readlink l3 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:8/41: write f2 [4438799,70772] 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/35: dwrite d3/f7 [0,4194304] 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/36: dread - d3/f9 zero size 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:5/19: creat d4/f8 x:0 0 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:5/20: chown l1 513955 1 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/37: rename d3/c8 to d3/cb 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/38: creat d3/fc x:0 0 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:5/21: creat d4/f9 x:0 0 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:5/22: write d4/f8 [775651,29624] 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:5/23: chown d4 53691 1 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/39: unlink d3/f4 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/40: fdatasync d3/f9 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/41: fsync f2 0 2026-03-10T14:07:45.507 INFO:tasks.workunit.client.0.vm03.stdout:0/42: creat d3/fd x:0 0 0 2026-03-10T14:07:45.508 INFO:tasks.workunit.client.0.vm03.stdout:0/43: unlink d3/l6 0 2026-03-10T14:07:45.508 INFO:tasks.workunit.client.0.vm03.stdout:0/44: dwrite d3/f7 [0,4194304] 0 2026-03-10T14:07:45.508 INFO:tasks.workunit.client.0.vm03.stdout:0/45: link d3/fd d3/fe 0 2026-03-10T14:07:45.508 INFO:tasks.workunit.client.0.vm03.stdout:0/46: write f1 [1664364,53081] 0 2026-03-10T14:07:45.509 INFO:tasks.workunit.client.1.vm04.stdout:7/481: fdatasync d2/dc/de/d2d/d60/d7c/d36/f87 0 2026-03-10T14:07:45.509 INFO:tasks.workunit.client.1.vm04.stdout:4/478: creat d4/df/d31/fae x:0 0 0 2026-03-10T14:07:45.509 INFO:tasks.workunit.client.1.vm04.stdout:8/518: rename d0/d3/d73/f94 to d0/d75/d8a/fa1 0 2026-03-10T14:07:45.517 INFO:tasks.workunit.client.1.vm04.stdout:8/519: chown d0/d3/d63/c55 309 1 2026-03-10T14:07:45.543 INFO:tasks.workunit.client.1.vm04.stdout:7/482: rename d2/dc/d4d/c71 to d2/d94/cb5 0 2026-03-10T14:07:45.543 INFO:tasks.workunit.client.1.vm04.stdout:4/479: mknod d4/df/d22/d47/d70/caf 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:8/520: creat d0/d75/fa2 x:0 0 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:7/483: mknod d2/dc/de/d2d/d60/d7c/cb6 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:4/480: stat d4/df/c63 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:7/484: symlink d2/dc/de/d2d/d60/d7c/lb7 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:8/521: creat d0/d3/fa3 x:0 0 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:4/481: rename d4/c1d to d4/df/d34/d6f/cb0 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:4/482: chown d4/df/d31/c7d 12 1 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:8/522: truncate d0/d3/f6b 1072524 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:7/485: getdents d2/dc/de/d2d/d60/d7c/d64 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:4/483: mkdir d4/df/d22/d47/db1 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:8/523: symlink d0/d3/d63/la4 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:7/486: creat d2/dc/de/d2d/d60/d7c/d36/d8b/fb8 x:0 0 0 2026-03-10T14:07:45.544 INFO:tasks.workunit.client.1.vm04.stdout:4/484: mkdir d4/df/db2 0 2026-03-10T14:07:45.545 INFO:tasks.workunit.client.1.vm04.stdout:4/485: readlink d4/lc 0 2026-03-10T14:07:45.546 INFO:tasks.workunit.client.1.vm04.stdout:7/487: chown d2/dc/de/d2d/d60 25160592 1 2026-03-10T14:07:45.549 INFO:tasks.workunit.client.1.vm04.stdout:7/488: stat d2/dc 0 2026-03-10T14:07:45.549 INFO:tasks.workunit.client.1.vm04.stdout:8/524: dread d0/d3/d73/f91 [0,4194304] 0 2026-03-10T14:07:45.550 INFO:tasks.workunit.client.1.vm04.stdout:7/489: creat d2/dc/de/d2d/d60/d81/db3/fb9 x:0 0 0 2026-03-10T14:07:45.552 INFO:tasks.workunit.client.1.vm04.stdout:8/525: dread d0/f42 [0,4194304] 0 2026-03-10T14:07:45.683 INFO:tasks.workunit.client.1.vm04.stdout:8/526: sync 2026-03-10T14:07:45.686 INFO:tasks.workunit.client.1.vm04.stdout:8/527: rmdir d0/d3/d5 39 2026-03-10T14:07:45.722 INFO:tasks.workunit.client.0.vm03.stdout:5/24: fdatasync d4/f8 0 2026-03-10T14:07:45.723 INFO:tasks.workunit.client.1.vm04.stdout:8/528: dread d0/d3/d73/f98 [0,4194304] 0 2026-03-10T14:07:45.726 INFO:tasks.workunit.client.0.vm03.stdout:5/25: dwrite d4/f9 [0,4194304] 0 2026-03-10T14:07:45.726 INFO:tasks.workunit.client.1.vm04.stdout:8/529: creat d0/d3/dd/d89/fa5 x:0 0 0 2026-03-10T14:07:45.777 INFO:tasks.workunit.client.1.vm04.stdout:5/547: getdents d7/d59/d7d/d9a 0 2026-03-10T14:07:45.781 INFO:tasks.workunit.client.1.vm04.stdout:1/464: write d3/d22/d2f/f39 [5357515,17559] 0 2026-03-10T14:07:45.793 INFO:tasks.workunit.client.1.vm04.stdout:1/465: symlink d3/d5c/la1 0 2026-03-10T14:07:45.793 INFO:tasks.workunit.client.1.vm04.stdout:0/425: dwrite d0/f72 [0,4194304] 0 2026-03-10T14:07:45.794 INFO:tasks.workunit.client.1.vm04.stdout:1/466: readlink d3/l9 0 2026-03-10T14:07:45.798 INFO:tasks.workunit.client.1.vm04.stdout:5/548: mkdir d7/d12/d2b/d3e/dae 0 2026-03-10T14:07:45.799 INFO:tasks.workunit.client.1.vm04.stdout:6/389: dwrite d3/de/d35/d3a/d43/f4b [0,4194304] 0 2026-03-10T14:07:45.800 INFO:tasks.workunit.client.1.vm04.stdout:6/390: dread - d3/de/f6d zero size 2026-03-10T14:07:45.805 INFO:tasks.workunit.client.1.vm04.stdout:1/467: creat d3/d20/d60/d82/d13/d38/d58/d5b/d92/fa2 x:0 0 0 2026-03-10T14:07:45.816 INFO:tasks.workunit.client.1.vm04.stdout:5/549: dwrite d7/d2d/d32/d34/f89 [0,4194304] 0 2026-03-10T14:07:45.830 INFO:tasks.workunit.client.1.vm04.stdout:1/468: creat d3/d20/fa3 x:0 0 0 2026-03-10T14:07:45.848 INFO:tasks.workunit.client.1.vm04.stdout:0/426: truncate d0/d2/d15/d22/d38/d56/d66/f2b 1143352 0 2026-03-10T14:07:45.862 INFO:tasks.workunit.client.1.vm04.stdout:5/550: fdatasync d7/f1d 0 2026-03-10T14:07:45.877 INFO:tasks.workunit.client.1.vm04.stdout:1/469: truncate d3/d20/f32 1530496 0 2026-03-10T14:07:45.879 INFO:tasks.workunit.client.0.vm03.stdout:6/10: getdents . 0 2026-03-10T14:07:45.881 INFO:tasks.workunit.client.0.vm03.stdout:6/11: symlink l4 0 2026-03-10T14:07:45.882 INFO:tasks.workunit.client.1.vm04.stdout:0/427: mknod d0/d2/d15/d22/d62/c83 0 2026-03-10T14:07:45.891 INFO:tasks.workunit.client.1.vm04.stdout:5/551: fdatasync d7/f11 0 2026-03-10T14:07:45.902 INFO:tasks.workunit.client.1.vm04.stdout:0/428: rename d0/d2/d15/d22/d62/f7b to d0/d2/d15/d22/d38/d56/f84 0 2026-03-10T14:07:45.903 INFO:tasks.workunit.client.0.vm03.stdout:7/24: getdents . 0 2026-03-10T14:07:45.903 INFO:tasks.workunit.client.0.vm03.stdout:7/25: fsync f0 0 2026-03-10T14:07:45.904 INFO:tasks.workunit.client.1.vm04.stdout:0/429: read d0/d2/d15/d22/d38/d56/f79 [2693553,81261] 0 2026-03-10T14:07:45.905 INFO:tasks.workunit.client.0.vm03.stdout:7/26: unlink f0 0 2026-03-10T14:07:45.905 INFO:tasks.workunit.client.1.vm04.stdout:0/430: chown d0/d2/d15/d49/d50/d5c/f76 12 1 2026-03-10T14:07:45.906 INFO:tasks.workunit.client.0.vm03.stdout:7/27: write f3 [528177,66592] 0 2026-03-10T14:07:45.907 INFO:tasks.workunit.client.0.vm03.stdout:7/28: mknod c4 0 2026-03-10T14:07:45.908 INFO:tasks.workunit.client.0.vm03.stdout:7/29: mkdir d5 0 2026-03-10T14:07:45.910 INFO:tasks.workunit.client.0.vm03.stdout:7/30: rename f3 to d5/f6 0 2026-03-10T14:07:45.911 INFO:tasks.workunit.client.1.vm04.stdout:5/552: fdatasync d7/d2d/d32/d34/f4e 0 2026-03-10T14:07:45.912 INFO:tasks.workunit.client.0.vm03.stdout:7/31: creat d5/f7 x:0 0 0 2026-03-10T14:07:45.912 INFO:tasks.workunit.client.1.vm04.stdout:1/470: mknod d3/ca4 0 2026-03-10T14:07:45.912 INFO:tasks.workunit.client.1.vm04.stdout:1/471: chown d3/d22/d2f/f3c 13 1 2026-03-10T14:07:45.913 INFO:tasks.workunit.client.1.vm04.stdout:5/553: readlink d7/d12/d2b/d3e/d3f/da6/la7 0 2026-03-10T14:07:45.913 INFO:tasks.workunit.client.0.vm03.stdout:7/32: symlink d5/l8 0 2026-03-10T14:07:45.913 INFO:tasks.workunit.client.1.vm04.stdout:8/530: dread d0/d3/d63/d29/f45 [0,4194304] 0 2026-03-10T14:07:45.914 INFO:tasks.workunit.client.0.vm03.stdout:7/33: write d5/f6 [786419,65690] 0 2026-03-10T14:07:45.917 INFO:tasks.workunit.client.1.vm04.stdout:5/554: dread - d7/d12/d2b/d3e/d57/d77/da5/faa zero size 2026-03-10T14:07:45.919 INFO:tasks.workunit.client.1.vm04.stdout:5/555: fsync d7/d12/d2b/f4d 0 2026-03-10T14:07:45.920 INFO:tasks.workunit.client.0.vm03.stdout:7/34: dwrite d5/f6 [0,4194304] 0 2026-03-10T14:07:45.921 INFO:tasks.workunit.client.0.vm03.stdout:7/35: stat c1 0 2026-03-10T14:07:45.921 INFO:tasks.workunit.client.0.vm03.stdout:7/36: truncate d5/f7 435836 0 2026-03-10T14:07:45.925 INFO:tasks.workunit.client.0.vm03.stdout:7/37: mkdir d5/d9 0 2026-03-10T14:07:45.926 INFO:tasks.workunit.client.0.vm03.stdout:7/38: mknod d5/ca 0 2026-03-10T14:07:45.931 INFO:tasks.workunit.client.0.vm03.stdout:7/39: dwrite d5/f7 [0,4194304] 0 2026-03-10T14:07:45.932 INFO:tasks.workunit.client.0.vm03.stdout:7/40: stat d5/f7 0 2026-03-10T14:07:45.941 INFO:tasks.workunit.client.0.vm03.stdout:7/41: dread d5/f6 [0,4194304] 0 2026-03-10T14:07:45.945 INFO:tasks.workunit.client.1.vm04.stdout:0/431: symlink d0/d2/d15/d22/d62/l85 0 2026-03-10T14:07:45.946 INFO:tasks.workunit.client.1.vm04.stdout:1/472: symlink d3/d20/la5 0 2026-03-10T14:07:45.952 INFO:tasks.workunit.client.0.vm03.stdout:7/42: creat d5/fb x:0 0 0 2026-03-10T14:07:45.952 INFO:tasks.workunit.client.0.vm03.stdout:7/43: readlink d5/l8 0 2026-03-10T14:07:45.953 INFO:tasks.workunit.client.0.vm03.stdout:7/44: mkdir d5/d9/dc 0 2026-03-10T14:07:45.954 INFO:tasks.workunit.client.0.vm03.stdout:7/45: mkdir d5/d9/dd 0 2026-03-10T14:07:45.954 INFO:tasks.workunit.client.1.vm04.stdout:9/430: getdents d9/da/dd/d74 0 2026-03-10T14:07:45.956 INFO:tasks.workunit.client.0.vm03.stdout:7/46: dread d5/f7 [0,4194304] 0 2026-03-10T14:07:45.958 INFO:tasks.workunit.client.1.vm04.stdout:1/473: mknod d3/d20/d60/d82/d13/d38/ca6 0 2026-03-10T14:07:45.985 INFO:tasks.workunit.client.0.vm03.stdout:7/47: dwrite d5/f7 [0,4194304] 0 2026-03-10T14:07:45.986 INFO:tasks.workunit.client.0.vm03.stdout:7/48: write d5/fb [743273,91944] 0 2026-03-10T14:07:45.986 INFO:tasks.workunit.client.0.vm03.stdout:7/49: dwrite d5/fb [0,4194304] 0 2026-03-10T14:07:45.986 INFO:tasks.workunit.client.1.vm04.stdout:1/474: chown d3/d5c/d79/d98/c9f 2136762 1 2026-03-10T14:07:45.986 INFO:tasks.workunit.client.1.vm04.stdout:0/432: symlink d0/d2/d15/d49/d50/d61/l86 0 2026-03-10T14:07:45.986 INFO:tasks.workunit.client.1.vm04.stdout:8/531: fdatasync d0/d3/d63/d12/d51/d67/f6e 0 2026-03-10T14:07:45.986 INFO:tasks.workunit.client.1.vm04.stdout:1/475: mkdir d3/d20/d60/d82/d13/da7 0 2026-03-10T14:07:45.986 INFO:tasks.workunit.client.1.vm04.stdout:3/524: dwrite da/d3e/f4c [0,4194304] 0 2026-03-10T14:07:45.986 INFO:tasks.workunit.client.1.vm04.stdout:2/513: dwrite d0/d14/d91/d8/dd/d26/d46/d4b/d56/f7b [0,4194304] 0 2026-03-10T14:07:45.988 INFO:tasks.workunit.client.1.vm04.stdout:2/514: dread d0/d14/d91/d8/dd/d26/d46/d4b/d56/f7b [0,4194304] 0 2026-03-10T14:07:45.995 INFO:tasks.workunit.client.1.vm04.stdout:3/525: symlink da/dc/d3f/d54/d66/lbb 0 2026-03-10T14:07:45.998 INFO:tasks.workunit.client.1.vm04.stdout:3/526: write da/dc/fa4 [1780012,29220] 0 2026-03-10T14:07:45.998 INFO:tasks.workunit.client.1.vm04.stdout:8/532: symlink d0/d3/d5/la6 0 2026-03-10T14:07:45.998 INFO:tasks.workunit.client.1.vm04.stdout:8/533: chown d0/d3/d5/c7c 1882 1 2026-03-10T14:07:46.003 INFO:tasks.workunit.client.1.vm04.stdout:8/534: dwrite d0/d3/d63/d12/d51/f8e [0,4194304] 0 2026-03-10T14:07:46.006 INFO:tasks.workunit.client.1.vm04.stdout:9/431: getdents d9 0 2026-03-10T14:07:46.020 INFO:tasks.workunit.client.1.vm04.stdout:5/556: sync 2026-03-10T14:07:46.021 INFO:tasks.workunit.client.1.vm04.stdout:3/527: chown da/dc/c21 941228085 1 2026-03-10T14:07:46.023 INFO:tasks.workunit.client.0.vm03.stdout:4/18: fsync f2 0 2026-03-10T14:07:46.026 INFO:tasks.workunit.client.0.vm03.stdout:4/19: truncate f2 950461 0 2026-03-10T14:07:46.028 INFO:tasks.workunit.client.0.vm03.stdout:4/20: fdatasync f2 0 2026-03-10T14:07:46.031 INFO:tasks.workunit.client.0.vm03.stdout:4/21: creat f3 x:0 0 0 2026-03-10T14:07:46.031 INFO:tasks.workunit.client.0.vm03.stdout:4/22: chown l1 478995 1 2026-03-10T14:07:46.051 INFO:tasks.workunit.client.1.vm04.stdout:4/486: write d4/df/d31/f55 [2306291,89179] 0 2026-03-10T14:07:46.053 INFO:tasks.workunit.client.1.vm04.stdout:3/528: mknod da/dc/d35/d37/cbc 0 2026-03-10T14:07:46.056 INFO:tasks.workunit.client.1.vm04.stdout:7/490: write d2/dc/de/d2d/d38/d50/fad [739430,2896] 0 2026-03-10T14:07:46.056 INFO:tasks.workunit.client.1.vm04.stdout:1/476: link d3/d22/d63/f77 d3/d20/d60/d82/d13/d38/d58/d5b/fa8 0 2026-03-10T14:07:46.063 INFO:tasks.workunit.client.1.vm04.stdout:4/487: readlink d4/df/d22/d47/d70/la2 0 2026-03-10T14:07:46.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:45 vm04.local ceph-mon[55966]: pgmap v151: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 116 GiB / 120 GiB avail; 26 MiB/s rd, 69 MiB/s wr, 448 op/s 2026-03-10T14:07:46.065 INFO:tasks.workunit.client.1.vm04.stdout:3/529: rename da/fb7 to da/dc/d47/fbd 0 2026-03-10T14:07:46.065 INFO:tasks.workunit.client.1.vm04.stdout:9/432: link d9/da/c3f d9/da/d5d/c95 0 2026-03-10T14:07:46.071 INFO:tasks.workunit.client.1.vm04.stdout:7/491: mknod d2/d9f/cba 0 2026-03-10T14:07:46.071 INFO:tasks.workunit.client.1.vm04.stdout:9/433: write d9/d44/d70/f86 [1031306,69160] 0 2026-03-10T14:07:46.072 INFO:tasks.workunit.client.1.vm04.stdout:0/433: dread d0/d2/f9 [0,4194304] 0 2026-03-10T14:07:46.081 INFO:tasks.workunit.client.1.vm04.stdout:0/434: mkdir d0/d6e/d87 0 2026-03-10T14:07:46.086 INFO:tasks.workunit.client.1.vm04.stdout:4/488: rename d4/d14/d1b/f2f to d4/fb3 0 2026-03-10T14:07:46.090 INFO:tasks.workunit.client.1.vm04.stdout:6/391: dwrite d3/de/d35/d3a/d43/d4c/f4d [0,4194304] 0 2026-03-10T14:07:46.096 INFO:tasks.workunit.client.1.vm04.stdout:6/392: chown d3/de/d35/d3f/d2d/d32/d23/f33 26691 1 2026-03-10T14:07:46.102 INFO:tasks.workunit.client.1.vm04.stdout:9/434: mkdir d9/d5c/d93/d96 0 2026-03-10T14:07:46.108 INFO:tasks.workunit.client.1.vm04.stdout:7/492: creat d2/d2a/d42/d86/fbb x:0 0 0 2026-03-10T14:07:46.117 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:45 vm03.local ceph-mon[49718]: pgmap v151: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 116 GiB / 120 GiB avail; 26 MiB/s rd, 69 MiB/s wr, 448 op/s 2026-03-10T14:07:46.118 INFO:tasks.workunit.client.1.vm04.stdout:0/435: unlink d0/d2/d15/d22/d62/c83 0 2026-03-10T14:07:46.118 INFO:tasks.workunit.client.1.vm04.stdout:6/393: fdatasync d3/de/d35/d3f/f22 0 2026-03-10T14:07:46.118 INFO:tasks.workunit.client.1.vm04.stdout:9/435: mknod d9/d1d/c97 0 2026-03-10T14:07:46.118 INFO:tasks.workunit.client.1.vm04.stdout:7/493: creat d2/dc/de/d2d/d38/d50/fbc x:0 0 0 2026-03-10T14:07:46.118 INFO:tasks.workunit.client.1.vm04.stdout:0/436: fdatasync d0/d2/d25/f33 0 2026-03-10T14:07:46.122 INFO:tasks.workunit.client.1.vm04.stdout:3/530: dread da/dc/d3f/f4d [0,4194304] 0 2026-03-10T14:07:46.124 INFO:tasks.workunit.client.1.vm04.stdout:6/394: rename d3/d1d/f4a to d3/de/d35/d3f/d2d/d32/d23/d47/f6e 0 2026-03-10T14:07:46.128 INFO:tasks.workunit.client.1.vm04.stdout:6/395: dwrite d3/d1d/f41 [4194304,4194304] 0 2026-03-10T14:07:46.134 INFO:tasks.workunit.client.1.vm04.stdout:7/494: creat d2/dc/de/d2d/d60/fbd x:0 0 0 2026-03-10T14:07:46.135 INFO:tasks.workunit.client.1.vm04.stdout:7/495: dread - d2/dc/de/d2d/d60/d81/db3/fb9 zero size 2026-03-10T14:07:46.136 INFO:tasks.workunit.client.1.vm04.stdout:7/496: truncate d2/dc/de/d21/fb0 191322 0 2026-03-10T14:07:46.137 INFO:tasks.workunit.client.1.vm04.stdout:6/396: dread d3/de/d35/d3f/d2d/d38/d40/f5d [0,4194304] 0 2026-03-10T14:07:46.138 INFO:tasks.workunit.client.1.vm04.stdout:6/397: write d3/de/d35/d3f/d2d/d38/d40/f5d [1330667,24257] 0 2026-03-10T14:07:46.139 INFO:tasks.workunit.client.1.vm04.stdout:6/398: stat d3/de/d35/d3f/d2d/d32/d23/d24/c4f 0 2026-03-10T14:07:46.151 INFO:tasks.workunit.client.1.vm04.stdout:4/489: sync 2026-03-10T14:07:46.152 INFO:tasks.workunit.client.1.vm04.stdout:3/531: sync 2026-03-10T14:07:46.152 INFO:tasks.workunit.client.1.vm04.stdout:7/497: fsync d2/dc/de/d2d/d60/d7c/f78 0 2026-03-10T14:07:46.152 INFO:tasks.workunit.client.1.vm04.stdout:6/399: mkdir d3/de/d35/d3f/d2d/d32/d23/d24/d6f 0 2026-03-10T14:07:46.157 INFO:tasks.workunit.client.0.vm03.stdout:1/18: getdents d0 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.1.vm04.stdout:7/498: creat d2/fbe x:0 0 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.1.vm04.stdout:6/400: read d3/de/d35/d3f/d2d/d32/d23/d24/f25 [45734,60126] 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.1.vm04.stdout:7/499: dread d2/dc/de/f62 [0,4194304] 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.0.vm03.stdout:1/19: truncate d0/d2/f3 267764 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.0.vm03.stdout:1/20: creat d0/f4 x:0 0 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.0.vm03.stdout:1/21: link d0/f4 d0/f5 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.0.vm03.stdout:1/22: dwrite d0/f5 [0,4194304] 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.0.vm03.stdout:1/23: symlink d0/d2/l6 0 2026-03-10T14:07:46.172 INFO:tasks.workunit.client.1.vm04.stdout:6/401: dwrite d3/de/f6d [0,4194304] 0 2026-03-10T14:07:46.173 INFO:tasks.workunit.client.1.vm04.stdout:3/532: sync 2026-03-10T14:07:46.174 INFO:tasks.workunit.client.1.vm04.stdout:4/490: sync 2026-03-10T14:07:46.174 INFO:tasks.workunit.client.0.vm03.stdout:1/24: creat d0/d2/f7 x:0 0 0 2026-03-10T14:07:46.178 INFO:tasks.workunit.client.0.vm03.stdout:1/25: creat d0/f8 x:0 0 0 2026-03-10T14:07:46.179 INFO:tasks.workunit.client.0.vm03.stdout:1/26: write d0/f4 [1174216,24321] 0 2026-03-10T14:07:46.180 INFO:tasks.workunit.client.1.vm04.stdout:7/500: dread - d2/d2a/f92 zero size 2026-03-10T14:07:46.182 INFO:tasks.workunit.client.1.vm04.stdout:4/491: dread d4/df/d31/f55 [0,4194304] 0 2026-03-10T14:07:46.184 INFO:tasks.workunit.client.0.vm03.stdout:1/27: rename d0/f5 to d0/d2/f9 0 2026-03-10T14:07:46.187 INFO:tasks.workunit.client.1.vm04.stdout:7/501: mkdir d2/dc/de/d2d/d60/d7c/d64/dbf 0 2026-03-10T14:07:46.187 INFO:tasks.workunit.client.1.vm04.stdout:6/402: write d3/de/d35/d3f/d2d/d38/d40/f5d [1084546,120742] 0 2026-03-10T14:07:46.187 INFO:tasks.workunit.client.1.vm04.stdout:4/492: dread - d4/f57 zero size 2026-03-10T14:07:46.189 INFO:tasks.workunit.client.0.vm03.stdout:1/28: creat d0/fa x:0 0 0 2026-03-10T14:07:46.192 INFO:tasks.workunit.client.0.vm03.stdout:1/29: symlink d0/lb 0 2026-03-10T14:07:46.196 INFO:tasks.workunit.client.1.vm04.stdout:4/493: rename d4/df/d22 to d4/df/db2/db4 0 2026-03-10T14:07:46.203 INFO:tasks.workunit.client.1.vm04.stdout:6/403: mknod d3/de/d35/d3f/d2d/d32/d23/d47/c70 0 2026-03-10T14:07:46.204 INFO:tasks.workunit.client.1.vm04.stdout:7/502: mkdir d2/dc/de/d2d/d60/d7c/d44/dc0 0 2026-03-10T14:07:46.204 INFO:tasks.workunit.client.1.vm04.stdout:4/494: unlink d4/d14/d1b/f30 0 2026-03-10T14:07:46.210 INFO:tasks.workunit.client.1.vm04.stdout:1/477: dread d3/d22/f2b [4194304,4194304] 0 2026-03-10T14:07:46.211 INFO:tasks.workunit.client.1.vm04.stdout:6/404: mkdir d3/de/d35/d3f/d2d/d32/d23/d24/d6f/d71 0 2026-03-10T14:07:46.212 INFO:tasks.workunit.client.1.vm04.stdout:7/503: dwrite d2/dc/de/d2d/d60/d7c/d36/d8b/fb8 [0,4194304] 0 2026-03-10T14:07:46.219 INFO:tasks.workunit.client.1.vm04.stdout:1/478: write d3/d22/d63/d35/f9a [362783,42124] 0 2026-03-10T14:07:46.219 INFO:tasks.workunit.client.1.vm04.stdout:4/495: creat d4/df/db2/db4/d47/d70/d74/fb5 x:0 0 0 2026-03-10T14:07:46.225 INFO:tasks.workunit.client.1.vm04.stdout:4/496: mkdir d4/df/db2/db6 0 2026-03-10T14:07:46.228 INFO:tasks.workunit.client.1.vm04.stdout:1/479: dread d3/f8 [0,4194304] 0 2026-03-10T14:07:46.229 INFO:tasks.workunit.client.1.vm04.stdout:6/405: truncate d3/de/d35/d3f/d2d/d38/f50 321699 0 2026-03-10T14:07:46.238 INFO:tasks.workunit.client.1.vm04.stdout:7/504: rename d2/dc/de/d2d/d5c/c6d to d2/dc/de/d2d/d60/d7c/d3b/cc1 0 2026-03-10T14:07:46.253 INFO:tasks.workunit.client.0.vm03.stdout:3/34: getdents . 0 2026-03-10T14:07:46.253 INFO:tasks.workunit.client.0.vm03.stdout:3/35: readlink l8 0 2026-03-10T14:07:46.264 INFO:tasks.workunit.client.1.vm04.stdout:6/406: dread d3/de/d35/d3f/d2d/d32/d23/d4e/f69 [0,4194304] 0 2026-03-10T14:07:46.265 INFO:tasks.workunit.client.1.vm04.stdout:7/505: sync 2026-03-10T14:07:46.268 INFO:tasks.workunit.client.1.vm04.stdout:7/506: rename f0 to d2/dc/d4d/d7f/fc2 0 2026-03-10T14:07:46.281 INFO:tasks.workunit.client.1.vm04.stdout:7/507: dwrite d2/dc/de/d2d/d38/d50/fad [4194304,4194304] 0 2026-03-10T14:07:46.289 INFO:tasks.workunit.client.1.vm04.stdout:2/515: dwrite d0/d14/d91/d8/d17/f1f [0,4194304] 0 2026-03-10T14:07:46.291 INFO:tasks.workunit.client.1.vm04.stdout:5/557: write d7/d26/f30 [566053,27112] 0 2026-03-10T14:07:46.302 INFO:tasks.workunit.client.1.vm04.stdout:7/508: truncate d2/dc/d4d/d7f/f96 196649 0 2026-03-10T14:07:46.302 INFO:tasks.workunit.client.1.vm04.stdout:8/535: truncate d0/f23 2334838 0 2026-03-10T14:07:46.304 INFO:tasks.workunit.client.1.vm04.stdout:8/536: write d0/d3/d63/d12/f50 [6218429,42830] 0 2026-03-10T14:07:46.305 INFO:tasks.workunit.client.1.vm04.stdout:5/558: rmdir d7/d12/d2b/d8c 39 2026-03-10T14:07:46.306 INFO:tasks.workunit.client.1.vm04.stdout:8/537: fdatasync d0/d3/d63/d12/d69/f9d 0 2026-03-10T14:07:46.313 INFO:tasks.workunit.client.0.vm03.stdout:2/8: dread f0 [0,4194304] 0 2026-03-10T14:07:46.321 INFO:tasks.workunit.client.1.vm04.stdout:7/509: fsync d2/dc/de/d2d/d60/d7c/d64/f6a 0 2026-03-10T14:07:46.321 INFO:tasks.workunit.client.1.vm04.stdout:2/516: rename d0/d14/d91/cf to d0/d14/d91/d4a/d66/c9e 0 2026-03-10T14:07:46.321 INFO:tasks.workunit.client.1.vm04.stdout:9/436: write d9/da/dd/d1c/f3b [573766,4151] 0 2026-03-10T14:07:46.331 INFO:tasks.workunit.client.1.vm04.stdout:5/559: creat d7/d12/d2b/d93/d9e/faf x:0 0 0 2026-03-10T14:07:46.332 INFO:tasks.workunit.client.1.vm04.stdout:9/437: unlink d9/d44/f49 0 2026-03-10T14:07:46.337 INFO:tasks.workunit.client.1.vm04.stdout:7/510: creat d2/dc/de/d2d/d5c/da9/fc3 x:0 0 0 2026-03-10T14:07:46.340 INFO:tasks.workunit.client.1.vm04.stdout:3/533: dwrite da/dc/d3f/d54/d66/f9e [0,4194304] 0 2026-03-10T14:07:46.340 INFO:tasks.workunit.client.1.vm04.stdout:5/560: creat d7/d12/d2b/d3e/d57/d8a/fb0 x:0 0 0 2026-03-10T14:07:46.341 INFO:tasks.workunit.client.1.vm04.stdout:0/437: dwrite d0/d2/d15/d22/d38/d56/f58 [0,4194304] 0 2026-03-10T14:07:46.344 INFO:tasks.workunit.client.1.vm04.stdout:5/561: readlink d7/d12/d2b/l90 0 2026-03-10T14:07:46.344 INFO:tasks.workunit.client.1.vm04.stdout:5/562: chown d7/d26 1 1 2026-03-10T14:07:46.344 INFO:tasks.workunit.client.1.vm04.stdout:3/534: read da/d30/f4e [3617820,11626] 0 2026-03-10T14:07:46.352 INFO:tasks.workunit.client.1.vm04.stdout:0/438: fsync d0/d2/d15/f44 0 2026-03-10T14:07:46.362 INFO:tasks.workunit.client.1.vm04.stdout:9/438: creat d9/da/d5d/d81/f98 x:0 0 0 2026-03-10T14:07:46.364 INFO:tasks.workunit.client.1.vm04.stdout:3/535: rename da/dc/d3f/f7e to da/dc/d47/d9b/fbe 0 2026-03-10T14:07:46.373 INFO:tasks.workunit.client.1.vm04.stdout:7/511: sync 2026-03-10T14:07:46.383 INFO:tasks.workunit.client.1.vm04.stdout:5/563: mknod d7/d59/d7e/d87/cb1 0 2026-03-10T14:07:46.387 INFO:tasks.workunit.client.1.vm04.stdout:9/439: creat d9/d44/d4d/f99 x:0 0 0 2026-03-10T14:07:46.395 INFO:tasks.workunit.client.1.vm04.stdout:6/407: dread d3/de/d35/d3f/d2d/d38/f50 [0,4194304] 0 2026-03-10T14:07:46.395 INFO:tasks.workunit.client.1.vm04.stdout:4/497: write d4/f5f [4516943,27145] 0 2026-03-10T14:07:46.402 INFO:tasks.workunit.client.1.vm04.stdout:9/440: rename d9/d44/d70/l83 to d9/l9a 0 2026-03-10T14:07:46.407 INFO:tasks.workunit.client.1.vm04.stdout:4/498: fsync d4/f57 0 2026-03-10T14:07:46.413 INFO:tasks.workunit.client.1.vm04.stdout:5/564: rmdir d7/d12/d2b/d8c 39 2026-03-10T14:07:46.413 INFO:tasks.workunit.client.1.vm04.stdout:1/480: write d3/d20/d60/d82/d13/d1a/f62 [1404982,38370] 0 2026-03-10T14:07:46.422 INFO:tasks.workunit.client.0.vm03.stdout:2/9: write f0 [3664608,2667] 0 2026-03-10T14:07:46.423 INFO:tasks.workunit.client.0.vm03.stdout:3/36: getdents . 0 2026-03-10T14:07:46.423 INFO:tasks.workunit.client.0.vm03.stdout:3/37: write - no filename 2026-03-10T14:07:46.423 INFO:tasks.workunit.client.1.vm04.stdout:8/538: read d0/d3/f6b [845998,80266] 0 2026-03-10T14:07:46.423 INFO:tasks.workunit.client.1.vm04.stdout:4/499: readlink d4/df/l25 0 2026-03-10T14:07:46.424 INFO:tasks.workunit.client.1.vm04.stdout:8/539: write d0/d3/dd/d89/fa5 [344282,17587] 0 2026-03-10T14:07:46.425 INFO:tasks.workunit.client.0.vm03.stdout:2/10: dread f0 [0,4194304] 0 2026-03-10T14:07:46.430 INFO:tasks.workunit.client.0.vm03.stdout:3/38: rename l8 to l9 0 2026-03-10T14:07:46.431 INFO:tasks.workunit.client.0.vm03.stdout:2/11: creat f1 x:0 0 0 2026-03-10T14:07:46.431 INFO:tasks.workunit.client.0.vm03.stdout:2/12: dread - f1 zero size 2026-03-10T14:07:46.434 INFO:tasks.workunit.client.0.vm03.stdout:2/13: symlink l2 0 2026-03-10T14:07:46.439 INFO:tasks.workunit.client.0.vm03.stdout:2/14: link l2 l3 0 2026-03-10T14:07:46.439 INFO:tasks.workunit.client.1.vm04.stdout:6/408: mknod d3/de/d35/d3f/d2d/d32/d61/c72 0 2026-03-10T14:07:46.441 INFO:tasks.workunit.client.0.vm03.stdout:2/15: chown l3 127 1 2026-03-10T14:07:46.442 INFO:tasks.workunit.client.0.vm03.stdout:2/16: creat f4 x:0 0 0 2026-03-10T14:07:46.443 INFO:tasks.workunit.client.0.vm03.stdout:2/17: dread - f4 zero size 2026-03-10T14:07:46.446 INFO:tasks.workunit.client.1.vm04.stdout:5/565: creat d7/d12/d45/fb2 x:0 0 0 2026-03-10T14:07:46.447 INFO:tasks.workunit.client.1.vm04.stdout:2/517: write d0/d14/d1b/f55 [993625,30154] 0 2026-03-10T14:07:46.449 INFO:tasks.workunit.client.1.vm04.stdout:2/518: write d0/d14/d91/d8/f30 [2492463,92312] 0 2026-03-10T14:07:46.450 INFO:tasks.workunit.client.1.vm04.stdout:2/519: read d0/d14/d91/f1d [10026811,26614] 0 2026-03-10T14:07:46.457 INFO:tasks.workunit.client.1.vm04.stdout:3/536: getdents da/d8e 0 2026-03-10T14:07:46.465 INFO:tasks.workunit.client.1.vm04.stdout:8/540: write d0/d3/d73/f91 [490859,20826] 0 2026-03-10T14:07:46.466 INFO:tasks.workunit.client.1.vm04.stdout:8/541: read - d0/d3/d63/d12/d69/f8c zero size 2026-03-10T14:07:46.466 INFO:tasks.workunit.client.0.vm03.stdout:2/18: fdatasync f0 0 2026-03-10T14:07:46.467 INFO:tasks.workunit.client.0.vm03.stdout:2/19: write f0 [2448629,52806] 0 2026-03-10T14:07:46.469 INFO:tasks.workunit.client.0.vm03.stdout:2/20: dread f0 [0,4194304] 0 2026-03-10T14:07:46.472 INFO:tasks.workunit.client.0.vm03.stdout:2/21: mkdir d5 0 2026-03-10T14:07:46.473 INFO:tasks.workunit.client.0.vm03.stdout:2/22: symlink d5/l6 0 2026-03-10T14:07:46.478 INFO:tasks.workunit.client.0.vm03.stdout:2/23: dwrite f1 [0,4194304] 0 2026-03-10T14:07:46.478 INFO:tasks.workunit.client.0.vm03.stdout:2/24: truncate f0 4293329 0 2026-03-10T14:07:46.479 INFO:tasks.workunit.client.1.vm04.stdout:6/409: rename d3/de/d35/d3a/d43/d56 to d3/d1d/d73 0 2026-03-10T14:07:46.480 INFO:tasks.workunit.client.0.vm03.stdout:2/25: symlink d5/l7 0 2026-03-10T14:07:46.481 INFO:tasks.workunit.client.1.vm04.stdout:0/439: write d0/d2/d15/f2f [894912,128492] 0 2026-03-10T14:07:46.484 INFO:tasks.workunit.client.0.vm03.stdout:2/26: rename l3 to d5/l8 0 2026-03-10T14:07:46.485 INFO:tasks.workunit.client.0.vm03.stdout:2/27: creat d5/f9 x:0 0 0 2026-03-10T14:07:46.486 INFO:tasks.workunit.client.1.vm04.stdout:2/520: symlink d0/d14/d91/d8/d17/d4e/l9f 0 2026-03-10T14:07:46.491 INFO:tasks.workunit.client.1.vm04.stdout:1/481: symlink d3/d22/d63/d35/la9 0 2026-03-10T14:07:46.493 INFO:tasks.workunit.client.1.vm04.stdout:3/537: chown da/dc/d47/d9b/fbe 0 1 2026-03-10T14:07:46.498 INFO:tasks.workunit.client.1.vm04.stdout:4/500: unlink d4/df/c63 0 2026-03-10T14:07:46.503 INFO:tasks.workunit.client.0.vm03.stdout:5/26: rename d4/f8 to d4/d6/fa 0 2026-03-10T14:07:46.504 INFO:tasks.workunit.client.1.vm04.stdout:5/566: getdents d7/d59/d7d/d9a 0 2026-03-10T14:07:46.506 INFO:tasks.workunit.client.0.vm03.stdout:9/24: truncate d2/f3 3077082 0 2026-03-10T14:07:46.508 INFO:tasks.workunit.client.0.vm03.stdout:9/25: rmdir d2/d4 0 2026-03-10T14:07:46.510 INFO:tasks.workunit.client.1.vm04.stdout:5/567: dread d7/d2d/d69/f95 [0,4194304] 0 2026-03-10T14:07:46.512 INFO:tasks.workunit.client.0.vm03.stdout:8/42: rmdir da 39 2026-03-10T14:07:46.512 INFO:tasks.workunit.client.1.vm04.stdout:4/501: read d4/d14/d3c/d5e/f79 [530343,48559] 0 2026-03-10T14:07:46.512 INFO:tasks.workunit.client.1.vm04.stdout:0/440: creat d0/d2/d15/d22/f88 x:0 0 0 2026-03-10T14:07:46.513 INFO:tasks.workunit.client.1.vm04.stdout:4/502: fdatasync d4/df/d31/fae 0 2026-03-10T14:07:46.535 INFO:tasks.workunit.client.0.vm03.stdout:0/47: rename d3/cb to d3/cf 0 2026-03-10T14:07:46.536 INFO:tasks.workunit.client.0.vm03.stdout:0/48: chown d3/fd 699117 1 2026-03-10T14:07:46.543 INFO:tasks.workunit.client.0.vm03.stdout:0/49: dwrite d3/fd [0,4194304] 0 2026-03-10T14:07:46.547 INFO:tasks.workunit.client.1.vm04.stdout:0/441: unlink d0/d2/f34 0 2026-03-10T14:07:46.550 INFO:tasks.workunit.client.0.vm03.stdout:0/50: dread f1 [0,4194304] 0 2026-03-10T14:07:46.553 INFO:tasks.workunit.client.1.vm04.stdout:7/512: write d2/dc/de/d2d/d38/f41 [5092627,101423] 0 2026-03-10T14:07:46.556 INFO:tasks.workunit.client.1.vm04.stdout:2/521: symlink d0/d14/d91/d8/dd/d26/la0 0 2026-03-10T14:07:46.557 INFO:tasks.workunit.client.0.vm03.stdout:0/51: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:07:46.564 INFO:tasks.workunit.client.0.vm03.stdout:0/52: unlink d3/fd 0 2026-03-10T14:07:46.583 INFO:tasks.workunit.client.0.vm03.stdout:0/53: stat d3 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.1.vm04.stdout:8/542: creat d0/d3/dd/fa7 x:0 0 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.1.vm04.stdout:8/543: write d0/d3/d5/f70 [1711747,27045] 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/54: creat d3/f10 x:0 0 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/55: truncate d3/fc 517641 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/56: chown d3/f9 1112 1 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/57: unlink f2 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/58: mkdir d3/d11 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/59: rename d3/fc to d3/d11/f12 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/60: write d3/f7 [376451,63326] 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/61: write d3/f10 [901601,22650] 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/62: write d3/f7 [3965165,98010] 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/63: unlink f1 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/64: readlink d3/l5 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.0.vm03.stdout:0/65: write d3/f7 [3080855,113605] 0 2026-03-10T14:07:46.584 INFO:tasks.workunit.client.1.vm04.stdout:6/410: rename d3/l49 to d3/de/d35/d3f/l74 0 2026-03-10T14:07:46.595 INFO:tasks.workunit.client.1.vm04.stdout:2/522: dread - d0/d14/d91/d8/d17/d35/f81 zero size 2026-03-10T14:07:46.601 INFO:tasks.workunit.client.1.vm04.stdout:1/482: rename d3/d22/d63/f83 to d3/d22/d63/faa 0 2026-03-10T14:07:46.606 INFO:tasks.workunit.client.1.vm04.stdout:1/483: dread d3/fa [0,4194304] 0 2026-03-10T14:07:46.607 INFO:tasks.workunit.client.1.vm04.stdout:1/484: readlink d3/l9 0 2026-03-10T14:07:46.609 INFO:tasks.workunit.client.1.vm04.stdout:1/485: fsync d3/d20/d60/d82/d13/d1a/f62 0 2026-03-10T14:07:46.611 INFO:tasks.workunit.client.1.vm04.stdout:5/568: rename d7/d12/d2b/d3e/l96 to d7/d26/lb3 0 2026-03-10T14:07:46.616 INFO:tasks.workunit.client.0.vm03.stdout:5/27: dwrite d4/f9 [4194304,4194304] 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.1.vm04.stdout:0/442: symlink d0/d2/d15/d22/d38/l89 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.1.vm04.stdout:0/443: dwrite d0/d2/d15/d22/d38/d56/f58 [0,4194304] 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.1.vm04.stdout:5/569: dread f4 [0,4194304] 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.1.vm04.stdout:3/538: getdents da/dc/d35/d52/d6d/d8a 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.1.vm04.stdout:5/570: dwrite d7/d2d/d32/f9d [0,4194304] 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/28: read d4/f9 [1159653,21575] 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/29: truncate d4/d6/fa 1737010 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/30: stat l1 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/31: creat d4/d6/fb x:0 0 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/32: write d4/d6/fb [690421,8003] 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/33: mknod d4/d6/cc 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/34: dwrite d4/f9 [4194304,4194304] 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/35: creat d4/fd x:0 0 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/36: dread - d4/fd zero size 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/37: dread d4/f9 [0,4194304] 0 2026-03-10T14:07:46.652 INFO:tasks.workunit.client.0.vm03.stdout:5/38: mkdir d4/d6/de 0 2026-03-10T14:07:46.661 INFO:tasks.workunit.client.0.vm03.stdout:5/39: dread d4/d6/fa [0,4194304] 0 2026-03-10T14:07:46.661 INFO:tasks.workunit.client.1.vm04.stdout:6/411: rename d3/de/d35/d3f/d2d/f1c to d3/de/d35/d3f/d2d/d32/d5c/f75 0 2026-03-10T14:07:46.662 INFO:tasks.workunit.client.0.vm03.stdout:5/40: symlink d4/lf 0 2026-03-10T14:07:46.662 INFO:tasks.workunit.client.1.vm04.stdout:6/412: chown d3/de/f6d 10 1 2026-03-10T14:07:46.663 INFO:tasks.workunit.client.0.vm03.stdout:5/41: rename d4/f9 to d4/d6/f10 0 2026-03-10T14:07:46.665 INFO:tasks.workunit.client.1.vm04.stdout:4/503: getdents d4/d14/d3c 0 2026-03-10T14:07:46.667 INFO:tasks.workunit.client.1.vm04.stdout:2/523: symlink d0/d14/d91/d8/dd/d26/d95/la1 0 2026-03-10T14:07:46.670 INFO:tasks.workunit.client.1.vm04.stdout:0/444: creat d0/d2/d15/d49/d50/f8a x:0 0 0 2026-03-10T14:07:46.675 INFO:tasks.workunit.client.1.vm04.stdout:5/571: mknod d7/d12/d2b/d3e/d3f/cb4 0 2026-03-10T14:07:46.690 INFO:tasks.workunit.client.1.vm04.stdout:6/413: unlink d3/de/d35/d3f/d2d/d32/d23/l42 0 2026-03-10T14:07:46.698 INFO:tasks.workunit.client.1.vm04.stdout:2/524: unlink d0/d14/f79 0 2026-03-10T14:07:46.700 INFO:tasks.workunit.client.1.vm04.stdout:4/504: mkdir d4/d14/dac/db7 0 2026-03-10T14:07:46.701 INFO:tasks.workunit.client.1.vm04.stdout:6/414: mkdir d3/de/d35/d3a/d43/d4c/d5e/d76 0 2026-03-10T14:07:46.701 INFO:tasks.workunit.client.1.vm04.stdout:2/525: rmdir d0/d14/d91/d8/d17/d4e/d85/d86 39 2026-03-10T14:07:46.704 INFO:tasks.workunit.client.1.vm04.stdout:5/572: dread d7/d12/d2b/d3e/d57/d8a/f94 [0,4194304] 0 2026-03-10T14:07:46.705 INFO:tasks.workunit.client.1.vm04.stdout:5/573: fdatasync d7/d26/d6b/d6e/fa3 0 2026-03-10T14:07:46.705 INFO:tasks.workunit.client.1.vm04.stdout:2/526: fsync d0/d14/d91/d4a/f57 0 2026-03-10T14:07:46.707 INFO:tasks.workunit.client.1.vm04.stdout:5/574: unlink d7/d12/d2b/d93/fa4 0 2026-03-10T14:07:46.708 INFO:tasks.workunit.client.1.vm04.stdout:4/505: rmdir d4/df/db2/db4/d47/db1 0 2026-03-10T14:07:46.708 INFO:tasks.workunit.client.1.vm04.stdout:4/506: chown d4/d14/d1b 4797 1 2026-03-10T14:07:46.709 INFO:tasks.workunit.client.1.vm04.stdout:6/415: creat d3/de/d35/d3a/d43/d4c/d5e/d76/f77 x:0 0 0 2026-03-10T14:07:46.710 INFO:tasks.workunit.client.1.vm04.stdout:5/575: mkdir d7/d9/db5 0 2026-03-10T14:07:46.710 INFO:tasks.workunit.client.0.vm03.stdout:8/43: sync 2026-03-10T14:07:46.712 INFO:tasks.workunit.client.1.vm04.stdout:4/507: rename d4/df/d31/f55 to d4/df/db2/db4/fb8 0 2026-03-10T14:07:46.713 INFO:tasks.workunit.client.0.vm03.stdout:8/44: symlink da/lc 0 2026-03-10T14:07:46.718 INFO:tasks.workunit.client.1.vm04.stdout:4/508: read d4/d14/d3c/d5e/f79 [180513,95859] 0 2026-03-10T14:07:46.721 INFO:tasks.workunit.client.1.vm04.stdout:5/576: link d7/f4b d7/d59/d7e/fb6 0 2026-03-10T14:07:46.725 INFO:tasks.workunit.client.1.vm04.stdout:5/577: truncate d7/f1d 2687987 0 2026-03-10T14:07:46.736 INFO:tasks.workunit.client.1.vm04.stdout:4/509: rename d4/d14/d6d/da9 to d4/db9 0 2026-03-10T14:07:46.736 INFO:tasks.workunit.client.1.vm04.stdout:2/527: dread d0/d14/d91/d4a/d66/f72 [4194304,4194304] 0 2026-03-10T14:07:46.736 INFO:tasks.workunit.client.1.vm04.stdout:2/528: chown d0/d14/d91/d4a/d66/f7d 122779477 1 2026-03-10T14:07:46.736 INFO:tasks.workunit.client.1.vm04.stdout:6/416: dread d3/de/d35/d3f/d2d/f2e [0,4194304] 0 2026-03-10T14:07:46.736 INFO:tasks.workunit.client.1.vm04.stdout:5/578: mkdir d7/db7 0 2026-03-10T14:07:46.748 INFO:tasks.workunit.client.1.vm04.stdout:9/441: dread d9/d44/d70/f71 [0,4194304] 0 2026-03-10T14:07:46.749 INFO:tasks.workunit.client.1.vm04.stdout:5/579: dread d7/d26/d6b/d6e/f81 [0,4194304] 0 2026-03-10T14:07:46.750 INFO:tasks.workunit.client.1.vm04.stdout:5/580: chown d7/d12/d2b/d3e/d57/d77/da5 19726 1 2026-03-10T14:07:46.756 INFO:tasks.workunit.client.1.vm04.stdout:4/510: getdents d4/d14 0 2026-03-10T14:07:46.759 INFO:tasks.workunit.client.1.vm04.stdout:3/539: sync 2026-03-10T14:07:46.761 INFO:tasks.workunit.client.1.vm04.stdout:9/442: dread d9/da/f41 [0,4194304] 0 2026-03-10T14:07:46.762 INFO:tasks.workunit.client.1.vm04.stdout:9/443: readlink d9/da/l12 0 2026-03-10T14:07:46.762 INFO:tasks.workunit.client.0.vm03.stdout:6/12: dwrite f0 [0,4194304] 0 2026-03-10T14:07:46.762 INFO:tasks.workunit.client.0.vm03.stdout:6/13: rmdir - no directory 2026-03-10T14:07:46.762 INFO:tasks.workunit.client.1.vm04.stdout:5/581: mkdir d7/d2d/d69/db8 0 2026-03-10T14:07:46.765 INFO:tasks.workunit.client.0.vm03.stdout:6/14: dread f2 [0,4194304] 0 2026-03-10T14:07:46.765 INFO:tasks.workunit.client.0.vm03.stdout:6/15: write f0 [3338041,6746] 0 2026-03-10T14:07:46.766 INFO:tasks.workunit.client.0.vm03.stdout:6/16: creat f5 x:0 0 0 2026-03-10T14:07:46.767 INFO:tasks.workunit.client.0.vm03.stdout:6/17: mknod c6 0 2026-03-10T14:07:46.767 INFO:tasks.workunit.client.1.vm04.stdout:7/513: write d2/dc/de/d2d/d60/d7c/f97 [646304,29794] 0 2026-03-10T14:07:46.768 INFO:tasks.workunit.client.0.vm03.stdout:6/18: mknod c7 0 2026-03-10T14:07:46.769 INFO:tasks.workunit.client.0.vm03.stdout:6/19: write f5 [790512,30222] 0 2026-03-10T14:07:46.774 INFO:tasks.workunit.client.1.vm04.stdout:0/445: rmdir d0/d2/d15/d22 39 2026-03-10T14:07:46.775 INFO:tasks.workunit.client.1.vm04.stdout:8/544: truncate d0/f42 3030444 0 2026-03-10T14:07:46.778 INFO:tasks.workunit.client.1.vm04.stdout:1/486: write d3/d20/d60/d82/d13/d1a/f28 [3053109,43920] 0 2026-03-10T14:07:46.783 INFO:tasks.workunit.client.1.vm04.stdout:8/545: dwrite d0/d3/dd/fa7 [0,4194304] 0 2026-03-10T14:07:46.799 INFO:tasks.workunit.client.1.vm04.stdout:4/511: chown d4/df/db2/db4/d47/d70 399836643 1 2026-03-10T14:07:46.806 INFO:tasks.workunit.client.1.vm04.stdout:3/540: unlink da/dc/d35/d52/da3/dac/fae 0 2026-03-10T14:07:46.811 INFO:tasks.workunit.client.1.vm04.stdout:3/541: truncate da/dc/d35/d37/f5e 4914136 0 2026-03-10T14:07:46.811 INFO:tasks.workunit.client.1.vm04.stdout:3/542: chown da/fb 82 1 2026-03-10T14:07:46.811 INFO:tasks.workunit.client.1.vm04.stdout:5/582: creat d7/d59/d7d/fb9 x:0 0 0 2026-03-10T14:07:46.811 INFO:tasks.workunit.client.1.vm04.stdout:0/446: write d0/d2/d15/f59 [1752442,69687] 0 2026-03-10T14:07:46.817 INFO:tasks.workunit.client.1.vm04.stdout:8/546: symlink d0/d3/d63/d29/la8 0 2026-03-10T14:07:46.818 INFO:tasks.workunit.client.1.vm04.stdout:8/547: stat d0/l4e 0 2026-03-10T14:07:46.822 INFO:tasks.workunit.client.1.vm04.stdout:8/548: dwrite d0/d75/d8a/f9e [0,4194304] 0 2026-03-10T14:07:46.825 INFO:tasks.workunit.client.1.vm04.stdout:4/512: dread - d4/df/db2/db4/d47/d70/d74/fb5 zero size 2026-03-10T14:07:46.826 INFO:tasks.workunit.client.1.vm04.stdout:4/513: write d4/f5f [4318635,44087] 0 2026-03-10T14:07:46.826 INFO:tasks.workunit.client.1.vm04.stdout:8/549: truncate d0/d3/dd/d78/f8d 275451 0 2026-03-10T14:07:46.837 INFO:tasks.workunit.client.1.vm04.stdout:9/444: rename d9/da/f41 to d9/da/d5d/f9b 0 2026-03-10T14:07:46.848 INFO:tasks.workunit.client.1.vm04.stdout:7/514: symlink d2/lc4 0 2026-03-10T14:07:46.863 INFO:tasks.workunit.client.1.vm04.stdout:0/447: dwrite d0/d2/fa [0,4194304] 0 2026-03-10T14:07:46.890 INFO:tasks.workunit.client.1.vm04.stdout:8/550: rename d0/f1b to d0/d3/dd/d89/fa9 0 2026-03-10T14:07:46.900 INFO:tasks.workunit.client.1.vm04.stdout:7/515: creat d2/dc/de/d21/fc5 x:0 0 0 2026-03-10T14:07:46.906 INFO:tasks.workunit.client.1.vm04.stdout:9/445: dread d9/da/dd/d1c/f2e [0,4194304] 0 2026-03-10T14:07:46.908 INFO:tasks.workunit.client.0.vm03.stdout:7/50: dwrite d5/fb [4194304,4194304] 0 2026-03-10T14:07:46.908 INFO:tasks.workunit.client.1.vm04.stdout:5/583: mknod d7/d12/d2b/d3e/cba 0 2026-03-10T14:07:46.909 INFO:tasks.workunit.client.0.vm03.stdout:7/51: mknod d5/d9/ce 0 2026-03-10T14:07:46.910 INFO:tasks.workunit.client.0.vm03.stdout:7/52: stat d5/f6 0 2026-03-10T14:07:46.910 INFO:tasks.workunit.client.1.vm04.stdout:0/448: creat d0/d6e/f8b x:0 0 0 2026-03-10T14:07:46.911 INFO:tasks.workunit.client.0.vm03.stdout:7/53: unlink d5/ca 0 2026-03-10T14:07:46.929 INFO:tasks.workunit.client.1.vm04.stdout:7/516: mkdir d2/dc/de/d21/dc6 0 2026-03-10T14:07:46.939 INFO:tasks.workunit.client.1.vm04.stdout:0/449: readlink d0/d2/d15/d22/d38/l60 0 2026-03-10T14:07:46.942 INFO:tasks.workunit.client.0.vm03.stdout:7/54: sync 2026-03-10T14:07:46.942 INFO:tasks.workunit.client.1.vm04.stdout:4/514: creat d4/fba x:0 0 0 2026-03-10T14:07:46.943 INFO:tasks.workunit.client.0.vm03.stdout:7/55: chown d5/fb 1917274 1 2026-03-10T14:07:46.947 INFO:tasks.workunit.client.1.vm04.stdout:2/529: write d0/d14/d91/d8/d17/d35/f94 [2269602,30438] 0 2026-03-10T14:07:46.951 INFO:tasks.workunit.client.0.vm03.stdout:7/56: sync 2026-03-10T14:07:46.951 INFO:tasks.workunit.client.1.vm04.stdout:6/417: dwrite d3/f57 [0,4194304] 0 2026-03-10T14:07:46.952 INFO:tasks.workunit.client.0.vm03.stdout:7/57: write d5/f6 [1637430,12641] 0 2026-03-10T14:07:46.953 INFO:tasks.workunit.client.1.vm04.stdout:6/418: read d3/de/d35/d3a/d43/d4c/f4d [2538069,101426] 0 2026-03-10T14:07:46.954 INFO:tasks.workunit.client.0.vm03.stdout:7/58: rmdir d5/d9/dc 0 2026-03-10T14:07:46.955 INFO:tasks.workunit.client.0.vm03.stdout:7/59: write d5/fb [7291103,113175] 0 2026-03-10T14:07:46.956 INFO:tasks.workunit.client.0.vm03.stdout:7/60: chown d5 66637 1 2026-03-10T14:07:46.957 INFO:tasks.workunit.client.0.vm03.stdout:7/61: creat d5/d9/dd/ff x:0 0 0 2026-03-10T14:07:46.958 INFO:tasks.workunit.client.0.vm03.stdout:7/62: read d5/f7 [3013983,99592] 0 2026-03-10T14:07:46.960 INFO:tasks.workunit.client.0.vm03.stdout:7/63: dread d5/f7 [0,4194304] 0 2026-03-10T14:07:46.963 INFO:tasks.workunit.client.0.vm03.stdout:7/64: dread d5/f6 [0,4194304] 0 2026-03-10T14:07:46.965 INFO:tasks.workunit.client.1.vm04.stdout:0/450: symlink d0/d2/d15/d49/d50/d61/l8c 0 2026-03-10T14:07:46.965 INFO:tasks.workunit.client.1.vm04.stdout:0/451: readlink d0/d2/d15/d22/l69 0 2026-03-10T14:07:46.967 INFO:tasks.workunit.client.0.vm03.stdout:7/65: creat d5/d9/f10 x:0 0 0 2026-03-10T14:07:46.968 INFO:tasks.workunit.client.0.vm03.stdout:7/66: read d5/f6 [2994061,82961] 0 2026-03-10T14:07:46.968 INFO:tasks.workunit.client.1.vm04.stdout:8/551: rename d0/d3/d63/d12/l31 to d0/laa 0 2026-03-10T14:07:46.968 INFO:tasks.workunit.client.0.vm03.stdout:7/67: dread - d5/d9/f10 zero size 2026-03-10T14:07:46.977 INFO:tasks.workunit.client.0.vm03.stdout:7/68: dwrite d5/f7 [4194304,4194304] 0 2026-03-10T14:07:46.977 INFO:tasks.workunit.client.0.vm03.stdout:7/69: rename d5 to d5/d11 22 2026-03-10T14:07:46.977 INFO:tasks.workunit.client.1.vm04.stdout:0/452: write d0/d2/f9 [1304515,65100] 0 2026-03-10T14:07:46.978 INFO:tasks.workunit.client.1.vm04.stdout:1/487: truncate d3/d22/d63/d35/f9a 373551 0 2026-03-10T14:07:46.979 INFO:tasks.workunit.client.0.vm03.stdout:7/70: write d5/f6 [4363516,88549] 0 2026-03-10T14:07:46.982 INFO:tasks.workunit.client.1.vm04.stdout:7/517: creat d2/fc7 x:0 0 0 2026-03-10T14:07:46.984 INFO:tasks.workunit.client.1.vm04.stdout:7/518: chown d2/dc/de/d11/c30 25668 1 2026-03-10T14:07:46.989 INFO:tasks.workunit.client.1.vm04.stdout:0/453: rename d0/d2/d25/f45 to d0/d2/d15/d49/d50/d61/f8d 0 2026-03-10T14:07:46.999 INFO:tasks.workunit.client.1.vm04.stdout:2/530: dread d0/d14/d91/d8/f6 [0,4194304] 0 2026-03-10T14:07:47.000 INFO:tasks.workunit.client.1.vm04.stdout:2/531: stat d0/l9d 0 2026-03-10T14:07:47.000 INFO:tasks.workunit.client.1.vm04.stdout:2/532: stat d0/d14/f6b 0 2026-03-10T14:07:47.000 INFO:tasks.workunit.client.1.vm04.stdout:1/488: mknod d3/d20/d60/d82/d13/da0/cab 0 2026-03-10T14:07:47.002 INFO:tasks.workunit.client.1.vm04.stdout:7/519: rename d2/d2a/d42/d56 to d2/dc/de/d2d/d38/d50/dc8 0 2026-03-10T14:07:47.005 INFO:tasks.workunit.client.1.vm04.stdout:0/454: symlink d0/d2/d15/d49/d50/l8e 0 2026-03-10T14:07:47.008 INFO:tasks.workunit.client.1.vm04.stdout:8/552: getdents d0/d3/d5 0 2026-03-10T14:07:47.008 INFO:tasks.workunit.client.1.vm04.stdout:8/553: write d0/d3/d5/f70 [3473325,114630] 0 2026-03-10T14:07:47.016 INFO:tasks.workunit.client.1.vm04.stdout:0/455: symlink d0/d2/d15/d22/d62/l8f 0 2026-03-10T14:07:47.016 INFO:tasks.workunit.client.1.vm04.stdout:0/456: write d0/d2/f12 [1416277,17890] 0 2026-03-10T14:07:47.018 INFO:tasks.workunit.client.1.vm04.stdout:8/554: creat d0/d3/dd/d78/fab x:0 0 0 2026-03-10T14:07:47.024 INFO:tasks.workunit.client.1.vm04.stdout:0/457: read d0/d2/d25/f3f [128512,65153] 0 2026-03-10T14:07:47.025 INFO:tasks.workunit.client.1.vm04.stdout:0/458: chown d0/d2/d15/f2f 4634296 1 2026-03-10T14:07:47.026 INFO:tasks.workunit.client.1.vm04.stdout:2/533: link d0/d14/d1b/f8a d0/d14/d39/fa2 0 2026-03-10T14:07:47.034 INFO:tasks.workunit.client.1.vm04.stdout:8/555: getdents d0/d82 0 2026-03-10T14:07:47.036 INFO:tasks.workunit.client.1.vm04.stdout:2/534: rename d0/d14/d91/d8/l41 to d0/d14/d91/d8/d17/la3 0 2026-03-10T14:07:47.037 INFO:tasks.workunit.client.1.vm04.stdout:2/535: chown d0/d14/d91/d8/dd/d26/d46/d4b/d56 285 1 2026-03-10T14:07:47.040 INFO:tasks.workunit.client.1.vm04.stdout:8/556: rename d0/d3/dd/l99 to d0/d3/dd/d76/lac 0 2026-03-10T14:07:47.043 INFO:tasks.workunit.client.1.vm04.stdout:2/536: rename d0/d14/d91/d8/dd/d26/la0 to d0/d14/d91/d4a/d8c/la4 0 2026-03-10T14:07:47.044 INFO:tasks.workunit.client.1.vm04.stdout:2/537: stat d0/d14/d91/d8/l62 0 2026-03-10T14:07:47.044 INFO:tasks.workunit.client.1.vm04.stdout:2/538: chown d0/d14/d91/d8/l18 209797 1 2026-03-10T14:07:47.045 INFO:tasks.workunit.client.1.vm04.stdout:2/539: write d0/d14/d1b/f55 [1218735,64417] 0 2026-03-10T14:07:47.048 INFO:tasks.workunit.client.1.vm04.stdout:8/557: creat d0/d3/d63/d12/d51/d67/d96/fad x:0 0 0 2026-03-10T14:07:47.050 INFO:tasks.workunit.client.1.vm04.stdout:2/540: write d0/d14/d91/d8/f30 [1911711,126346] 0 2026-03-10T14:07:47.050 INFO:tasks.workunit.client.1.vm04.stdout:8/558: fdatasync d0/d3/d63/d12/f2c 0 2026-03-10T14:07:47.050 INFO:tasks.workunit.client.1.vm04.stdout:6/419: sync 2026-03-10T14:07:47.051 INFO:tasks.workunit.client.1.vm04.stdout:7/520: sync 2026-03-10T14:07:47.053 INFO:tasks.workunit.client.0.vm03.stdout:4/23: dread f2 [0,4194304] 0 2026-03-10T14:07:47.053 INFO:tasks.workunit.client.0.vm03.stdout:4/24: fdatasync f3 0 2026-03-10T14:07:47.055 INFO:tasks.workunit.client.0.vm03.stdout:4/25: creat f4 x:0 0 0 2026-03-10T14:07:47.057 INFO:tasks.workunit.client.0.vm03.stdout:4/26: mkdir d5 0 2026-03-10T14:07:47.060 INFO:tasks.workunit.client.0.vm03.stdout:4/27: symlink d5/l6 0 2026-03-10T14:07:47.061 INFO:tasks.workunit.client.1.vm04.stdout:3/543: dwrite da/dc/f90 [0,4194304] 0 2026-03-10T14:07:47.062 INFO:tasks.workunit.client.0.vm03.stdout:4/28: creat d5/f7 x:0 0 0 2026-03-10T14:07:47.062 INFO:tasks.workunit.client.0.vm03.stdout:4/29: dread - f4 zero size 2026-03-10T14:07:47.065 INFO:tasks.workunit.client.0.vm03.stdout:4/30: creat d5/f8 x:0 0 0 2026-03-10T14:07:47.071 INFO:tasks.workunit.client.1.vm04.stdout:2/541: mkdir d0/d14/d1b/d45/da5 0 2026-03-10T14:07:47.075 INFO:tasks.workunit.client.1.vm04.stdout:8/559: creat d0/d3/dd/d78/fae x:0 0 0 2026-03-10T14:07:47.087 INFO:tasks.workunit.client.1.vm04.stdout:8/560: truncate d0/d3/d63/d12/d69/f9d 176175 0 2026-03-10T14:07:47.088 INFO:tasks.workunit.client.1.vm04.stdout:9/446: dwrite d9/da/dd/d74/f75 [0,4194304] 0 2026-03-10T14:07:47.093 INFO:tasks.workunit.client.1.vm04.stdout:9/447: chown d9/da/c89 23 1 2026-03-10T14:07:47.093 INFO:tasks.workunit.client.1.vm04.stdout:2/542: creat d0/d14/d91/d4a/fa6 x:0 0 0 2026-03-10T14:07:47.093 INFO:tasks.workunit.client.1.vm04.stdout:9/448: chown d9/f4a 6200722 1 2026-03-10T14:07:47.102 INFO:tasks.workunit.client.1.vm04.stdout:9/449: getdents d9/d33 0 2026-03-10T14:07:47.121 INFO:tasks.workunit.client.1.vm04.stdout:9/450: read - d9/d1d/f67 zero size 2026-03-10T14:07:47.121 INFO:tasks.workunit.client.1.vm04.stdout:9/451: write d9/da/dd/f48 [4102961,96452] 0 2026-03-10T14:07:47.124 INFO:tasks.workunit.client.1.vm04.stdout:8/561: sync 2026-03-10T14:07:47.143 INFO:tasks.workunit.client.0.vm03.stdout:1/30: rmdir d0 39 2026-03-10T14:07:47.144 INFO:tasks.workunit.client.0.vm03.stdout:1/31: dread - d0/d2/f7 zero size 2026-03-10T14:07:47.145 INFO:tasks.workunit.client.0.vm03.stdout:1/32: creat d0/fc x:0 0 0 2026-03-10T14:07:47.146 INFO:tasks.workunit.client.0.vm03.stdout:1/33: dread - d0/d2/f7 zero size 2026-03-10T14:07:47.146 INFO:tasks.workunit.client.0.vm03.stdout:1/34: fdatasync d0/fa 0 2026-03-10T14:07:47.152 INFO:tasks.workunit.client.0.vm03.stdout:1/35: dwrite d0/fc [0,4194304] 0 2026-03-10T14:07:47.155 INFO:tasks.workunit.client.0.vm03.stdout:1/36: mknod d0/d2/cd 0 2026-03-10T14:07:47.161 INFO:tasks.workunit.client.1.vm04.stdout:5/584: write d7/f3c [174724,42806] 0 2026-03-10T14:07:47.166 INFO:tasks.workunit.client.1.vm04.stdout:5/585: mknod d7/d12/d2b/cbb 0 2026-03-10T14:07:47.178 INFO:tasks.workunit.client.1.vm04.stdout:5/586: read d7/d9/f20 [206952,124248] 0 2026-03-10T14:07:47.178 INFO:tasks.workunit.client.1.vm04.stdout:4/515: write d4/df/db2/db4/f2d [1895615,96071] 0 2026-03-10T14:07:47.188 INFO:tasks.workunit.client.1.vm04.stdout:5/587: creat d7/d26/fbc x:0 0 0 2026-03-10T14:07:47.217 INFO:tasks.workunit.client.1.vm04.stdout:1/489: dwrite d3/d20/f27 [4194304,4194304] 0 2026-03-10T14:07:47.246 INFO:tasks.workunit.client.1.vm04.stdout:1/490: creat d3/d22/d2f/d57/fac x:0 0 0 2026-03-10T14:07:47.252 INFO:tasks.workunit.client.1.vm04.stdout:2/543: getdents d0/d14/d91/d8/dd/d26 0 2026-03-10T14:07:47.252 INFO:tasks.workunit.client.1.vm04.stdout:0/459: dwrite d0/d2/d15/d22/d38/d56/f79 [0,4194304] 0 2026-03-10T14:07:47.256 INFO:tasks.workunit.client.1.vm04.stdout:1/491: rmdir d3/d20/d60 39 2026-03-10T14:07:47.259 INFO:tasks.workunit.client.1.vm04.stdout:1/492: write d3/d22/d2f/d57/fac [1033540,111928] 0 2026-03-10T14:07:47.262 INFO:tasks.workunit.client.1.vm04.stdout:0/460: fdatasync d0/d2/d15/d22/d38/d56/f5e 0 2026-03-10T14:07:47.263 INFO:tasks.workunit.client.1.vm04.stdout:3/544: write da/d30/f4e [2721100,7215] 0 2026-03-10T14:07:47.265 INFO:tasks.workunit.client.1.vm04.stdout:6/420: dwrite d3/ff [0,4194304] 0 2026-03-10T14:07:47.267 INFO:tasks.workunit.client.1.vm04.stdout:7/521: dwrite d2/d94/f7e [0,4194304] 0 2026-03-10T14:07:47.272 INFO:tasks.workunit.client.1.vm04.stdout:6/421: dread - d3/de/d35/d3a/d43/d4c/d5e/d76/f77 zero size 2026-03-10T14:07:47.273 INFO:tasks.workunit.client.1.vm04.stdout:0/461: chown d0/d2/d25/f2a 58769 1 2026-03-10T14:07:47.283 INFO:tasks.workunit.client.1.vm04.stdout:0/462: truncate d0/d2/d15/d22/f88 1040749 0 2026-03-10T14:07:47.286 INFO:tasks.workunit.client.1.vm04.stdout:0/463: stat d0/d2/d15/d22/d38/f5b 0 2026-03-10T14:07:47.287 INFO:tasks.workunit.client.1.vm04.stdout:7/522: mknod d2/dc/de/d2d/d60/d7c/d36/daa/cc9 0 2026-03-10T14:07:47.288 INFO:tasks.workunit.client.1.vm04.stdout:7/523: write d2/dc/de/d2d/d38/f41 [5750337,42093] 0 2026-03-10T14:07:47.300 INFO:tasks.workunit.client.1.vm04.stdout:3/545: mknod da/dc/d3f/d61/cbf 0 2026-03-10T14:07:47.304 INFO:tasks.workunit.client.1.vm04.stdout:3/546: read da/f19 [2017213,16083] 0 2026-03-10T14:07:47.308 INFO:tasks.workunit.client.1.vm04.stdout:2/544: rename d0/d14/d39/d47/d70/l83 to d0/d14/d91/d8/d17/d4e/d85/d86/la7 0 2026-03-10T14:07:47.311 INFO:tasks.workunit.client.1.vm04.stdout:1/493: mknod d3/d20/d60/d82/cad 0 2026-03-10T14:07:47.311 INFO:tasks.workunit.client.1.vm04.stdout:1/494: chown d3/d22/d2f/f39 38341250 1 2026-03-10T14:07:47.331 INFO:tasks.workunit.client.1.vm04.stdout:3/547: rmdir da/dc/d3f/d54/d66 39 2026-03-10T14:07:47.332 INFO:tasks.workunit.client.1.vm04.stdout:1/495: symlink d3/d5c/d79/lae 0 2026-03-10T14:07:47.341 INFO:tasks.workunit.client.1.vm04.stdout:6/422: creat d3/f78 x:0 0 0 2026-03-10T14:07:47.349 INFO:tasks.workunit.client.1.vm04.stdout:0/464: dread d0/d2/d15/d22/d38/d56/d66/f2e [0,4194304] 0 2026-03-10T14:07:47.352 INFO:tasks.workunit.client.1.vm04.stdout:0/465: read - d0/d2/d15/d49/d50/d5c/f76 zero size 2026-03-10T14:07:47.357 INFO:tasks.workunit.client.1.vm04.stdout:0/466: chown d0/d2/d15/d22/d38/f71 7553516 1 2026-03-10T14:07:47.358 INFO:tasks.workunit.client.1.vm04.stdout:7/524: rename d2/c3a to d2/dc/de/dae/cca 0 2026-03-10T14:07:47.359 INFO:tasks.workunit.client.1.vm04.stdout:7/525: write d2/dc/de/f98 [73268,102435] 0 2026-03-10T14:07:47.359 INFO:tasks.workunit.client.1.vm04.stdout:7/526: read - d2/d2a/f92 zero size 2026-03-10T14:07:47.366 INFO:tasks.workunit.client.1.vm04.stdout:6/423: truncate d3/f9 3712180 0 2026-03-10T14:07:47.375 INFO:tasks.workunit.client.1.vm04.stdout:0/467: truncate d0/d2/d15/d22/d38/f5b 720495 0 2026-03-10T14:07:47.388 INFO:tasks.workunit.client.1.vm04.stdout:7/527: unlink d2/lc4 0 2026-03-10T14:07:47.388 INFO:tasks.workunit.client.1.vm04.stdout:7/528: write d2/dc/de/f1e [2844218,39701] 0 2026-03-10T14:07:47.388 INFO:tasks.workunit.client.1.vm04.stdout:7/529: creat d2/dc/de/d21/dc6/fcb x:0 0 0 2026-03-10T14:07:47.388 INFO:tasks.workunit.client.1.vm04.stdout:7/530: mknod d2/d2a/ccc 0 2026-03-10T14:07:47.394 INFO:tasks.workunit.client.1.vm04.stdout:1/496: sync 2026-03-10T14:07:47.398 INFO:tasks.workunit.client.1.vm04.stdout:0/468: dread d0/d2/fd [0,4194304] 0 2026-03-10T14:07:47.408 INFO:tasks.workunit.client.1.vm04.stdout:9/452: dwrite d9/d1d/f23 [0,4194304] 0 2026-03-10T14:07:47.408 INFO:tasks.workunit.client.0.vm03.stdout:3/39: getdents . 0 2026-03-10T14:07:47.409 INFO:tasks.workunit.client.0.vm03.stdout:3/40: write - no filename 2026-03-10T14:07:47.409 INFO:tasks.workunit.client.0.vm03.stdout:3/41: rmdir - no directory 2026-03-10T14:07:47.410 INFO:tasks.workunit.client.0.vm03.stdout:3/42: creat fa x:0 0 0 2026-03-10T14:07:47.410 INFO:tasks.workunit.client.0.vm03.stdout:3/43: chown c6 2 1 2026-03-10T14:07:47.411 INFO:tasks.workunit.client.1.vm04.stdout:0/469: mkdir d0/d2/d90 0 2026-03-10T14:07:47.411 INFO:tasks.workunit.client.0.vm03.stdout:3/44: read - fa zero size 2026-03-10T14:07:47.415 INFO:tasks.workunit.client.0.vm03.stdout:2/28: rmdir d5 39 2026-03-10T14:07:47.415 INFO:tasks.workunit.client.1.vm04.stdout:8/562: dwrite d0/f23 [0,4194304] 0 2026-03-10T14:07:47.416 INFO:tasks.workunit.client.1.vm04.stdout:8/563: dread - d0/d75/fa2 zero size 2026-03-10T14:07:47.417 INFO:tasks.workunit.client.0.vm03.stdout:2/29: rmdir d5 39 2026-03-10T14:07:47.422 INFO:tasks.workunit.client.0.vm03.stdout:2/30: dread - d5/f9 zero size 2026-03-10T14:07:47.423 INFO:tasks.workunit.client.0.vm03.stdout:9/26: write d2/f3 [2620789,46812] 0 2026-03-10T14:07:47.424 INFO:tasks.workunit.client.0.vm03.stdout:2/31: readlink l2 0 2026-03-10T14:07:47.425 INFO:tasks.workunit.client.1.vm04.stdout:8/564: rmdir d0/d3/d63/d12/d51 39 2026-03-10T14:07:47.425 INFO:tasks.workunit.client.1.vm04.stdout:0/470: write d0/d2/d15/d22/d38/d56/f67 [1729408,99099] 0 2026-03-10T14:07:47.426 INFO:tasks.workunit.client.1.vm04.stdout:8/565: write d0/d3/dd/d76/f92 [1186373,50330] 0 2026-03-10T14:07:47.432 INFO:tasks.workunit.client.0.vm03.stdout:2/32: creat d5/fa x:0 0 0 2026-03-10T14:07:47.435 INFO:tasks.workunit.client.0.vm03.stdout:2/33: dwrite f4 [0,4194304] 0 2026-03-10T14:07:47.440 INFO:tasks.workunit.client.0.vm03.stdout:2/34: read f4 [1713229,48864] 0 2026-03-10T14:07:47.446 INFO:tasks.workunit.client.1.vm04.stdout:0/471: rmdir d0/d6e 39 2026-03-10T14:07:47.446 INFO:tasks.workunit.client.0.vm03.stdout:9/27: link d2/c9 d2/ca 0 2026-03-10T14:07:47.446 INFO:tasks.workunit.client.1.vm04.stdout:8/566: mknod d0/d75/caf 0 2026-03-10T14:07:47.448 INFO:tasks.workunit.client.0.vm03.stdout:2/35: creat d5/fb x:0 0 0 2026-03-10T14:07:47.448 INFO:tasks.workunit.client.0.vm03.stdout:2/36: readlink d5/l8 0 2026-03-10T14:07:47.450 INFO:tasks.workunit.client.0.vm03.stdout:9/28: mknod d2/cb 0 2026-03-10T14:07:47.452 INFO:tasks.workunit.client.1.vm04.stdout:4/516: dwrite d4/df/d34/f95 [0,4194304] 0 2026-03-10T14:07:47.453 INFO:tasks.workunit.client.0.vm03.stdout:9/29: creat d2/fc x:0 0 0 2026-03-10T14:07:47.454 INFO:tasks.workunit.client.0.vm03.stdout:9/30: unlink d2/l7 0 2026-03-10T14:07:47.460 INFO:tasks.workunit.client.1.vm04.stdout:4/517: unlink d4/df/db2/db4/d47/d70/d74/fb5 0 2026-03-10T14:07:47.460 INFO:tasks.workunit.client.0.vm03.stdout:9/31: symlink d2/ld 0 2026-03-10T14:07:47.462 INFO:tasks.workunit.client.0.vm03.stdout:9/32: mknod d2/ce 0 2026-03-10T14:07:47.462 INFO:tasks.workunit.client.0.vm03.stdout:9/33: chown d2/c5 0 1 2026-03-10T14:07:47.463 INFO:tasks.workunit.client.1.vm04.stdout:4/518: write d4/d14/d64/fab [987337,84003] 0 2026-03-10T14:07:47.465 INFO:tasks.workunit.client.0.vm03.stdout:0/66: rmdir d3 39 2026-03-10T14:07:47.465 INFO:tasks.workunit.client.1.vm04.stdout:4/519: rmdir d4/df/d34/d6f 39 2026-03-10T14:07:47.466 INFO:tasks.workunit.client.0.vm03.stdout:9/34: unlink d2/cb 0 2026-03-10T14:07:47.466 INFO:tasks.workunit.client.0.vm03.stdout:9/35: write d2/fc [234619,12196] 0 2026-03-10T14:07:47.467 INFO:tasks.workunit.client.0.vm03.stdout:9/36: chown d2/c6 0 1 2026-03-10T14:07:47.467 INFO:tasks.workunit.client.0.vm03.stdout:9/37: stat c0 0 2026-03-10T14:07:47.471 INFO:tasks.workunit.client.0.vm03.stdout:9/38: creat d2/ff x:0 0 0 2026-03-10T14:07:47.488 INFO:tasks.workunit.client.1.vm04.stdout:4/520: getdents d4/df/db2/db4 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.1.vm04.stdout:4/521: creat d4/d14/dac/db7/fbb x:0 0 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.1.vm04.stdout:4/522: symlink d4/df/d31/lbc 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.1.vm04.stdout:4/523: link d4/df/l80 d4/df/db2/db4/d47/d70/d74/lbd 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.1.vm04.stdout:4/524: read - d4/f77 zero size 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:0/67: creat d3/d11/f13 x:0 0 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:5/42: fsync d4/fd 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:5/43: chown d4/d6/de 2180 1 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:8/45: getdents da 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:6/20: getdents . 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:8/46: rename da/fb to da/fd 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:8/47: dread - f5 zero size 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:8/48: chown f9 13300 1 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:5/44: creat d4/d6/de/f11 x:0 0 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:7/71: fsync d5/d9/f10 0 2026-03-10T14:07:47.489 INFO:tasks.workunit.client.0.vm03.stdout:7/72: write d5/f6 [3078486,113320] 0 2026-03-10T14:07:47.490 INFO:tasks.workunit.client.1.vm04.stdout:4/525: symlink d4/d14/dac/lbe 0 2026-03-10T14:07:47.491 INFO:tasks.workunit.client.0.vm03.stdout:7/73: dread d5/f6 [0,4194304] 0 2026-03-10T14:07:47.491 INFO:tasks.workunit.client.1.vm04.stdout:4/526: dread - d4/d14/d6d/f9c zero size 2026-03-10T14:07:47.491 INFO:tasks.workunit.client.0.vm03.stdout:8/49: creat da/fe x:0 0 0 2026-03-10T14:07:47.492 INFO:tasks.workunit.client.0.vm03.stdout:7/74: write d5/f7 [9134632,84622] 0 2026-03-10T14:07:47.492 INFO:tasks.workunit.client.0.vm03.stdout:5/45: mknod d4/d6/c12 0 2026-03-10T14:07:47.492 INFO:tasks.workunit.client.1.vm04.stdout:4/527: read - d4/d14/d6d/f8a zero size 2026-03-10T14:07:47.493 INFO:tasks.workunit.client.0.vm03.stdout:5/46: write d4/d6/de/f11 [109702,54762] 0 2026-03-10T14:07:47.494 INFO:tasks.workunit.client.0.vm03.stdout:5/47: truncate d4/d6/fa 2508697 0 2026-03-10T14:07:47.499 INFO:tasks.workunit.client.1.vm04.stdout:4/528: creat d4/df/db2/db4/d47/d70/fbf x:0 0 0 2026-03-10T14:07:47.500 INFO:tasks.workunit.client.0.vm03.stdout:8/50: creat da/ff x:0 0 0 2026-03-10T14:07:47.500 INFO:tasks.workunit.client.0.vm03.stdout:4/31: rmdir d5 39 2026-03-10T14:07:47.500 INFO:tasks.workunit.client.0.vm03.stdout:4/32: readlink l1 0 2026-03-10T14:07:47.500 INFO:tasks.workunit.client.0.vm03.stdout:4/33: chown f2 238 1 2026-03-10T14:07:47.501 INFO:tasks.workunit.client.0.vm03.stdout:1/37: truncate d0/d2/f9 3262034 0 2026-03-10T14:07:47.502 INFO:tasks.workunit.client.0.vm03.stdout:7/75: rename c4 to d5/d9/c12 0 2026-03-10T14:07:47.503 INFO:tasks.workunit.client.0.vm03.stdout:7/76: dread - d5/d9/f10 zero size 2026-03-10T14:07:47.503 INFO:tasks.workunit.client.0.vm03.stdout:5/48: write d4/d6/fa [2837467,72878] 0 2026-03-10T14:07:47.504 INFO:tasks.workunit.client.1.vm04.stdout:4/529: dwrite d4/fba [0,4194304] 0 2026-03-10T14:07:47.508 INFO:tasks.workunit.client.0.vm03.stdout:7/77: mknod d5/d9/dd/c13 0 2026-03-10T14:07:47.508 INFO:tasks.workunit.client.0.vm03.stdout:4/34: mkdir d5/d9 0 2026-03-10T14:07:47.508 INFO:tasks.workunit.client.0.vm03.stdout:4/35: readlink d5/l6 0 2026-03-10T14:07:47.509 INFO:tasks.workunit.client.0.vm03.stdout:1/38: chown d0/f4 2506 1 2026-03-10T14:07:47.512 INFO:tasks.workunit.client.0.vm03.stdout:5/49: mkdir d4/d13 0 2026-03-10T14:07:47.524 INFO:tasks.workunit.client.0.vm03.stdout:5/50: creat d4/d6/de/f14 x:0 0 0 2026-03-10T14:07:47.524 INFO:tasks.workunit.client.0.vm03.stdout:4/36: link f3 d5/d9/fa 0 2026-03-10T14:07:47.524 INFO:tasks.workunit.client.0.vm03.stdout:5/51: mknod d4/d6/c15 0 2026-03-10T14:07:47.524 INFO:tasks.workunit.client.0.vm03.stdout:4/37: fdatasync d5/d9/fa 0 2026-03-10T14:07:47.536 INFO:tasks.workunit.client.1.vm04.stdout:5/588: write d7/f4b [468101,104233] 0 2026-03-10T14:07:47.541 INFO:tasks.workunit.client.1.vm04.stdout:5/589: mknod d7/d12/d2b/d3e/d57/d9f/cbd 0 2026-03-10T14:07:47.552 INFO:tasks.workunit.client.1.vm04.stdout:5/590: dwrite d7/d2d/d76/f84 [0,4194304] 0 2026-03-10T14:07:47.554 INFO:tasks.workunit.client.0.vm03.stdout:9/39: sync 2026-03-10T14:07:47.554 INFO:tasks.workunit.client.0.vm03.stdout:6/21: sync 2026-03-10T14:07:47.554 INFO:tasks.workunit.client.0.vm03.stdout:4/38: sync 2026-03-10T14:07:47.554 INFO:tasks.workunit.client.0.vm03.stdout:6/22: stat c7 0 2026-03-10T14:07:47.554 INFO:tasks.workunit.client.0.vm03.stdout:5/52: sync 2026-03-10T14:07:47.556 INFO:tasks.workunit.client.1.vm04.stdout:5/591: symlink d7/d26/d6b/d6e/lbe 0 2026-03-10T14:07:47.557 INFO:tasks.workunit.client.1.vm04.stdout:5/592: chown d7/d26/f54 35 1 2026-03-10T14:07:47.557 INFO:tasks.workunit.client.1.vm04.stdout:5/593: chown d7/d12/d45 3127326 1 2026-03-10T14:07:47.560 INFO:tasks.workunit.client.1.vm04.stdout:5/594: creat d7/d2d/fbf x:0 0 0 2026-03-10T14:07:47.564 INFO:tasks.workunit.client.0.vm03.stdout:9/40: sync 2026-03-10T14:07:47.567 INFO:tasks.workunit.client.1.vm04.stdout:5/595: mkdir d7/d12/d2b/d3e/d3f/dc0 0 2026-03-10T14:07:47.569 INFO:tasks.workunit.client.1.vm04.stdout:5/596: chown d7/d26/l3d 386231641 1 2026-03-10T14:07:47.574 INFO:tasks.workunit.client.0.vm03.stdout:6/23: mkdir d8 0 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:4/39: mkdir d5/d9/db 0 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:9/41: symlink d2/l10 0 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:5/53: mkdir d4/d16 0 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:9/42: dread d2/f3 [0,4194304] 0 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:9/43: stat d2/l10 0 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:4/40: symlink d5/lc 0 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:6/24: symlink d8/l9 0 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:4/41: read - f4 zero size 2026-03-10T14:07:47.584 INFO:tasks.workunit.client.0.vm03.stdout:4/42: dwrite f4 [0,4194304] 0 2026-03-10T14:07:47.588 INFO:tasks.workunit.client.0.vm03.stdout:2/37: fdatasync f4 0 2026-03-10T14:07:47.590 INFO:tasks.workunit.client.0.vm03.stdout:2/38: dread f1 [0,4194304] 0 2026-03-10T14:07:47.591 INFO:tasks.workunit.client.0.vm03.stdout:9/44: creat d2/f11 x:0 0 0 2026-03-10T14:07:47.597 INFO:tasks.workunit.client.0.vm03.stdout:6/25: dwrite f2 [0,4194304] 0 2026-03-10T14:07:47.598 INFO:tasks.workunit.client.0.vm03.stdout:6/26: fdatasync f0 0 2026-03-10T14:07:47.600 INFO:tasks.workunit.client.0.vm03.stdout:4/43: creat d5/fd x:0 0 0 2026-03-10T14:07:47.614 INFO:tasks.workunit.client.0.vm03.stdout:2/39: unlink l2 0 2026-03-10T14:07:47.622 INFO:tasks.workunit.client.0.vm03.stdout:5/54: rename d4/d6/de/f11 to d4/f17 0 2026-03-10T14:07:47.624 INFO:tasks.workunit.client.0.vm03.stdout:2/40: rename d5/l6 to d5/lc 0 2026-03-10T14:07:47.627 INFO:tasks.workunit.client.0.vm03.stdout:5/55: mkdir d4/d6/d18 0 2026-03-10T14:07:47.628 INFO:tasks.workunit.client.0.vm03.stdout:6/27: sync 2026-03-10T14:07:47.633 INFO:tasks.workunit.client.0.vm03.stdout:4/44: rename f4 to d5/fe 0 2026-03-10T14:07:47.637 INFO:tasks.workunit.client.0.vm03.stdout:6/28: mknod d8/ca 0 2026-03-10T14:07:47.637 INFO:tasks.workunit.client.0.vm03.stdout:6/29: truncate f5 970741 0 2026-03-10T14:07:47.641 INFO:tasks.workunit.client.0.vm03.stdout:5/56: rename d4/d6/d18 to d4/d16/d19 0 2026-03-10T14:07:47.643 INFO:tasks.workunit.client.0.vm03.stdout:5/57: dread d4/d6/f10 [0,4194304] 0 2026-03-10T14:07:47.655 INFO:tasks.workunit.client.1.vm04.stdout:2/545: write d0/d14/d91/d8/d17/d4e/d85/f89 [720629,76295] 0 2026-03-10T14:07:47.656 INFO:tasks.workunit.client.0.vm03.stdout:2/41: rename d5/l8 to d5/ld 0 2026-03-10T14:07:47.658 INFO:tasks.workunit.client.0.vm03.stdout:6/30: rmdir d8 39 2026-03-10T14:07:47.662 INFO:tasks.workunit.client.0.vm03.stdout:4/45: creat d5/d9/db/ff x:0 0 0 2026-03-10T14:07:47.663 INFO:tasks.workunit.client.0.vm03.stdout:5/58: mknod d4/d13/c1a 0 2026-03-10T14:07:47.663 INFO:tasks.workunit.client.0.vm03.stdout:2/42: sync 2026-03-10T14:07:47.663 INFO:tasks.workunit.client.1.vm04.stdout:3/548: truncate da/dc/d35/d52/f79 867009 0 2026-03-10T14:07:47.668 INFO:tasks.workunit.client.1.vm04.stdout:0/472: dread d0/d2/d15/d22/d38/f5b [0,4194304] 0 2026-03-10T14:07:47.672 INFO:tasks.workunit.client.0.vm03.stdout:3/45: creat fb x:0 0 0 2026-03-10T14:07:47.672 INFO:tasks.workunit.client.0.vm03.stdout:3/46: write fa [1029477,23187] 0 2026-03-10T14:07:47.673 INFO:tasks.workunit.client.0.vm03.stdout:4/46: unlink l1 0 2026-03-10T14:07:47.674 INFO:tasks.workunit.client.1.vm04.stdout:0/473: dread d0/d2/d15/d22/d38/d56/f79 [0,4194304] 0 2026-03-10T14:07:47.677 INFO:tasks.workunit.client.0.vm03.stdout:5/59: creat d4/d16/f1b x:0 0 0 2026-03-10T14:07:47.678 INFO:tasks.workunit.client.0.vm03.stdout:2/43: symlink d5/le 0 2026-03-10T14:07:47.678 INFO:tasks.workunit.client.0.vm03.stdout:2/44: truncate d5/fb 308323 0 2026-03-10T14:07:47.678 INFO:tasks.workunit.client.0.vm03.stdout:2/45: stat f4 0 2026-03-10T14:07:47.679 INFO:tasks.workunit.client.0.vm03.stdout:2/46: fdatasync f4 0 2026-03-10T14:07:47.679 INFO:tasks.workunit.client.1.vm04.stdout:6/424: dwrite d3/de/d35/d3f/d2d/d32/d23/d4e/f69 [0,4194304] 0 2026-03-10T14:07:47.680 INFO:tasks.workunit.client.1.vm04.stdout:7/531: rename d2/dc/de/d21 to d2/dc/d4d/dcd 0 2026-03-10T14:07:47.682 INFO:tasks.workunit.client.1.vm04.stdout:6/425: fdatasync d3/f57 0 2026-03-10T14:07:47.683 INFO:tasks.workunit.client.0.vm03.stdout:4/47: creat d5/d9/db/f10 x:0 0 0 2026-03-10T14:07:47.688 INFO:tasks.workunit.client.1.vm04.stdout:9/453: truncate d9/da/dd/f85 316553 0 2026-03-10T14:07:47.688 INFO:tasks.workunit.client.1.vm04.stdout:9/454: readlink d9/d44/l79 0 2026-03-10T14:07:47.689 INFO:tasks.workunit.client.0.vm03.stdout:4/48: dwrite d5/fd [0,4194304] 0 2026-03-10T14:07:47.691 INFO:tasks.workunit.client.0.vm03.stdout:5/60: creat d4/d16/f1c x:0 0 0 2026-03-10T14:07:47.692 INFO:tasks.workunit.client.0.vm03.stdout:5/61: write d4/d6/de/f14 [397912,60316] 0 2026-03-10T14:07:47.698 INFO:tasks.workunit.client.0.vm03.stdout:4/49: creat d5/d9/f11 x:0 0 0 2026-03-10T14:07:47.704 INFO:tasks.workunit.client.0.vm03.stdout:2/47: link d5/fb d5/ff 0 2026-03-10T14:07:47.704 INFO:tasks.workunit.client.0.vm03.stdout:2/48: read - d5/f9 zero size 2026-03-10T14:07:47.707 INFO:tasks.workunit.client.0.vm03.stdout:8/51: truncate f0 3823453 0 2026-03-10T14:07:47.710 INFO:tasks.workunit.client.0.vm03.stdout:8/52: read - f7 zero size 2026-03-10T14:07:47.710 INFO:tasks.workunit.client.0.vm03.stdout:3/47: creat fc x:0 0 0 2026-03-10T14:07:47.710 INFO:tasks.workunit.client.0.vm03.stdout:5/62: dread d4/d6/fa [0,4194304] 0 2026-03-10T14:07:47.711 INFO:tasks.workunit.client.0.vm03.stdout:7/78: rename d5/d9/dd to d5/d9/d14 0 2026-03-10T14:07:47.713 INFO:tasks.workunit.client.0.vm03.stdout:2/49: mkdir d5/d10 0 2026-03-10T14:07:47.714 INFO:tasks.workunit.client.0.vm03.stdout:2/50: stat d5/l7 0 2026-03-10T14:07:47.715 INFO:tasks.workunit.client.0.vm03.stdout:8/53: mknod da/c10 0 2026-03-10T14:07:47.716 INFO:tasks.workunit.client.0.vm03.stdout:8/54: chown da/c10 42131063 1 2026-03-10T14:07:47.716 INFO:tasks.workunit.client.0.vm03.stdout:8/55: chown c8 86740573 1 2026-03-10T14:07:47.716 INFO:tasks.workunit.client.1.vm04.stdout:0/474: mknod d0/d2/d15/d49/d50/d61/c91 0 2026-03-10T14:07:47.718 INFO:tasks.workunit.client.0.vm03.stdout:7/79: creat d5/d9/d14/f15 x:0 0 0 2026-03-10T14:07:47.722 INFO:tasks.workunit.client.0.vm03.stdout:7/80: readlink d5/l8 0 2026-03-10T14:07:47.722 INFO:tasks.workunit.client.1.vm04.stdout:7/532: write d2/dc/de/d2d/d60/d81/fa6 [2341626,79864] 0 2026-03-10T14:07:47.723 INFO:tasks.workunit.client.1.vm04.stdout:7/533: stat d2/dc/de/d2d/d60/d7c/d36/daa/cc9 0 2026-03-10T14:07:47.723 INFO:tasks.workunit.client.1.vm04.stdout:1/497: rename d3/d20/f88 to d3/d20/d60/d82/d13/d38/d58/faf 0 2026-03-10T14:07:47.723 INFO:tasks.workunit.client.1.vm04.stdout:1/498: chown d3/d20/d60/d82/d13/d1a/l30 56835 1 2026-03-10T14:07:47.724 INFO:tasks.workunit.client.0.vm03.stdout:5/63: dwrite d4/d6/fa [0,4194304] 0 2026-03-10T14:07:47.729 INFO:tasks.workunit.client.0.vm03.stdout:5/64: dread d4/d6/f10 [4194304,4194304] 0 2026-03-10T14:07:47.731 INFO:tasks.workunit.client.0.vm03.stdout:5/65: stat d4/d16 0 2026-03-10T14:07:47.732 INFO:tasks.workunit.client.0.vm03.stdout:5/66: truncate d4/d16/f1b 470004 0 2026-03-10T14:07:47.732 INFO:tasks.workunit.client.0.vm03.stdout:5/67: dread - d4/fd zero size 2026-03-10T14:07:47.732 INFO:tasks.workunit.client.0.vm03.stdout:5/68: rename d4/d16/d19 to d4/d16/d19/d1d 22 2026-03-10T14:07:47.738 INFO:tasks.workunit.client.1.vm04.stdout:6/426: symlink d3/l79 0 2026-03-10T14:07:47.741 INFO:tasks.workunit.client.0.vm03.stdout:5/69: dwrite d4/d6/fb [0,4194304] 0 2026-03-10T14:07:47.741 INFO:tasks.workunit.client.1.vm04.stdout:7/534: readlink d2/dc/d4d/la8 0 2026-03-10T14:07:47.741 INFO:tasks.workunit.client.1.vm04.stdout:1/499: dread d3/d22/d63/f6f [0,4194304] 0 2026-03-10T14:07:47.744 INFO:tasks.workunit.client.1.vm04.stdout:0/475: mknod d0/d2/d15/c92 0 2026-03-10T14:07:47.745 INFO:tasks.workunit.client.0.vm03.stdout:2/51: symlink d5/l11 0 2026-03-10T14:07:47.745 INFO:tasks.workunit.client.1.vm04.stdout:6/427: dread d3/de/d35/d3f/d2d/f21 [0,4194304] 0 2026-03-10T14:07:47.746 INFO:tasks.workunit.client.1.vm04.stdout:9/455: creat d9/d5c/d93/d96/f9c x:0 0 0 2026-03-10T14:07:47.748 INFO:tasks.workunit.client.1.vm04.stdout:8/567: rename d0/d75/c95 to d0/d3/dd/cb0 0 2026-03-10T14:07:47.750 INFO:tasks.workunit.client.1.vm04.stdout:1/500: truncate d3/d20/d60/d82/fd 2853801 0 2026-03-10T14:07:47.752 INFO:tasks.workunit.client.1.vm04.stdout:1/501: fdatasync d3/d22/d63/f6f 0 2026-03-10T14:07:47.754 INFO:tasks.workunit.client.1.vm04.stdout:9/456: unlink d9/d33/l52 0 2026-03-10T14:07:47.754 INFO:tasks.workunit.client.1.vm04.stdout:3/549: rename da/f19 to da/dc/fc0 0 2026-03-10T14:07:47.754 INFO:tasks.workunit.client.0.vm03.stdout:7/81: creat d5/f16 x:0 0 0 2026-03-10T14:07:47.755 INFO:tasks.workunit.client.1.vm04.stdout:1/502: mknod d3/d20/d60/d82/d13/d38/cb0 0 2026-03-10T14:07:47.759 INFO:tasks.workunit.client.1.vm04.stdout:6/428: symlink d3/de/d35/d3a/d43/l7a 0 2026-03-10T14:07:47.762 INFO:tasks.workunit.client.1.vm04.stdout:1/503: symlink d3/d5c/d79/lb1 0 2026-03-10T14:07:47.763 INFO:tasks.workunit.client.0.vm03.stdout:9/45: getdents d2 0 2026-03-10T14:07:47.763 INFO:tasks.workunit.client.0.vm03.stdout:0/68: getdents d3/d11 0 2026-03-10T14:07:47.763 INFO:tasks.workunit.client.1.vm04.stdout:3/550: dread - da/dc/d35/d52/fa8 zero size 2026-03-10T14:07:47.763 INFO:tasks.workunit.client.1.vm04.stdout:9/457: chown d9/da/d5d/c95 20398 1 2026-03-10T14:07:47.764 INFO:tasks.workunit.client.1.vm04.stdout:1/504: rename d3/d22/f42 to d3/d5c/fb2 0 2026-03-10T14:07:47.765 INFO:tasks.workunit.client.1.vm04.stdout:9/458: mkdir d9/d5c/d93/d96/d9d 0 2026-03-10T14:07:47.765 INFO:tasks.workunit.client.1.vm04.stdout:8/568: getdents d0 0 2026-03-10T14:07:47.769 INFO:tasks.workunit.client.0.vm03.stdout:3/48: creat fd x:0 0 0 2026-03-10T14:07:47.769 INFO:tasks.workunit.client.1.vm04.stdout:9/459: dread d9/d5c/f6e [0,4194304] 0 2026-03-10T14:07:47.770 INFO:tasks.workunit.client.0.vm03.stdout:7/82: creat d5/d9/f17 x:0 0 0 2026-03-10T14:07:47.772 INFO:tasks.workunit.client.1.vm04.stdout:8/569: symlink d0/d3/d73/lb1 0 2026-03-10T14:07:47.773 INFO:tasks.workunit.client.1.vm04.stdout:1/505: creat d3/d22/d63/fb3 x:0 0 0 2026-03-10T14:07:47.774 INFO:tasks.workunit.client.0.vm03.stdout:9/46: symlink d2/l12 0 2026-03-10T14:07:47.776 INFO:tasks.workunit.client.1.vm04.stdout:9/460: getdents d9/d5c/d93/d96/d9d 0 2026-03-10T14:07:47.778 INFO:tasks.workunit.client.1.vm04.stdout:1/506: unlink d3/d20/d60/d82/d13/l3f 0 2026-03-10T14:07:47.779 INFO:tasks.workunit.client.1.vm04.stdout:9/461: rmdir d9 39 2026-03-10T14:07:47.779 INFO:tasks.workunit.client.0.vm03.stdout:2/52: creat d5/d10/f12 x:0 0 0 2026-03-10T14:07:47.783 INFO:tasks.workunit.client.1.vm04.stdout:9/462: unlink d9/c40 0 2026-03-10T14:07:47.784 INFO:tasks.workunit.client.0.vm03.stdout:9/47: write d2/ff [557955,25896] 0 2026-03-10T14:07:47.788 INFO:tasks.workunit.client.1.vm04.stdout:1/507: dread d3/f2c [0,4194304] 0 2026-03-10T14:07:47.791 INFO:tasks.workunit.client.1.vm04.stdout:4/530: write d4/df/f2e [1480590,75547] 0 2026-03-10T14:07:47.792 INFO:tasks.workunit.client.0.vm03.stdout:2/53: creat d5/d10/f13 x:0 0 0 2026-03-10T14:07:47.792 INFO:tasks.workunit.client.1.vm04.stdout:1/508: truncate d3/d22/d63/f97 2154287 0 2026-03-10T14:07:47.798 INFO:tasks.workunit.client.1.vm04.stdout:9/463: dwrite d9/d44/d59/f5a [0,4194304] 0 2026-03-10T14:07:47.802 INFO:tasks.workunit.client.0.vm03.stdout:6/31: getdents d8 0 2026-03-10T14:07:47.802 INFO:tasks.workunit.client.0.vm03.stdout:3/49: creat fe x:0 0 0 2026-03-10T14:07:47.811 INFO:tasks.workunit.client.0.vm03.stdout:1/39: dwrite d0/f4 [0,4194304] 0 2026-03-10T14:07:47.815 INFO:tasks.workunit.client.1.vm04.stdout:4/531: creat d4/d14/d1b/fc0 x:0 0 0 2026-03-10T14:07:47.815 INFO:tasks.workunit.client.1.vm04.stdout:9/464: fsync d9/d44/d4d/d7d/f82 0 2026-03-10T14:07:47.815 INFO:tasks.workunit.client.0.vm03.stdout:6/32: mkdir d8/db 0 2026-03-10T14:07:47.816 INFO:tasks.workunit.client.1.vm04.stdout:9/465: chown d9/d5c/l72 19308 1 2026-03-10T14:07:47.816 INFO:tasks.workunit.client.0.vm03.stdout:4/50: write f2 [1120550,85818] 0 2026-03-10T14:07:47.820 INFO:tasks.workunit.client.0.vm03.stdout:8/56: dwrite f2 [0,4194304] 0 2026-03-10T14:07:47.826 INFO:tasks.workunit.client.1.vm04.stdout:5/597: truncate d7/d2d/d76/f8f 505830 0 2026-03-10T14:07:47.827 INFO:tasks.workunit.client.1.vm04.stdout:9/466: stat d9/d44/d4d/d7d/l7f 0 2026-03-10T14:07:47.827 INFO:tasks.workunit.client.0.vm03.stdout:6/33: dwrite f2 [0,4194304] 0 2026-03-10T14:07:47.827 INFO:tasks.workunit.client.0.vm03.stdout:4/51: dwrite d5/f8 [0,4194304] 0 2026-03-10T14:07:47.828 INFO:tasks.workunit.client.0.vm03.stdout:6/34: read f2 [560269,29106] 0 2026-03-10T14:07:47.828 INFO:tasks.workunit.client.0.vm03.stdout:6/35: write f3 [3138700,21608] 0 2026-03-10T14:07:47.831 INFO:tasks.workunit.client.0.vm03.stdout:3/50: sync 2026-03-10T14:07:47.832 INFO:tasks.workunit.client.0.vm03.stdout:3/51: read - fd zero size 2026-03-10T14:07:47.832 INFO:tasks.workunit.client.0.vm03.stdout:3/52: write fc [467103,129594] 0 2026-03-10T14:07:47.833 INFO:tasks.workunit.client.0.vm03.stdout:3/53: fdatasync fd 0 2026-03-10T14:07:47.833 INFO:tasks.workunit.client.1.vm04.stdout:4/532: stat d4/df/d34/d6f/cb0 0 2026-03-10T14:07:47.833 INFO:tasks.workunit.client.0.vm03.stdout:4/52: dread d5/f8 [0,4194304] 0 2026-03-10T14:07:47.833 INFO:tasks.workunit.client.1.vm04.stdout:1/509: link d3/d20/la5 d3/d22/d2f/d57/lb4 0 2026-03-10T14:07:47.837 INFO:tasks.workunit.client.1.vm04.stdout:5/598: dwrite d7/d9/fd [0,4194304] 0 2026-03-10T14:07:47.839 INFO:tasks.workunit.client.0.vm03.stdout:3/54: stat fc 0 2026-03-10T14:07:47.839 INFO:tasks.workunit.client.0.vm03.stdout:3/55: write fa [513705,26347] 0 2026-03-10T14:07:47.843 INFO:tasks.workunit.client.0.vm03.stdout:4/53: dwrite d5/d9/db/ff [0,4194304] 0 2026-03-10T14:07:47.845 INFO:tasks.workunit.client.1.vm04.stdout:4/533: mknod d4/df/db2/db4/d47/d4f/cc1 0 2026-03-10T14:07:47.845 INFO:tasks.workunit.client.1.vm04.stdout:9/467: creat d9/da/f9e x:0 0 0 2026-03-10T14:07:47.846 INFO:tasks.workunit.client.0.vm03.stdout:4/54: write d5/d9/f11 [522750,56137] 0 2026-03-10T14:07:47.846 INFO:tasks.workunit.client.0.vm03.stdout:4/55: chown d5/d9/db 7420252 1 2026-03-10T14:07:47.853 INFO:tasks.workunit.client.1.vm04.stdout:1/510: mkdir d3/d20/d60/d82/d13/d38/db5 0 2026-03-10T14:07:47.867 INFO:tasks.workunit.client.0.vm03.stdout:1/40: rmdir d0/d2 39 2026-03-10T14:07:47.867 INFO:tasks.workunit.client.0.vm03.stdout:9/48: rename d2/ld to d2/l13 0 2026-03-10T14:07:47.867 INFO:tasks.workunit.client.0.vm03.stdout:1/41: stat d0/f8 0 2026-03-10T14:07:47.868 INFO:tasks.workunit.client.0.vm03.stdout:9/49: truncate d2/ff 1285245 0 2026-03-10T14:07:47.881 INFO:tasks.workunit.client.0.vm03.stdout:8/57: rmdir da 39 2026-03-10T14:07:47.894 INFO:tasks.workunit.client.0.vm03.stdout:6/36: rename d8/ca to d8/db/cc 0 2026-03-10T14:07:47.894 INFO:tasks.workunit.client.0.vm03.stdout:5/70: truncate d4/d6/fb 1294901 0 2026-03-10T14:07:47.894 INFO:tasks.workunit.client.0.vm03.stdout:7/83: truncate d5/fb 2847700 0 2026-03-10T14:07:47.894 INFO:tasks.workunit.client.0.vm03.stdout:7/84: write d5/d9/f10 [968982,22909] 0 2026-03-10T14:07:47.894 INFO:tasks.workunit.client.0.vm03.stdout:7/85: write d5/f16 [64585,113028] 0 2026-03-10T14:07:47.894 INFO:tasks.workunit.client.0.vm03.stdout:0/69: truncate d3/fe 1366969 0 2026-03-10T14:07:47.894 INFO:tasks.workunit.client.0.vm03.stdout:4/56: creat d5/d9/db/f12 x:0 0 0 2026-03-10T14:07:47.894 INFO:tasks.workunit.client.1.vm04.stdout:9/468: dread d9/da/dd/f48 [0,4194304] 0 2026-03-10T14:07:47.896 INFO:tasks.workunit.client.0.vm03.stdout:9/50: read d2/f3 [2649003,129176] 0 2026-03-10T14:07:47.896 INFO:tasks.workunit.client.0.vm03.stdout:8/58: chown da/ff 171 1 2026-03-10T14:07:47.900 INFO:tasks.workunit.client.0.vm03.stdout:1/42: dwrite d0/d2/f7 [0,4194304] 0 2026-03-10T14:07:47.901 INFO:tasks.workunit.client.1.vm04.stdout:9/469: link d9/d1d/f1f d9/da/dd/f9f 0 2026-03-10T14:07:47.902 INFO:tasks.workunit.client.1.vm04.stdout:9/470: stat d9/d5c/l60 0 2026-03-10T14:07:47.902 INFO:tasks.workunit.client.1.vm04.stdout:9/471: stat d9/da/f57 0 2026-03-10T14:07:47.903 INFO:tasks.workunit.client.0.vm03.stdout:5/71: mknod d4/d16/c1e 0 2026-03-10T14:07:47.903 INFO:tasks.workunit.client.0.vm03.stdout:5/72: readlink d4/lf 0 2026-03-10T14:07:47.904 INFO:tasks.workunit.client.1.vm04.stdout:9/472: read d9/d44/d59/f5a [1255546,130382] 0 2026-03-10T14:07:47.905 INFO:tasks.workunit.client.0.vm03.stdout:5/73: dread d4/d6/f10 [4194304,4194304] 0 2026-03-10T14:07:47.906 INFO:tasks.workunit.client.1.vm04.stdout:9/473: rename d9/d44/d4d/d7d/f82 to d9/da/dd/d74/fa0 0 2026-03-10T14:07:47.907 INFO:tasks.workunit.client.0.vm03.stdout:3/56: symlink lf 0 2026-03-10T14:07:47.909 INFO:tasks.workunit.client.1.vm04.stdout:9/474: symlink d9/d1d/la1 0 2026-03-10T14:07:47.910 INFO:tasks.workunit.client.1.vm04.stdout:4/534: dread d4/f2c [0,4194304] 0 2026-03-10T14:07:47.911 INFO:tasks.workunit.client.0.vm03.stdout:4/57: creat d5/f13 x:0 0 0 2026-03-10T14:07:47.911 INFO:tasks.workunit.client.0.vm03.stdout:1/43: creat d0/d2/fe x:0 0 0 2026-03-10T14:07:47.912 INFO:tasks.workunit.client.1.vm04.stdout:4/535: write d4/df/db2/db4/d47/d4f/f6a [4974181,130848] 0 2026-03-10T14:07:47.916 INFO:tasks.workunit.client.1.vm04.stdout:4/536: mknod d4/df/db2/db4/d47/d4f/d8c/cc2 0 2026-03-10T14:07:47.926 INFO:tasks.workunit.client.1.vm04.stdout:4/537: dwrite d4/fba [0,4194304] 0 2026-03-10T14:07:47.926 INFO:tasks.workunit.client.0.vm03.stdout:5/74: dwrite d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:07:47.926 INFO:tasks.workunit.client.0.vm03.stdout:3/57: dwrite fd [0,4194304] 0 2026-03-10T14:07:47.926 INFO:tasks.workunit.client.0.vm03.stdout:0/70: rename d3/ca to d3/d11/c14 0 2026-03-10T14:07:47.926 INFO:tasks.workunit.client.0.vm03.stdout:4/58: fsync d5/fe 0 2026-03-10T14:07:47.926 INFO:tasks.workunit.client.0.vm03.stdout:3/58: dread - fe zero size 2026-03-10T14:07:47.926 INFO:tasks.workunit.client.0.vm03.stdout:6/37: dread f5 [0,4194304] 0 2026-03-10T14:07:47.926 INFO:tasks.workunit.client.0.vm03.stdout:9/51: dread d2/fc [0,4194304] 0 2026-03-10T14:07:47.929 INFO:tasks.workunit.client.0.vm03.stdout:1/44: mkdir d0/d2/df 0 2026-03-10T14:07:47.930 INFO:tasks.workunit.client.0.vm03.stdout:5/75: mkdir d4/d13/d1f 0 2026-03-10T14:07:47.931 INFO:tasks.workunit.client.0.vm03.stdout:0/71: chown d3/cf 1308207 1 2026-03-10T14:07:47.931 INFO:tasks.workunit.client.1.vm04.stdout:4/538: dread d4/d14/d1b/f6c [0,4194304] 0 2026-03-10T14:07:47.932 INFO:tasks.workunit.client.0.vm03.stdout:4/59: creat d5/f14 x:0 0 0 2026-03-10T14:07:47.932 INFO:tasks.workunit.client.0.vm03.stdout:4/60: write d5/f14 [222486,14418] 0 2026-03-10T14:07:47.933 INFO:tasks.workunit.client.0.vm03.stdout:6/38: fdatasync f5 0 2026-03-10T14:07:47.934 INFO:tasks.workunit.client.0.vm03.stdout:6/39: truncate f0 5111013 0 2026-03-10T14:07:47.934 INFO:tasks.workunit.client.0.vm03.stdout:7/86: getdents d5 0 2026-03-10T14:07:47.935 INFO:tasks.workunit.client.1.vm04.stdout:4/539: creat d4/df/d34/fc3 x:0 0 0 2026-03-10T14:07:47.938 INFO:tasks.workunit.client.0.vm03.stdout:9/52: mkdir d2/d14 0 2026-03-10T14:07:47.941 INFO:tasks.workunit.client.0.vm03.stdout:9/53: write d2/f3 [3563353,70907] 0 2026-03-10T14:07:47.942 INFO:tasks.workunit.client.0.vm03.stdout:9/54: write d2/f3 [2145451,113733] 0 2026-03-10T14:07:47.945 INFO:tasks.workunit.client.0.vm03.stdout:0/72: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:47.958 INFO:tasks.workunit.client.0.vm03.stdout:5/76: dwrite d4/d6/f10 [4194304,4194304] 0 2026-03-10T14:07:47.958 INFO:tasks.workunit.client.0.vm03.stdout:7/87: dwrite d5/d9/d14/f15 [0,4194304] 0 2026-03-10T14:07:47.958 INFO:tasks.workunit.client.0.vm03.stdout:1/45: creat d0/f10 x:0 0 0 2026-03-10T14:07:47.958 INFO:tasks.workunit.client.0.vm03.stdout:1/46: dread d0/d2/f3 [0,4194304] 0 2026-03-10T14:07:47.962 INFO:tasks.workunit.client.0.vm03.stdout:5/77: link d4/d6/de/f14 d4/d13/d1f/f20 0 2026-03-10T14:07:47.963 INFO:tasks.workunit.client.0.vm03.stdout:5/78: chown d4/d13/c1a 35288 1 2026-03-10T14:07:47.964 INFO:tasks.workunit.client.0.vm03.stdout:0/73: unlink d3/f7 0 2026-03-10T14:07:47.965 INFO:tasks.workunit.client.0.vm03.stdout:0/74: truncate d3/d11/f13 542622 0 2026-03-10T14:07:47.967 INFO:tasks.workunit.client.0.vm03.stdout:1/47: rename d0/d2/f3 to d0/f11 0 2026-03-10T14:07:47.968 INFO:tasks.workunit.client.0.vm03.stdout:1/48: chown d0 187672 1 2026-03-10T14:07:47.968 INFO:tasks.workunit.client.0.vm03.stdout:1/49: write d0/f10 [968781,105661] 0 2026-03-10T14:07:47.969 INFO:tasks.workunit.client.0.vm03.stdout:1/50: chown d0/d2/cd 203756049 1 2026-03-10T14:07:47.969 INFO:tasks.workunit.client.0.vm03.stdout:5/79: creat d4/d13/d1f/f21 x:0 0 0 2026-03-10T14:07:47.969 INFO:tasks.workunit.client.0.vm03.stdout:0/75: mknod d3/c15 0 2026-03-10T14:07:47.971 INFO:tasks.workunit.client.0.vm03.stdout:1/51: symlink d0/l12 0 2026-03-10T14:07:47.979 INFO:tasks.workunit.client.0.vm03.stdout:1/52: rename d0 to d0/d2/df/d13 22 2026-03-10T14:07:47.979 INFO:tasks.workunit.client.0.vm03.stdout:5/80: mknod d4/c22 0 2026-03-10T14:07:47.979 INFO:tasks.workunit.client.0.vm03.stdout:0/76: mkdir d3/d16 0 2026-03-10T14:07:47.979 INFO:tasks.workunit.client.0.vm03.stdout:0/77: write d3/d11/f13 [1104528,42080] 0 2026-03-10T14:07:47.979 INFO:tasks.workunit.client.0.vm03.stdout:1/53: creat d0/d2/f14 x:0 0 0 2026-03-10T14:07:48.012 INFO:tasks.workunit.client.0.vm03.stdout:4/61: sync 2026-03-10T14:07:48.017 INFO:tasks.workunit.client.0.vm03.stdout:4/62: mknod d5/d9/db/c15 0 2026-03-10T14:07:48.018 INFO:tasks.workunit.client.0.vm03.stdout:4/63: write d5/d9/db/f10 [671745,13361] 0 2026-03-10T14:07:48.021 INFO:tasks.workunit.client.0.vm03.stdout:4/64: dread d5/f8 [0,4194304] 0 2026-03-10T14:07:48.021 INFO:tasks.workunit.client.0.vm03.stdout:4/65: write d5/d9/f11 [959652,10505] 0 2026-03-10T14:07:48.024 INFO:tasks.workunit.client.0.vm03.stdout:4/66: creat d5/d9/f16 x:0 0 0 2026-03-10T14:07:48.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:47 vm04.local ceph-mon[55966]: pgmap v152: 65 pgs: 65 active+clean; 1.2 GiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 22 MiB/s rd, 57 MiB/s wr, 368 op/s 2026-03-10T14:07:48.105 INFO:tasks.workunit.client.1.vm04.stdout:1/511: sync 2026-03-10T14:07:48.105 INFO:tasks.workunit.client.1.vm04.stdout:9/475: sync 2026-03-10T14:07:48.105 INFO:tasks.workunit.client.1.vm04.stdout:4/540: sync 2026-03-10T14:07:48.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:47 vm03.local ceph-mon[49718]: pgmap v152: 65 pgs: 65 active+clean; 1.2 GiB data, 4.6 GiB used, 115 GiB / 120 GiB avail; 22 MiB/s rd, 57 MiB/s wr, 368 op/s 2026-03-10T14:07:48.114 INFO:tasks.workunit.client.1.vm04.stdout:9/476: creat d9/d5c/d93/d96/d9d/fa2 x:0 0 0 2026-03-10T14:07:48.126 INFO:tasks.workunit.client.0.vm03.stdout:8/59: fsync f0 0 2026-03-10T14:07:48.127 INFO:tasks.workunit.client.1.vm04.stdout:3/551: dread da/dc/f1d [0,4194304] 0 2026-03-10T14:07:48.127 INFO:tasks.workunit.client.0.vm03.stdout:6/40: getdents d8 0 2026-03-10T14:07:48.128 INFO:tasks.workunit.client.0.vm03.stdout:6/41: fsync f5 0 2026-03-10T14:07:48.130 INFO:tasks.workunit.client.1.vm04.stdout:3/552: mkdir da/dc/d3f/d61/dc1 0 2026-03-10T14:07:48.130 INFO:tasks.workunit.client.0.vm03.stdout:7/88: read d5/fb [2682053,18953] 0 2026-03-10T14:07:48.133 INFO:tasks.workunit.client.1.vm04.stdout:3/553: creat da/dc/fc2 x:0 0 0 2026-03-10T14:07:48.136 INFO:tasks.workunit.client.0.vm03.stdout:8/60: mknod da/c11 0 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.1.vm04.stdout:3/554: chown da/dc/d3f/d54/d66/lbb 1643351091 1 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.1.vm04.stdout:3/555: fsync da/dc/d47/f6b 0 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.0.vm03.stdout:9/55: getdents d2 0 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.0.vm03.stdout:9/56: chown d2/l10 29 1 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.0.vm03.stdout:9/57: fdatasync d2/ff 0 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.0.vm03.stdout:7/89: symlink d5/d9/d14/l18 0 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.0.vm03.stdout:7/90: read - d5/d9/f17 zero size 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.0.vm03.stdout:7/91: dread - d5/d9/d14/ff zero size 2026-03-10T14:07:48.143 INFO:tasks.workunit.client.0.vm03.stdout:6/42: dwrite f2 [0,4194304] 0 2026-03-10T14:07:48.145 INFO:tasks.workunit.client.0.vm03.stdout:2/54: mknod d5/d10/c14 0 2026-03-10T14:07:48.145 INFO:tasks.workunit.client.0.vm03.stdout:0/78: chown d3/d11/c14 209 1 2026-03-10T14:07:48.146 INFO:tasks.workunit.client.0.vm03.stdout:8/61: creat da/f12 x:0 0 0 2026-03-10T14:07:48.147 INFO:tasks.workunit.client.0.vm03.stdout:9/58: creat d2/f15 x:0 0 0 2026-03-10T14:07:48.149 INFO:tasks.workunit.client.0.vm03.stdout:5/81: truncate d4/d13/d1f/f20 2726729 0 2026-03-10T14:07:48.149 INFO:tasks.workunit.client.0.vm03.stdout:6/43: creat d8/fd x:0 0 0 2026-03-10T14:07:48.153 INFO:tasks.workunit.client.0.vm03.stdout:8/62: unlink f0 0 2026-03-10T14:07:48.157 INFO:tasks.workunit.client.0.vm03.stdout:8/63: dread - da/f12 zero size 2026-03-10T14:07:48.159 INFO:tasks.workunit.client.0.vm03.stdout:5/82: dwrite d4/d16/f1c [0,4194304] 0 2026-03-10T14:07:48.159 INFO:tasks.workunit.client.0.vm03.stdout:0/79: mkdir d3/d17 0 2026-03-10T14:07:48.160 INFO:tasks.workunit.client.0.vm03.stdout:9/59: unlink d2/l13 0 2026-03-10T14:07:48.160 INFO:tasks.workunit.client.0.vm03.stdout:8/64: fsync da/fd 0 2026-03-10T14:07:48.162 INFO:tasks.workunit.client.0.vm03.stdout:2/55: dwrite d5/d10/f13 [0,4194304] 0 2026-03-10T14:07:48.171 INFO:tasks.workunit.client.0.vm03.stdout:0/80: readlink d3/l5 0 2026-03-10T14:07:48.171 INFO:tasks.workunit.client.0.vm03.stdout:0/81: chown d3/d11/f12 162019777 1 2026-03-10T14:07:48.173 INFO:tasks.workunit.client.0.vm03.stdout:8/65: mknod da/c13 0 2026-03-10T14:07:48.174 INFO:tasks.workunit.client.0.vm03.stdout:2/56: mknod d5/d10/c15 0 2026-03-10T14:07:48.175 INFO:tasks.workunit.client.0.vm03.stdout:0/82: unlink d3/cf 0 2026-03-10T14:07:48.175 INFO:tasks.workunit.client.0.vm03.stdout:8/66: mknod da/c14 0 2026-03-10T14:07:48.176 INFO:tasks.workunit.client.0.vm03.stdout:2/57: creat d5/d10/f16 x:0 0 0 2026-03-10T14:07:48.176 INFO:tasks.workunit.client.0.vm03.stdout:2/58: chown d5/d10/f16 1 1 2026-03-10T14:07:48.177 INFO:tasks.workunit.client.0.vm03.stdout:0/83: creat d3/d11/f18 x:0 0 0 2026-03-10T14:07:48.178 INFO:tasks.workunit.client.0.vm03.stdout:8/67: creat da/f15 x:0 0 0 2026-03-10T14:07:48.178 INFO:tasks.workunit.client.0.vm03.stdout:8/68: readlink da/lc 0 2026-03-10T14:07:48.179 INFO:tasks.workunit.client.0.vm03.stdout:2/59: dread d5/d10/f13 [0,4194304] 0 2026-03-10T14:07:48.180 INFO:tasks.workunit.client.0.vm03.stdout:8/69: creat da/f16 x:0 0 0 2026-03-10T14:07:48.180 INFO:tasks.workunit.client.0.vm03.stdout:8/70: fdatasync da/f15 0 2026-03-10T14:07:48.181 INFO:tasks.workunit.client.0.vm03.stdout:2/60: chown d5/fa 196541 1 2026-03-10T14:07:48.181 INFO:tasks.workunit.client.0.vm03.stdout:0/84: creat d3/f19 x:0 0 0 2026-03-10T14:07:48.184 INFO:tasks.workunit.client.0.vm03.stdout:8/71: link da/f16 da/f17 0 2026-03-10T14:07:48.189 INFO:tasks.workunit.client.0.vm03.stdout:2/61: dwrite d5/f9 [0,4194304] 0 2026-03-10T14:07:48.189 INFO:tasks.workunit.client.0.vm03.stdout:8/72: dread f6 [0,4194304] 0 2026-03-10T14:07:48.195 INFO:tasks.workunit.client.0.vm03.stdout:7/92: sync 2026-03-10T14:07:48.196 INFO:tasks.workunit.client.1.vm04.stdout:9/477: sync 2026-03-10T14:07:48.198 INFO:tasks.workunit.client.0.vm03.stdout:7/93: dread - d5/d9/d14/ff zero size 2026-03-10T14:07:48.198 INFO:tasks.workunit.client.0.vm03.stdout:7/94: readlink d5/l8 0 2026-03-10T14:07:48.199 INFO:tasks.workunit.client.0.vm03.stdout:7/95: dwrite d5/d9/f17 [0,4194304] 0 2026-03-10T14:07:48.203 INFO:tasks.workunit.client.0.vm03.stdout:5/83: sync 2026-03-10T14:07:48.215 INFO:tasks.workunit.client.0.vm03.stdout:2/62: dread d5/ff [0,4194304] 0 2026-03-10T14:07:48.218 INFO:tasks.workunit.client.0.vm03.stdout:5/84: unlink d4/d6/c15 0 2026-03-10T14:07:48.218 INFO:tasks.workunit.client.0.vm03.stdout:8/73: dwrite da/f16 [0,4194304] 0 2026-03-10T14:07:48.226 INFO:tasks.workunit.client.0.vm03.stdout:7/96: rename d5/d9/d14/f15 to d5/d9/f19 0 2026-03-10T14:07:48.226 INFO:tasks.workunit.client.0.vm03.stdout:7/97: chown d5/d9/f17 900123 1 2026-03-10T14:07:48.229 INFO:tasks.workunit.client.0.vm03.stdout:2/63: mkdir d5/d10/d17 0 2026-03-10T14:07:48.230 INFO:tasks.workunit.client.0.vm03.stdout:4/67: dread d5/f14 [0,4194304] 0 2026-03-10T14:07:48.234 INFO:tasks.workunit.client.0.vm03.stdout:8/74: symlink da/l18 0 2026-03-10T14:07:48.234 INFO:tasks.workunit.client.0.vm03.stdout:8/75: dread - f5 zero size 2026-03-10T14:07:48.235 INFO:tasks.workunit.client.0.vm03.stdout:8/76: write da/fd [499999,41125] 0 2026-03-10T14:07:48.235 INFO:tasks.workunit.client.0.vm03.stdout:5/85: mkdir d4/d16/d19/d23 0 2026-03-10T14:07:48.236 INFO:tasks.workunit.client.0.vm03.stdout:8/77: write f4 [300902,70219] 0 2026-03-10T14:07:48.242 INFO:tasks.workunit.client.0.vm03.stdout:2/64: dwrite f0 [0,4194304] 0 2026-03-10T14:07:48.243 INFO:tasks.workunit.client.0.vm03.stdout:2/65: read d5/f9 [546397,73010] 0 2026-03-10T14:07:48.247 INFO:tasks.workunit.client.0.vm03.stdout:4/68: creat d5/d9/f17 x:0 0 0 2026-03-10T14:07:48.247 INFO:tasks.workunit.client.0.vm03.stdout:4/69: dread - d5/f7 zero size 2026-03-10T14:07:48.248 INFO:tasks.workunit.client.0.vm03.stdout:1/54: truncate d0/fc 2798243 0 2026-03-10T14:07:48.251 INFO:tasks.workunit.client.0.vm03.stdout:6/44: getdents d8 0 2026-03-10T14:07:48.259 INFO:tasks.workunit.client.0.vm03.stdout:5/86: creat d4/d13/f24 x:0 0 0 2026-03-10T14:07:48.262 INFO:tasks.workunit.client.0.vm03.stdout:6/45: dread f2 [0,4194304] 0 2026-03-10T14:07:48.266 INFO:tasks.workunit.client.0.vm03.stdout:5/87: dwrite d4/d6/f10 [0,4194304] 0 2026-03-10T14:07:48.277 INFO:tasks.workunit.client.0.vm03.stdout:3/59: dread fc [0,4194304] 0 2026-03-10T14:07:48.285 INFO:tasks.workunit.client.0.vm03.stdout:9/60: rmdir d2 39 2026-03-10T14:07:48.292 INFO:tasks.workunit.client.1.vm04.stdout:2/546: dwrite d0/d14/d91/d3a/d3e/f38 [0,4194304] 0 2026-03-10T14:07:48.292 INFO:tasks.workunit.client.1.vm04.stdout:2/547: mkdir d0/d14/d39/da8 0 2026-03-10T14:07:48.292 INFO:tasks.workunit.client.1.vm04.stdout:2/548: mknod d0/d14/d91/d3a/d3e/ca9 0 2026-03-10T14:07:48.296 INFO:tasks.workunit.client.0.vm03.stdout:4/70: dread d5/d9/f11 [0,4194304] 0 2026-03-10T14:07:48.302 INFO:tasks.workunit.client.1.vm04.stdout:2/549: dread d0/d14/d91/d8/d17/d35/f94 [0,4194304] 0 2026-03-10T14:07:48.304 INFO:tasks.workunit.client.1.vm04.stdout:2/550: chown d0/d14/d39/d47/f5d 0 1 2026-03-10T14:07:48.343 INFO:tasks.workunit.client.0.vm03.stdout:8/78: fdatasync da/f17 0 2026-03-10T14:07:48.348 INFO:tasks.workunit.client.1.vm04.stdout:0/476: rmdir d0/d2 39 2026-03-10T14:07:48.359 INFO:tasks.workunit.client.1.vm04.stdout:7/535: dwrite d2/d94/f3d [4194304,4194304] 0 2026-03-10T14:07:48.360 INFO:tasks.workunit.client.1.vm04.stdout:0/477: dread d0/d2/f17 [0,4194304] 0 2026-03-10T14:07:48.371 INFO:tasks.workunit.client.1.vm04.stdout:0/478: creat d0/d2/d15/d22/d38/f93 x:0 0 0 2026-03-10T14:07:48.376 INFO:tasks.workunit.client.0.vm03.stdout:7/98: getdents d5 0 2026-03-10T14:07:48.376 INFO:tasks.workunit.client.0.vm03.stdout:7/99: write d5/d9/d14/ff [231718,71995] 0 2026-03-10T14:07:48.377 INFO:tasks.workunit.client.0.vm03.stdout:7/100: truncate d5/f6 5048220 0 2026-03-10T14:07:48.382 INFO:tasks.workunit.client.1.vm04.stdout:0/479: rename d0/d2/d15/d22/d38/d56/d66/f51 to d0/d2/d15/d49/f94 0 2026-03-10T14:07:48.382 INFO:tasks.workunit.client.0.vm03.stdout:6/46: rmdir d8 39 2026-03-10T14:07:48.384 INFO:tasks.workunit.client.1.vm04.stdout:7/536: dread d2/dc/d4d/dcd/f77 [0,4194304] 0 2026-03-10T14:07:48.386 INFO:tasks.workunit.client.0.vm03.stdout:6/47: dwrite f3 [4194304,4194304] 0 2026-03-10T14:07:48.388 INFO:tasks.workunit.client.0.vm03.stdout:2/66: creat d5/d10/d17/f18 x:0 0 0 2026-03-10T14:07:48.397 INFO:tasks.workunit.client.1.vm04.stdout:7/537: dread - d2/dc/de/d2d/d38/f8a zero size 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.1.vm04.stdout:7/538: truncate d2/dc/de/f1e 3891146 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.1.vm04.stdout:7/539: fdatasync d2/dc/de/d2d/d38/d50/dc8/f7b 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:1/55: mknod d0/c15 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:1/56: chown d0/lb 2843 1 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:1/57: write d0/d2/f14 [787226,8703] 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:5/88: creat d4/d16/d19/f25 x:0 0 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:5/89: dwrite d4/d16/f1c [0,4194304] 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:4/71: write d5/f8 [2813212,37745] 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:4/72: fsync d5/d9/fa 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:8/79: chown l1 6859888 1 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:4/73: truncate d5/f7 927156 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:7/101: fsync d5/d9/f19 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:7/102: chown c1 22 1 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:4/74: dread - d5/d9/f17 zero size 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:4/75: readlink d5/lc 0 2026-03-10T14:07:48.398 INFO:tasks.workunit.client.0.vm03.stdout:4/76: read d5/d9/db/ff [205809,119450] 0 2026-03-10T14:07:48.399 INFO:tasks.workunit.client.1.vm04.stdout:7/540: fsync d2/dc/de/d2d/d5c/da9/fc3 0 2026-03-10T14:07:48.401 INFO:tasks.workunit.client.1.vm04.stdout:7/541: chown d2/d94/f29 146 1 2026-03-10T14:07:48.404 INFO:tasks.workunit.client.0.vm03.stdout:4/77: dwrite d5/d9/db/f10 [0,4194304] 0 2026-03-10T14:07:48.406 INFO:tasks.workunit.client.1.vm04.stdout:0/480: dread d0/d2/d15/d49/d50/f53 [0,4194304] 0 2026-03-10T14:07:48.407 INFO:tasks.workunit.client.1.vm04.stdout:0/481: write d0/d2/d15/f59 [1930976,5701] 0 2026-03-10T14:07:48.408 INFO:tasks.workunit.client.0.vm03.stdout:4/78: dread d5/fd [0,4194304] 0 2026-03-10T14:07:48.408 INFO:tasks.workunit.client.0.vm03.stdout:4/79: read d5/fe [2632671,82004] 0 2026-03-10T14:07:48.410 INFO:tasks.workunit.client.1.vm04.stdout:0/482: stat d0/d2/d15/d49/d50/d61 0 2026-03-10T14:07:48.414 INFO:tasks.workunit.client.1.vm04.stdout:0/483: chown d0/d2/d15/d22/d38/l89 66 1 2026-03-10T14:07:48.415 INFO:tasks.workunit.client.1.vm04.stdout:0/484: read d0/d2/f17 [1541971,115453] 0 2026-03-10T14:07:48.416 INFO:tasks.workunit.client.0.vm03.stdout:6/48: write f2 [4032835,93399] 0 2026-03-10T14:07:48.416 INFO:tasks.workunit.client.0.vm03.stdout:6/49: read f2 [586536,126565] 0 2026-03-10T14:07:48.418 INFO:tasks.workunit.client.0.vm03.stdout:6/50: dread f3 [0,4194304] 0 2026-03-10T14:07:48.427 INFO:tasks.workunit.client.1.vm04.stdout:7/542: rename d2/dc/d4d/dcd/c46 to d2/dc/de/d2d/d5c/da9/cce 0 2026-03-10T14:07:48.427 INFO:tasks.workunit.client.0.vm03.stdout:1/58: unlink d0/f8 0 2026-03-10T14:07:48.427 INFO:tasks.workunit.client.0.vm03.stdout:9/61: fsync d2/f3 0 2026-03-10T14:07:48.427 INFO:tasks.workunit.client.0.vm03.stdout:5/90: symlink d4/d16/d19/l26 0 2026-03-10T14:07:48.429 INFO:tasks.workunit.client.1.vm04.stdout:1/512: fsync d3/d20/d60/d82/fd 0 2026-03-10T14:07:48.430 INFO:tasks.workunit.client.0.vm03.stdout:5/91: dwrite d4/fd [0,4194304] 0 2026-03-10T14:07:48.430 INFO:tasks.workunit.client.1.vm04.stdout:7/543: fdatasync d2/dc/de/f73 0 2026-03-10T14:07:48.433 INFO:tasks.workunit.client.0.vm03.stdout:7/103: symlink d5/l1a 0 2026-03-10T14:07:48.434 INFO:tasks.workunit.client.0.vm03.stdout:7/104: chown d5/d9/ce 59 1 2026-03-10T14:07:48.436 INFO:tasks.workunit.client.0.vm03.stdout:4/80: mknod d5/d9/db/c18 0 2026-03-10T14:07:48.442 INFO:tasks.workunit.client.1.vm04.stdout:0/485: write d0/d2/d15/d49/d50/f53 [938764,102499] 0 2026-03-10T14:07:48.442 INFO:tasks.workunit.client.0.vm03.stdout:4/81: dwrite d5/f8 [0,4194304] 0 2026-03-10T14:07:48.457 INFO:tasks.workunit.client.0.vm03.stdout:3/60: rename fa to f10 0 2026-03-10T14:07:48.462 INFO:tasks.workunit.client.0.vm03.stdout:4/82: unlink d5/l6 0 2026-03-10T14:07:48.462 INFO:tasks.workunit.client.0.vm03.stdout:4/83: read f2 [211187,109282] 0 2026-03-10T14:07:48.463 INFO:tasks.workunit.client.0.vm03.stdout:4/84: write f2 [1693206,101295] 0 2026-03-10T14:07:48.469 INFO:tasks.workunit.client.0.vm03.stdout:6/51: creat d8/fe x:0 0 0 2026-03-10T14:07:48.475 INFO:tasks.workunit.client.0.vm03.stdout:1/59: mkdir d0/d2/df/d16 0 2026-03-10T14:07:48.475 INFO:tasks.workunit.client.0.vm03.stdout:4/85: mkdir d5/d9/db/d19 0 2026-03-10T14:07:48.475 INFO:tasks.workunit.client.0.vm03.stdout:6/52: chown d8/db/cc 126467 1 2026-03-10T14:07:48.476 INFO:tasks.workunit.client.0.vm03.stdout:3/61: symlink l11 0 2026-03-10T14:07:48.477 INFO:tasks.workunit.client.0.vm03.stdout:8/80: truncate da/f16 1400727 0 2026-03-10T14:07:48.479 INFO:tasks.workunit.client.0.vm03.stdout:6/53: mkdir d8/db/df 0 2026-03-10T14:07:48.484 INFO:tasks.workunit.client.1.vm04.stdout:0/486: rename d0/d2/d15/l74 to d0/d2/d15/d22/l95 0 2026-03-10T14:07:48.484 INFO:tasks.workunit.client.0.vm03.stdout:9/62: getdents d2 0 2026-03-10T14:07:48.487 INFO:tasks.workunit.client.1.vm04.stdout:1/513: dread d3/d20/d60/d82/d13/d38/d58/faf [0,4194304] 0 2026-03-10T14:07:48.488 INFO:tasks.workunit.client.0.vm03.stdout:3/62: dwrite fc [0,4194304] 0 2026-03-10T14:07:48.490 INFO:tasks.workunit.client.0.vm03.stdout:3/63: truncate fd 4508326 0 2026-03-10T14:07:48.495 INFO:tasks.workunit.client.1.vm04.stdout:1/514: read d3/f14 [4100069,34439] 0 2026-03-10T14:07:48.496 INFO:tasks.workunit.client.0.vm03.stdout:8/81: mknod da/c19 0 2026-03-10T14:07:48.496 INFO:tasks.workunit.client.0.vm03.stdout:4/86: symlink d5/d9/db/d19/l1a 0 2026-03-10T14:07:48.496 INFO:tasks.workunit.client.0.vm03.stdout:1/60: rename d0/d2/f7 to d0/f17 0 2026-03-10T14:07:48.498 INFO:tasks.workunit.client.0.vm03.stdout:8/82: dread f6 [0,4194304] 0 2026-03-10T14:07:48.499 INFO:tasks.workunit.client.1.vm04.stdout:1/515: unlink d3/d22/d6d/l85 0 2026-03-10T14:07:48.500 INFO:tasks.workunit.client.0.vm03.stdout:9/63: creat d2/d14/f16 x:0 0 0 2026-03-10T14:07:48.501 INFO:tasks.workunit.client.1.vm04.stdout:0/487: link d0/d2/d25/f3f d0/d2/d15/d49/d50/d61/f96 0 2026-03-10T14:07:48.501 INFO:tasks.workunit.client.0.vm03.stdout:1/61: mkdir d0/d18 0 2026-03-10T14:07:48.502 INFO:tasks.workunit.client.0.vm03.stdout:1/62: truncate d0/d2/f14 1596059 0 2026-03-10T14:07:48.502 INFO:tasks.workunit.client.0.vm03.stdout:9/64: mknod d2/d14/c17 0 2026-03-10T14:07:48.504 INFO:tasks.workunit.client.0.vm03.stdout:9/65: rename d2/c5 to d2/c18 0 2026-03-10T14:07:48.505 INFO:tasks.workunit.client.0.vm03.stdout:1/63: symlink d0/d2/df/d16/l19 0 2026-03-10T14:07:48.506 INFO:tasks.workunit.client.0.vm03.stdout:3/64: link l3 l12 0 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.1.vm04.stdout:1/516: unlink d3/d5c/c67 0 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.1.vm04.stdout:1/517: dread - d3/d20/fa3 zero size 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.1.vm04.stdout:1/518: write d3/d20/d60/d82/d13/d1a/f4b [3984253,76572] 0 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.0.vm03.stdout:9/66: dread - d2/d14/f16 zero size 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.0.vm03.stdout:1/64: dwrite d0/f4 [0,4194304] 0 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.0.vm03.stdout:9/67: mknod d2/d14/c19 0 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.0.vm03.stdout:3/65: unlink l3 0 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.0.vm03.stdout:1/65: dread d0/f10 [0,4194304] 0 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.0.vm03.stdout:1/66: chown d0/lb 107299 1 2026-03-10T14:07:48.523 INFO:tasks.workunit.client.0.vm03.stdout:9/68: dwrite d2/fc [0,4194304] 0 2026-03-10T14:07:48.527 INFO:tasks.workunit.client.1.vm04.stdout:6/429: truncate d3/de/d35/d3f/f17 7401974 0 2026-03-10T14:07:48.532 INFO:tasks.workunit.client.0.vm03.stdout:3/66: symlink l13 0 2026-03-10T14:07:48.533 INFO:tasks.workunit.client.0.vm03.stdout:1/67: creat d0/d2/f1a x:0 0 0 2026-03-10T14:07:48.535 INFO:tasks.workunit.client.0.vm03.stdout:9/69: creat d2/d14/f1a x:0 0 0 2026-03-10T14:07:48.536 INFO:tasks.workunit.client.0.vm03.stdout:3/67: stat l5 0 2026-03-10T14:07:48.536 INFO:tasks.workunit.client.0.vm03.stdout:3/68: truncate fb 933081 0 2026-03-10T14:07:48.537 INFO:tasks.workunit.client.0.vm03.stdout:9/70: dread - d2/f11 zero size 2026-03-10T14:07:48.539 INFO:tasks.workunit.client.0.vm03.stdout:9/71: write d2/f3 [113070,93871] 0 2026-03-10T14:07:48.548 INFO:tasks.workunit.client.1.vm04.stdout:0/488: link d0/d2/d15/d49/d50/l8e d0/d2/d15/d22/d38/d56/l97 0 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.1.vm04.stdout:0/489: dread - d0/d2/d15/d22/d38/d56/d66/f7a zero size 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.1.vm04.stdout:6/430: symlink d3/de/d35/d3f/d2d/d32/d23/d24/d6f/d71/l7b 0 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.0.vm03.stdout:9/72: write d2/d14/f1a [795101,19933] 0 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.0.vm03.stdout:9/73: read d2/fc [1159167,33876] 0 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.0.vm03.stdout:3/69: symlink l14 0 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.0.vm03.stdout:3/70: unlink fd 0 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.0.vm03.stdout:3/71: chown l14 6 1 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.0.vm03.stdout:9/74: link d2/f11 d2/d14/f1b 0 2026-03-10T14:07:48.549 INFO:tasks.workunit.client.1.vm04.stdout:8/570: truncate d0/d3/d63/f5f 1455241 0 2026-03-10T14:07:48.550 INFO:tasks.workunit.client.0.vm03.stdout:3/72: creat f15 x:0 0 0 2026-03-10T14:07:48.550 INFO:tasks.workunit.client.0.vm03.stdout:3/73: chown lf 912608 1 2026-03-10T14:07:48.556 INFO:tasks.workunit.client.0.vm03.stdout:6/54: sync 2026-03-10T14:07:48.568 INFO:tasks.workunit.client.1.vm04.stdout:8/571: dwrite d0/d75/f83 [0,4194304] 0 2026-03-10T14:07:48.568 INFO:tasks.workunit.client.0.vm03.stdout:6/55: dread f3 [0,4194304] 0 2026-03-10T14:07:48.568 INFO:tasks.workunit.client.0.vm03.stdout:6/56: link f2 d8/db/df/f10 0 2026-03-10T14:07:48.568 INFO:tasks.workunit.client.0.vm03.stdout:6/57: chown d8/fd 789903 1 2026-03-10T14:07:48.568 INFO:tasks.workunit.client.0.vm03.stdout:6/58: dread f2 [0,4194304] 0 2026-03-10T14:07:48.568 INFO:tasks.workunit.client.0.vm03.stdout:6/59: rmdir d8/db/df 39 2026-03-10T14:07:48.569 INFO:tasks.workunit.client.1.vm04.stdout:8/572: readlink d0/d3/d63/d29/l5a 0 2026-03-10T14:07:48.585 INFO:tasks.workunit.client.1.vm04.stdout:8/573: chown d0/d3/d63/d12/d69/f81 58 1 2026-03-10T14:07:48.593 INFO:tasks.workunit.client.1.vm04.stdout:8/574: creat d0/d3/d63/d12/d51/d67/fb2 x:0 0 0 2026-03-10T14:07:48.595 INFO:tasks.workunit.client.1.vm04.stdout:8/575: mknod d0/d3/d63/d12/d51/d67/d96/cb3 0 2026-03-10T14:07:48.596 INFO:tasks.workunit.client.1.vm04.stdout:8/576: write d0/d3/f80 [1690888,73830] 0 2026-03-10T14:07:48.599 INFO:tasks.workunit.client.1.vm04.stdout:8/577: symlink d0/d3/dd/d89/lb4 0 2026-03-10T14:07:48.600 INFO:tasks.workunit.client.1.vm04.stdout:8/578: mkdir d0/d3/dd/d89/db5 0 2026-03-10T14:07:48.601 INFO:tasks.workunit.client.1.vm04.stdout:8/579: stat d0/d3/d63/d12/d69/f8c 0 2026-03-10T14:07:48.601 INFO:tasks.workunit.client.1.vm04.stdout:8/580: write d0/f23 [2122846,98943] 0 2026-03-10T14:07:48.605 INFO:tasks.workunit.client.1.vm04.stdout:8/581: symlink d0/d3/d63/d12/d51/lb6 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/582: symlink d0/d3/dd/d89/lb7 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/583: mkdir d0/d3/d73/db8 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/584: creat d0/d3/d5/fb9 x:0 0 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/585: truncate d0/d3/d63/d12/f2c 80779 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/586: creat d0/d3/d63/d29/fba x:0 0 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/587: rename d0/d3/d73/f9a to d0/d3/d63/d29/fbb 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/588: getdents d0/d3/d73/db8 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/589: write d0/d3/d63/d12/f47 [2227001,30626] 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/590: dwrite d0/d3/dd/d89/fa9 [0,4194304] 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/591: rename d0/d3/d73/lb1 to d0/d3/d73/db8/lbc 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/592: fsync d0/d75/d8a/f9e 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/593: fdatasync d0/d75/d8a/fa1 0 2026-03-10T14:07:48.631 INFO:tasks.workunit.client.1.vm04.stdout:8/594: write d0/d75/d8a/f9e [558095,75198] 0 2026-03-10T14:07:48.632 INFO:tasks.workunit.client.1.vm04.stdout:8/595: dwrite d0/d3/d63/f8f [0,4194304] 0 2026-03-10T14:07:48.635 INFO:tasks.workunit.client.1.vm04.stdout:8/596: mknod d0/d3/d63/d29/cbd 0 2026-03-10T14:07:48.660 INFO:tasks.workunit.client.1.vm04.stdout:6/431: read d3/de/d35/d3f/d2d/d32/d23/d24/f25 [582258,115014] 0 2026-03-10T14:07:48.661 INFO:tasks.workunit.client.1.vm04.stdout:6/432: unlink d3/de/d35/d3f/d2d/d32/d23/d4e/f69 0 2026-03-10T14:07:48.675 INFO:tasks.workunit.client.0.vm03.stdout:6/60: sync 2026-03-10T14:07:48.676 INFO:tasks.workunit.client.0.vm03.stdout:7/105: dread d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:07:48.677 INFO:tasks.workunit.client.0.vm03.stdout:6/61: mkdir d8/d11 0 2026-03-10T14:07:48.677 INFO:tasks.workunit.client.0.vm03.stdout:6/62: chown c6 205836 1 2026-03-10T14:07:48.679 INFO:tasks.workunit.client.0.vm03.stdout:7/106: creat d5/f1b x:0 0 0 2026-03-10T14:07:48.683 INFO:tasks.workunit.client.0.vm03.stdout:7/107: dwrite d5/d9/f17 [4194304,4194304] 0 2026-03-10T14:07:48.687 INFO:tasks.workunit.client.0.vm03.stdout:7/108: mkdir d5/d9/d14/d1c 0 2026-03-10T14:07:48.703 INFO:tasks.workunit.client.1.vm04.stdout:0/490: sync 2026-03-10T14:07:48.705 INFO:tasks.workunit.client.0.vm03.stdout:1/68: fdatasync d0/f4 0 2026-03-10T14:07:48.709 INFO:tasks.workunit.client.1.vm04.stdout:0/491: dread - d0/d6e/f8b zero size 2026-03-10T14:07:48.722 INFO:tasks.workunit.client.1.vm04.stdout:0/492: dread d0/d2/d15/d22/d38/d56/d66/f29 [0,4194304] 0 2026-03-10T14:07:48.725 INFO:tasks.workunit.client.0.vm03.stdout:6/63: fsync f3 0 2026-03-10T14:07:48.729 INFO:tasks.workunit.client.1.vm04.stdout:0/493: dread - d0/d2/d15/d49/f7c zero size 2026-03-10T14:07:48.729 INFO:tasks.workunit.client.0.vm03.stdout:6/64: dread f3 [0,4194304] 0 2026-03-10T14:07:48.729 INFO:tasks.workunit.client.0.vm03.stdout:6/65: chown l4 1503943 1 2026-03-10T14:07:48.732 INFO:tasks.workunit.client.1.vm04.stdout:0/494: creat d0/d2/d15/d49/d50/d61/d75/f98 x:0 0 0 2026-03-10T14:07:48.734 INFO:tasks.workunit.client.0.vm03.stdout:6/66: write d8/db/df/f10 [3579987,6083] 0 2026-03-10T14:07:48.734 INFO:tasks.workunit.client.1.vm04.stdout:0/495: write d0/d2/d15/d22/d38/d56/d66/f2c [4033375,41227] 0 2026-03-10T14:07:48.735 INFO:tasks.workunit.client.1.vm04.stdout:0/496: dread - d0/d2/d15/d49/d50/d61/d75/f80 zero size 2026-03-10T14:07:48.739 INFO:tasks.workunit.client.0.vm03.stdout:6/67: mkdir d8/db/d12 0 2026-03-10T14:07:48.740 INFO:tasks.workunit.client.1.vm04.stdout:0/497: creat d0/f99 x:0 0 0 2026-03-10T14:07:48.776 INFO:tasks.workunit.client.1.vm04.stdout:5/599: write d7/d9/f20 [3582506,91709] 0 2026-03-10T14:07:48.780 INFO:tasks.workunit.client.1.vm04.stdout:0/498: dread d0/d2/f12 [4194304,4194304] 0 2026-03-10T14:07:48.781 INFO:tasks.workunit.client.0.vm03.stdout:9/75: fdatasync d2/f3 0 2026-03-10T14:07:48.782 INFO:tasks.workunit.client.1.vm04.stdout:5/600: symlink d7/d2d/d32/d34/lc1 0 2026-03-10T14:07:48.786 INFO:tasks.workunit.client.1.vm04.stdout:0/499: chown d0/d2/d15/d22/d38/d56/f67 2419 1 2026-03-10T14:07:48.787 INFO:tasks.workunit.client.0.vm03.stdout:9/76: dwrite d2/d14/f1a [0,4194304] 0 2026-03-10T14:07:48.789 INFO:tasks.workunit.client.1.vm04.stdout:0/500: rename d0/d2/d25/f33 to d0/d2/d15/d49/d50/d61/d75/f9a 0 2026-03-10T14:07:48.789 INFO:tasks.workunit.client.0.vm03.stdout:9/77: mknod d2/d14/c1c 0 2026-03-10T14:07:48.790 INFO:tasks.workunit.client.1.vm04.stdout:5/601: link d7/d12/d2b/d3e/d3f/c7c d7/d12/d45/cc2 0 2026-03-10T14:07:48.797 INFO:tasks.workunit.client.0.vm03.stdout:9/78: dwrite d2/f15 [0,4194304] 0 2026-03-10T14:07:48.801 INFO:tasks.workunit.client.0.vm03.stdout:9/79: symlink d2/d14/l1d 0 2026-03-10T14:07:48.802 INFO:tasks.workunit.client.1.vm04.stdout:0/501: rename d0/d2/d15/d22/d38/d56/l6a to d0/d2/d15/d22/d38/d56/l9b 0 2026-03-10T14:07:48.806 INFO:tasks.workunit.client.0.vm03.stdout:9/80: dwrite d2/ff [0,4194304] 0 2026-03-10T14:07:48.810 INFO:tasks.workunit.client.1.vm04.stdout:0/502: rename d0/d2/d15/d49/d50/d61/c91 to d0/d2/d15/d22/d38/d56/d66/c9c 0 2026-03-10T14:07:48.822 INFO:tasks.workunit.client.1.vm04.stdout:0/503: readlink d0/d2/l39 0 2026-03-10T14:07:48.823 INFO:tasks.workunit.client.1.vm04.stdout:0/504: dwrite d0/d2/d15/f2f [0,4194304] 0 2026-03-10T14:07:48.851 INFO:tasks.workunit.client.0.vm03.stdout:2/67: dwrite f1 [0,4194304] 0 2026-03-10T14:07:48.857 INFO:tasks.workunit.client.0.vm03.stdout:0/85: dwrite d3/fe [0,4194304] 0 2026-03-10T14:07:48.857 INFO:tasks.workunit.client.0.vm03.stdout:0/86: chown d3/l5 4100170 1 2026-03-10T14:07:48.858 INFO:tasks.workunit.client.0.vm03.stdout:0/87: chown d3/l5 41 1 2026-03-10T14:07:48.859 INFO:tasks.workunit.client.1.vm04.stdout:5/602: sync 2026-03-10T14:07:48.863 INFO:tasks.workunit.client.0.vm03.stdout:2/68: rmdir d5 39 2026-03-10T14:07:48.863 INFO:tasks.workunit.client.0.vm03.stdout:5/92: getdents d4/d16/d19 0 2026-03-10T14:07:48.865 INFO:tasks.workunit.client.0.vm03.stdout:5/93: dread d4/d6/fa [0,4194304] 0 2026-03-10T14:07:48.866 INFO:tasks.workunit.client.0.vm03.stdout:5/94: dread - d4/d16/d19/f25 zero size 2026-03-10T14:07:48.871 INFO:tasks.workunit.client.0.vm03.stdout:0/88: creat d3/d17/f1a x:0 0 0 2026-03-10T14:07:48.872 INFO:tasks.workunit.client.0.vm03.stdout:0/89: chown d3/l5 11117735 1 2026-03-10T14:07:48.872 INFO:tasks.workunit.client.0.vm03.stdout:2/69: dwrite d5/fa [0,4194304] 0 2026-03-10T14:07:48.873 INFO:tasks.workunit.client.0.vm03.stdout:0/90: write d3/d11/f13 [2020770,45141] 0 2026-03-10T14:07:48.873 INFO:tasks.workunit.client.0.vm03.stdout:0/91: stat d3/f19 0 2026-03-10T14:07:48.873 INFO:tasks.workunit.client.0.vm03.stdout:0/92: read d3/f9 [4167916,64489] 0 2026-03-10T14:07:48.875 INFO:tasks.workunit.client.0.vm03.stdout:2/70: truncate d5/d10/d17/f18 800903 0 2026-03-10T14:07:48.878 INFO:tasks.workunit.client.0.vm03.stdout:2/71: write d5/d10/f12 [608005,12004] 0 2026-03-10T14:07:48.883 INFO:tasks.workunit.client.0.vm03.stdout:5/95: unlink d4/lf 0 2026-03-10T14:07:48.888 INFO:tasks.workunit.client.0.vm03.stdout:5/96: dwrite d4/d6/f10 [0,4194304] 0 2026-03-10T14:07:48.894 INFO:tasks.workunit.client.0.vm03.stdout:0/93: mkdir d3/d11/d1b 0 2026-03-10T14:07:48.895 INFO:tasks.workunit.client.0.vm03.stdout:5/97: symlink d4/d16/l27 0 2026-03-10T14:07:48.897 INFO:tasks.workunit.client.0.vm03.stdout:5/98: dread d4/f17 [0,4194304] 0 2026-03-10T14:07:48.899 INFO:tasks.workunit.client.0.vm03.stdout:0/94: getdents d3/d16 0 2026-03-10T14:07:48.900 INFO:tasks.workunit.client.0.vm03.stdout:5/99: dwrite d4/d13/f24 [0,4194304] 0 2026-03-10T14:07:48.902 INFO:tasks.workunit.client.0.vm03.stdout:5/100: truncate d4/d16/d19/f25 881486 0 2026-03-10T14:07:48.902 INFO:tasks.workunit.client.0.vm03.stdout:5/101: readlink d4/d16/l27 0 2026-03-10T14:07:48.905 INFO:tasks.workunit.client.0.vm03.stdout:5/102: mknod d4/d6/de/c28 0 2026-03-10T14:07:48.906 INFO:tasks.workunit.client.0.vm03.stdout:5/103: dread d4/d16/f1b [0,4194304] 0 2026-03-10T14:07:48.908 INFO:tasks.workunit.client.0.vm03.stdout:5/104: link d4/d13/f24 d4/f29 0 2026-03-10T14:07:48.909 INFO:tasks.workunit.client.0.vm03.stdout:5/105: stat d4/d16/f1b 0 2026-03-10T14:07:48.909 INFO:tasks.workunit.client.0.vm03.stdout:5/106: truncate d4/d16/f1c 5235111 0 2026-03-10T14:07:48.926 INFO:tasks.workunit.client.1.vm04.stdout:5/603: sync 2026-03-10T14:07:48.927 INFO:tasks.workunit.client.0.vm03.stdout:2/72: sync 2026-03-10T14:07:48.928 INFO:tasks.workunit.client.0.vm03.stdout:2/73: dread - d5/d10/f16 zero size 2026-03-10T14:07:48.931 INFO:tasks.workunit.client.1.vm04.stdout:5/604: rmdir d7/d59 39 2026-03-10T14:07:48.931 INFO:tasks.workunit.client.0.vm03.stdout:2/74: creat d5/d10/d17/f19 x:0 0 0 2026-03-10T14:07:48.931 INFO:tasks.workunit.client.1.vm04.stdout:5/605: chown d7/d26/l66 1141307551 1 2026-03-10T14:07:48.934 INFO:tasks.workunit.client.1.vm04.stdout:5/606: read d7/d12/d2b/d3e/d3f/f88 [1064875,64349] 0 2026-03-10T14:07:48.938 INFO:tasks.workunit.client.1.vm04.stdout:5/607: dwrite d7/d2d/d32/f9d [0,4194304] 0 2026-03-10T14:07:48.944 INFO:tasks.workunit.client.1.vm04.stdout:5/608: unlink d7/d9/fd 0 2026-03-10T14:07:49.004 INFO:tasks.workunit.client.0.vm03.stdout:2/75: dread d5/d10/f12 [0,4194304] 0 2026-03-10T14:07:49.013 INFO:tasks.workunit.client.0.vm03.stdout:2/76: dwrite d5/d10/f16 [0,4194304] 0 2026-03-10T14:07:49.023 INFO:tasks.workunit.client.0.vm03.stdout:2/77: link d5/le d5/d10/l1a 0 2026-03-10T14:07:49.024 INFO:tasks.workunit.client.0.vm03.stdout:2/78: symlink d5/d10/l1b 0 2026-03-10T14:07:49.025 INFO:tasks.workunit.client.0.vm03.stdout:2/79: readlink d5/d10/l1a 0 2026-03-10T14:07:49.026 INFO:tasks.workunit.client.0.vm03.stdout:2/80: mkdir d5/d10/d1c 0 2026-03-10T14:07:49.028 INFO:tasks.workunit.client.0.vm03.stdout:0/95: dread d3/f10 [0,4194304] 0 2026-03-10T14:07:49.028 INFO:tasks.workunit.client.0.vm03.stdout:0/96: chown d3/f10 11083 1 2026-03-10T14:07:49.053 INFO:tasks.workunit.client.0.vm03.stdout:2/81: creat d5/d10/d1c/f1d x:0 0 0 2026-03-10T14:07:49.053 INFO:tasks.workunit.client.0.vm03.stdout:2/82: fdatasync f4 0 2026-03-10T14:07:49.055 INFO:tasks.workunit.client.0.vm03.stdout:2/83: creat d5/f1e x:0 0 0 2026-03-10T14:07:49.056 INFO:tasks.workunit.client.0.vm03.stdout:2/84: truncate d5/d10/d17/f19 244615 0 2026-03-10T14:07:49.059 INFO:tasks.workunit.client.0.vm03.stdout:2/85: unlink d5/d10/c15 0 2026-03-10T14:07:49.060 INFO:tasks.workunit.client.0.vm03.stdout:2/86: mkdir d5/d10/d1f 0 2026-03-10T14:07:49.063 INFO:tasks.workunit.client.0.vm03.stdout:2/87: write d5/f9 [3577179,79934] 0 2026-03-10T14:07:49.070 INFO:tasks.workunit.client.0.vm03.stdout:2/88: dwrite d5/d10/d17/f19 [0,4194304] 0 2026-03-10T14:07:49.075 INFO:tasks.workunit.client.0.vm03.stdout:2/89: creat d5/d10/d17/f20 x:0 0 0 2026-03-10T14:07:49.086 INFO:tasks.workunit.client.0.vm03.stdout:2/90: dwrite d5/d10/d17/f19 [0,4194304] 0 2026-03-10T14:07:49.093 INFO:tasks.workunit.client.0.vm03.stdout:2/91: chown d5/fb 85 1 2026-03-10T14:07:49.096 INFO:tasks.workunit.client.0.vm03.stdout:2/92: dread d5/d10/f13 [0,4194304] 0 2026-03-10T14:07:49.096 INFO:tasks.workunit.client.0.vm03.stdout:2/93: read d5/d10/f13 [2671729,86555] 0 2026-03-10T14:07:49.097 INFO:tasks.workunit.client.0.vm03.stdout:2/94: write d5/d10/f16 [2807210,69880] 0 2026-03-10T14:07:49.098 INFO:tasks.workunit.client.0.vm03.stdout:2/95: fsync f4 0 2026-03-10T14:07:49.100 INFO:tasks.workunit.client.0.vm03.stdout:2/96: dread d5/d10/d17/f18 [0,4194304] 0 2026-03-10T14:07:49.105 INFO:tasks.workunit.client.0.vm03.stdout:2/97: creat d5/d10/d1c/f21 x:0 0 0 2026-03-10T14:07:49.110 INFO:tasks.workunit.client.0.vm03.stdout:2/98: rename f0 to d5/d10/f22 0 2026-03-10T14:07:49.117 INFO:tasks.workunit.client.0.vm03.stdout:2/99: rename d5/d10/d1c/f21 to d5/f23 0 2026-03-10T14:07:49.129 INFO:tasks.workunit.client.1.vm04.stdout:4/541: write d4/f21 [2411069,117462] 0 2026-03-10T14:07:49.136 INFO:tasks.workunit.client.1.vm04.stdout:4/542: readlink d4/df/db2/db4/d47/d70/d74/lbd 0 2026-03-10T14:07:49.136 INFO:tasks.workunit.client.0.vm03.stdout:2/100: sync 2026-03-10T14:07:49.137 INFO:tasks.workunit.client.0.vm03.stdout:2/101: read d5/d10/f16 [1585117,92268] 0 2026-03-10T14:07:49.138 INFO:tasks.workunit.client.0.vm03.stdout:2/102: fsync f4 0 2026-03-10T14:07:49.140 INFO:tasks.workunit.client.1.vm04.stdout:4/543: link d4/d14/l4a d4/d14/dac/db7/lc4 0 2026-03-10T14:07:49.140 INFO:tasks.workunit.client.1.vm04.stdout:4/544: fdatasync d4/df/db2/db4/f4b 0 2026-03-10T14:07:49.141 INFO:tasks.workunit.client.0.vm03.stdout:2/103: dread d5/d10/d17/f19 [0,4194304] 0 2026-03-10T14:07:49.141 INFO:tasks.workunit.client.1.vm04.stdout:4/545: chown d4/f5f 33117143 1 2026-03-10T14:07:49.142 INFO:tasks.workunit.client.0.vm03.stdout:2/104: readlink d5/l11 0 2026-03-10T14:07:49.142 INFO:tasks.workunit.client.1.vm04.stdout:4/546: stat d4/d14/d3c/f46 0 2026-03-10T14:07:49.143 INFO:tasks.workunit.client.0.vm03.stdout:2/105: creat d5/d10/d1f/f24 x:0 0 0 2026-03-10T14:07:49.145 INFO:tasks.workunit.client.0.vm03.stdout:2/106: dread d5/ff [0,4194304] 0 2026-03-10T14:07:49.147 INFO:tasks.workunit.client.1.vm04.stdout:4/547: rmdir d4/d14 39 2026-03-10T14:07:49.147 INFO:tasks.workunit.client.0.vm03.stdout:2/107: dread f1 [0,4194304] 0 2026-03-10T14:07:49.148 INFO:tasks.workunit.client.1.vm04.stdout:4/548: dread - d4/df/d31/fa7 zero size 2026-03-10T14:07:49.148 INFO:tasks.workunit.client.0.vm03.stdout:2/108: chown d5/d10/f16 53596740 1 2026-03-10T14:07:49.154 INFO:tasks.workunit.client.1.vm04.stdout:4/549: chown d4/f96 27826 1 2026-03-10T14:07:49.169 INFO:tasks.workunit.client.1.vm04.stdout:4/550: link d4/d14/d1b/f99 d4/df/d34/fc5 0 2026-03-10T14:07:49.174 INFO:tasks.workunit.client.1.vm04.stdout:4/551: dread d4/d14/d1b/f9d [0,4194304] 0 2026-03-10T14:07:49.182 INFO:tasks.workunit.client.1.vm04.stdout:4/552: dwrite d4/df/db2/db4/d47/d4f/fa6 [0,4194304] 0 2026-03-10T14:07:49.187 INFO:tasks.workunit.client.1.vm04.stdout:4/553: chown d4/df/db2/db4/d47/d4f/f84 125588 1 2026-03-10T14:07:49.192 INFO:tasks.workunit.client.1.vm04.stdout:4/554: dread d4/df/d34/f95 [0,4194304] 0 2026-03-10T14:07:49.235 INFO:tasks.workunit.client.1.vm04.stdout:9/478: truncate d9/da/d5d/f90 3218943 0 2026-03-10T14:07:49.237 INFO:tasks.workunit.client.1.vm04.stdout:3/556: dwrite da/dc/d35/d52/f72 [4194304,4194304] 0 2026-03-10T14:07:49.244 INFO:tasks.workunit.client.1.vm04.stdout:9/479: truncate d9/d5c/d93/d96/d9d/fa2 973329 0 2026-03-10T14:07:49.247 INFO:tasks.workunit.client.1.vm04.stdout:3/557: unlink da/dc/d3f/c5f 0 2026-03-10T14:07:49.257 INFO:tasks.workunit.client.1.vm04.stdout:3/558: dwrite da/dc/d3f/d54/d66/f9e [0,4194304] 0 2026-03-10T14:07:49.260 INFO:tasks.workunit.client.1.vm04.stdout:9/480: dread d9/f20 [0,4194304] 0 2026-03-10T14:07:49.265 INFO:tasks.workunit.client.1.vm04.stdout:9/481: mkdir d9/da/dd/d1c/da3 0 2026-03-10T14:07:49.277 INFO:tasks.workunit.client.1.vm04.stdout:9/482: dread d9/da/dd/f31 [0,4194304] 0 2026-03-10T14:07:49.284 INFO:tasks.workunit.client.1.vm04.stdout:9/483: dread d9/da/dd/f31 [0,4194304] 0 2026-03-10T14:07:49.287 INFO:tasks.workunit.client.1.vm04.stdout:3/559: link da/dc/l7f da/dc/d35/d52/d70/lc3 0 2026-03-10T14:07:49.291 INFO:tasks.workunit.client.1.vm04.stdout:2/551: dwrite d0/d14/d91/d3a/d3e/f61 [0,4194304] 0 2026-03-10T14:07:49.292 INFO:tasks.workunit.client.1.vm04.stdout:9/484: dread d9/d5c/f6e [0,4194304] 0 2026-03-10T14:07:49.293 INFO:tasks.workunit.client.1.vm04.stdout:7/544: dwrite d2/dc/de/d2d/d60/d7c/f84 [0,4194304] 0 2026-03-10T14:07:49.297 INFO:tasks.workunit.client.1.vm04.stdout:9/485: read d9/d5c/f6e [104461,90114] 0 2026-03-10T14:07:49.316 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:49 vm04.local ceph-mon[55966]: pgmap v153: 65 pgs: 65 active+clean; 1.3 GiB data, 5.1 GiB used, 115 GiB / 120 GiB avail; 26 MiB/s rd, 69 MiB/s wr, 380 op/s 2026-03-10T14:07:49.316 INFO:tasks.workunit.client.1.vm04.stdout:2/552: truncate d0/d14/d39/f44 3008671 0 2026-03-10T14:07:49.321 INFO:tasks.workunit.client.1.vm04.stdout:7/545: mknod d2/dc/de/d2d/d60/d7c/d44/dc0/ccf 0 2026-03-10T14:07:49.338 INFO:tasks.workunit.client.1.vm04.stdout:9/486: link d9/d33/f4b d9/d1d/fa4 0 2026-03-10T14:07:49.349 INFO:tasks.workunit.client.1.vm04.stdout:1/519: dwrite d3/d22/d2f/d57/f68 [0,4194304] 0 2026-03-10T14:07:49.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:49 vm03.local ceph-mon[49718]: pgmap v153: 65 pgs: 65 active+clean; 1.3 GiB data, 5.1 GiB used, 115 GiB / 120 GiB avail; 26 MiB/s rd, 69 MiB/s wr, 380 op/s 2026-03-10T14:07:49.382 INFO:tasks.workunit.client.1.vm04.stdout:9/487: mkdir d9/d1d/da5 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:1/520: truncate d3/f14 2089798 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:9/488: dread d9/d44/f51 [0,4194304] 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:9/489: rename d9/d44/d70/f86 to d9/d5c/fa6 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:1/521: creat d3/d20/d60/fb6 x:0 0 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:1/522: dread - d3/d22/f8e zero size 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:9/490: fdatasync d9/d44/d4d/f99 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:2/553: link d0/d14/d39/l97 d0/d14/d91/d8/dd/d26/d46/laa 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:1/523: dread d3/f2c [0,4194304] 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:1/524: mkdir d3/d8f/db7 0 2026-03-10T14:07:49.393 INFO:tasks.workunit.client.1.vm04.stdout:1/525: stat d3/d22/d63/fb3 0 2026-03-10T14:07:49.394 INFO:tasks.workunit.client.1.vm04.stdout:1/526: creat d3/d20/d60/d82/d13/d38/db5/fb8 x:0 0 0 2026-03-10T14:07:49.395 INFO:tasks.workunit.client.1.vm04.stdout:1/527: fdatasync d3/d20/d60/d82/d13/d38/d58/faf 0 2026-03-10T14:07:49.396 INFO:tasks.workunit.client.1.vm04.stdout:1/528: fsync d3/d22/d2f/d57/fac 0 2026-03-10T14:07:49.399 INFO:tasks.workunit.client.1.vm04.stdout:1/529: dwrite f1 [0,4194304] 0 2026-03-10T14:07:49.401 INFO:tasks.workunit.client.1.vm04.stdout:1/530: chown d3/d20/d60/d82/d13/d38/d58/d5b/f7c 60 1 2026-03-10T14:07:49.410 INFO:tasks.workunit.client.1.vm04.stdout:1/531: dwrite d3/d20/d60/d82/d13/d1a/f28 [4194304,4194304] 0 2026-03-10T14:07:49.416 INFO:tasks.workunit.client.1.vm04.stdout:1/532: fdatasync d3/d22/d63/f7f 0 2026-03-10T14:07:49.417 INFO:tasks.workunit.client.1.vm04.stdout:1/533: chown d3/d22/l8c 126376 1 2026-03-10T14:07:49.423 INFO:tasks.workunit.client.1.vm04.stdout:1/534: creat d3/d20/d60/d82/d13/da0/fb9 x:0 0 0 2026-03-10T14:07:49.424 INFO:tasks.workunit.client.1.vm04.stdout:1/535: mknod d3/d5c/d79/cba 0 2026-03-10T14:07:49.427 INFO:tasks.workunit.client.1.vm04.stdout:1/536: link d3/d20/d60/d82/d13/d38/l7e d3/d22/d63/d35/d6c/lbb 0 2026-03-10T14:07:49.488 INFO:tasks.workunit.client.1.vm04.stdout:7/546: sync 2026-03-10T14:07:49.506 INFO:tasks.workunit.client.1.vm04.stdout:8/597: getdents d0/d3/d63/d12/d51 0 2026-03-10T14:07:49.575 INFO:tasks.workunit.client.1.vm04.stdout:6/433: dwrite d3/de/d35/d3a/d43/d4c/f53 [0,4194304] 0 2026-03-10T14:07:49.579 INFO:tasks.workunit.client.1.vm04.stdout:6/434: chown d3/de/d35/d3f/d2d/d32/d23/d24/f25 28680863 1 2026-03-10T14:07:49.606 INFO:tasks.workunit.client.1.vm04.stdout:0/505: dwrite d0/d2/d15/d49/d50/f55 [0,4194304] 0 2026-03-10T14:07:49.609 INFO:tasks.workunit.client.1.vm04.stdout:0/506: chown d0/d2/d15/d22/d38/d56/d66/l3a 17898221 1 2026-03-10T14:07:49.616 INFO:tasks.workunit.client.1.vm04.stdout:0/507: link d0/d2/d15/d22/l69 d0/d2/d25/l9d 0 2026-03-10T14:07:49.659 INFO:tasks.workunit.client.1.vm04.stdout:6/435: sync 2026-03-10T14:07:49.671 INFO:tasks.workunit.client.0.vm03.stdout:4/87: rmdir d5/d9 39 2026-03-10T14:07:49.672 INFO:tasks.workunit.client.0.vm03.stdout:4/88: stat d5/d9 0 2026-03-10T14:07:49.675 INFO:tasks.workunit.client.0.vm03.stdout:4/89: rename d5/d9/db/c18 to d5/c1b 0 2026-03-10T14:07:49.675 INFO:tasks.workunit.client.0.vm03.stdout:4/90: dread - d5/d9/fa zero size 2026-03-10T14:07:49.677 INFO:tasks.workunit.client.0.vm03.stdout:4/91: unlink d5/f13 0 2026-03-10T14:07:49.679 INFO:tasks.workunit.client.0.vm03.stdout:4/92: symlink d5/l1c 0 2026-03-10T14:07:49.680 INFO:tasks.workunit.client.1.vm04.stdout:5/609: dwrite d7/d2d/f6d [0,4194304] 0 2026-03-10T14:07:49.681 INFO:tasks.workunit.client.0.vm03.stdout:4/93: rename d5/l1c to d5/d9/db/l1d 0 2026-03-10T14:07:49.684 INFO:tasks.workunit.client.0.vm03.stdout:4/94: dread d5/f14 [0,4194304] 0 2026-03-10T14:07:49.693 INFO:tasks.workunit.client.0.vm03.stdout:4/95: mkdir d5/d9/db/d19/d1e 0 2026-03-10T14:07:49.694 INFO:tasks.workunit.client.0.vm03.stdout:4/96: read - d5/d9/fa zero size 2026-03-10T14:07:49.697 INFO:tasks.workunit.client.0.vm03.stdout:4/97: rename d5/f8 to d5/f1f 0 2026-03-10T14:07:49.698 INFO:tasks.workunit.client.0.vm03.stdout:4/98: truncate d5/d9/f16 98408 0 2026-03-10T14:07:49.699 INFO:tasks.workunit.client.0.vm03.stdout:4/99: write d5/f1f [3476809,47647] 0 2026-03-10T14:07:49.702 INFO:tasks.workunit.client.0.vm03.stdout:4/100: creat d5/d9/db/f20 x:0 0 0 2026-03-10T14:07:49.702 INFO:tasks.workunit.client.0.vm03.stdout:4/101: fsync d5/d9/fa 0 2026-03-10T14:07:49.706 INFO:tasks.workunit.client.0.vm03.stdout:4/102: creat d5/d9/db/d19/d1e/f21 x:0 0 0 2026-03-10T14:07:49.706 INFO:tasks.workunit.client.0.vm03.stdout:4/103: truncate d5/d9/db/ff 2036718 0 2026-03-10T14:07:49.731 INFO:tasks.workunit.client.1.vm04.stdout:6/436: creat d3/de/d35/d3f/d2d/d32/d23/f7c x:0 0 0 2026-03-10T14:07:49.733 INFO:tasks.workunit.client.1.vm04.stdout:4/555: dwrite d4/d14/f98 [0,4194304] 0 2026-03-10T14:07:49.754 INFO:tasks.workunit.client.1.vm04.stdout:4/556: mknod d4/d14/d6d/cc6 0 2026-03-10T14:07:49.755 INFO:tasks.workunit.client.1.vm04.stdout:4/557: chown d4/df/db2/db4/l68 115101 1 2026-03-10T14:07:49.760 INFO:tasks.workunit.client.0.vm03.stdout:3/74: getdents . 0 2026-03-10T14:07:49.761 INFO:tasks.workunit.client.1.vm04.stdout:3/560: dwrite da/f25 [0,4194304] 0 2026-03-10T14:07:49.798 INFO:tasks.workunit.client.0.vm03.stdout:6/68: fsync d8/db/df/f10 0 2026-03-10T14:07:49.799 INFO:tasks.workunit.client.0.vm03.stdout:6/69: mkdir d8/d11/d13 0 2026-03-10T14:07:49.800 INFO:tasks.workunit.client.0.vm03.stdout:6/70: truncate d8/fd 822708 0 2026-03-10T14:07:49.804 INFO:tasks.workunit.client.0.vm03.stdout:6/71: dwrite f5 [0,4194304] 0 2026-03-10T14:07:49.809 INFO:tasks.workunit.client.0.vm03.stdout:7/109: getdents d5 0 2026-03-10T14:07:49.810 INFO:tasks.workunit.client.0.vm03.stdout:7/110: chown d5/d9/d14/d1c 328457 1 2026-03-10T14:07:49.811 INFO:tasks.workunit.client.0.vm03.stdout:6/72: rename c7 to d8/d11/c14 0 2026-03-10T14:07:49.816 INFO:tasks.workunit.client.0.vm03.stdout:6/73: symlink d8/l15 0 2026-03-10T14:07:49.816 INFO:tasks.workunit.client.0.vm03.stdout:7/111: getdents d5 0 2026-03-10T14:07:49.816 INFO:tasks.workunit.client.0.vm03.stdout:6/74: readlink d8/l15 0 2026-03-10T14:07:49.828 INFO:tasks.workunit.client.0.vm03.stdout:6/75: link d8/db/cc d8/d11/c16 0 2026-03-10T14:07:49.830 INFO:tasks.workunit.client.0.vm03.stdout:6/76: dread f5 [0,4194304] 0 2026-03-10T14:07:49.830 INFO:tasks.workunit.client.0.vm03.stdout:6/77: readlink d8/l15 0 2026-03-10T14:07:49.832 INFO:tasks.workunit.client.0.vm03.stdout:6/78: dread d8/db/df/f10 [0,4194304] 0 2026-03-10T14:07:49.834 INFO:tasks.workunit.client.0.vm03.stdout:7/112: link d5/d9/d14/c13 d5/d9/d14/d1c/c1d 0 2026-03-10T14:07:49.836 INFO:tasks.workunit.client.0.vm03.stdout:7/113: dread d5/f7 [4194304,4194304] 0 2026-03-10T14:07:49.846 INFO:tasks.workunit.client.0.vm03.stdout:6/79: rename c6 to d8/db/d12/c17 0 2026-03-10T14:07:49.854 INFO:tasks.workunit.client.0.vm03.stdout:7/114: creat d5/d9/f1e x:0 0 0 2026-03-10T14:07:49.856 INFO:tasks.workunit.client.0.vm03.stdout:6/80: rename d8/d11/d13 to d8/d11/d18 0 2026-03-10T14:07:49.858 INFO:tasks.workunit.client.0.vm03.stdout:6/81: dread f2 [0,4194304] 0 2026-03-10T14:07:49.862 INFO:tasks.workunit.client.0.vm03.stdout:7/115: creat d5/d9/f1f x:0 0 0 2026-03-10T14:07:49.862 INFO:tasks.workunit.client.0.vm03.stdout:7/116: write d5/d9/f10 [423593,93391] 0 2026-03-10T14:07:49.871 INFO:tasks.workunit.client.0.vm03.stdout:6/82: mknod d8/db/c19 0 2026-03-10T14:07:49.873 INFO:tasks.workunit.client.0.vm03.stdout:6/83: dread d8/fd [0,4194304] 0 2026-03-10T14:07:49.879 INFO:tasks.workunit.client.0.vm03.stdout:6/84: chown d8/db/cc 122539218 1 2026-03-10T14:07:49.882 INFO:tasks.workunit.client.0.vm03.stdout:6/85: dread f2 [0,4194304] 0 2026-03-10T14:07:49.882 INFO:tasks.workunit.client.0.vm03.stdout:6/86: write f3 [7183537,121565] 0 2026-03-10T14:07:49.882 INFO:tasks.workunit.client.0.vm03.stdout:6/87: dread - d8/fe zero size 2026-03-10T14:07:49.885 INFO:tasks.workunit.client.0.vm03.stdout:6/88: dwrite f0 [0,4194304] 0 2026-03-10T14:07:49.892 INFO:tasks.workunit.client.0.vm03.stdout:6/89: dwrite d8/fd [0,4194304] 0 2026-03-10T14:07:49.919 INFO:tasks.workunit.client.0.vm03.stdout:7/117: sync 2026-03-10T14:07:49.921 INFO:tasks.workunit.client.0.vm03.stdout:7/118: rmdir d5/d9/d14 39 2026-03-10T14:07:49.925 INFO:tasks.workunit.client.0.vm03.stdout:7/119: mkdir d5/d9/d14/d1c/d20 0 2026-03-10T14:07:50.000 INFO:tasks.workunit.client.1.vm04.stdout:1/537: truncate d3/d20/d60/d82/d13/d38/d58/faf 756222 0 2026-03-10T14:07:50.007 INFO:tasks.workunit.client.0.vm03.stdout:9/81: truncate d2/fc 1847910 0 2026-03-10T14:07:50.009 INFO:tasks.workunit.client.0.vm03.stdout:9/82: creat d2/f1e x:0 0 0 2026-03-10T14:07:50.027 INFO:tasks.workunit.client.0.vm03.stdout:9/83: sync 2026-03-10T14:07:50.028 INFO:tasks.workunit.client.0.vm03.stdout:9/84: write d2/f1e [357955,117833] 0 2026-03-10T14:07:50.030 INFO:tasks.workunit.client.0.vm03.stdout:9/85: mknod d2/c1f 0 2026-03-10T14:07:50.033 INFO:tasks.workunit.client.0.vm03.stdout:9/86: truncate d2/f3 1093032 0 2026-03-10T14:07:50.038 INFO:tasks.workunit.client.0.vm03.stdout:9/87: link d2/c18 d2/d14/c20 0 2026-03-10T14:07:50.041 INFO:tasks.workunit.client.0.vm03.stdout:9/88: symlink d2/l21 0 2026-03-10T14:07:50.042 INFO:tasks.workunit.client.0.vm03.stdout:9/89: write d2/d14/f1a [3243538,67474] 0 2026-03-10T14:07:50.046 INFO:tasks.workunit.client.1.vm04.stdout:7/547: write d2/dc/d4d/dcd/f3c [1382783,107083] 0 2026-03-10T14:07:50.060 INFO:tasks.workunit.client.1.vm04.stdout:8/598: dwrite d0/d3/d5/f30 [0,4194304] 0 2026-03-10T14:07:50.065 INFO:tasks.workunit.client.0.vm03.stdout:8/83: dwrite f2 [4194304,4194304] 0 2026-03-10T14:07:50.066 INFO:tasks.workunit.client.0.vm03.stdout:8/84: fsync da/f15 0 2026-03-10T14:07:50.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:50 vm04.local ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:07:50.072 INFO:tasks.workunit.client.0.vm03.stdout:8/85: read - f5 zero size 2026-03-10T14:07:50.072 INFO:tasks.workunit.client.0.vm03.stdout:8/86: readlink l1 0 2026-03-10T14:07:50.072 INFO:tasks.workunit.client.0.vm03.stdout:8/87: stat da/fe 0 2026-03-10T14:07:50.072 INFO:tasks.workunit.client.0.vm03.stdout:8/88: chown da/ff 194 1 2026-03-10T14:07:50.075 INFO:tasks.workunit.client.0.vm03.stdout:8/89: fdatasync f6 0 2026-03-10T14:07:50.080 INFO:tasks.workunit.client.0.vm03.stdout:8/90: dwrite da/ff [0,4194304] 0 2026-03-10T14:07:50.089 INFO:tasks.workunit.client.0.vm03.stdout:8/91: dread da/f17 [0,4194304] 0 2026-03-10T14:07:50.089 INFO:tasks.workunit.client.0.vm03.stdout:8/92: truncate da/f12 781680 0 2026-03-10T14:07:50.092 INFO:tasks.workunit.client.0.vm03.stdout:8/93: write f2 [8383225,94437] 0 2026-03-10T14:07:50.102 INFO:tasks.workunit.client.1.vm04.stdout:8/599: dread d0/d3/d5/f66 [0,4194304] 0 2026-03-10T14:07:50.102 INFO:tasks.workunit.client.0.vm03.stdout:8/94: read da/fd [7786,53279] 0 2026-03-10T14:07:50.102 INFO:tasks.workunit.client.0.vm03.stdout:8/95: mknod da/c1a 0 2026-03-10T14:07:50.103 INFO:tasks.workunit.client.0.vm03.stdout:8/96: creat da/f1b x:0 0 0 2026-03-10T14:07:50.108 INFO:tasks.workunit.client.0.vm03.stdout:8/97: unlink da/c1a 0 2026-03-10T14:07:50.108 INFO:tasks.workunit.client.0.vm03.stdout:8/98: write da/f12 [287624,83041] 0 2026-03-10T14:07:50.108 INFO:tasks.workunit.client.0.vm03.stdout:8/99: readlink l1 0 2026-03-10T14:07:50.171 INFO:tasks.workunit.client.0.vm03.stdout:1/69: dwrite d0/fc [0,4194304] 0 2026-03-10T14:07:50.172 INFO:tasks.workunit.client.0.vm03.stdout:1/70: write d0/f4 [3880167,41209] 0 2026-03-10T14:07:50.173 INFO:tasks.workunit.client.0.vm03.stdout:1/71: stat d0/fa 0 2026-03-10T14:07:50.174 INFO:tasks.workunit.client.0.vm03.stdout:1/72: chown d0/d2/f9 8938065 1 2026-03-10T14:07:50.176 INFO:tasks.workunit.client.0.vm03.stdout:1/73: dread d0/f10 [0,4194304] 0 2026-03-10T14:07:50.194 INFO:tasks.workunit.client.0.vm03.stdout:1/74: creat d0/d2/df/f1b x:0 0 0 2026-03-10T14:07:50.195 INFO:tasks.workunit.client.0.vm03.stdout:1/75: dwrite d0/d2/f14 [0,4194304] 0 2026-03-10T14:07:50.195 INFO:tasks.workunit.client.0.vm03.stdout:1/76: dwrite d0/d2/df/f1b [0,4194304] 0 2026-03-10T14:07:50.205 INFO:tasks.workunit.client.0.vm03.stdout:1/77: dwrite d0/d2/f1a [0,4194304] 0 2026-03-10T14:07:50.216 INFO:tasks.workunit.client.0.vm03.stdout:1/78: symlink d0/d18/l1c 0 2026-03-10T14:07:50.218 INFO:tasks.workunit.client.0.vm03.stdout:1/79: mkdir d0/d18/d1d 0 2026-03-10T14:07:50.220 INFO:tasks.workunit.client.0.vm03.stdout:1/80: write d0/f17 [1018543,58389] 0 2026-03-10T14:07:50.263 INFO:tasks.workunit.client.1.vm04.stdout:7/548: sync 2026-03-10T14:07:50.286 INFO:tasks.workunit.client.0.vm03.stdout:1/81: sync 2026-03-10T14:07:50.290 INFO:tasks.workunit.client.0.vm03.stdout:1/82: dwrite d0/d2/f14 [0,4194304] 0 2026-03-10T14:07:50.291 INFO:tasks.workunit.client.0.vm03.stdout:1/83: write d0/d2/f9 [4344303,117883] 0 2026-03-10T14:07:50.293 INFO:tasks.workunit.client.0.vm03.stdout:1/84: chown d0/f10 30415 1 2026-03-10T14:07:50.293 INFO:tasks.workunit.client.0.vm03.stdout:1/85: dread d0/d2/f1a [0,4194304] 0 2026-03-10T14:07:50.301 INFO:tasks.workunit.client.0.vm03.stdout:1/86: dwrite d0/d2/df/f1b [0,4194304] 0 2026-03-10T14:07:50.308 INFO:tasks.workunit.client.0.vm03.stdout:1/87: dread d0/f11 [0,4194304] 0 2026-03-10T14:07:50.317 INFO:tasks.workunit.client.0.vm03.stdout:1/88: creat d0/d2/df/d16/f1e x:0 0 0 2026-03-10T14:07:50.320 INFO:tasks.workunit.client.0.vm03.stdout:1/89: dread d0/d2/f14 [0,4194304] 0 2026-03-10T14:07:50.327 INFO:tasks.workunit.client.0.vm03.stdout:1/90: dwrite d0/d2/f9 [4194304,4194304] 0 2026-03-10T14:07:50.331 INFO:tasks.workunit.client.0.vm03.stdout:0/97: write d3/fe [5166351,127498] 0 2026-03-10T14:07:50.336 INFO:tasks.workunit.client.0.vm03.stdout:1/91: fdatasync d0/f11 0 2026-03-10T14:07:50.339 INFO:tasks.workunit.client.0.vm03.stdout:0/98: symlink d3/d11/l1c 0 2026-03-10T14:07:50.340 INFO:tasks.workunit.client.0.vm03.stdout:0/99: write d3/d17/f1a [693336,93707] 0 2026-03-10T14:07:50.341 INFO:tasks.workunit.client.0.vm03.stdout:0/100: fsync d3/f9 0 2026-03-10T14:07:50.342 INFO:tasks.workunit.client.0.vm03.stdout:0/101: mknod d3/d11/d1b/c1d 0 2026-03-10T14:07:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:50 vm03.local ceph-mon[49718]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:07:50.383 INFO:tasks.workunit.client.0.vm03.stdout:1/92: fsync d0/f4 0 2026-03-10T14:07:50.386 INFO:tasks.workunit.client.0.vm03.stdout:0/102: dread d3/d17/f1a [0,4194304] 0 2026-03-10T14:07:50.389 INFO:tasks.workunit.client.0.vm03.stdout:1/93: dread d0/f4 [4194304,4194304] 0 2026-03-10T14:07:50.391 INFO:tasks.workunit.client.0.vm03.stdout:0/103: dwrite d3/d11/f18 [0,4194304] 0 2026-03-10T14:07:50.400 INFO:tasks.workunit.client.0.vm03.stdout:0/104: creat d3/f1e x:0 0 0 2026-03-10T14:07:50.401 INFO:tasks.workunit.client.0.vm03.stdout:1/94: dwrite d0/d2/fe [0,4194304] 0 2026-03-10T14:07:50.403 INFO:tasks.workunit.client.0.vm03.stdout:1/95: write d0/fc [875642,69707] 0 2026-03-10T14:07:50.404 INFO:tasks.workunit.client.0.vm03.stdout:1/96: truncate d0/d2/f1a 5167170 0 2026-03-10T14:07:50.406 INFO:tasks.workunit.client.0.vm03.stdout:0/105: mknod d3/d11/c1f 0 2026-03-10T14:07:50.406 INFO:tasks.workunit.client.0.vm03.stdout:1/97: dread d0/f10 [0,4194304] 0 2026-03-10T14:07:50.409 INFO:tasks.workunit.client.0.vm03.stdout:0/106: mkdir d3/d17/d20 0 2026-03-10T14:07:50.409 INFO:tasks.workunit.client.0.vm03.stdout:1/98: unlink d0/d18/l1c 0 2026-03-10T14:07:50.411 INFO:tasks.workunit.client.0.vm03.stdout:0/107: mkdir d3/d16/d21 0 2026-03-10T14:07:50.411 INFO:tasks.workunit.client.0.vm03.stdout:0/108: dread - d3/f1e zero size 2026-03-10T14:07:50.415 INFO:tasks.workunit.client.0.vm03.stdout:0/109: rename d3/d11/f12 to d3/d17/d20/f22 0 2026-03-10T14:07:50.415 INFO:tasks.workunit.client.0.vm03.stdout:0/110: write d3/d11/f18 [2257017,110247] 0 2026-03-10T14:07:50.416 INFO:tasks.workunit.client.0.vm03.stdout:0/111: write d3/d11/f13 [544616,90268] 0 2026-03-10T14:07:50.418 INFO:tasks.workunit.client.0.vm03.stdout:0/112: dread d3/d17/f1a [0,4194304] 0 2026-03-10T14:07:50.419 INFO:tasks.workunit.client.1.vm04.stdout:2/554: rename d0/d14/d91/d8/dd/d26 to d0/d14/d91/d4a/d8c/dab 0 2026-03-10T14:07:50.420 INFO:tasks.workunit.client.1.vm04.stdout:0/508: rename d0/d2/d15 to d0/d2/d15/d22/d62/d9e 22 2026-03-10T14:07:50.420 INFO:tasks.workunit.client.1.vm04.stdout:2/555: stat d0/d14/d91/d8/d17/d4e/d85/f89 0 2026-03-10T14:07:50.424 INFO:tasks.workunit.client.0.vm03.stdout:0/113: symlink d3/d11/l23 0 2026-03-10T14:07:50.425 INFO:tasks.workunit.client.0.vm03.stdout:0/114: rename d3/d17 to d3/d17/d20/d24 22 2026-03-10T14:07:50.425 INFO:tasks.workunit.client.0.vm03.stdout:0/115: truncate d3/f19 123447 0 2026-03-10T14:07:50.426 INFO:tasks.workunit.client.1.vm04.stdout:2/556: creat d0/d14/d91/d4a/d8c/fac x:0 0 0 2026-03-10T14:07:50.426 INFO:tasks.workunit.client.0.vm03.stdout:1/99: sync 2026-03-10T14:07:50.431 INFO:tasks.workunit.client.0.vm03.stdout:1/100: unlink d0/fc 0 2026-03-10T14:07:50.432 INFO:tasks.workunit.client.1.vm04.stdout:0/509: dwrite d0/f99 [0,4194304] 0 2026-03-10T14:07:50.432 INFO:tasks.workunit.client.0.vm03.stdout:0/116: dwrite d3/f10 [0,4194304] 0 2026-03-10T14:07:50.434 INFO:tasks.workunit.client.0.vm03.stdout:1/101: creat d0/d2/df/f1f x:0 0 0 2026-03-10T14:07:50.434 INFO:tasks.workunit.client.0.vm03.stdout:0/117: readlink d3/d11/l23 0 2026-03-10T14:07:50.437 INFO:tasks.workunit.client.1.vm04.stdout:1/538: write d3/f8 [2385927,106156] 0 2026-03-10T14:07:50.438 INFO:tasks.workunit.client.0.vm03.stdout:1/102: mkdir d0/d2/df/d16/d20 0 2026-03-10T14:07:50.439 INFO:tasks.workunit.client.0.vm03.stdout:0/118: dread d3/d17/d20/f22 [0,4194304] 0 2026-03-10T14:07:50.440 INFO:tasks.workunit.client.0.vm03.stdout:1/103: creat d0/d18/f21 x:0 0 0 2026-03-10T14:07:50.444 INFO:tasks.workunit.client.1.vm04.stdout:2/557: mkdir d0/d14/d39/d47/d70/dad 0 2026-03-10T14:07:50.445 INFO:tasks.workunit.client.0.vm03.stdout:1/104: dwrite d0/d2/df/f1b [0,4194304] 0 2026-03-10T14:07:50.445 INFO:tasks.workunit.client.1.vm04.stdout:0/510: creat d0/d2/d25/f9f x:0 0 0 2026-03-10T14:07:50.446 INFO:tasks.workunit.client.1.vm04.stdout:2/558: dread - d0/d14/d39/d47/f7e zero size 2026-03-10T14:07:50.449 INFO:tasks.workunit.client.0.vm03.stdout:1/105: chown d0/d2/f14 7065720 1 2026-03-10T14:07:50.449 INFO:tasks.workunit.client.1.vm04.stdout:0/511: readlink d0/d2/d15/d22/l69 0 2026-03-10T14:07:50.449 INFO:tasks.workunit.client.0.vm03.stdout:1/106: truncate d0/fa 885180 0 2026-03-10T14:07:50.450 INFO:tasks.workunit.client.0.vm03.stdout:1/107: readlink d0/l12 0 2026-03-10T14:07:50.460 INFO:tasks.workunit.client.0.vm03.stdout:1/108: sync 2026-03-10T14:07:50.460 INFO:tasks.workunit.client.0.vm03.stdout:1/109: dread - d0/d2/df/d16/f1e zero size 2026-03-10T14:07:50.462 INFO:tasks.workunit.client.0.vm03.stdout:1/110: symlink d0/d18/l22 0 2026-03-10T14:07:50.464 INFO:tasks.workunit.client.0.vm03.stdout:1/111: symlink d0/d2/df/d16/d20/l23 0 2026-03-10T14:07:50.465 INFO:tasks.workunit.client.0.vm03.stdout:1/112: unlink d0/d2/df/d16/d20/l23 0 2026-03-10T14:07:50.467 INFO:tasks.workunit.client.0.vm03.stdout:1/113: link d0/d2/df/f1b d0/f24 0 2026-03-10T14:07:50.566 INFO:tasks.workunit.client.1.vm04.stdout:5/610: symlink d7/d26/d6b/lc3 0 2026-03-10T14:07:50.566 INFO:tasks.workunit.client.0.vm03.stdout:1/114: sync 2026-03-10T14:07:50.567 INFO:tasks.workunit.client.1.vm04.stdout:6/437: symlink d3/de/d35/d3a/d43/l7d 0 2026-03-10T14:07:50.572 INFO:tasks.workunit.client.0.vm03.stdout:1/115: dwrite d0/d2/df/f1f [0,4194304] 0 2026-03-10T14:07:50.573 INFO:tasks.workunit.client.1.vm04.stdout:6/438: mknod d3/de/d35/d3f/d2d/d32/d61/c7e 0 2026-03-10T14:07:50.574 INFO:tasks.workunit.client.1.vm04.stdout:2/559: sync 2026-03-10T14:07:50.575 INFO:tasks.workunit.client.1.vm04.stdout:8/600: dwrite d0/d3/d63/d29/f45 [4194304,4194304] 0 2026-03-10T14:07:50.577 INFO:tasks.workunit.client.0.vm03.stdout:1/116: dread d0/d2/f1a [0,4194304] 0 2026-03-10T14:07:50.578 INFO:tasks.workunit.client.0.vm03.stdout:1/117: chown d0/d2/df/d16/l19 7 1 2026-03-10T14:07:50.580 INFO:tasks.workunit.client.0.vm03.stdout:1/118: creat d0/d18/f25 x:0 0 0 2026-03-10T14:07:50.581 INFO:tasks.workunit.client.0.vm03.stdout:1/119: write d0/fa [689307,99913] 0 2026-03-10T14:07:50.582 INFO:tasks.workunit.client.0.vm03.stdout:1/120: dread d0/f11 [0,4194304] 0 2026-03-10T14:07:50.588 INFO:tasks.workunit.client.1.vm04.stdout:2/560: stat d0/d14/d91/d3a/d3e 0 2026-03-10T14:07:50.589 INFO:tasks.workunit.client.1.vm04.stdout:8/601: dwrite d0/d3/dd/d89/fa5 [0,4194304] 0 2026-03-10T14:07:50.589 INFO:tasks.workunit.client.0.vm03.stdout:1/121: creat d0/d18/f26 x:0 0 0 2026-03-10T14:07:50.589 INFO:tasks.workunit.client.0.vm03.stdout:1/122: mkdir d0/d2/df/d27 0 2026-03-10T14:07:50.589 INFO:tasks.workunit.client.0.vm03.stdout:1/123: dread d0/d2/df/f1f [0,4194304] 0 2026-03-10T14:07:50.590 INFO:tasks.workunit.client.0.vm03.stdout:1/124: mknod d0/d2/df/d16/d20/c28 0 2026-03-10T14:07:50.596 INFO:tasks.workunit.client.0.vm03.stdout:1/125: symlink d0/l29 0 2026-03-10T14:07:50.596 INFO:tasks.workunit.client.1.vm04.stdout:6/439: creat d3/de/d35/d3f/d2d/d32/d5c/f7f x:0 0 0 2026-03-10T14:07:50.596 INFO:tasks.workunit.client.1.vm04.stdout:2/561: truncate d0/d14/d1b/f8a 259751 0 2026-03-10T14:07:50.596 INFO:tasks.workunit.client.1.vm04.stdout:7/549: dwrite d2/d2a/f9c [0,4194304] 0 2026-03-10T14:07:50.596 INFO:tasks.workunit.client.1.vm04.stdout:6/440: write d3/f57 [4244695,88742] 0 2026-03-10T14:07:50.596 INFO:tasks.workunit.client.1.vm04.stdout:2/562: stat d0/d14/d91/f1d 0 2026-03-10T14:07:50.606 INFO:tasks.workunit.client.1.vm04.stdout:2/563: mknod d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/cae 0 2026-03-10T14:07:50.612 INFO:tasks.workunit.client.1.vm04.stdout:4/558: rename d4/d14/c9b to d4/df/d34/d6f/cc7 0 2026-03-10T14:07:50.618 INFO:tasks.workunit.client.1.vm04.stdout:2/564: dwrite d0/d14/d91/d3a/d3e/f61 [4194304,4194304] 0 2026-03-10T14:07:50.620 INFO:tasks.workunit.client.1.vm04.stdout:4/559: mkdir d4/df/db2/db4/d47/d4f/d8c/dc8 0 2026-03-10T14:07:50.620 INFO:tasks.workunit.client.1.vm04.stdout:2/565: dread - d0/d14/d91/d4a/fa6 zero size 2026-03-10T14:07:50.621 INFO:tasks.workunit.client.1.vm04.stdout:2/566: rmdir d0/d14/d39/d47/d70 39 2026-03-10T14:07:50.621 INFO:tasks.workunit.client.1.vm04.stdout:9/491: unlink d9/da/dd/f91 0 2026-03-10T14:07:50.625 INFO:tasks.workunit.client.1.vm04.stdout:3/561: rename da/dc/d35/d52/f72 to da/dc/d3f/d61/dc1/fc4 0 2026-03-10T14:07:50.626 INFO:tasks.workunit.client.1.vm04.stdout:2/567: truncate d0/d14/d91/d8/d17/d35/f94 3177902 0 2026-03-10T14:07:50.632 INFO:tasks.workunit.client.1.vm04.stdout:3/562: symlink da/dc/d3f/lc5 0 2026-03-10T14:07:50.635 INFO:tasks.workunit.client.1.vm04.stdout:2/568: symlink d0/d14/d91/d8/d17/d4e/d85/d86/laf 0 2026-03-10T14:07:50.635 INFO:tasks.workunit.client.1.vm04.stdout:3/563: unlink da/d30/f4e 0 2026-03-10T14:07:50.638 INFO:tasks.workunit.client.1.vm04.stdout:3/564: dwrite da/d30/f55 [0,4194304] 0 2026-03-10T14:07:50.645 INFO:tasks.workunit.client.1.vm04.stdout:3/565: fdatasync da/dc/d35/d52/fa8 0 2026-03-10T14:07:50.654 INFO:tasks.workunit.client.1.vm04.stdout:3/566: dread - da/dc/d35/d52/d53/f96 zero size 2026-03-10T14:07:50.654 INFO:tasks.workunit.client.1.vm04.stdout:3/567: rmdir da/dc/d35/d52/d53 39 2026-03-10T14:07:50.654 INFO:tasks.workunit.client.1.vm04.stdout:3/568: fsync da/dc/d3f/d61/f8c 0 2026-03-10T14:07:50.654 INFO:tasks.workunit.client.1.vm04.stdout:3/569: chown da/dc/l12 140661758 1 2026-03-10T14:07:50.675 INFO:tasks.workunit.client.1.vm04.stdout:6/441: creat d3/de/d35/f80 x:0 0 0 2026-03-10T14:07:50.676 INFO:tasks.workunit.client.1.vm04.stdout:8/602: rmdir d0/d3/d63/d29 39 2026-03-10T14:07:50.678 INFO:tasks.workunit.client.1.vm04.stdout:8/603: write d0/d3/d63/d12/f50 [2003408,26009] 0 2026-03-10T14:07:50.692 INFO:tasks.workunit.client.1.vm04.stdout:2/569: dread d0/d14/d1b/f29 [0,4194304] 0 2026-03-10T14:07:50.704 INFO:tasks.workunit.client.1.vm04.stdout:6/442: creat d3/de/d35/d3f/d2d/d32/f81 x:0 0 0 2026-03-10T14:07:50.729 INFO:tasks.workunit.client.1.vm04.stdout:6/443: read - d3/de/d35/d3a/d43/d4c/d5e/d76/f77 zero size 2026-03-10T14:07:50.729 INFO:tasks.workunit.client.1.vm04.stdout:6/444: stat d3/de/d35/d3f/d2d/f2e 0 2026-03-10T14:07:50.729 INFO:tasks.workunit.client.1.vm04.stdout:6/445: creat d3/de/d35/d3f/d2d/f82 x:0 0 0 2026-03-10T14:07:50.729 INFO:tasks.workunit.client.1.vm04.stdout:6/446: chown d3/de/d35/d3f/d2d/d32/d61/c72 4 1 2026-03-10T14:07:50.733 INFO:tasks.workunit.client.1.vm04.stdout:8/604: sync 2026-03-10T14:07:50.740 INFO:tasks.workunit.client.1.vm04.stdout:8/605: dread - d0/d3/fa3 zero size 2026-03-10T14:07:50.751 INFO:tasks.workunit.client.1.vm04.stdout:0/512: write d0/f70 [863259,15064] 0 2026-03-10T14:07:50.752 INFO:tasks.workunit.client.1.vm04.stdout:0/513: readlink d0/d2/d15/d49/d50/d61/l8c 0 2026-03-10T14:07:50.754 INFO:tasks.workunit.client.1.vm04.stdout:0/514: mknod d0/ca0 0 2026-03-10T14:07:50.755 INFO:tasks.workunit.client.1.vm04.stdout:0/515: chown d0/d2/d15/d49/d50/l8e 6 1 2026-03-10T14:07:50.755 INFO:tasks.workunit.client.1.vm04.stdout:0/516: chown d0/f72 109661 1 2026-03-10T14:07:50.756 INFO:tasks.workunit.client.1.vm04.stdout:8/606: dread d0/d3/d63/f5f [0,4194304] 0 2026-03-10T14:07:50.762 INFO:tasks.workunit.client.1.vm04.stdout:0/517: symlink d0/d2/d15/d22/d38/la1 0 2026-03-10T14:07:50.762 INFO:tasks.workunit.client.1.vm04.stdout:6/447: rename d3/de/d35/d3f/d2d/d32/d61 to d3/de/d35/d3f/d2d/d32/d23/d83 0 2026-03-10T14:07:50.763 INFO:tasks.workunit.client.1.vm04.stdout:5/611: dread d7/d2d/d32/f3b [0,4194304] 0 2026-03-10T14:07:50.780 INFO:tasks.workunit.client.1.vm04.stdout:0/518: mkdir d0/da2 0 2026-03-10T14:07:50.781 INFO:tasks.workunit.client.1.vm04.stdout:0/519: dread - d0/d2/d15/d22/d38/d56/d66/f54 zero size 2026-03-10T14:07:50.786 INFO:tasks.workunit.client.1.vm04.stdout:6/448: mknod d3/d1d/d73/c84 0 2026-03-10T14:07:50.790 INFO:tasks.workunit.client.1.vm04.stdout:5/612: readlink d7/d12/d2b/l90 0 2026-03-10T14:07:50.792 INFO:tasks.workunit.client.1.vm04.stdout:6/449: creat d3/de/d35/d3f/d2d/d32/d23/d83/f85 x:0 0 0 2026-03-10T14:07:50.794 INFO:tasks.workunit.client.1.vm04.stdout:8/607: truncate d0/d3/d63/d12/f2c 28556 0 2026-03-10T14:07:50.794 INFO:tasks.workunit.client.1.vm04.stdout:0/520: dwrite d0/d2/d15/d22/d38/d56/f58 [0,4194304] 0 2026-03-10T14:07:50.796 INFO:tasks.workunit.client.1.vm04.stdout:7/550: dwrite d2/dc/d4d/d7f/fc2 [0,4194304] 0 2026-03-10T14:07:50.800 INFO:tasks.workunit.client.1.vm04.stdout:6/450: chown d3/de/d35/d3f/d2d/f2e 19287 1 2026-03-10T14:07:50.802 INFO:tasks.workunit.client.1.vm04.stdout:6/451: chown d3/f2a 30 1 2026-03-10T14:07:50.803 INFO:tasks.workunit.client.0.vm03.stdout:5/107: truncate d4/d6/f10 3891861 0 2026-03-10T14:07:50.808 INFO:tasks.workunit.client.1.vm04.stdout:7/551: write d2/dc/d4d/dcd/fc5 [4183,31313] 0 2026-03-10T14:07:50.809 INFO:tasks.workunit.client.0.vm03.stdout:5/108: symlink d4/l2a 0 2026-03-10T14:07:50.809 INFO:tasks.workunit.client.1.vm04.stdout:7/552: write d2/dc/de/d2d/d60/fbd [37576,35460] 0 2026-03-10T14:07:50.810 INFO:tasks.workunit.client.0.vm03.stdout:5/109: mknod d4/d13/c2b 0 2026-03-10T14:07:50.812 INFO:tasks.workunit.client.1.vm04.stdout:5/613: mkdir d7/d9/dc4 0 2026-03-10T14:07:50.815 INFO:tasks.workunit.client.0.vm03.stdout:5/110: dwrite d4/d13/f24 [0,4194304] 0 2026-03-10T14:07:50.818 INFO:tasks.workunit.client.1.vm04.stdout:9/492: write d9/f4f [2607274,127127] 0 2026-03-10T14:07:50.819 INFO:tasks.workunit.client.1.vm04.stdout:0/521: readlink d0/l43 0 2026-03-10T14:07:50.822 INFO:tasks.workunit.client.1.vm04.stdout:0/522: write d0/d2/d15/d49/d50/d61/d75/f98 [953566,29281] 0 2026-03-10T14:07:50.826 INFO:tasks.workunit.client.1.vm04.stdout:6/452: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f86 x:0 0 0 2026-03-10T14:07:50.830 INFO:tasks.workunit.client.1.vm04.stdout:7/553: rmdir d2/dc/de/d2d/d38 39 2026-03-10T14:07:50.830 INFO:tasks.workunit.client.1.vm04.stdout:8/608: dread d0/d3/d63/d12/d69/f81 [0,4194304] 0 2026-03-10T14:07:50.833 INFO:tasks.workunit.client.1.vm04.stdout:3/570: write da/dc/d35/d52/d53/f96 [535138,125775] 0 2026-03-10T14:07:50.837 INFO:tasks.workunit.client.1.vm04.stdout:3/571: chown da/dc/d3f/d61/f94 5775354 1 2026-03-10T14:07:50.845 INFO:tasks.workunit.client.1.vm04.stdout:9/493: creat d9/da/d5d/fa7 x:0 0 0 2026-03-10T14:07:50.847 INFO:tasks.workunit.client.1.vm04.stdout:8/609: chown d0/d3/d73/db8/lbc 101972 1 2026-03-10T14:07:50.847 INFO:tasks.workunit.client.1.vm04.stdout:5/614: dread - d7/d59/fad zero size 2026-03-10T14:07:50.855 INFO:tasks.workunit.client.1.vm04.stdout:3/572: mknod da/d8e/db5/cc6 0 2026-03-10T14:07:50.856 INFO:tasks.workunit.client.1.vm04.stdout:1/539: write d3/f14 [3098694,66631] 0 2026-03-10T14:07:50.858 INFO:tasks.workunit.client.1.vm04.stdout:2/570: dwrite d0/d14/d91/d4a/d8c/dab/f33 [0,4194304] 0 2026-03-10T14:07:50.866 INFO:tasks.workunit.client.1.vm04.stdout:0/523: link d0/d2/d15/d49/c63 d0/d2/d90/ca3 0 2026-03-10T14:07:50.870 INFO:tasks.workunit.client.1.vm04.stdout:4/560: dread d4/d14/d3c/d5e/f79 [0,4194304] 0 2026-03-10T14:07:50.878 INFO:tasks.workunit.client.1.vm04.stdout:3/573: creat da/dc/d3f/d61/fc7 x:0 0 0 2026-03-10T14:07:50.879 INFO:tasks.workunit.client.1.vm04.stdout:8/610: mknod d0/d3/d63/cbe 0 2026-03-10T14:07:50.884 INFO:tasks.workunit.client.1.vm04.stdout:0/524: mkdir d0/d2/d15/d49/d50/d5c/da4 0 2026-03-10T14:07:50.884 INFO:tasks.workunit.client.1.vm04.stdout:0/525: read - d0/d2/d15/d22/d38/d56/f84 zero size 2026-03-10T14:07:50.886 INFO:tasks.workunit.client.1.vm04.stdout:7/554: link d2/dc/de/d2d/d38/f5a d2/d2a/d42/d86/fd0 0 2026-03-10T14:07:50.887 INFO:tasks.workunit.client.1.vm04.stdout:7/555: chown d2/dc/de/d2d/d60/d7c/d3b/f49 3 1 2026-03-10T14:07:50.890 INFO:tasks.workunit.client.1.vm04.stdout:1/540: truncate d3/d22/d63/f89 563049 0 2026-03-10T14:07:50.890 INFO:tasks.workunit.client.1.vm04.stdout:1/541: chown d3/f14 526231672 1 2026-03-10T14:07:50.891 INFO:tasks.workunit.client.1.vm04.stdout:1/542: read - d3/d20/d60/d82/d13/d38/db5/fb8 zero size 2026-03-10T14:07:50.894 INFO:tasks.workunit.client.1.vm04.stdout:8/611: rename d0/d3/dd/f5d to d0/d3/d63/d12/d51/d67/d96/fbf 0 2026-03-10T14:07:50.897 INFO:tasks.workunit.client.1.vm04.stdout:5/615: rmdir d7/d9/dc4 0 2026-03-10T14:07:50.898 INFO:tasks.workunit.client.1.vm04.stdout:4/561: mkdir d4/df/db2/db6/dc9 0 2026-03-10T14:07:50.900 INFO:tasks.workunit.client.1.vm04.stdout:0/526: mknod d0/d2/d15/d22/d62/ca5 0 2026-03-10T14:07:50.901 INFO:tasks.workunit.client.1.vm04.stdout:7/556: mknod d2/dc/d4d/dcd/dc6/cd1 0 2026-03-10T14:07:50.903 INFO:tasks.workunit.client.1.vm04.stdout:1/543: unlink d3/d8f/l90 0 2026-03-10T14:07:50.908 INFO:tasks.workunit.client.1.vm04.stdout:9/494: sync 2026-03-10T14:07:50.916 INFO:tasks.workunit.client.1.vm04.stdout:7/557: creat d2/dc/de/dae/fd2 x:0 0 0 2026-03-10T14:07:50.917 INFO:tasks.workunit.client.1.vm04.stdout:0/527: write d0/d2/d15/f57 [1177551,86184] 0 2026-03-10T14:07:50.918 INFO:tasks.workunit.client.1.vm04.stdout:0/528: chown d0/f70 21358 1 2026-03-10T14:07:50.918 INFO:tasks.workunit.client.1.vm04.stdout:0/529: chown d0/f70 2227013 1 2026-03-10T14:07:50.918 INFO:tasks.workunit.client.1.vm04.stdout:7/558: read d2/dc/d4d/dcd/fb0 [38081,48800] 0 2026-03-10T14:07:50.919 INFO:tasks.workunit.client.1.vm04.stdout:0/530: chown d0/d2/d15/d22/d38/d56/d66/f54 591 1 2026-03-10T14:07:50.922 INFO:tasks.workunit.client.1.vm04.stdout:0/531: stat d0/d2/d15/d22/d38/d56/f79 0 2026-03-10T14:07:50.922 INFO:tasks.workunit.client.1.vm04.stdout:3/574: getdents da/dc/d35/d52 0 2026-03-10T14:07:50.928 INFO:tasks.workunit.client.1.vm04.stdout:1/544: symlink d3/lbc 0 2026-03-10T14:07:50.928 INFO:tasks.workunit.client.1.vm04.stdout:8/612: rename d0/d3/d63/d12/f47 to d0/d3/dd/fc0 0 2026-03-10T14:07:50.929 INFO:tasks.workunit.client.1.vm04.stdout:9/495: creat d9/d1d/da5/fa8 x:0 0 0 2026-03-10T14:07:50.929 INFO:tasks.workunit.client.1.vm04.stdout:7/559: readlink d2/d6b/l6e 0 2026-03-10T14:07:50.932 INFO:tasks.workunit.client.1.vm04.stdout:5/616: link d7/d12/d2b/l90 d7/d12/d2b/d93/d9e/lc5 0 2026-03-10T14:07:50.933 INFO:tasks.workunit.client.1.vm04.stdout:8/613: fsync d0/d3/d63/d12/d51/f4f 0 2026-03-10T14:07:50.945 INFO:tasks.workunit.client.1.vm04.stdout:9/496: creat d9/d44/d4d/fa9 x:0 0 0 2026-03-10T14:07:50.946 INFO:tasks.workunit.client.1.vm04.stdout:9/497: chown d9/da/dd/d1c/da3 704 1 2026-03-10T14:07:50.947 INFO:tasks.workunit.client.1.vm04.stdout:1/545: creat d3/d22/d6d/fbd x:0 0 0 2026-03-10T14:07:50.951 INFO:tasks.workunit.client.1.vm04.stdout:1/546: dwrite d3/d20/d60/d82/d13/d38/db5/fb8 [0,4194304] 0 2026-03-10T14:07:50.983 INFO:tasks.workunit.client.0.vm03.stdout:2/109: getdents d5/d10/d17 0 2026-03-10T14:07:50.987 INFO:tasks.workunit.client.0.vm03.stdout:2/110: dwrite d5/d10/d1c/f1d [0,4194304] 0 2026-03-10T14:07:51.004 INFO:tasks.workunit.client.0.vm03.stdout:2/111: chown d5/le 120910 1 2026-03-10T14:07:51.004 INFO:tasks.workunit.client.0.vm03.stdout:2/112: dread - d5/f1e zero size 2026-03-10T14:07:51.005 INFO:tasks.workunit.client.0.vm03.stdout:2/113: chown d5/fa 16208 1 2026-03-10T14:07:51.007 INFO:tasks.workunit.client.0.vm03.stdout:2/114: dread d5/fa [0,4194304] 0 2026-03-10T14:07:51.007 INFO:tasks.workunit.client.0.vm03.stdout:2/115: fdatasync f4 0 2026-03-10T14:07:51.008 INFO:tasks.workunit.client.0.vm03.stdout:2/116: mknod d5/d10/d17/c25 0 2026-03-10T14:07:51.010 INFO:tasks.workunit.client.0.vm03.stdout:2/117: creat d5/d10/d17/f26 x:0 0 0 2026-03-10T14:07:51.012 INFO:tasks.workunit.client.0.vm03.stdout:2/118: link d5/d10/d17/c25 d5/d10/d17/c27 0 2026-03-10T14:07:51.012 INFO:tasks.workunit.client.0.vm03.stdout:2/119: readlink d5/l11 0 2026-03-10T14:07:51.013 INFO:tasks.workunit.client.0.vm03.stdout:2/120: creat d5/d10/d17/f28 x:0 0 0 2026-03-10T14:07:51.014 INFO:tasks.workunit.client.0.vm03.stdout:2/121: write d5/d10/f22 [2385750,127982] 0 2026-03-10T14:07:51.014 INFO:tasks.workunit.client.0.vm03.stdout:2/122: read - d5/f1e zero size 2026-03-10T14:07:51.015 INFO:tasks.workunit.client.0.vm03.stdout:2/123: write d5/d10/d1f/f24 [662526,87857] 0 2026-03-10T14:07:51.016 INFO:tasks.workunit.client.0.vm03.stdout:2/124: dread - d5/f23 zero size 2026-03-10T14:07:51.016 INFO:tasks.workunit.client.0.vm03.stdout:2/125: chown d5/f9 19713 1 2026-03-10T14:07:51.019 INFO:tasks.workunit.client.0.vm03.stdout:2/126: truncate d5/fb 588292 0 2026-03-10T14:07:51.020 INFO:tasks.workunit.client.0.vm03.stdout:2/127: readlink d5/l11 0 2026-03-10T14:07:51.023 INFO:tasks.workunit.client.0.vm03.stdout:2/128: symlink d5/d10/l29 0 2026-03-10T14:07:51.063 INFO:tasks.workunit.client.0.vm03.stdout:4/104: chown d5/d9/db/l1d 0 1 2026-03-10T14:07:51.064 INFO:tasks.workunit.client.0.vm03.stdout:4/105: mknod d5/c22 0 2026-03-10T14:07:51.066 INFO:tasks.workunit.client.0.vm03.stdout:4/106: unlink d5/d9/f17 0 2026-03-10T14:07:51.069 INFO:tasks.workunit.client.0.vm03.stdout:4/107: dwrite f3 [0,4194304] 0 2026-03-10T14:07:51.086 INFO:tasks.workunit.client.0.vm03.stdout:4/108: creat d5/d9/db/d19/d1e/f23 x:0 0 0 2026-03-10T14:07:51.092 INFO:tasks.workunit.client.0.vm03.stdout:3/75: getdents . 0 2026-03-10T14:07:51.100 INFO:tasks.workunit.client.0.vm03.stdout:3/76: chown l13 3 1 2026-03-10T14:07:51.100 INFO:tasks.workunit.client.0.vm03.stdout:4/109: creat d5/d9/db/f24 x:0 0 0 2026-03-10T14:07:51.101 INFO:tasks.workunit.client.0.vm03.stdout:4/110: dwrite d5/d9/db/d19/d1e/f23 [0,4194304] 0 2026-03-10T14:07:51.102 INFO:tasks.workunit.client.0.vm03.stdout:4/111: chown d5/f1f 60474728 1 2026-03-10T14:07:51.107 INFO:tasks.workunit.client.0.vm03.stdout:4/112: dwrite d5/d9/db/f20 [0,4194304] 0 2026-03-10T14:07:51.107 INFO:tasks.workunit.client.0.vm03.stdout:4/113: fdatasync d5/f1f 0 2026-03-10T14:07:51.110 INFO:tasks.workunit.client.0.vm03.stdout:3/77: rename l11 to l16 0 2026-03-10T14:07:51.128 INFO:tasks.workunit.client.0.vm03.stdout:4/114: link d5/d9/f16 d5/d9/f25 0 2026-03-10T14:07:51.130 INFO:tasks.workunit.client.0.vm03.stdout:3/78: creat f17 x:0 0 0 2026-03-10T14:07:51.130 INFO:tasks.workunit.client.0.vm03.stdout:6/90: rmdir d8 39 2026-03-10T14:07:51.132 INFO:tasks.workunit.client.0.vm03.stdout:3/79: dread - f15 zero size 2026-03-10T14:07:51.136 INFO:tasks.workunit.client.0.vm03.stdout:6/91: dwrite d8/fe [0,4194304] 0 2026-03-10T14:07:51.143 INFO:tasks.workunit.client.0.vm03.stdout:6/92: dwrite d8/db/df/f10 [4194304,4194304] 0 2026-03-10T14:07:51.199 INFO:tasks.workunit.client.0.vm03.stdout:7/120: truncate d5/d9/f10 423888 0 2026-03-10T14:07:51.241 INFO:tasks.workunit.client.1.vm04.stdout:6/453: truncate d3/ff 2052943 0 2026-03-10T14:07:51.246 INFO:tasks.workunit.client.0.vm03.stdout:7/121: sync 2026-03-10T14:07:51.248 INFO:tasks.workunit.client.0.vm03.stdout:7/122: stat d5/d9/f10 0 2026-03-10T14:07:51.250 INFO:tasks.workunit.client.0.vm03.stdout:7/123: write d5/fb [1833244,92372] 0 2026-03-10T14:07:51.255 INFO:tasks.workunit.client.0.vm03.stdout:7/124: dwrite d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:07:51.257 INFO:tasks.workunit.client.1.vm04.stdout:2/571: write d0/d14/d91/d4a/d8c/dab/f36 [480610,101699] 0 2026-03-10T14:07:51.258 INFO:tasks.workunit.client.0.vm03.stdout:7/125: mkdir d5/d9/d14/d21 0 2026-03-10T14:07:51.259 INFO:tasks.workunit.client.0.vm03.stdout:7/126: write d5/f6 [5743107,12390] 0 2026-03-10T14:07:51.268 INFO:tasks.workunit.client.0.vm03.stdout:7/127: rmdir d5/d9/d14/d1c/d20 0 2026-03-10T14:07:51.268 INFO:tasks.workunit.client.1.vm04.stdout:4/562: write d4/d14/f27 [2633081,82939] 0 2026-03-10T14:07:51.276 INFO:tasks.workunit.client.1.vm04.stdout:8/614: mkdir d0/dc1 0 2026-03-10T14:07:51.280 INFO:tasks.workunit.client.1.vm04.stdout:0/532: dwrite d0/d2/d25/f64 [0,4194304] 0 2026-03-10T14:07:51.285 INFO:tasks.workunit.client.1.vm04.stdout:0/533: chown d0/d2/d15/d22/d38/la1 331801 1 2026-03-10T14:07:51.297 INFO:tasks.workunit.client.1.vm04.stdout:9/498: chown d9/d44/d59/c87 120027844 1 2026-03-10T14:07:51.305 INFO:tasks.workunit.client.1.vm04.stdout:3/575: link da/dc/d35/d52/d6d/fab da/dc/d47/fc8 0 2026-03-10T14:07:51.309 INFO:tasks.workunit.client.1.vm04.stdout:1/547: mknod d3/d22/d2f/d57/cbe 0 2026-03-10T14:07:51.316 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:51 vm04.local ceph-mon[55966]: pgmap v154: 65 pgs: 65 active+clean; 1.4 GiB data, 5.3 GiB used, 115 GiB / 120 GiB avail; 23 MiB/s rd, 72 MiB/s wr, 298 op/s 2026-03-10T14:07:51.328 INFO:tasks.workunit.client.1.vm04.stdout:6/454: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:51.344 INFO:tasks.workunit.client.0.vm03.stdout:8/100: dwrite da/f17 [0,4194304] 0 2026-03-10T14:07:51.345 INFO:tasks.workunit.client.0.vm03.stdout:8/101: creat da/f1c x:0 0 0 2026-03-10T14:07:51.355 INFO:tasks.workunit.client.1.vm04.stdout:4/563: chown d4/df/c81 311 1 2026-03-10T14:07:51.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:51 vm03.local ceph-mon[49718]: pgmap v154: 65 pgs: 65 active+clean; 1.4 GiB data, 5.3 GiB used, 115 GiB / 120 GiB avail; 23 MiB/s rd, 72 MiB/s wr, 298 op/s 2026-03-10T14:07:51.366 INFO:tasks.workunit.client.0.vm03.stdout:8/102: dread f4 [0,4194304] 0 2026-03-10T14:07:51.370 INFO:tasks.workunit.client.1.vm04.stdout:1/548: creat d3/d5c/fbf x:0 0 0 2026-03-10T14:07:51.370 INFO:tasks.workunit.client.0.vm03.stdout:1/126: write d0/f24 [4536622,114641] 0 2026-03-10T14:07:51.372 INFO:tasks.workunit.client.0.vm03.stdout:1/127: write d0/d2/f9 [5067849,37534] 0 2026-03-10T14:07:51.373 INFO:tasks.workunit.client.0.vm03.stdout:8/103: dwrite f2 [4194304,4194304] 0 2026-03-10T14:07:51.381 INFO:tasks.workunit.client.0.vm03.stdout:1/128: creat d0/d2/f2a x:0 0 0 2026-03-10T14:07:51.384 INFO:tasks.workunit.client.0.vm03.stdout:8/104: symlink da/l1d 0 2026-03-10T14:07:51.389 INFO:tasks.workunit.client.0.vm03.stdout:8/105: unlink da/l18 0 2026-03-10T14:07:51.390 INFO:tasks.workunit.client.0.vm03.stdout:1/129: truncate d0/f11 26651 0 2026-03-10T14:07:51.391 INFO:tasks.workunit.client.0.vm03.stdout:8/106: dread da/f16 [0,4194304] 0 2026-03-10T14:07:51.394 INFO:tasks.workunit.client.0.vm03.stdout:8/107: mkdir da/d1e 0 2026-03-10T14:07:51.395 INFO:tasks.workunit.client.0.vm03.stdout:8/108: creat da/f1f x:0 0 0 2026-03-10T14:07:51.396 INFO:tasks.workunit.client.0.vm03.stdout:8/109: creat da/f20 x:0 0 0 2026-03-10T14:07:51.396 INFO:tasks.workunit.client.0.vm03.stdout:8/110: chown f4 925677 1 2026-03-10T14:07:51.396 INFO:tasks.workunit.client.0.vm03.stdout:8/111: stat da/f20 0 2026-03-10T14:07:51.399 INFO:tasks.workunit.client.0.vm03.stdout:8/112: rmdir da/d1e 0 2026-03-10T14:07:51.400 INFO:tasks.workunit.client.0.vm03.stdout:8/113: mknod da/c21 0 2026-03-10T14:07:51.401 INFO:tasks.workunit.client.0.vm03.stdout:8/114: symlink da/l22 0 2026-03-10T14:07:51.401 INFO:tasks.workunit.client.0.vm03.stdout:8/115: truncate da/fe 506895 0 2026-03-10T14:07:51.402 INFO:tasks.workunit.client.0.vm03.stdout:8/116: chown f7 6407 1 2026-03-10T14:07:51.403 INFO:tasks.workunit.client.0.vm03.stdout:8/117: mknod da/c23 0 2026-03-10T14:07:51.403 INFO:tasks.workunit.client.0.vm03.stdout:8/118: chown da/f1b 23687 1 2026-03-10T14:07:51.406 INFO:tasks.workunit.client.0.vm03.stdout:8/119: dread f6 [0,4194304] 0 2026-03-10T14:07:51.415 INFO:tasks.workunit.client.0.vm03.stdout:8/120: fdatasync f2 0 2026-03-10T14:07:51.417 INFO:tasks.workunit.client.0.vm03.stdout:8/121: mkdir da/d24 0 2026-03-10T14:07:51.419 INFO:tasks.workunit.client.0.vm03.stdout:8/122: creat da/d24/f25 x:0 0 0 2026-03-10T14:07:51.419 INFO:tasks.workunit.client.0.vm03.stdout:8/123: dread - da/f1f zero size 2026-03-10T14:07:51.420 INFO:tasks.workunit.client.0.vm03.stdout:8/124: mknod da/c26 0 2026-03-10T14:07:51.424 INFO:tasks.workunit.client.0.vm03.stdout:8/125: dread da/f12 [0,4194304] 0 2026-03-10T14:07:51.424 INFO:tasks.workunit.client.1.vm04.stdout:1/549: chown d3/d22/d63/d35/d6c/l86 2 1 2026-03-10T14:07:51.425 INFO:tasks.workunit.client.0.vm03.stdout:8/126: symlink da/d24/l27 0 2026-03-10T14:07:51.426 INFO:tasks.workunit.client.0.vm03.stdout:8/127: truncate da/f12 1076860 0 2026-03-10T14:07:51.426 INFO:tasks.workunit.client.0.vm03.stdout:8/128: write da/f20 [156442,68538] 0 2026-03-10T14:07:51.427 INFO:tasks.workunit.client.1.vm04.stdout:4/564: creat d4/d14/d64/fca x:0 0 0 2026-03-10T14:07:51.427 INFO:tasks.workunit.client.1.vm04.stdout:7/560: getdents d2/d2a/d42 0 2026-03-10T14:07:51.427 INFO:tasks.workunit.client.1.vm04.stdout:1/550: mknod d3/d20/d60/d82/d13/d38/cc0 0 2026-03-10T14:07:51.427 INFO:tasks.workunit.client.0.vm03.stdout:8/129: write da/f1b [352092,61141] 0 2026-03-10T14:07:51.445 INFO:tasks.workunit.client.1.vm04.stdout:7/561: creat d2/dc/de/d2d/d60/d81/fd3 x:0 0 0 2026-03-10T14:07:51.446 INFO:tasks.workunit.client.1.vm04.stdout:7/562: readlink d2/dc/de/d2d/d60/d7c/la7 0 2026-03-10T14:07:51.446 INFO:tasks.workunit.client.1.vm04.stdout:7/563: chown d2/f4 104 1 2026-03-10T14:07:51.449 INFO:tasks.workunit.client.1.vm04.stdout:6/455: write d3/ff [856576,63213] 0 2026-03-10T14:07:51.459 INFO:tasks.workunit.client.1.vm04.stdout:6/456: dread - d3/de/d35/d3f/d2d/d32/d23/d47/f62 zero size 2026-03-10T14:07:51.460 INFO:tasks.workunit.client.1.vm04.stdout:6/457: fsync d3/de/d35/d3f/d2d/d32/d23/d83/f85 0 2026-03-10T14:07:51.465 INFO:tasks.workunit.client.0.vm03.stdout:0/119: rename d3/d17/d20 to d3/d11/d25 0 2026-03-10T14:07:51.465 INFO:tasks.workunit.client.0.vm03.stdout:0/120: chown d3/d11/f18 151 1 2026-03-10T14:07:51.465 INFO:tasks.workunit.client.1.vm04.stdout:6/458: creat d3/de/d35/d3f/d2d/d32/d23/d83/f87 x:0 0 0 2026-03-10T14:07:51.467 INFO:tasks.workunit.client.0.vm03.stdout:4/115: rename d5/d9/db/c15 to d5/d9/c26 0 2026-03-10T14:07:51.469 INFO:tasks.workunit.client.0.vm03.stdout:0/121: mknod d3/d17/c26 0 2026-03-10T14:07:51.469 INFO:tasks.workunit.client.0.vm03.stdout:0/122: fsync d3/f19 0 2026-03-10T14:07:51.470 INFO:tasks.workunit.client.0.vm03.stdout:0/123: write d3/d11/f13 [300640,122010] 0 2026-03-10T14:07:51.470 INFO:tasks.workunit.client.0.vm03.stdout:0/124: read d3/f10 [2360417,71161] 0 2026-03-10T14:07:51.473 INFO:tasks.workunit.client.1.vm04.stdout:8/615: dwrite d0/d3/dd/d89/fa5 [0,4194304] 0 2026-03-10T14:07:51.483 INFO:tasks.workunit.client.1.vm04.stdout:1/551: link d3/d22/d63/d35/l61 d3/d22/d6d/lc1 0 2026-03-10T14:07:51.484 INFO:tasks.workunit.client.0.vm03.stdout:0/125: write d3/d11/d25/f22 [1427397,45963] 0 2026-03-10T14:07:51.484 INFO:tasks.workunit.client.0.vm03.stdout:0/126: chown d3/d11/f13 11074767 1 2026-03-10T14:07:51.485 INFO:tasks.workunit.client.0.vm03.stdout:4/116: rename d5/c1b to d5/d9/db/c27 0 2026-03-10T14:07:51.486 INFO:tasks.workunit.client.0.vm03.stdout:4/117: write d5/d9/db/f24 [284460,548] 0 2026-03-10T14:07:51.486 INFO:tasks.workunit.client.0.vm03.stdout:4/118: write d5/fe [516240,37760] 0 2026-03-10T14:07:51.488 INFO:tasks.workunit.client.1.vm04.stdout:5/617: truncate d7/d9/f20 2168587 0 2026-03-10T14:07:51.491 INFO:tasks.workunit.client.0.vm03.stdout:0/127: symlink d3/d11/d25/l27 0 2026-03-10T14:07:51.493 INFO:tasks.workunit.client.1.vm04.stdout:0/534: write d0/d2/d15/d22/d38/f5b [33052,63250] 0 2026-03-10T14:07:51.493 INFO:tasks.workunit.client.0.vm03.stdout:4/119: rename d5/d9/db/d19/d1e/f21 to d5/d9/db/f28 0 2026-03-10T14:07:51.493 INFO:tasks.workunit.client.0.vm03.stdout:4/120: fsync d5/d9/db/f20 0 2026-03-10T14:07:51.495 INFO:tasks.workunit.client.1.vm04.stdout:8/616: creat d0/d3/d63/d12/d51/d67/fc2 x:0 0 0 2026-03-10T14:07:51.497 INFO:tasks.workunit.client.1.vm04.stdout:9/499: write d9/da/dd/f24 [3154680,104526] 0 2026-03-10T14:07:51.502 INFO:tasks.workunit.client.0.vm03.stdout:0/128: creat d3/f28 x:0 0 0 2026-03-10T14:07:51.504 INFO:tasks.workunit.client.1.vm04.stdout:9/500: stat d9/d1d/c2b 0 2026-03-10T14:07:51.506 INFO:tasks.workunit.client.0.vm03.stdout:0/129: dwrite d3/d11/f18 [0,4194304] 0 2026-03-10T14:07:51.506 INFO:tasks.workunit.client.0.vm03.stdout:0/130: chown d3 460481 1 2026-03-10T14:07:51.512 INFO:tasks.workunit.client.1.vm04.stdout:2/572: truncate d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/f7b 2041450 0 2026-03-10T14:07:51.513 INFO:tasks.workunit.client.1.vm04.stdout:3/576: write da/dc/d35/d52/d6d/f9d [646968,35930] 0 2026-03-10T14:07:51.518 INFO:tasks.workunit.client.1.vm04.stdout:0/535: unlink d0/d2/d15/d22/d38/d56/d66/l5a 0 2026-03-10T14:07:51.536 INFO:tasks.workunit.client.1.vm04.stdout:7/564: write d2/d2a/f92 [616040,123235] 0 2026-03-10T14:07:51.537 INFO:tasks.workunit.client.1.vm04.stdout:3/577: fdatasync da/d30/f42 0 2026-03-10T14:07:51.539 INFO:tasks.workunit.client.1.vm04.stdout:4/565: dwrite d4/df/f4e [0,4194304] 0 2026-03-10T14:07:51.541 INFO:tasks.workunit.client.1.vm04.stdout:4/566: readlink d4/df/d34/d6f/l75 0 2026-03-10T14:07:51.546 INFO:tasks.workunit.client.1.vm04.stdout:6/459: write d3/de/d35/d3f/d2d/f21 [254496,46093] 0 2026-03-10T14:07:51.546 INFO:tasks.workunit.client.1.vm04.stdout:0/536: creat d0/d2/d15/d22/d62/fa6 x:0 0 0 2026-03-10T14:07:51.548 INFO:tasks.workunit.client.1.vm04.stdout:6/460: chown d3/de/d35/d3f/d2d 980842 1 2026-03-10T14:07:51.550 INFO:tasks.workunit.client.1.vm04.stdout:1/552: mkdir d3/d20/d60/d82/d13/d38/d58/d5b/d7b/dc2 0 2026-03-10T14:07:51.558 INFO:tasks.workunit.client.1.vm04.stdout:0/537: dwrite d0/f99 [4194304,4194304] 0 2026-03-10T14:07:51.559 INFO:tasks.workunit.client.1.vm04.stdout:6/461: dwrite d3/de/d35/d3f/d2d/f21 [0,4194304] 0 2026-03-10T14:07:51.565 INFO:tasks.workunit.client.1.vm04.stdout:7/565: symlink d2/dc/de/d2d/d60/d7c/d36/daa/ld4 0 2026-03-10T14:07:51.566 INFO:tasks.workunit.client.1.vm04.stdout:4/567: mkdir d4/d14/d64/dcb 0 2026-03-10T14:07:51.580 INFO:tasks.workunit.client.1.vm04.stdout:4/568: dread d4/d14/f27 [0,4194304] 0 2026-03-10T14:07:51.584 INFO:tasks.workunit.client.1.vm04.stdout:3/578: creat da/dc/d35/d52/da3/dac/fc9 x:0 0 0 2026-03-10T14:07:51.591 INFO:tasks.workunit.client.1.vm04.stdout:5/618: getdents d7/d2d/d76 0 2026-03-10T14:07:51.600 INFO:tasks.workunit.client.1.vm04.stdout:3/579: dread da/dc/d3f/d61/dc1/fc4 [0,4194304] 0 2026-03-10T14:07:51.610 INFO:tasks.workunit.client.1.vm04.stdout:0/538: mkdir d0/d2/d15/d22/d38/d56/da7 0 2026-03-10T14:07:51.626 INFO:tasks.workunit.client.1.vm04.stdout:3/580: dwrite da/dc/d35/d52/d53/f96 [0,4194304] 0 2026-03-10T14:07:51.626 INFO:tasks.workunit.client.1.vm04.stdout:6/462: rename d3/l79 to d3/de/d35/d3f/d2d/d32/l88 0 2026-03-10T14:07:51.626 INFO:tasks.workunit.client.1.vm04.stdout:6/463: fsync d3/de/d35/f80 0 2026-03-10T14:07:51.627 INFO:tasks.workunit.client.1.vm04.stdout:4/569: chown d4/df/db2/db4/d47/d70/la2 451 1 2026-03-10T14:07:51.627 INFO:tasks.workunit.client.1.vm04.stdout:7/566: dread d2/dc/de/d2d/d60/faf [0,4194304] 0 2026-03-10T14:07:51.628 INFO:tasks.workunit.client.1.vm04.stdout:5/619: dwrite d7/d12/d2b/d3e/d57/d8a/fb0 [0,4194304] 0 2026-03-10T14:07:51.630 INFO:tasks.workunit.client.1.vm04.stdout:3/581: creat da/dc/d35/d52/da3/dac/fca x:0 0 0 2026-03-10T14:07:51.638 INFO:tasks.workunit.client.1.vm04.stdout:5/620: stat d7/d2d/d32/l73 0 2026-03-10T14:07:51.641 INFO:tasks.workunit.client.1.vm04.stdout:0/539: rmdir d0/d6e/d87 0 2026-03-10T14:07:51.642 INFO:tasks.workunit.client.1.vm04.stdout:5/621: dwrite f4 [0,4194304] 0 2026-03-10T14:07:51.645 INFO:tasks.workunit.client.1.vm04.stdout:5/622: mkdir d7/d59/d7e/dc6 0 2026-03-10T14:07:51.646 INFO:tasks.workunit.client.1.vm04.stdout:5/623: rmdir d7/d12/d2b/d3e/d3f 39 2026-03-10T14:07:51.649 INFO:tasks.workunit.client.1.vm04.stdout:5/624: creat d7/d12/d2b/d3e/d3f/dc0/fc7 x:0 0 0 2026-03-10T14:07:51.657 INFO:tasks.workunit.client.0.vm03.stdout:5/111: truncate d4/d16/f1c 3231309 0 2026-03-10T14:07:51.689 INFO:tasks.workunit.client.1.vm04.stdout:8/617: dwrite d0/d3/d63/d12/d51/f97 [0,4194304] 0 2026-03-10T14:07:51.689 INFO:tasks.workunit.client.1.vm04.stdout:5/625: symlink d7/d12/d2b/d93/d9e/lc8 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/112: rename d4/c22 to d4/d13/d1f/c2c 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/113: stat d4/d6/de/c28 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/114: chown d4/f17 867 1 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/115: dwrite d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/116: write d4/d13/f24 [5094208,88793] 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/117: chown d4/d6/de 28 1 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/118: write d4/d16/d19/f25 [1163383,108817] 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/119: unlink d4/d16/f1b 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/120: dread d4/fd [0,4194304] 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/121: write d4/d6/f10 [41603,58022] 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/122: link d4/d16/d19/f25 d4/d16/f2d 0 2026-03-10T14:07:51.690 INFO:tasks.workunit.client.0.vm03.stdout:5/123: mknod d4/d13/c2e 0 2026-03-10T14:07:51.692 INFO:tasks.workunit.client.1.vm04.stdout:5/626: link d7/d26/d6b/d6e/fa3 d7/d2d/d69/fc9 0 2026-03-10T14:07:51.703 INFO:tasks.workunit.client.1.vm04.stdout:5/627: creat d7/d12/d2b/d93/fca x:0 0 0 2026-03-10T14:07:51.721 INFO:tasks.workunit.client.1.vm04.stdout:5/628: readlink d7/d12/d2b/d3e/d57/l6f 0 2026-03-10T14:07:51.782 INFO:tasks.workunit.client.1.vm04.stdout:2/573: dwrite d0/d14/d91/d4a/f7c [0,4194304] 0 2026-03-10T14:07:51.787 INFO:tasks.workunit.client.1.vm04.stdout:2/574: getdents d0/d14/d39/d47/d93 0 2026-03-10T14:07:51.788 INFO:tasks.workunit.client.0.vm03.stdout:5/124: sync 2026-03-10T14:07:51.789 INFO:tasks.workunit.client.0.vm03.stdout:5/125: read d4/d13/d1f/f20 [2576060,108533] 0 2026-03-10T14:07:51.799 INFO:tasks.workunit.client.0.vm03.stdout:5/126: dwrite d4/d6/f10 [0,4194304] 0 2026-03-10T14:07:51.803 INFO:tasks.workunit.client.1.vm04.stdout:9/501: sync 2026-03-10T14:07:51.804 INFO:tasks.workunit.client.0.vm03.stdout:5/127: symlink d4/d6/l2f 0 2026-03-10T14:07:51.805 INFO:tasks.workunit.client.1.vm04.stdout:9/502: rename d9/d33/f45 to d9/da/d5d/d81/faa 0 2026-03-10T14:07:51.806 INFO:tasks.workunit.client.0.vm03.stdout:5/128: mknod d4/d13/d1f/c30 0 2026-03-10T14:07:51.815 INFO:tasks.workunit.client.0.vm03.stdout:5/129: mknod d4/c31 0 2026-03-10T14:07:51.832 INFO:tasks.workunit.client.0.vm03.stdout:5/130: stat d4/d6/fb 0 2026-03-10T14:07:51.832 INFO:tasks.workunit.client.0.vm03.stdout:5/131: write d4/d13/d1f/f20 [4903621,27795] 0 2026-03-10T14:07:51.832 INFO:tasks.workunit.client.0.vm03.stdout:5/132: dread d4/f17 [0,4194304] 0 2026-03-10T14:07:51.832 INFO:tasks.workunit.client.0.vm03.stdout:5/133: chown d4/d13 1987 1 2026-03-10T14:07:51.832 INFO:tasks.workunit.client.0.vm03.stdout:5/134: getdents d4/d16/d19/d23 0 2026-03-10T14:07:51.832 INFO:tasks.workunit.client.0.vm03.stdout:5/135: dread d4/d6/f10 [0,4194304] 0 2026-03-10T14:07:51.869 INFO:tasks.workunit.client.1.vm04.stdout:9/503: dread d9/da/dd/f47 [0,4194304] 0 2026-03-10T14:07:51.872 INFO:tasks.workunit.client.1.vm04.stdout:9/504: read - d9/da/f9e zero size 2026-03-10T14:07:51.873 INFO:tasks.workunit.client.1.vm04.stdout:1/553: write d3/fa [1100264,16249] 0 2026-03-10T14:07:51.873 INFO:tasks.workunit.client.1.vm04.stdout:6/464: write d3/de/d35/d3a/d43/d4c/f4d [2223688,18596] 0 2026-03-10T14:07:51.878 INFO:tasks.workunit.client.1.vm04.stdout:9/505: rmdir d9/d1d 39 2026-03-10T14:07:51.883 INFO:tasks.workunit.client.1.vm04.stdout:7/567: write d2/d2a/f35 [745683,62253] 0 2026-03-10T14:07:51.883 INFO:tasks.workunit.client.1.vm04.stdout:0/540: write d0/d2/d25/f7f [2208845,12258] 0 2026-03-10T14:07:51.883 INFO:tasks.workunit.client.1.vm04.stdout:1/554: creat d3/d20/d60/d82/d13/fc3 x:0 0 0 2026-03-10T14:07:51.886 INFO:tasks.workunit.client.1.vm04.stdout:3/582: dwrite da/dc/d3f/d61/f94 [0,4194304] 0 2026-03-10T14:07:51.891 INFO:tasks.workunit.client.1.vm04.stdout:9/506: mkdir d9/d1d/da5/dab 0 2026-03-10T14:07:51.893 INFO:tasks.workunit.client.1.vm04.stdout:4/570: dwrite d4/df/db2/db4/d47/d70/d74/f76 [0,4194304] 0 2026-03-10T14:07:51.893 INFO:tasks.workunit.client.1.vm04.stdout:3/583: unlink da/f10 0 2026-03-10T14:07:51.895 INFO:tasks.workunit.client.1.vm04.stdout:3/584: write da/dc/d35/d52/d6d/f9d [1465820,115150] 0 2026-03-10T14:07:51.899 INFO:tasks.workunit.client.1.vm04.stdout:9/507: read d9/da/dd/f24 [3261316,14463] 0 2026-03-10T14:07:51.903 INFO:tasks.workunit.client.1.vm04.stdout:3/585: mkdir da/dc/d47/d9b/dcb 0 2026-03-10T14:07:51.904 INFO:tasks.workunit.client.1.vm04.stdout:3/586: chown da/d8e/db5/cc6 223547692 1 2026-03-10T14:07:51.905 INFO:tasks.workunit.client.1.vm04.stdout:3/587: stat da/dc/fc2 0 2026-03-10T14:07:51.906 INFO:tasks.workunit.client.1.vm04.stdout:9/508: fsync d9/da/dd/d74/f92 0 2026-03-10T14:07:51.918 INFO:tasks.workunit.client.1.vm04.stdout:9/509: truncate d9/d5c/f77 3428948 0 2026-03-10T14:07:51.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.924+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/746586211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d400a42d0 msgr2=0x7f9d400a46e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:51.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.924+0000 7f9d4e8b9700 1 --2- 192.168.123.103:0/746586211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d400a42d0 0x7f9d400a46e0 secure :-1 s=READY pgs=311 cs=0 l=1 rev1=1 crypto rx=0x7f9d44009b00 tx=0x7f9d44009e10 comp rx=0 tx=0).stop 2026-03-10T14:07:51.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.925+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/746586211 shutdown_connections 2026-03-10T14:07:51.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.925+0000 7f9d4e8b9700 1 --2- 192.168.123.103:0/746586211 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d400a5410 0x7f9d400a5880 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:51.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.926+0000 7f9d4e8b9700 1 --2- 192.168.123.103:0/746586211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d400a42d0 0x7f9d400a46e0 unknown :-1 s=CLOSED pgs=311 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:51.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.926+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/746586211 >> 192.168.123.103:0/746586211 conn(0x7f9d4009f7a0 msgr2=0x7f9d400a1bf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:51.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.926+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/746586211 shutdown_connections 2026-03-10T14:07:51.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.926+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/746586211 wait complete. 2026-03-10T14:07:51.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.927+0000 7f9d4e8b9700 1 Processor -- start 2026-03-10T14:07:51.925 INFO:tasks.workunit.client.1.vm04.stdout:9/510: dwrite d9/d44/d4d/f99 [0,4194304] 0 2026-03-10T14:07:51.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.930+0000 7f9d4e8b9700 1 -- start start 2026-03-10T14:07:51.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.930+0000 7f9d4e8b9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d400a42d0 0x7f9d400b36b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:51.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.930+0000 7f9d4e8b9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d400a5410 0x7f9d400b3bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:51.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.930+0000 7f9d4e8b9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d400b4210 con 0x7f9d400a42d0 2026-03-10T14:07:51.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.930+0000 7f9d4e8b9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d400b4350 con 0x7f9d400a5410 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.930+0000 7f9d4d0b6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d400a5410 0x7f9d400b3bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.930+0000 7f9d4d0b6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d400a5410 0x7f9d400b3bf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:39834/0 (socket says 192.168.123.103:39834) 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.930+0000 7f9d4d0b6700 1 -- 192.168.123.103:0/1429681961 learned_addr learned my addr 192.168.123.103:0/1429681961 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.931+0000 7f9d4d8b7700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d400a42d0 0x7f9d400b36b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.931+0000 7f9d4d0b6700 1 -- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d400a42d0 msgr2=0x7f9d400b36b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.931+0000 7f9d4d0b6700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d400a42d0 0x7f9d400b36b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.931+0000 7f9d4d0b6700 1 -- 192.168.123.103:0/1429681961 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d440097e0 con 0x7f9d400a5410 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.931+0000 7f9d4d0b6700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d400a5410 0x7f9d400b3bf0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f9d3800b770 tx=0x7f9d3800bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.933+0000 7f9d3effd700 1 -- 192.168.123.103:0/1429681961 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d3800f820 con 0x7f9d400a5410 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.933+0000 7f9d3effd700 1 -- 192.168.123.103:0/1429681961 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d3800fe60 con 0x7f9d400a5410 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.934+0000 7f9d3effd700 1 -- 192.168.123.103:0/1429681961 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d3800d610 con 0x7f9d400a5410 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.934+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/1429681961 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d401504c0 con 0x7f9d400a5410 2026-03-10T14:07:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.934+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/1429681961 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d40150990 con 0x7f9d400a5410 2026-03-10T14:07:51.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.936+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/1429681961 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d40004eb0 con 0x7f9d400a5410 2026-03-10T14:07:51.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.937+0000 7f9d3effd700 1 -- 192.168.123.103:0/1429681961 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9d3800f980 con 0x7f9d400a5410 2026-03-10T14:07:51.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.937+0000 7f9d3effd700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d3406c5a0 0x7f9d3406ea50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:51.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.937+0000 7f9d3effd700 1 -- 192.168.123.103:0/1429681961 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f9d3808b680 con 0x7f9d400a5410 2026-03-10T14:07:51.936 INFO:tasks.workunit.client.0.vm03.stdout:2/129: dwrite d5/d10/d1c/f1d [4194304,4194304] 0 2026-03-10T14:07:51.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.938+0000 7f9d4d8b7700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d3406c5a0 0x7f9d3406ea50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:51.940 INFO:tasks.workunit.client.1.vm04.stdout:9/511: dwrite d9/d5c/d93/d96/f9c [0,4194304] 0 2026-03-10T14:07:51.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.942+0000 7f9d3effd700 1 -- 192.168.123.103:0/1429681961 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9d38059910 con 0x7f9d400a5410 2026-03-10T14:07:51.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:51.942+0000 7f9d4d8b7700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d3406c5a0 0x7f9d3406ea50 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f9d44009fd0 tx=0x7f9d4401a040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:51.944 INFO:tasks.workunit.client.1.vm04.stdout:4/571: dread d4/df/d31/f3d [4194304,4194304] 0 2026-03-10T14:07:51.949 INFO:tasks.workunit.client.0.vm03.stdout:2/130: dwrite d5/d10/d1c/f1d [0,4194304] 0 2026-03-10T14:07:51.959 INFO:tasks.workunit.client.0.vm03.stdout:2/131: mkdir d5/d2a 0 2026-03-10T14:07:51.961 INFO:tasks.workunit.client.1.vm04.stdout:8/618: write d0/d3/dd/fc [1676576,15437] 0 2026-03-10T14:07:51.963 INFO:tasks.workunit.client.0.vm03.stdout:2/132: dwrite d5/f1e [0,4194304] 0 2026-03-10T14:07:51.967 INFO:tasks.workunit.client.1.vm04.stdout:9/512: link d9/d44/l79 d9/da/dd/d1c/da3/lac 0 2026-03-10T14:07:51.967 INFO:tasks.workunit.client.1.vm04.stdout:5/629: write d7/d12/d2b/d3e/d57/d77/fa2 [637576,62397] 0 2026-03-10T14:07:51.977 INFO:tasks.workunit.client.1.vm04.stdout:5/630: dread d7/d12/d2b/d3e/d3f/f88 [0,4194304] 0 2026-03-10T14:07:51.996 INFO:tasks.workunit.client.1.vm04.stdout:8/619: dread - d0/d3/d63/d29/fba zero size 2026-03-10T14:07:52.000 INFO:tasks.workunit.client.1.vm04.stdout:9/513: symlink d9/da/lad 0 2026-03-10T14:07:52.018 INFO:tasks.workunit.client.1.vm04.stdout:8/620: readlink d0/d3/l72 0 2026-03-10T14:07:52.022 INFO:tasks.workunit.client.1.vm04.stdout:6/465: write d3/de/d35/d3f/d2d/f34 [794382,58831] 0 2026-03-10T14:07:52.022 INFO:tasks.workunit.client.1.vm04.stdout:2/575: dwrite d0/d14/d1b/f32 [4194304,4194304] 0 2026-03-10T14:07:52.026 INFO:tasks.workunit.client.1.vm04.stdout:5/631: rename d7/d2d/d32/l40 to d7/lcb 0 2026-03-10T14:07:52.027 INFO:tasks.workunit.client.1.vm04.stdout:7/568: write d2/dc/de/d2d/d60/d7c/d44/f51 [2421399,73223] 0 2026-03-10T14:07:52.032 INFO:tasks.workunit.client.1.vm04.stdout:6/466: creat d3/de/d35/d3f/d2d/f89 x:0 0 0 2026-03-10T14:07:52.032 INFO:tasks.workunit.client.1.vm04.stdout:9/514: creat d9/fae x:0 0 0 2026-03-10T14:07:52.033 INFO:tasks.workunit.client.1.vm04.stdout:1/555: dwrite d3/d20/d60/d82/d13/d38/d58/d5b/f7c [0,4194304] 0 2026-03-10T14:07:52.039 INFO:tasks.workunit.client.1.vm04.stdout:3/588: dwrite da/d8e/f71 [0,4194304] 0 2026-03-10T14:07:52.040 INFO:tasks.workunit.client.1.vm04.stdout:5/632: symlink d7/d59/d7d/d9a/lcc 0 2026-03-10T14:07:52.046 INFO:tasks.workunit.client.1.vm04.stdout:9/515: symlink d9/d5c/d93/d96/d9d/laf 0 2026-03-10T14:07:52.046 INFO:tasks.workunit.client.1.vm04.stdout:1/556: mkdir d3/d20/d60/d82/d13/d38/db5/dc4 0 2026-03-10T14:07:52.049 INFO:tasks.workunit.client.1.vm04.stdout:7/569: dwrite d2/dc/d4d/dcd/f77 [0,4194304] 0 2026-03-10T14:07:52.050 INFO:tasks.workunit.client.1.vm04.stdout:3/589: creat da/dc/fcc x:0 0 0 2026-03-10T14:07:52.057 INFO:tasks.workunit.client.1.vm04.stdout:1/557: mkdir d3/d20/d60/d82/d13/da0/dc5 0 2026-03-10T14:07:52.058 INFO:tasks.workunit.client.1.vm04.stdout:9/516: mknod d9/d5c/d93/cb0 0 2026-03-10T14:07:52.067 INFO:tasks.workunit.client.1.vm04.stdout:6/467: creat d3/de/d35/d3a/d43/f8a x:0 0 0 2026-03-10T14:07:52.072 INFO:tasks.workunit.client.1.vm04.stdout:7/570: mkdir d2/dd5 0 2026-03-10T14:07:52.073 INFO:tasks.workunit.client.1.vm04.stdout:6/468: dwrite d3/de/d35/d3f/d2d/f82 [0,4194304] 0 2026-03-10T14:07:52.075 INFO:tasks.workunit.client.1.vm04.stdout:1/558: rename d3/d22/d2f/d57/f68 to d3/d20/d60/d82/d13/da0/dc5/fc6 0 2026-03-10T14:07:52.088 INFO:tasks.workunit.client.1.vm04.stdout:7/571: creat d2/dc/de/d2d/d5c/da9/fd6 x:0 0 0 2026-03-10T14:07:52.088 INFO:tasks.workunit.client.1.vm04.stdout:7/572: write d2/dc/de/dae/fd2 [644660,4671] 0 2026-03-10T14:07:52.088 INFO:tasks.workunit.client.1.vm04.stdout:1/559: symlink d3/d20/d60/d82/d13/d38/db5/lc7 0 2026-03-10T14:07:52.088 INFO:tasks.workunit.client.1.vm04.stdout:7/573: chown d2/dc/de/d2d/d60/d7c/d64/c76 2486 1 2026-03-10T14:07:52.088 INFO:tasks.workunit.client.1.vm04.stdout:1/560: dwrite d3/d5c/fbf [0,4194304] 0 2026-03-10T14:07:52.089 INFO:tasks.workunit.client.1.vm04.stdout:6/469: symlink d3/de/d35/d3f/d2d/d32/d23/d47/l8b 0 2026-03-10T14:07:52.089 INFO:tasks.workunit.client.1.vm04.stdout:3/590: getdents da/d8e/db5 0 2026-03-10T14:07:52.095 INFO:tasks.workunit.client.1.vm04.stdout:3/591: mkdir da/dc/d35/dcd 0 2026-03-10T14:07:52.099 INFO:tasks.workunit.client.1.vm04.stdout:3/592: rename da/dc/d3f/cad to da/d8e/cce 0 2026-03-10T14:07:52.100 INFO:tasks.workunit.client.1.vm04.stdout:4/572: dwrite d4/df/d34/fc5 [0,4194304] 0 2026-03-10T14:07:52.102 INFO:tasks.workunit.client.1.vm04.stdout:7/574: dwrite d2/dc/de/d2d/d38/f41 [4194304,4194304] 0 2026-03-10T14:07:52.102 INFO:tasks.workunit.client.1.vm04.stdout:3/593: truncate da/dc/d35/d52/da3/dac/fca 432297 0 2026-03-10T14:07:52.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.112+0000 7f9d4e8b9700 1 -- 192.168.123.103:0/1429681961 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9d40150cc0 con 0x7f9d3406c5a0 2026-03-10T14:07:52.113 INFO:tasks.workunit.client.1.vm04.stdout:7/575: mknod d2/dc/d4d/d7f/cd7 0 2026-03-10T14:07:52.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.115+0000 7f9d3effd700 1 -- 192.168.123.103:0/1429681961 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f9d40150cc0 con 0x7f9d3406c5a0 2026-03-10T14:07:52.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.121+0000 7f9d3cfb9700 1 -- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d3406c5a0 msgr2=0x7f9d3406ea50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.122+0000 7f9d3cfb9700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d3406c5a0 0x7f9d3406ea50 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f9d44009fd0 tx=0x7f9d4401a040 comp rx=0 tx=0).stop 2026-03-10T14:07:52.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.123+0000 7f9d3cfb9700 1 -- 192.168.123.103:0/1429681961 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d400a5410 msgr2=0x7f9d400b3bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.123+0000 7f9d3cfb9700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d400a5410 0x7f9d400b3bf0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f9d3800b770 tx=0x7f9d3800bb30 comp rx=0 tx=0).stop 2026-03-10T14:07:52.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.125+0000 7f9d3cfb9700 1 -- 192.168.123.103:0/1429681961 shutdown_connections 2026-03-10T14:07:52.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.125+0000 7f9d3cfb9700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d400a42d0 0x7f9d400b36b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.125+0000 7f9d3cfb9700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d3406c5a0 0x7f9d3406ea50 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.125+0000 7f9d3cfb9700 1 --2- 192.168.123.103:0/1429681961 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d400a5410 0x7f9d400b3bf0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.125+0000 7f9d3cfb9700 1 -- 192.168.123.103:0/1429681961 >> 192.168.123.103:0/1429681961 conn(0x7f9d4009f7a0 msgr2=0x7f9d400a8640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:52.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.126+0000 7f9d3cfb9700 1 -- 192.168.123.103:0/1429681961 shutdown_connections 2026-03-10T14:07:52.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.126+0000 7f9d3cfb9700 1 -- 192.168.123.103:0/1429681961 wait complete. 2026-03-10T14:07:52.138 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:07:52.182 INFO:tasks.workunit.client.1.vm04.stdout:1/561: dread d3/d22/d2f/f3c [0,4194304] 0 2026-03-10T14:07:52.184 INFO:tasks.workunit.client.1.vm04.stdout:1/562: dread d3/d5c/fbf [0,4194304] 0 2026-03-10T14:07:52.185 INFO:tasks.workunit.client.1.vm04.stdout:1/563: mknod d3/d22/d2f/d57/cc8 0 2026-03-10T14:07:52.187 INFO:tasks.workunit.client.1.vm04.stdout:1/564: chown d3/d20/d60/d82/d13/d38/d58/d5b/c87 5419 1 2026-03-10T14:07:52.189 INFO:tasks.workunit.client.1.vm04.stdout:1/565: truncate d3/d22/d2f/d57/fac 1524077 0 2026-03-10T14:07:52.191 INFO:tasks.workunit.client.1.vm04.stdout:1/566: link d3/d22/d63/c4a d3/d22/d63/cc9 0 2026-03-10T14:07:52.192 INFO:tasks.workunit.client.1.vm04.stdout:1/567: dread - d3/d22/f8e zero size 2026-03-10T14:07:52.193 INFO:tasks.workunit.client.1.vm04.stdout:1/568: symlink d3/d5c/lca 0 2026-03-10T14:07:52.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.221+0000 7f6c8176c700 1 -- 192.168.123.103:0/2341676048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1036d0 msgr2=0x7f6c7c103b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.222+0000 7f6c8176c700 1 --2- 192.168.123.103:0/2341676048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1036d0 0x7f6c7c103b20 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7f6c70009a60 tx=0x7f6c70009d70 comp rx=0 tx=0).stop 2026-03-10T14:07:52.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.227+0000 7f6c8176c700 1 -- 192.168.123.103:0/2341676048 shutdown_connections 2026-03-10T14:07:52.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.227+0000 7f6c8176c700 1 --2- 192.168.123.103:0/2341676048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1036d0 0x7f6c7c103b20 unknown :-1 s=CLOSED pgs=312 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.227+0000 7f6c8176c700 1 --2- 192.168.123.103:0/2341676048 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c7c1024d0 0x7f6c7c1028e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.227+0000 7f6c8176c700 1 -- 192.168.123.103:0/2341676048 >> 192.168.123.103:0/2341676048 conn(0x7f6c7c0fda60 msgr2=0x7f6c7c0ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:52.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.227+0000 7f6c8176c700 1 -- 192.168.123.103:0/2341676048 shutdown_connections 2026-03-10T14:07:52.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.227+0000 7f6c8176c700 1 -- 192.168.123.103:0/2341676048 wait complete. 2026-03-10T14:07:52.225 INFO:tasks.workunit.client.1.vm04.stdout:8/621: dwrite d0/d3/d73/f98 [0,4194304] 0 2026-03-10T14:07:52.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.228+0000 7f6c8176c700 1 Processor -- start 2026-03-10T14:07:52.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.228+0000 7f6c8176c700 1 -- start start 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c8176c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1024d0 0x7f6c7c197db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c8176c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c7c1036d0 0x7f6c7c1982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c8176c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c7c198910 con 0x7f6c7c1024d0 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c8176c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c7c198a50 con 0x7f6c7c1036d0 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c7bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1024d0 0x7f6c7c197db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c7bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1024d0 0x7f6c7c197db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47318/0 (socket says 192.168.123.103:47318) 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c7bfff700 1 -- 192.168.123.103:0/3084204939 learned_addr learned my addr 192.168.123.103:0/3084204939 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c7bfff700 1 -- 192.168.123.103:0/3084204939 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c7c1036d0 msgr2=0x7f6c7c1982f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c7bfff700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c7c1036d0 0x7f6c7c1982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.229+0000 7f6c7bfff700 1 -- 192.168.123.103:0/3084204939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6c70009710 con 0x7f6c7c1024d0 2026-03-10T14:07:52.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.230+0000 7f6c7bfff700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1024d0 0x7f6c7c197db0 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f6c6c00ea00 tx=0x7f6c6c00edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:52.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.230+0000 7f6c797fa700 1 -- 192.168.123.103:0/3084204939 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c6c00cb80 con 0x7f6c7c1024d0 2026-03-10T14:07:52.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.230+0000 7f6c797fa700 1 -- 192.168.123.103:0/3084204939 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6c6c004500 con 0x7f6c7c1024d0 2026-03-10T14:07:52.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.230+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6c7c19d500 con 0x7f6c7c1024d0 2026-03-10T14:07:52.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.230+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6c7c19da50 con 0x7f6c7c1024d0 2026-03-10T14:07:52.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.231+0000 7f6c797fa700 1 -- 192.168.123.103:0/3084204939 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c6c004020 con 0x7f6c7c1024d0 2026-03-10T14:07:52.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.231+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6c7c04ea50 con 0x7f6c7c1024d0 2026-03-10T14:07:52.235 INFO:tasks.workunit.client.1.vm04.stdout:2/576: dwrite d0/d14/d39/fa2 [0,4194304] 0 2026-03-10T14:07:52.235 INFO:tasks.workunit.client.1.vm04.stdout:5/633: dwrite d7/d26/f30 [0,4194304] 0 2026-03-10T14:07:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.234+0000 7f6c797fa700 1 -- 192.168.123.103:0/3084204939 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6c6c003710 con 0x7f6c7c1024d0 2026-03-10T14:07:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.234+0000 7f6c797fa700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c6406c720 0x7f6c6406ebd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.235+0000 7f6c797fa700 1 -- 192.168.123.103:0/3084204939 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f6c6c014070 con 0x7f6c7c1024d0 2026-03-10T14:07:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.235+0000 7f6c7b7fe700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c6406c720 0x7f6c6406ebd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.235+0000 7f6c7b7fe700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c6406c720 0x7f6c6406ebd0 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f6c70003820 tx=0x7f6c70005b40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.236+0000 7f6c797fa700 1 -- 192.168.123.103:0/3084204939 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6c6c050720 con 0x7f6c7c1024d0 2026-03-10T14:07:52.236 INFO:tasks.workunit.client.1.vm04.stdout:8/622: link d0/d3/d5/f66 d0/d3/fc3 0 2026-03-10T14:07:52.237 INFO:tasks.workunit.client.1.vm04.stdout:5/634: stat d7/d26/d6b/d6e 0 2026-03-10T14:07:52.238 INFO:tasks.workunit.client.1.vm04.stdout:8/623: write d0/d3/dd/d78/f8d [792126,17554] 0 2026-03-10T14:07:52.241 INFO:tasks.workunit.client.1.vm04.stdout:5/635: chown d7/d26/d6b/d6e/d82/ca8 141 1 2026-03-10T14:07:52.241 INFO:tasks.workunit.client.1.vm04.stdout:2/577: rename d0/d14/d91/d8/d17/d4e/d85/d86/l9b to d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/lb0 0 2026-03-10T14:07:52.241 INFO:tasks.workunit.client.1.vm04.stdout:8/624: rmdir d0/d3/d63/d12/d51/d67/d96 39 2026-03-10T14:07:52.242 INFO:tasks.workunit.client.1.vm04.stdout:8/625: write d0/d3/d63/d29/fba [93159,81430] 0 2026-03-10T14:07:52.243 INFO:tasks.workunit.client.1.vm04.stdout:5/636: chown d7/d12/d2b/d3e/c4f 4008 1 2026-03-10T14:07:52.245 INFO:tasks.workunit.client.1.vm04.stdout:2/578: creat d0/d14/d91/d4a/d8c/d92/fb1 x:0 0 0 2026-03-10T14:07:52.248 INFO:tasks.workunit.client.1.vm04.stdout:5/637: read d7/d12/d2b/f53 [1262789,36907] 0 2026-03-10T14:07:52.249 INFO:tasks.workunit.client.1.vm04.stdout:2/579: mknod d0/d14/d39/d47/cb2 0 2026-03-10T14:07:52.252 INFO:tasks.workunit.client.1.vm04.stdout:2/580: mkdir d0/d14/d91/d4a/d8c/dab/db3 0 2026-03-10T14:07:52.262 INFO:tasks.workunit.client.1.vm04.stdout:5/638: rename d7/d12/d2b/d3e/d57/c6a to d7/d26/d6b/d6e/ccd 0 2026-03-10T14:07:52.262 INFO:tasks.workunit.client.1.vm04.stdout:2/581: symlink d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/lb4 0 2026-03-10T14:07:52.262 INFO:tasks.workunit.client.1.vm04.stdout:8/626: dwrite d0/d3/d5/fb9 [0,4194304] 0 2026-03-10T14:07:52.264 INFO:tasks.workunit.client.1.vm04.stdout:5/639: readlink d7/d26/l3d 0 2026-03-10T14:07:52.268 INFO:tasks.workunit.client.1.vm04.stdout:8/627: dwrite d0/d3/d63/d12/d51/d67/fb2 [0,4194304] 0 2026-03-10T14:07:52.293 INFO:tasks.workunit.client.1.vm04.stdout:8/628: dwrite d0/d3/dd/fa7 [0,4194304] 0 2026-03-10T14:07:52.293 INFO:tasks.workunit.client.1.vm04.stdout:4/573: sync 2026-03-10T14:07:52.299 INFO:tasks.workunit.client.1.vm04.stdout:5/640: mkdir d7/d2d/d69/dce 0 2026-03-10T14:07:52.299 INFO:tasks.workunit.client.1.vm04.stdout:6/470: read d3/de/d35/d3a/d43/d4c/f4d [3114207,22183] 0 2026-03-10T14:07:52.300 INFO:tasks.workunit.client.1.vm04.stdout:7/576: read d2/dc/f8d [2796399,36799] 0 2026-03-10T14:07:52.304 INFO:tasks.workunit.client.1.vm04.stdout:9/517: dwrite f5 [4194304,4194304] 0 2026-03-10T14:07:52.310 INFO:tasks.workunit.client.1.vm04.stdout:7/577: dwrite d2/dc/de/f73 [4194304,4194304] 0 2026-03-10T14:07:52.324 INFO:tasks.workunit.client.1.vm04.stdout:8/629: getdents d0/dc1 0 2026-03-10T14:07:52.328 INFO:tasks.workunit.client.1.vm04.stdout:6/471: mknod d3/de/d35/d3f/d2d/d32/d23/d47/c8c 0 2026-03-10T14:07:52.331 INFO:tasks.workunit.client.1.vm04.stdout:9/518: mkdir d9/d5c/d93/db1 0 2026-03-10T14:07:52.338 INFO:tasks.workunit.client.1.vm04.stdout:7/578: symlink d2/dc/ld8 0 2026-03-10T14:07:52.343 INFO:tasks.workunit.client.1.vm04.stdout:8/630: unlink d0/f23 0 2026-03-10T14:07:52.343 INFO:tasks.workunit.client.1.vm04.stdout:6/472: rename d3/d1d/f41 to d3/f8d 0 2026-03-10T14:07:52.348 INFO:tasks.workunit.client.1.vm04.stdout:7/579: rename d2/dc/de/d2d/d60/d7c/l85 to d2/dc/d4d/dcd/ld9 0 2026-03-10T14:07:52.348 INFO:tasks.workunit.client.1.vm04.stdout:7/580: stat d2/d94/f7e 0 2026-03-10T14:07:52.348 INFO:tasks.workunit.client.1.vm04.stdout:7/581: dread - d2/dc/de/d2d/d5c/da9/fc3 zero size 2026-03-10T14:07:52.350 INFO:tasks.workunit.client.1.vm04.stdout:6/473: mkdir d3/de/d35/d3f/d2d/d32/d23/d24/d8e 0 2026-03-10T14:07:52.352 INFO:tasks.workunit.client.1.vm04.stdout:7/582: mknod d2/dc/de/d2d/d5c/cda 0 2026-03-10T14:07:52.353 INFO:tasks.workunit.client.1.vm04.stdout:6/474: mkdir d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f 0 2026-03-10T14:07:52.354 INFO:tasks.workunit.client.1.vm04.stdout:7/583: symlink d2/dc/de/d2d/d38/d50/dc8/ldb 0 2026-03-10T14:07:52.354 INFO:tasks.workunit.client.1.vm04.stdout:7/584: dread - d2/dc/de/d2d/d38/d50/dc8/f7b zero size 2026-03-10T14:07:52.365 INFO:tasks.workunit.client.1.vm04.stdout:9/519: dread d9/d33/f8f [0,4194304] 0 2026-03-10T14:07:52.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.378+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6c7c108020 con 0x7f6c6406c720 2026-03-10T14:07:52.376 INFO:tasks.workunit.client.1.vm04.stdout:3/594: dwrite da/dc/d35/d52/d6d/f75 [0,4194304] 0 2026-03-10T14:07:52.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.379+0000 7f6c797fa700 1 -- 192.168.123.103:0/3084204939 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f6c7c108020 con 0x7f6c6406c720 2026-03-10T14:07:52.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c6406c720 msgr2=0x7f6c6406ebd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c6406c720 0x7f6c6406ebd0 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f6c70003820 tx=0x7f6c70005b40 comp rx=0 tx=0).stop 2026-03-10T14:07:52.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1024d0 msgr2=0x7f6c7c197db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1024d0 0x7f6c7c197db0 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7f6c6c00ea00 tx=0x7f6c6c00edc0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 shutdown_connections 2026-03-10T14:07:52.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c7c1024d0 0x7f6c7c197db0 unknown :-1 s=CLOSED pgs=313 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6c6406c720 0x7f6c6406ebd0 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 --2- 192.168.123.103:0/3084204939 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c7c1036d0 0x7f6c7c1982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 >> 192.168.123.103:0/3084204939 conn(0x7f6c7c0fda60 msgr2=0x7f6c7c106900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:52.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 shutdown_connections 2026-03-10T14:07:52.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.382+0000 7f6c8176c700 1 -- 192.168.123.103:0/3084204939 wait complete. 2026-03-10T14:07:52.397 INFO:tasks.workunit.client.1.vm04.stdout:7/585: dread d2/f20 [0,4194304] 0 2026-03-10T14:07:52.397 INFO:tasks.workunit.client.1.vm04.stdout:7/586: chown d2/dc/de/l13 242827263 1 2026-03-10T14:07:52.397 INFO:tasks.workunit.client.1.vm04.stdout:3/595: symlink da/dc/d47/d9b/dcb/lcf 0 2026-03-10T14:07:52.397 INFO:tasks.workunit.client.1.vm04.stdout:7/587: fdatasync d2/dc/de/d2d/d38/f8a 0 2026-03-10T14:07:52.402 INFO:tasks.workunit.client.1.vm04.stdout:7/588: symlink d2/dc/de/d2d/d5c/da9/ldc 0 2026-03-10T14:07:52.405 INFO:tasks.workunit.client.1.vm04.stdout:7/589: mknod d2/dc/de/d2d/d38/d50/dc8/cdd 0 2026-03-10T14:07:52.406 INFO:tasks.workunit.client.1.vm04.stdout:7/590: dread - d2/dc/de/d2d/d60/d7c/d44/f66 zero size 2026-03-10T14:07:52.411 INFO:tasks.workunit.client.1.vm04.stdout:7/591: rename d2/dc/d4d/f74 to d2/dc/de/d2d/d60/d7c/d36/d8b/fde 0 2026-03-10T14:07:52.415 INFO:tasks.workunit.client.1.vm04.stdout:7/592: link d2/dc/de/d2d/d60/d7c/d36/l9d d2/dc/d4d/d7f/ldf 0 2026-03-10T14:07:52.418 INFO:tasks.workunit.client.1.vm04.stdout:7/593: symlink d2/dc/de/d2d/d5c/le0 0 2026-03-10T14:07:52.421 INFO:tasks.workunit.client.1.vm04.stdout:7/594: read - d2/fbe zero size 2026-03-10T14:07:52.422 INFO:tasks.workunit.client.1.vm04.stdout:7/595: fsync d2/dc/d4d/dcd/f7a 0 2026-03-10T14:07:52.423 INFO:tasks.workunit.client.1.vm04.stdout:7/596: readlink d2/dc/de/d2d/d60/d7c/d3b/l88 0 2026-03-10T14:07:52.430 INFO:tasks.workunit.client.1.vm04.stdout:7/597: dread d2/dc/de/f1e [0,4194304] 0 2026-03-10T14:07:52.444 INFO:tasks.workunit.client.1.vm04.stdout:7/598: dread d2/d2a/f35 [0,4194304] 0 2026-03-10T14:07:52.449 INFO:tasks.workunit.client.1.vm04.stdout:7/599: symlink d2/dc/de/d2d/d60/d81/le1 0 2026-03-10T14:07:52.451 INFO:tasks.workunit.client.1.vm04.stdout:8/631: fsync d0/d3/d73/f98 0 2026-03-10T14:07:52.452 INFO:tasks.workunit.client.1.vm04.stdout:8/632: write d0/d3/dd/d78/fae [43022,130814] 0 2026-03-10T14:07:52.454 INFO:tasks.workunit.client.1.vm04.stdout:7/600: dread d2/dc/de/f1e [0,4194304] 0 2026-03-10T14:07:52.459 INFO:tasks.workunit.client.1.vm04.stdout:7/601: symlink d2/dc/d4d/dcd/le2 0 2026-03-10T14:07:52.464 INFO:tasks.workunit.client.1.vm04.stdout:8/633: mknod d0/d75/d8a/cc4 0 2026-03-10T14:07:52.467 INFO:tasks.workunit.client.1.vm04.stdout:7/602: creat d2/dc/de/d2d/d60/d7c/d36/d8b/fe3 x:0 0 0 2026-03-10T14:07:52.467 INFO:tasks.workunit.client.1.vm04.stdout:7/603: fdatasync d2/dc/de/f98 0 2026-03-10T14:07:52.468 INFO:tasks.workunit.client.1.vm04.stdout:8/634: mknod d0/d3/dd/d89/db5/cc5 0 2026-03-10T14:07:52.473 INFO:tasks.workunit.client.1.vm04.stdout:7/604: fdatasync d2/dc/de/d2d/d60/d7c/f97 0 2026-03-10T14:07:52.473 INFO:tasks.workunit.client.1.vm04.stdout:1/569: dwrite d3/d22/d63/d35/f9a [0,4194304] 0 2026-03-10T14:07:52.474 INFO:tasks.workunit.client.0.vm03.stdout:4/121: dwrite d5/d9/fa [4194304,4194304] 0 2026-03-10T14:07:52.476 INFO:tasks.workunit.client.0.vm03.stdout:4/122: write d5/d9/db/f24 [1159549,73040] 0 2026-03-10T14:07:52.481 INFO:tasks.workunit.client.1.vm04.stdout:7/605: stat d2/dc/de/d2d/d60/d7c/d64/c76 0 2026-03-10T14:07:52.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.489+0000 7fb618e3e700 1 -- 192.168.123.103:0/716575918 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb614071980 msgr2=0x7fb614071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.489+0000 7fb618e3e700 1 --2- 192.168.123.103:0/716575918 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb614071980 0x7fb614071d90 secure :-1 s=READY pgs=314 cs=0 l=1 rev1=1 crypto rx=0x7fb604007780 tx=0x7fb60400c050 comp rx=0 tx=0).stop 2026-03-10T14:07:52.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.489+0000 7fb618e3e700 1 -- 192.168.123.103:0/716575918 shutdown_connections 2026-03-10T14:07:52.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.489+0000 7fb618e3e700 1 --2- 192.168.123.103:0/716575918 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614072360 0x7fb6140770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.489+0000 7fb618e3e700 1 --2- 192.168.123.103:0/716575918 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb614071980 0x7fb614071d90 unknown :-1 s=CLOSED pgs=314 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.489+0000 7fb618e3e700 1 -- 192.168.123.103:0/716575918 >> 192.168.123.103:0/716575918 conn(0x7fb61406d1a0 msgr2=0x7fb61406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:52.488 INFO:tasks.workunit.client.0.vm03.stdout:4/123: sync 2026-03-10T14:07:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.489+0000 7fb618e3e700 1 -- 192.168.123.103:0/716575918 shutdown_connections 2026-03-10T14:07:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.489+0000 7fb618e3e700 1 -- 192.168.123.103:0/716575918 wait complete. 2026-03-10T14:07:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.494+0000 7fb618e3e700 1 Processor -- start 2026-03-10T14:07:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.494+0000 7fb618e3e700 1 -- start start 2026-03-10T14:07:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.494+0000 7fb618e3e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614072360 0x7fb6141313a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.494+0000 7fb618e3e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb6141318e0 0x7fb61407f590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.494+0000 7fb618e3e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb614131de0 con 0x7fb6141318e0 2026-03-10T14:07:52.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.494+0000 7fb618e3e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb614131f20 con 0x7fb614072360 2026-03-10T14:07:52.493 INFO:tasks.workunit.client.0.vm03.stdout:4/124: rename d5/f1f to d5/d9/db/f29 0 2026-03-10T14:07:52.493 INFO:tasks.workunit.client.0.vm03.stdout:4/125: chown d5/d9/db/f20 14385462 1 2026-03-10T14:07:52.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.496+0000 7fb611d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb6141318e0 0x7fb61407f590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:52.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.496+0000 7fb611d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb6141318e0 0x7fb61407f590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47336/0 (socket says 192.168.123.103:47336) 2026-03-10T14:07:52.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.496+0000 7fb611d9b700 1 -- 192.168.123.103:0/509899684 learned_addr learned my addr 192.168.123.103:0/509899684 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:52.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.496+0000 7fb61259c700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614072360 0x7fb6141313a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:52.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.496+0000 7fb61259c700 1 -- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb6141318e0 msgr2=0x7fb61407f590 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.496+0000 7fb61259c700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb6141318e0 0x7fb61407f590 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.496+0000 7fb61259c700 1 -- 192.168.123.103:0/509899684 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb604007430 con 0x7fb614072360 2026-03-10T14:07:52.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.497+0000 7fb61259c700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614072360 0x7fb6141313a0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fb604007400 tx=0x7fb60400c990 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:52.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.497+0000 7fb6037fe700 1 -- 192.168.123.103:0/509899684 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb604021070 con 0x7fb614072360 2026-03-10T14:07:52.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.497+0000 7fb618e3e700 1 -- 192.168.123.103:0/509899684 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb61407fad0 con 0x7fb614072360 2026-03-10T14:07:52.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.497+0000 7fb618e3e700 1 -- 192.168.123.103:0/509899684 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb61407fff0 con 0x7fb614072360 2026-03-10T14:07:52.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.498+0000 7fb6037fe700 1 -- 192.168.123.103:0/509899684 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb604004500 con 0x7fb614072360 2026-03-10T14:07:52.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.498+0000 7fb6037fe700 1 -- 192.168.123.103:0/509899684 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb60400f050 con 0x7fb614072360 2026-03-10T14:07:52.497 INFO:tasks.workunit.client.0.vm03.stdout:4/126: creat d5/d9/db/f2a x:0 0 0 2026-03-10T14:07:52.497 INFO:tasks.workunit.client.1.vm04.stdout:7/606: creat d2/dc/d4d/fe4 x:0 0 0 2026-03-10T14:07:52.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.500+0000 7fb6037fe700 1 -- 192.168.123.103:0/509899684 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb604003680 con 0x7fb614072360 2026-03-10T14:07:52.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.500+0000 7fb6037fe700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb5fc06c6d0 0x7fb5fc06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.501+0000 7fb611d9b700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb5fc06c6d0 0x7fb5fc06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:52.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.501+0000 7fb6037fe700 1 -- 192.168.123.103:0/509899684 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb60408bb70 con 0x7fb614072360 2026-03-10T14:07:52.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.499+0000 7fb618e3e700 1 -- 192.168.123.103:0/509899684 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb5f4005320 con 0x7fb614072360 2026-03-10T14:07:52.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.502+0000 7fb611d9b700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb5fc06c6d0 0x7fb5fc06eb80 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fb60c00e3b0 tx=0x7fb60c00b040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:52.500 INFO:tasks.workunit.client.0.vm03.stdout:4/127: unlink d5/f14 0 2026-03-10T14:07:52.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.506+0000 7fb6037fe700 1 -- 192.168.123.103:0/509899684 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb604059af0 con 0x7fb614072360 2026-03-10T14:07:52.515 INFO:tasks.workunit.client.1.vm04.stdout:7/607: creat d2/d94/fe5 x:0 0 0 2026-03-10T14:07:52.519 INFO:tasks.workunit.client.1.vm04.stdout:1/570: link d3/d20/d60/d82/d13/d38/d58/d5b/d92/c95 d3/d22/d2f/d57/ccb 0 2026-03-10T14:07:52.519 INFO:tasks.workunit.client.1.vm04.stdout:0/541: dread d0/d2/d15/d49/d50/d61/f8d [0,4194304] 0 2026-03-10T14:07:52.529 INFO:tasks.workunit.client.1.vm04.stdout:7/608: read d2/dc/de/d2d/d60/f91 [319403,115221] 0 2026-03-10T14:07:52.530 INFO:tasks.workunit.client.1.vm04.stdout:1/571: fsync d3/d20/d60/d82/d13/d38/d58/d5b/d92/f93 0 2026-03-10T14:07:52.532 INFO:tasks.workunit.client.1.vm04.stdout:0/542: dread d0/d2/d15/d49/f94 [0,4194304] 0 2026-03-10T14:07:52.541 INFO:tasks.workunit.client.1.vm04.stdout:0/543: write d0/d2/d25/f2a [3983557,62398] 0 2026-03-10T14:07:52.550 INFO:tasks.workunit.client.1.vm04.stdout:0/544: symlink d0/d2/d15/d49/d50/d61/la8 0 2026-03-10T14:07:52.551 INFO:tasks.workunit.client.1.vm04.stdout:0/545: dread - d0/d2/d15/d22/d38/d56/f84 zero size 2026-03-10T14:07:52.553 INFO:tasks.workunit.client.1.vm04.stdout:0/546: mknod d0/d2/d15/d22/ca9 0 2026-03-10T14:07:52.562 INFO:tasks.workunit.client.1.vm04.stdout:0/547: mknod d0/da2/caa 0 2026-03-10T14:07:52.564 INFO:tasks.workunit.client.0.vm03.stdout:3/80: getdents . 0 2026-03-10T14:07:52.568 INFO:tasks.workunit.client.0.vm03.stdout:3/81: fsync f10 0 2026-03-10T14:07:52.568 INFO:tasks.workunit.client.0.vm03.stdout:3/82: read - fe zero size 2026-03-10T14:07:52.568 INFO:tasks.workunit.client.0.vm03.stdout:3/83: rmdir - no directory 2026-03-10T14:07:52.568 INFO:tasks.workunit.client.0.vm03.stdout:3/84: write fe [495695,113820] 0 2026-03-10T14:07:52.574 INFO:tasks.workunit.client.1.vm04.stdout:2/582: dwrite d0/d14/d91/d8/d17/d35/f5f [0,4194304] 0 2026-03-10T14:07:52.575 INFO:tasks.workunit.client.1.vm04.stdout:2/583: readlink d0/l2b 0 2026-03-10T14:07:52.578 INFO:tasks.workunit.client.1.vm04.stdout:0/548: dwrite d0/d2/d15/d49/f7c [0,4194304] 0 2026-03-10T14:07:52.579 INFO:tasks.workunit.client.1.vm04.stdout:2/584: chown d0/d14/d91/l3f 961375342 1 2026-03-10T14:07:52.582 INFO:tasks.workunit.client.1.vm04.stdout:2/585: fsync d0/d14/d39/fa2 0 2026-03-10T14:07:52.583 INFO:tasks.workunit.client.1.vm04.stdout:0/549: stat d0/d2/d25/f7f 0 2026-03-10T14:07:52.604 INFO:tasks.workunit.client.1.vm04.stdout:2/586: creat d0/d14/d91/d3a/fb5 x:0 0 0 2026-03-10T14:07:52.605 INFO:tasks.workunit.client.0.vm03.stdout:6/93: truncate d8/fd 1381599 0 2026-03-10T14:07:52.607 INFO:tasks.workunit.client.0.vm03.stdout:6/94: symlink d8/d11/d18/l1a 0 2026-03-10T14:07:52.611 INFO:tasks.workunit.client.0.vm03.stdout:6/95: rmdir d8/db 39 2026-03-10T14:07:52.611 INFO:tasks.workunit.client.0.vm03.stdout:6/96: write f2 [3590393,98377] 0 2026-03-10T14:07:52.611 INFO:tasks.workunit.client.0.vm03.stdout:6/97: fdatasync d8/fe 0 2026-03-10T14:07:52.613 INFO:tasks.workunit.client.1.vm04.stdout:0/550: read d0/f14 [2492964,71629] 0 2026-03-10T14:07:52.614 INFO:tasks.workunit.client.0.vm03.stdout:6/98: mkdir d8/d1b 0 2026-03-10T14:07:52.616 INFO:tasks.workunit.client.0.vm03.stdout:6/99: unlink d8/l15 0 2026-03-10T14:07:52.618 INFO:tasks.workunit.client.1.vm04.stdout:2/587: link d0/d14/d91/d4a/d8c/dab/f33 d0/d14/d1b/d45/fb6 0 2026-03-10T14:07:52.618 INFO:tasks.workunit.client.1.vm04.stdout:0/551: truncate d0/d2/d15/d22/f30 4955593 0 2026-03-10T14:07:52.618 INFO:tasks.workunit.client.0.vm03.stdout:6/100: mkdir d8/d1b/d1c 0 2026-03-10T14:07:52.624 INFO:tasks.workunit.client.1.vm04.stdout:0/552: rename d0/d2/d15/d22/d38/la1 to d0/d2/d15/d49/lab 0 2026-03-10T14:07:52.625 INFO:tasks.workunit.client.1.vm04.stdout:0/553: write d0/d2/d15/f59 [2645965,119819] 0 2026-03-10T14:07:52.625 INFO:tasks.workunit.client.1.vm04.stdout:0/554: chown d0/d2/d15/d22/d62/l85 5 1 2026-03-10T14:07:52.632 INFO:tasks.workunit.client.1.vm04.stdout:5/641: write d7/d2d/d32/f3b [2752573,86605] 0 2026-03-10T14:07:52.633 INFO:tasks.workunit.client.1.vm04.stdout:0/555: dread d0/d2/f12 [4194304,4194304] 0 2026-03-10T14:07:52.637 INFO:tasks.workunit.client.0.vm03.stdout:6/101: sync 2026-03-10T14:07:52.637 INFO:tasks.workunit.client.1.vm04.stdout:5/642: dwrite d7/d12/d2b/d3e/d3f/dc0/fc7 [0,4194304] 0 2026-03-10T14:07:52.637 INFO:tasks.workunit.client.0.vm03.stdout:6/102: chown d8/d11 46862 1 2026-03-10T14:07:52.640 INFO:tasks.workunit.client.0.vm03.stdout:6/103: dread f2 [4194304,4194304] 0 2026-03-10T14:07:52.644 INFO:tasks.workunit.client.1.vm04.stdout:2/588: getdents d0/d14/d91/d8/d17/d4e/d85 0 2026-03-10T14:07:52.645 INFO:tasks.workunit.client.0.vm03.stdout:9/90: dwrite d2/fc [0,4194304] 0 2026-03-10T14:07:52.646 INFO:tasks.workunit.client.1.vm04.stdout:2/589: chown d0/d14/d39/d47 146062852 1 2026-03-10T14:07:52.648 INFO:tasks.workunit.client.0.vm03.stdout:9/91: write d2/d14/f16 [479114,91873] 0 2026-03-10T14:07:52.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.650+0000 7fb618e3e700 1 -- 192.168.123.103:0/509899684 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fb5f4000bf0 con 0x7fb5fc06c6d0 2026-03-10T14:07:52.650 INFO:tasks.workunit.client.1.vm04.stdout:2/590: dwrite d0/d14/d91/d4a/fa6 [0,4194304] 0 2026-03-10T14:07:52.652 INFO:tasks.workunit.client.1.vm04.stdout:0/556: mknod d0/d2/d15/d22/d38/cac 0 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (3m) 2m ago 4m 21.4M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 error 2m ago 4m - - 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (4m) 2m ago 4m 8342k - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 2m ago 4m 7419k - 18.2.0 dc2bc1663786 57962aef7443 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (4m) 2m ago 4m 7407k - 18.2.0 dc2bc1663786 0918365fa827 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 2m ago 4m 82.1M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (2m) 2m ago 2m 16.4M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (2m) 2m ago 2m 16.1M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (2m) 2m ago 2m 16.8M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (2m) 2m ago 2m 18.5M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:9283,8765,8443 running (5m) 2m ago 5m 499M - 18.2.0 dc2bc1663786 378306a7bb3c 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (4m) 2m ago 4m 446M - 18.2.0 dc2bc1663786 f2d79432e040 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 2m ago 5m 54.7M 2048M 18.2.0 dc2bc1663786 f59cc7d5bdfd 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (3m) 2m ago 3m 44.2M 2048M 18.2.0 dc2bc1663786 4113774b34c7 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 2m ago 4m 14.2M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (4m) 2m ago 4m 14.1M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 2m ago 3m 48.8M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 2m ago 3m 45.7M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (3m) 2m ago 3m 45.3M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:07:52.656 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (3m) 2m ago 3m 47.2M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:07:52.657 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (2m) 2m ago 2m 45.6M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:07:52.657 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (2m) 2m ago 2m 44.0M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:07:52.657 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 2m ago 4m 36.2M - 2.43.0 a07b618ecd1d fcef697ff8c4 2026-03-10T14:07:52.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.658+0000 7fb6037fe700 1 -- 192.168.123.103:0/509899684 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3264 (secure 0 0 0) 0x7fb5f4000bf0 con 0x7fb5fc06c6d0 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 -- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb5fc06c6d0 msgr2=0x7fb5fc06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb5fc06c6d0 0x7fb5fc06eb80 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fb60c00e3b0 tx=0x7fb60c00b040 comp rx=0 tx=0).stop 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 -- 192.168.123.103:0/509899684 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614072360 msgr2=0x7fb6141313a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614072360 0x7fb6141313a0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fb604007400 tx=0x7fb60400c990 comp rx=0 tx=0).stop 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 -- 192.168.123.103:0/509899684 shutdown_connections 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb5fc06c6d0 0x7fb5fc06eb80 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb614072360 0x7fb6141313a0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 --2- 192.168.123.103:0/509899684 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb6141318e0 0x7fb61407f590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 -- 192.168.123.103:0/509899684 >> 192.168.123.103:0/509899684 conn(0x7fb61406d1a0 msgr2=0x7fb614076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 -- 192.168.123.103:0/509899684 shutdown_connections 2026-03-10T14:07:52.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.663+0000 7fb6017fa700 1 -- 192.168.123.103:0/509899684 wait complete. 2026-03-10T14:07:52.670 INFO:tasks.workunit.client.1.vm04.stdout:2/591: symlink d0/d14/d91/d4a/lb7 0 2026-03-10T14:07:52.673 INFO:tasks.workunit.client.1.vm04.stdout:0/557: mknod d0/d6e/cad 0 2026-03-10T14:07:52.681 INFO:tasks.workunit.client.1.vm04.stdout:4/574: write d4/d14/fad [774041,110413] 0 2026-03-10T14:07:52.691 INFO:tasks.workunit.client.0.vm03.stdout:9/92: unlink d2/ca 0 2026-03-10T14:07:52.695 INFO:tasks.workunit.client.0.vm03.stdout:9/93: symlink d2/d14/l22 0 2026-03-10T14:07:52.702 INFO:tasks.workunit.client.1.vm04.stdout:0/558: read d0/d2/f17 [1225531,82973] 0 2026-03-10T14:07:52.728 INFO:tasks.workunit.client.1.vm04.stdout:5/643: fsync d7/d12/d2b/f4d 0 2026-03-10T14:07:52.730 INFO:tasks.workunit.client.1.vm04.stdout:2/592: mkdir d0/db8 0 2026-03-10T14:07:52.731 INFO:tasks.workunit.client.1.vm04.stdout:2/593: dread - d0/d14/d39/d47/f7e zero size 2026-03-10T14:07:52.731 INFO:tasks.workunit.client.0.vm03.stdout:7/128: write d5/f7 [8298534,56806] 0 2026-03-10T14:07:52.734 INFO:tasks.workunit.client.1.vm04.stdout:4/575: chown d4/d14/dac/lbe 233842 1 2026-03-10T14:07:52.736 INFO:tasks.workunit.client.0.vm03.stdout:7/129: creat d5/d9/f22 x:0 0 0 2026-03-10T14:07:52.743 INFO:tasks.workunit.client.0.vm03.stdout:7/130: rename d5/d9/f1e to d5/d9/d14/f23 0 2026-03-10T14:07:52.743 INFO:tasks.workunit.client.0.vm03.stdout:7/131: fdatasync d5/f16 0 2026-03-10T14:07:52.744 INFO:tasks.workunit.client.0.vm03.stdout:7/132: truncate d5/d9/f1f 79493 0 2026-03-10T14:07:52.744 INFO:tasks.workunit.client.1.vm04.stdout:0/559: symlink d0/lae 0 2026-03-10T14:07:52.748 INFO:tasks.workunit.client.1.vm04.stdout:4/576: mknod d4/df/d34/ccc 0 2026-03-10T14:07:52.749 INFO:tasks.workunit.client.0.vm03.stdout:7/133: mknod d5/c24 0 2026-03-10T14:07:52.752 INFO:tasks.workunit.client.0.vm03.stdout:7/134: dwrite d5/d9/f1f [0,4194304] 0 2026-03-10T14:07:52.753 INFO:tasks.workunit.client.1.vm04.stdout:0/560: symlink d0/d2/d15/d22/d38/d56/d66/laf 0 2026-03-10T14:07:52.755 INFO:tasks.workunit.client.0.vm03.stdout:7/135: truncate d5/d9/f22 548741 0 2026-03-10T14:07:52.757 INFO:tasks.workunit.client.0.vm03.stdout:7/136: read d5/d9/f19 [1814199,52587] 0 2026-03-10T14:07:52.760 INFO:tasks.workunit.client.0.vm03.stdout:7/137: dwrite d5/f7 [0,4194304] 0 2026-03-10T14:07:52.762 INFO:tasks.workunit.client.1.vm04.stdout:2/594: truncate d0/d14/d39/d47/d70/f74 107378 0 2026-03-10T14:07:52.766 INFO:tasks.workunit.client.0.vm03.stdout:7/138: dread d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.767+0000 7f416cac4700 1 -- 192.168.123.103:0/2966479887 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168071950 msgr2=0x7f4168071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.767+0000 7f416cac4700 1 --2- 192.168.123.103:0/2966479887 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168071950 0x7f4168071d60 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f4158009a60 tx=0x7f4158009d70 comp rx=0 tx=0).stop 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.767+0000 7f416cac4700 1 -- 192.168.123.103:0/2966479887 shutdown_connections 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.767+0000 7f416cac4700 1 --2- 192.168.123.103:0/2966479887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4168072330 0x7f41680770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.767+0000 7f416cac4700 1 --2- 192.168.123.103:0/2966479887 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168071950 0x7f4168071d60 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.767+0000 7f416cac4700 1 -- 192.168.123.103:0/2966479887 >> 192.168.123.103:0/2966479887 conn(0x7f416806d1a0 msgr2=0x7f416806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.767+0000 7f416cac4700 1 -- 192.168.123.103:0/2966479887 shutdown_connections 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.767+0000 7f416cac4700 1 -- 192.168.123.103:0/2966479887 wait complete. 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f416cac4700 1 Processor -- start 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f416cac4700 1 -- start start 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f416cac4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168072330 0x7f4168131330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f416cac4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4168131870 0x7f416807f4d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f416cac4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4168131d70 con 0x7f4168131870 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f416cac4700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4168131ee0 con 0x7f4168072330 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f41677fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168072330 0x7f4168131330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f41677fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168072330 0x7f4168131330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:39896/0 (socket says 192.168.123.103:39896) 2026-03-10T14:07:52.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.768+0000 7f41677fe700 1 -- 192.168.123.103:0/527567025 learned_addr learned my addr 192.168.123.103:0/527567025 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:52.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.770+0000 7f4166ffd700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4168131870 0x7f416807f4d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:52.768 INFO:tasks.workunit.client.1.vm04.stdout:6/475: dwrite d3/de/f6d [0,4194304] 0 2026-03-10T14:07:52.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.771+0000 7f4166ffd700 1 -- 192.168.123.103:0/527567025 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168072330 msgr2=0x7f4168131330 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:52.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.771+0000 7f4166ffd700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168072330 0x7f4168131330 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:52.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.771+0000 7f4166ffd700 1 -- 192.168.123.103:0/527567025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4158009710 con 0x7f4168131870 2026-03-10T14:07:52.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.771+0000 7f4166ffd700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4168131870 0x7f416807f4d0 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7f416000e7c0 tx=0x7f416000ead0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:52.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.772+0000 7f4164ff9700 1 -- 192.168.123.103:0/527567025 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f416000f780 con 0x7f4168131870 2026-03-10T14:07:52.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.772+0000 7f416cac4700 1 -- 192.168.123.103:0/527567025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f416807fa70 con 0x7f4168131870 2026-03-10T14:07:52.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.772+0000 7f416cac4700 1 -- 192.168.123.103:0/527567025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f416807ff90 con 0x7f4168131870 2026-03-10T14:07:52.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.772+0000 7f4164ff9700 1 -- 192.168.123.103:0/527567025 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f416000fdc0 con 0x7f4168131870 2026-03-10T14:07:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.773+0000 7f4164ff9700 1 -- 192.168.123.103:0/527567025 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f416000f8e0 con 0x7f4168131870 2026-03-10T14:07:52.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.773+0000 7f416cac4700 1 -- 192.168.123.103:0/527567025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4154005320 con 0x7f4168131870 2026-03-10T14:07:52.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.777+0000 7f4164ff9700 1 -- 192.168.123.103:0/527567025 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4160015070 con 0x7f4168131870 2026-03-10T14:07:52.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.777+0000 7f4164ff9700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f415006c680 0x7f415006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:52.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.777+0000 7f41677fe700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f415006c680 0x7f415006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:52.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.778+0000 7f41677fe700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f415006c680 0x7f415006eb30 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f4158009a30 tx=0x7f4158019040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:52.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.778+0000 7f4164ff9700 1 -- 192.168.123.103:0/527567025 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f4160018470 con 0x7f4168131870 2026-03-10T14:07:52.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:52.784+0000 7f4164ff9700 1 -- 192.168.123.103:0/527567025 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f416005a4b0 con 0x7f4168131870 2026-03-10T14:07:52.798 INFO:tasks.workunit.client.1.vm04.stdout:4/577: dread d4/fe [0,4194304] 0 2026-03-10T14:07:53.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.029+0000 7f416cac4700 1 -- 192.168.123.103:0/527567025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f4154005cc0 con 0x7f4168131870 2026-03-10T14:07:53.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:52 vm03.local ceph-mon[49718]: from='client.24391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:53.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:52 vm03.local ceph-mon[49718]: from='client.14656 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:53.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:52 vm03.local ceph-mon[49718]: pgmap v155: 65 pgs: 65 active+clean; 1.5 GiB data, 5.6 GiB used, 114 GiB / 120 GiB avail; 24 MiB/s rd, 83 MiB/s wr, 301 op/s 2026-03-10T14:07:53.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:52 vm03.local ceph-mon[49718]: from='client.24395 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:53.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.034+0000 7f4164ff9700 1 -- 192.168.123.103:0/527567025 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f416005a040 con 0x7f4168131870 2026-03-10T14:07:53.032 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:07:53.032 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:07:53.032 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:07:53.032 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:07:53.032 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:07:53.032 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:07:53.033 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:07:53.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.040+0000 7f414e7fc700 1 -- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f415006c680 msgr2=0x7f415006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.040+0000 7f414e7fc700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f415006c680 0x7f415006eb30 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f4158009a30 tx=0x7f4158019040 comp rx=0 tx=0).stop 2026-03-10T14:07:53.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.040+0000 7f414e7fc700 1 -- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4168131870 msgr2=0x7f416807f4d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.040+0000 7f414e7fc700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4168131870 0x7f416807f4d0 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7f416000e7c0 tx=0x7f416000ead0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.042+0000 7f414e7fc700 1 -- 192.168.123.103:0/527567025 shutdown_connections 2026-03-10T14:07:53.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.042+0000 7f414e7fc700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f415006c680 0x7f415006eb30 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.042+0000 7f414e7fc700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4168072330 0x7f4168131330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.042+0000 7f414e7fc700 1 --2- 192.168.123.103:0/527567025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4168131870 0x7f416807f4d0 unknown :-1 s=CLOSED pgs=315 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.042+0000 7f414e7fc700 1 -- 192.168.123.103:0/527567025 >> 192.168.123.103:0/527567025 conn(0x7f416806d1a0 msgr2=0x7f41680705e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:53.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.043+0000 7f414e7fc700 1 -- 192.168.123.103:0/527567025 shutdown_connections 2026-03-10T14:07:53.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.043+0000 7f414e7fc700 1 -- 192.168.123.103:0/527567025 wait complete. 2026-03-10T14:07:53.107 INFO:tasks.workunit.client.1.vm04.stdout:9/520: dwrite d9/ff [4194304,4194304] 0 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 -- 192.168.123.103:0/3402407529 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6398071980 msgr2=0x7f6398071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 --2- 192.168.123.103:0/3402407529 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6398071980 0x7f6398071d90 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f63880099c0 tx=0x7f6388009cd0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 -- 192.168.123.103:0/3402407529 shutdown_connections 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 --2- 192.168.123.103:0/3402407529 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6398072360 0x7f63980770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 --2- 192.168.123.103:0/3402407529 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6398071980 0x7f6398071d90 unknown :-1 s=CLOSED pgs=316 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 -- 192.168.123.103:0/3402407529 >> 192.168.123.103:0/3402407529 conn(0x7f639806d1a0 msgr2=0x7f639806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 -- 192.168.123.103:0/3402407529 shutdown_connections 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 -- 192.168.123.103:0/3402407529 wait complete. 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 Processor -- start 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 -- start start 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6398072360 0x7f6398131340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6398131880 0x7f639807f4e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6398131d80 con 0x7f6398131880 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.125+0000 7f639e22a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6398131ef0 con 0x7f6398072360 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f63977fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6398072360 0x7f6398131340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f63977fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6398072360 0x7f6398131340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:39918/0 (socket says 192.168.123.103:39918) 2026-03-10T14:07:53.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f63977fe700 1 -- 192.168.123.103:0/1384323973 learned_addr learned my addr 192.168.123.103:0/1384323973 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f63977fe700 1 -- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6398131880 msgr2=0x7f639807f4e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f63977fe700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6398131880 0x7f639807f4e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f63977fe700 1 -- 192.168.123.103:0/1384323973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f63880096b0 con 0x7f6398072360 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f63977fe700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6398072360 0x7f6398131340 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f638800f6c0 tx=0x7f638800f7a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f6394ff9700 1 -- 192.168.123.103:0/1384323973 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f63880050d0 con 0x7f6398072360 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f639e22a700 1 -- 192.168.123.103:0/1384323973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f639807fa20 con 0x7f6398072360 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.126+0000 7f639e22a700 1 -- 192.168.123.103:0/1384323973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f639807fee0 con 0x7f6398072360 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.127+0000 7f6394ff9700 1 -- 192.168.123.103:0/1384323973 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6388005230 con 0x7f6398072360 2026-03-10T14:07:53.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.127+0000 7f6394ff9700 1 -- 192.168.123.103:0/1384323973 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f638801f430 con 0x7f6398072360 2026-03-10T14:07:53.125 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.127+0000 7f639e22a700 1 -- 192.168.123.103:0/1384323973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6384005320 con 0x7f6398072360 2026-03-10T14:07:53.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.128+0000 7f6394ff9700 1 -- 192.168.123.103:0/1384323973 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6388003680 con 0x7f6398072360 2026-03-10T14:07:53.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.128+0000 7f6394ff9700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f638006c6d0 0x7f638006eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.128+0000 7f6394ff9700 1 -- 192.168.123.103:0/1384323973 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f6388012070 con 0x7f6398072360 2026-03-10T14:07:53.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.129+0000 7f6396ffd700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f638006c6d0 0x7f638006eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:53.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.129+0000 7f6396ffd700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f638006c6d0 0x7f638006eb80 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f63900060b0 tx=0x7f6390006040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:53.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.130+0000 7f6394ff9700 1 -- 192.168.123.103:0/1384323973 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f63880570f0 con 0x7f6398072360 2026-03-10T14:07:53.159 INFO:tasks.workunit.client.0.vm03.stdout:1/130: read d0/f11 [4271,59515] 0 2026-03-10T14:07:53.160 INFO:tasks.workunit.client.1.vm04.stdout:3/596: dwrite da/d3e/f63 [0,4194304] 0 2026-03-10T14:07:53.161 INFO:tasks.workunit.client.1.vm04.stdout:3/597: readlink da/dc/d3f/l4b 0 2026-03-10T14:07:53.166 INFO:tasks.workunit.client.1.vm04.stdout:8/635: write d0/d3/d63/f5f [884427,78512] 0 2026-03-10T14:07:53.166 INFO:tasks.workunit.client.1.vm04.stdout:7/609: write d2/dc/d4d/dcd/fb0 [157891,9339] 0 2026-03-10T14:07:53.169 INFO:tasks.workunit.client.1.vm04.stdout:1/572: dwrite d3/d20/d60/d82/d13/d38/f3d [0,4194304] 0 2026-03-10T14:07:53.198 INFO:tasks.workunit.client.1.vm04.stdout:0/561: creat d0/fb0 x:0 0 0 2026-03-10T14:07:53.198 INFO:tasks.workunit.client.1.vm04.stdout:0/562: rename d0 to d0/d2/d15/d49/d50/d61/d75/db1 22 2026-03-10T14:07:53.199 INFO:tasks.workunit.client.1.vm04.stdout:9/521: mknod d9/da/dd/cb2 0 2026-03-10T14:07:53.199 INFO:tasks.workunit.client.1.vm04.stdout:2/595: link d0/d14/d91/d8/d17/d4e/l9a d0/d14/d39/lb9 0 2026-03-10T14:07:53.200 INFO:tasks.workunit.client.1.vm04.stdout:2/596: chown d0/d14/d91/d8/d17/d4e/d85/f90 13014400 1 2026-03-10T14:07:53.228 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:52 vm04.local ceph-mon[55966]: from='client.24391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:53.228 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:52 vm04.local ceph-mon[55966]: from='client.14656 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:53.228 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:52 vm04.local ceph-mon[55966]: pgmap v155: 65 pgs: 65 active+clean; 1.5 GiB data, 5.6 GiB used, 114 GiB / 120 GiB avail; 24 MiB/s rd, 83 MiB/s wr, 301 op/s 2026-03-10T14:07:53.228 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:52 vm04.local ceph-mon[55966]: from='client.24395 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.280+0000 7f639e22a700 1 -- 192.168.123.103:0/1384323973 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f6384005cc0 con 0x7f6398072360 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.280+0000 7f6394ff9700 1 -- 192.168.123.103:0/1384323973 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1882 (secure 0 0 0) 0x7f6388024770 con 0x7f6398072360 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:07:53.278 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:07:53.279 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:07:53.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.291+0000 7f637e7fc700 1 -- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f638006c6d0 msgr2=0x7f638006eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.291+0000 7f637e7fc700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f638006c6d0 0x7f638006eb80 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f63900060b0 tx=0x7f6390006040 comp rx=0 tx=0).stop 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.291+0000 7f637e7fc700 1 -- 192.168.123.103:0/1384323973 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6398072360 msgr2=0x7f6398131340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.291+0000 7f637e7fc700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6398072360 0x7f6398131340 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f638800f6c0 tx=0x7f638800f7a0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.292+0000 7f637e7fc700 1 -- 192.168.123.103:0/1384323973 shutdown_connections 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.292+0000 7f637e7fc700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f638006c6d0 0x7f638006eb80 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.292+0000 7f637e7fc700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6398072360 0x7f6398131340 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.292+0000 7f637e7fc700 1 --2- 192.168.123.103:0/1384323973 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6398131880 0x7f639807f4e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.292+0000 7f637e7fc700 1 -- 192.168.123.103:0/1384323973 >> 192.168.123.103:0/1384323973 conn(0x7f639806d1a0 msgr2=0x7f63980705c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.292+0000 7f637e7fc700 1 -- 192.168.123.103:0/1384323973 shutdown_connections 2026-03-10T14:07:53.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.293+0000 7f637e7fc700 1 -- 192.168.123.103:0/1384323973 wait complete. 2026-03-10T14:07:53.292 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:07:53.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.380+0000 7f29d6e80700 1 -- 192.168.123.103:0/2804785389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29d0071a60 msgr2=0x7f29d0071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.380+0000 7f29d6e80700 1 --2- 192.168.123.103:0/2804785389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29d0071a60 0x7f29d0071e70 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7f29cc009b00 tx=0x7f29cc009e10 comp rx=0 tx=0).stop 2026-03-10T14:07:53.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.380+0000 7f29d6e80700 1 -- 192.168.123.103:0/2804785389 shutdown_connections 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.380+0000 7f29d6e80700 1 --2- 192.168.123.103:0/2804785389 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29d0072440 0x7f29d010be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.380+0000 7f29d6e80700 1 --2- 192.168.123.103:0/2804785389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29d0071a60 0x7f29d0071e70 unknown :-1 s=CLOSED pgs=317 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.380+0000 7f29d6e80700 1 -- 192.168.123.103:0/2804785389 >> 192.168.123.103:0/2804785389 conn(0x7f29d006d1a0 msgr2=0x7f29d006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.380+0000 7f29d6e80700 1 -- 192.168.123.103:0/2804785389 shutdown_connections 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.380+0000 7f29d6e80700 1 -- 192.168.123.103:0/2804785389 wait complete. 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d6e80700 1 Processor -- start 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d6e80700 1 -- start start 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d6e80700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29d0072440 0x7f29d0116a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d6e80700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29d0116f70 0x7f29d01b27e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d6e80700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29d0117470 con 0x7f29d0072440 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d6e80700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29d01175e0 con 0x7f29d0116f70 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d5e7e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29d0072440 0x7f29d0116a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d567d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29d0116f70 0x7f29d01b27e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d567d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29d0116f70 0x7f29d01b27e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:39934/0 (socket says 192.168.123.103:39934) 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.381+0000 7f29d567d700 1 -- 192.168.123.103:0/4088538353 learned_addr learned my addr 192.168.123.103:0/4088538353 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:53.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.382+0000 7f29d567d700 1 -- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29d0072440 msgr2=0x7f29d0116a30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.382+0000 7f29d567d700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29d0072440 0x7f29d0116a30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.382+0000 7f29d567d700 1 -- 192.168.123.103:0/4088538353 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29cc0097e0 con 0x7f29d0116f70 2026-03-10T14:07:53.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.382+0000 7f29d567d700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29d0116f70 0x7f29d01b27e0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f29c800bf40 tx=0x7f29c800bf70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:53.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.384+0000 7f29c6ffd700 1 -- 192.168.123.103:0/4088538353 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f29c800cb40 con 0x7f29d0116f70 2026-03-10T14:07:53.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.384+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f29d01b2d80 con 0x7f29d0116f70 2026-03-10T14:07:53.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.384+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f29d01b32a0 con 0x7f29d0116f70 2026-03-10T14:07:53.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.384+0000 7f29c6ffd700 1 -- 192.168.123.103:0/4088538353 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f29c800cca0 con 0x7f29d0116f70 2026-03-10T14:07:53.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.384+0000 7f29c6ffd700 1 -- 192.168.123.103:0/4088538353 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f29c8012720 con 0x7f29d0116f70 2026-03-10T14:07:53.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.386+0000 7f29c6ffd700 1 -- 192.168.123.103:0/4088538353 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f29c8014440 con 0x7f29d0116f70 2026-03-10T14:07:53.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.386+0000 7f29c6ffd700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f29bc06c7a0 0x7f29bc06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.386+0000 7f29c6ffd700 1 -- 192.168.123.103:0/4088538353 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f29c808af30 con 0x7f29d0116f70 2026-03-10T14:07:53.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.386+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f29b4005320 con 0x7f29d0116f70 2026-03-10T14:07:53.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.387+0000 7f29d5e7e700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f29bc06c7a0 0x7f29bc06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:53.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.387+0000 7f29d5e7e700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f29bc06c7a0 0x7f29bc06ec50 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f29cc000c00 tx=0x7f29cc011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:53.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.390+0000 7f29c6ffd700 1 -- 192.168.123.103:0/4088538353 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f29c804e720 con 0x7f29d0116f70 2026-03-10T14:07:53.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.538+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f29b4000bf0 con 0x7f29bc06c7a0 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.544+0000 7f29c6ffd700 1 -- 192.168.123.103:0/4088538353 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f29b4000bf0 con 0x7f29bc06c7a0 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "0/23 daemons upgraded", 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm04", 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:07:53.542 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f29bc06c7a0 msgr2=0x7f29bc06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f29bc06c7a0 0x7f29bc06ec50 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f29cc000c00 tx=0x7f29cc011040 comp rx=0 tx=0).stop 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29d0116f70 msgr2=0x7f29d01b27e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29d0116f70 0x7f29d01b27e0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f29c800bf40 tx=0x7f29c800bf70 comp rx=0 tx=0).stop 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 shutdown_connections 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29d0072440 0x7f29d0116a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f29bc06c7a0 0x7f29bc06ec50 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 --2- 192.168.123.103:0/4088538353 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29d0116f70 0x7f29d01b27e0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.547+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 >> 192.168.123.103:0/4088538353 conn(0x7f29d006d1a0 msgr2=0x7f29d010b260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:53.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.548+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 shutdown_connections 2026-03-10T14:07:53.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.548+0000 7f29d6e80700 1 -- 192.168.123.103:0/4088538353 wait complete. 2026-03-10T14:07:53.624 INFO:tasks.workunit.client.0.vm03.stdout:8/130: getdents da/d24 0 2026-03-10T14:07:53.628 INFO:tasks.workunit.client.0.vm03.stdout:8/131: rename da/f1c to da/d24/f28 0 2026-03-10T14:07:53.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.635+0000 7f81398f6700 1 -- 192.168.123.103:0/2236621680 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134072360 msgr2=0x7f81340770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.635+0000 7f81398f6700 1 --2- 192.168.123.103:0/2236621680 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134072360 0x7f81340770e0 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f812c009230 tx=0x7f812c009260 comp rx=0 tx=0).stop 2026-03-10T14:07:53.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.635+0000 7f81398f6700 1 -- 192.168.123.103:0/2236621680 shutdown_connections 2026-03-10T14:07:53.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.635+0000 7f81398f6700 1 --2- 192.168.123.103:0/2236621680 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134072360 0x7f81340770e0 unknown :-1 s=CLOSED pgs=318 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.635+0000 7f81398f6700 1 --2- 192.168.123.103:0/2236621680 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8134071980 0x7f8134071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.635+0000 7f81398f6700 1 -- 192.168.123.103:0/2236621680 >> 192.168.123.103:0/2236621680 conn(0x7f813406d1a0 msgr2=0x7f813406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.636+0000 7f81398f6700 1 -- 192.168.123.103:0/2236621680 shutdown_connections 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.636+0000 7f81398f6700 1 -- 192.168.123.103:0/2236621680 wait complete. 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.636+0000 7f81398f6700 1 Processor -- start 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.636+0000 7f81398f6700 1 -- start start 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.636+0000 7f81398f6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8134071980 0x7f8134080400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.636+0000 7f81398f6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134080940 0x7f8134080db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.636+0000 7f81398f6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f813412dd80 con 0x7f8134080940 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.636+0000 7f81398f6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f813412def0 con 0x7f8134071980 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f8132ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8134071980 0x7f8134080400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f81327fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134080940 0x7f8134080db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f81327fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134080940 0x7f8134080db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45532/0 (socket says 192.168.123.103:45532) 2026-03-10T14:07:53.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f81327fc700 1 -- 192.168.123.103:0/1837299510 learned_addr learned my addr 192.168.123.103:0/1837299510 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:07:53.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f8132ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8134071980 0x7f8134080400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37378/0 (socket says 192.168.123.103:37378) 2026-03-10T14:07:53.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f81327fc700 1 -- 192.168.123.103:0/1837299510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8134071980 msgr2=0x7f8134080400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f81327fc700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8134071980 0x7f8134080400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f81327fc700 1 -- 192.168.123.103:0/1837299510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f812c008ee0 con 0x7f8134080940 2026-03-10T14:07:53.635 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.637+0000 7f81327fc700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134080940 0x7f8134080db0 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7f812c011fd0 tx=0x7f812c00bbc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:53.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.641+0000 7f81388f4700 1 -- 192.168.123.103:0/1837299510 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f812c010300 con 0x7f8134080940 2026-03-10T14:07:53.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.641+0000 7f81398f6700 1 -- 192.168.123.103:0/1837299510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f813412e110 con 0x7f8134080940 2026-03-10T14:07:53.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.641+0000 7f81398f6700 1 -- 192.168.123.103:0/1837299510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f813412e600 con 0x7f8134080940 2026-03-10T14:07:53.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.641+0000 7f81388f4700 1 -- 192.168.123.103:0/1837299510 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f812c007dd0 con 0x7f8134080940 2026-03-10T14:07:53.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.641+0000 7f81388f4700 1 -- 192.168.123.103:0/1837299510 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f812c008750 con 0x7f8134080940 2026-03-10T14:07:53.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.643+0000 7f81398f6700 1 -- 192.168.123.103:0/1837299510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f813404ea50 con 0x7f8134080940 2026-03-10T14:07:53.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.649+0000 7f81388f4700 1 -- 192.168.123.103:0/1837299510 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f812c0078f0 con 0x7f8134080940 2026-03-10T14:07:53.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.649+0000 7f81388f4700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f811c06c7a0 0x7f811c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:07:53.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.649+0000 7f81388f4700 1 -- 192.168.123.103:0/1837299510 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f812c08d870 con 0x7f8134080940 2026-03-10T14:07:53.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.650+0000 7f8132ffd700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f811c06c7a0 0x7f811c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:07:53.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.650+0000 7f8132ffd700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f811c06c7a0 0x7f811c06ec50 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f812400bed0 tx=0x7f812400d040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:07:53.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.653+0000 7f81388f4700 1 -- 192.168.123.103:0/1837299510 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f812c05bb00 con 0x7f8134080940 2026-03-10T14:07:53.652 INFO:tasks.workunit.client.0.vm03.stdout:0/131: getdents d3/d17 0 2026-03-10T14:07:53.653 INFO:tasks.workunit.client.0.vm03.stdout:8/132: write da/f12 [1169648,69761] 0 2026-03-10T14:07:53.653 INFO:tasks.workunit.client.0.vm03.stdout:8/133: stat da/c11 0 2026-03-10T14:07:53.654 INFO:tasks.workunit.client.0.vm03.stdout:8/134: chown f2 5 1 2026-03-10T14:07:53.663 INFO:tasks.workunit.client.1.vm04.stdout:5/644: write d7/d2d/f64 [4130299,120998] 0 2026-03-10T14:07:53.670 INFO:tasks.workunit.client.0.vm03.stdout:8/135: rename da/c13 to da/c29 0 2026-03-10T14:07:53.674 INFO:tasks.workunit.client.0.vm03.stdout:8/136: creat da/f2a x:0 0 0 2026-03-10T14:07:53.685 INFO:tasks.workunit.client.0.vm03.stdout:0/132: getdents d3/d11 0 2026-03-10T14:07:53.700 INFO:tasks.workunit.client.0.vm03.stdout:5/136: read d4/d16/f1c [1906015,74464] 0 2026-03-10T14:07:53.700 INFO:tasks.workunit.client.0.vm03.stdout:5/137: write d4/d13/d1f/f20 [1423208,16453] 0 2026-03-10T14:07:53.702 INFO:tasks.workunit.client.0.vm03.stdout:0/133: symlink d3/d16/d21/l29 0 2026-03-10T14:07:53.703 INFO:tasks.workunit.client.0.vm03.stdout:0/134: write d3/f1e [370822,126442] 0 2026-03-10T14:07:53.705 INFO:tasks.workunit.client.0.vm03.stdout:0/135: fsync d3/d17/f1a 0 2026-03-10T14:07:53.710 INFO:tasks.workunit.client.0.vm03.stdout:5/138: dwrite d4/d16/f1c [0,4194304] 0 2026-03-10T14:07:53.711 INFO:tasks.workunit.client.0.vm03.stdout:0/136: dread d3/f19 [0,4194304] 0 2026-03-10T14:07:53.715 INFO:tasks.workunit.client.1.vm04.stdout:7/610: chown d2/dc/d4d/l53 619248 1 2026-03-10T14:07:53.716 INFO:tasks.workunit.client.0.vm03.stdout:0/137: creat d3/d11/d25/f2a x:0 0 0 2026-03-10T14:07:53.716 INFO:tasks.workunit.client.0.vm03.stdout:5/139: creat d4/d6/de/f32 x:0 0 0 2026-03-10T14:07:53.716 INFO:tasks.workunit.client.0.vm03.stdout:0/138: rename d3/d16 to d3/d16/d21/d2b 22 2026-03-10T14:07:53.722 INFO:tasks.workunit.client.1.vm04.stdout:3/598: truncate da/dc/d3f/d54/d66/f80 719461 0 2026-03-10T14:07:53.723 INFO:tasks.workunit.client.1.vm04.stdout:8/636: creat d0/d3/d5/fc6 x:0 0 0 2026-03-10T14:07:53.724 INFO:tasks.workunit.client.0.vm03.stdout:2/133: dwrite d5/ff [0,4194304] 0 2026-03-10T14:07:53.724 INFO:tasks.workunit.client.0.vm03.stdout:2/134: fsync d5/d10/d17/f20 0 2026-03-10T14:07:53.725 INFO:tasks.workunit.client.0.vm03.stdout:2/135: fsync d5/f1e 0 2026-03-10T14:07:53.728 INFO:tasks.workunit.client.1.vm04.stdout:1/573: mkdir d3/d20/d60/d82/d13/d38/d58/dcc 0 2026-03-10T14:07:53.778 INFO:tasks.workunit.client.1.vm04.stdout:1/574: chown d3/d22/d2f/d57/l7a 100405 1 2026-03-10T14:07:53.778 INFO:tasks.workunit.client.0.vm03.stdout:5/140: getdents d4/d13/d1f 0 2026-03-10T14:07:53.779 INFO:tasks.workunit.client.0.vm03.stdout:4/128: truncate d5/d9/db/f10 995538 0 2026-03-10T14:07:53.779 INFO:tasks.workunit.client.0.vm03.stdout:5/141: symlink d4/d6/de/l33 0 2026-03-10T14:07:53.779 INFO:tasks.workunit.client.0.vm03.stdout:3/85: truncate fe 333789 0 2026-03-10T14:07:53.779 INFO:tasks.workunit.client.0.vm03.stdout:2/136: dwrite d5/f1e [0,4194304] 0 2026-03-10T14:07:53.779 INFO:tasks.workunit.client.0.vm03.stdout:6/104: getdents d8/d11/d18 0 2026-03-10T14:07:53.779 INFO:tasks.workunit.client.0.vm03.stdout:5/142: mkdir d4/d13/d1f/d34 0 2026-03-10T14:07:53.779 INFO:tasks.workunit.client.0.vm03.stdout:3/86: getdents . 0 2026-03-10T14:07:53.781 INFO:tasks.workunit.client.0.vm03.stdout:6/105: mknod d8/d1b/d1c/c1d 0 2026-03-10T14:07:53.787 INFO:tasks.workunit.client.0.vm03.stdout:3/87: rename f15 to f18 0 2026-03-10T14:07:53.792 INFO:tasks.workunit.client.0.vm03.stdout:7/139: read d5/d9/f10 [11594,17190] 0 2026-03-10T14:07:53.795 INFO:tasks.workunit.client.0.vm03.stdout:3/88: fsync f10 0 2026-03-10T14:07:53.799 INFO:tasks.workunit.client.1.vm04.stdout:9/522: dread d9/d1d/f23 [0,4194304] 0 2026-03-10T14:07:53.801 INFO:tasks.workunit.client.1.vm04.stdout:6/476: link d3/ff d3/de/d35/d3a/d43/f90 0 2026-03-10T14:07:53.804 INFO:tasks.workunit.client.1.vm04.stdout:3/599: unlink da/fb 0 2026-03-10T14:07:53.805 INFO:tasks.workunit.client.1.vm04.stdout:0/563: creat d0/d2/d90/fb2 x:0 0 0 2026-03-10T14:07:53.806 INFO:tasks.workunit.client.0.vm03.stdout:9/94: truncate d2/f1e 150512 0 2026-03-10T14:07:53.806 INFO:tasks.workunit.client.1.vm04.stdout:1/575: unlink d3/d22/f48 0 2026-03-10T14:07:53.808 INFO:tasks.workunit.client.1.vm04.stdout:5/645: creat d7/d9/db5/fcf x:0 0 0 2026-03-10T14:07:53.809 INFO:tasks.workunit.client.1.vm04.stdout:5/646: chown d7/d26/l43 7 1 2026-03-10T14:07:53.812 INFO:tasks.workunit.client.1.vm04.stdout:9/523: rename d9/da/dd/f24 to d9/da/dd/d1c/da3/fb3 0 2026-03-10T14:07:53.816 INFO:tasks.workunit.client.1.vm04.stdout:2/597: dread d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/f7b [0,4194304] 0 2026-03-10T14:07:53.816 INFO:tasks.workunit.client.1.vm04.stdout:6/477: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:53.818 INFO:tasks.workunit.client.0.vm03.stdout:8/137: sync 2026-03-10T14:07:53.818 INFO:tasks.workunit.client.0.vm03.stdout:2/137: sync 2026-03-10T14:07:53.818 INFO:tasks.workunit.client.1.vm04.stdout:9/524: dread d9/da/dd/f47 [0,4194304] 0 2026-03-10T14:07:53.820 INFO:tasks.workunit.client.1.vm04.stdout:0/564: symlink d0/da2/lb3 0 2026-03-10T14:07:53.821 INFO:tasks.workunit.client.0.vm03.stdout:8/138: dwrite da/f15 [0,4194304] 0 2026-03-10T14:07:53.828 INFO:tasks.workunit.client.1.vm04.stdout:3/600: dread da/dc/f2a [0,4194304] 0 2026-03-10T14:07:53.830 INFO:tasks.workunit.client.1.vm04.stdout:7/611: link d2/dc/de/d2d/d5c/da9/ldc d2/dc/de/d2d/d38/d50/dc8/le6 0 2026-03-10T14:07:53.832 INFO:tasks.workunit.client.1.vm04.stdout:9/525: mknod d9/d44/cb4 0 2026-03-10T14:07:53.836 INFO:tasks.workunit.client.1.vm04.stdout:3/601: creat da/d30/fd0 x:0 0 0 2026-03-10T14:07:53.840 INFO:tasks.workunit.client.1.vm04.stdout:2/598: rename d0/d14/l58 to d0/d14/d91/d4a/d66/lba 0 2026-03-10T14:07:53.840 INFO:tasks.workunit.client.1.vm04.stdout:7/612: mkdir d2/dc/de/d11/de7 0 2026-03-10T14:07:53.848 INFO:tasks.workunit.client.1.vm04.stdout:2/599: rename d0/l76 to d0/d14/d91/d4a/d8c/lbb 0 2026-03-10T14:07:53.852 INFO:tasks.workunit.client.0.vm03.stdout:2/138: link d5/d10/f16 d5/d10/d17/f2b 0 2026-03-10T14:07:53.853 INFO:tasks.workunit.client.0.vm03.stdout:2/139: truncate d5/f23 901989 0 2026-03-10T14:07:53.855 INFO:tasks.workunit.client.1.vm04.stdout:0/565: creat d0/d2/d15/fb4 x:0 0 0 2026-03-10T14:07:53.858 INFO:tasks.workunit.client.1.vm04.stdout:7/613: mkdir d2/dc/de/d2d/d60/de8 0 2026-03-10T14:07:53.859 INFO:tasks.workunit.client.0.vm03.stdout:2/140: dread d5/d10/d1f/f24 [0,4194304] 0 2026-03-10T14:07:53.860 INFO:tasks.workunit.client.1.vm04.stdout:9/526: rename d9/d1d to d9/d58/db5 0 2026-03-10T14:07:53.863 INFO:tasks.workunit.client.1.vm04.stdout:2/600: symlink d0/d14/d91/d8/d17/d35/lbc 0 2026-03-10T14:07:53.869 INFO:tasks.workunit.client.0.vm03.stdout:2/141: dwrite d5/d10/d17/f18 [0,4194304] 0 2026-03-10T14:07:53.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.874+0000 7f81398f6700 1 -- 192.168.123.103:0/1837299510 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f813412e8e0 con 0x7f8134080940 2026-03-10T14:07:53.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.875+0000 7f81388f4700 1 -- 192.168.123.103:0/1837299510 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+151 (secure 0 0 0) 0x7f812c05b690 con 0x7f8134080940 2026-03-10T14:07:53.873 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 failed cephadm daemon(s) 2026-03-10T14:07:53.873 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s) 2026-03-10T14:07:53.873 INFO:teuthology.orchestra.run.vm03.stdout: daemon ceph-exporter.vm03 on vm03 is in error state 2026-03-10T14:07:53.874 INFO:tasks.workunit.client.1.vm04.stdout:0/566: dread d0/f99 [4194304,4194304] 0 2026-03-10T14:07:53.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.878+0000 7f811a7fc700 1 -- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f811c06c7a0 msgr2=0x7f811c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.878+0000 7f811a7fc700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f811c06c7a0 0x7f811c06ec50 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f812400bed0 tx=0x7f812400d040 comp rx=0 tx=0).stop 2026-03-10T14:07:53.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.878+0000 7f811a7fc700 1 -- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134080940 msgr2=0x7f8134080db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:07:53.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.878+0000 7f811a7fc700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134080940 0x7f8134080db0 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7f812c011fd0 tx=0x7f812c00bbc0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.879+0000 7f811a7fc700 1 -- 192.168.123.103:0/1837299510 shutdown_connections 2026-03-10T14:07:53.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.879+0000 7f811a7fc700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f811c06c7a0 0x7f811c06ec50 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.879+0000 7f811a7fc700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8134071980 0x7f8134080400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.879+0000 7f811a7fc700 1 --2- 192.168.123.103:0/1837299510 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8134080940 0x7f8134080db0 unknown :-1 s=CLOSED pgs=319 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:07:53.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.879+0000 7f811a7fc700 1 -- 192.168.123.103:0/1837299510 >> 192.168.123.103:0/1837299510 conn(0x7f813406d1a0 msgr2=0x7f8134075630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:07:53.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.879+0000 7f811a7fc700 1 -- 192.168.123.103:0/1837299510 shutdown_connections 2026-03-10T14:07:53.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:07:53.879+0000 7f811a7fc700 1 -- 192.168.123.103:0/1837299510 wait complete. 2026-03-10T14:07:53.879 INFO:tasks.workunit.client.1.vm04.stdout:2/601: creat d0/d14/d91/d3a/d3e/fbd x:0 0 0 2026-03-10T14:07:53.891 INFO:tasks.workunit.client.0.vm03.stdout:2/142: dwrite d5/d10/f13 [0,4194304] 0 2026-03-10T14:07:53.898 INFO:tasks.workunit.client.1.vm04.stdout:0/567: rename d0/d2/d15/d22/d38/d56/d66/l3a to d0/d2/d15/d22/lb5 0 2026-03-10T14:07:53.901 INFO:tasks.workunit.client.1.vm04.stdout:0/568: dread - d0/d2/d90/fb2 zero size 2026-03-10T14:07:53.910 INFO:tasks.workunit.client.1.vm04.stdout:0/569: mknod d0/d2/d15/d22/d38/d56/d66/cb6 0 2026-03-10T14:07:53.912 INFO:tasks.workunit.client.1.vm04.stdout:0/570: fdatasync d0/d2/d90/fb2 0 2026-03-10T14:07:53.924 INFO:tasks.workunit.client.1.vm04.stdout:9/527: dread d9/da/dd/f48 [0,4194304] 0 2026-03-10T14:07:53.925 INFO:tasks.workunit.client.1.vm04.stdout:9/528: readlink d9/da/l12 0 2026-03-10T14:07:53.938 INFO:tasks.workunit.client.0.vm03.stdout:2/143: dread d5/d10/f22 [0,4194304] 0 2026-03-10T14:07:53.938 INFO:tasks.workunit.client.0.vm03.stdout:1/131: truncate d0/f24 546709 0 2026-03-10T14:07:53.939 INFO:tasks.workunit.client.0.vm03.stdout:2/144: truncate d5/d10/d1f/f24 1591099 0 2026-03-10T14:07:53.939 INFO:tasks.workunit.client.0.vm03.stdout:2/145: chown d5/d10/d1c 113 1 2026-03-10T14:07:53.939 INFO:tasks.workunit.client.1.vm04.stdout:9/529: creat d9/d5c/fb6 x:0 0 0 2026-03-10T14:07:53.941 INFO:tasks.workunit.client.0.vm03.stdout:2/146: dread d5/fb [0,4194304] 0 2026-03-10T14:07:53.944 INFO:tasks.workunit.client.1.vm04.stdout:6/478: dread d3/ff [0,4194304] 0 2026-03-10T14:07:53.948 INFO:tasks.workunit.client.1.vm04.stdout:9/530: dwrite d9/d5c/fb6 [0,4194304] 0 2026-03-10T14:07:53.948 INFO:tasks.workunit.client.0.vm03.stdout:2/147: dread d5/fa [0,4194304] 0 2026-03-10T14:07:53.951 INFO:tasks.workunit.client.1.vm04.stdout:6/479: chown d3/de/d35/d3f/d2d/d32/d23/d83/f87 2209992 1 2026-03-10T14:07:53.952 INFO:tasks.workunit.client.0.vm03.stdout:2/148: write d5/ff [1869411,57383] 0 2026-03-10T14:07:53.956 INFO:tasks.workunit.client.1.vm04.stdout:9/531: rename d9/l3c to d9/da/lb7 0 2026-03-10T14:07:53.960 INFO:tasks.workunit.client.1.vm04.stdout:9/532: chown d9/da/f57 1861 1 2026-03-10T14:07:53.961 INFO:tasks.workunit.client.1.vm04.stdout:9/533: truncate d9/d44/d4d/fa9 854606 0 2026-03-10T14:07:53.968 INFO:tasks.workunit.client.0.vm03.stdout:0/139: dwrite d3/f10 [0,4194304] 0 2026-03-10T14:07:53.968 INFO:tasks.workunit.client.1.vm04.stdout:9/534: unlink d9/d58/db5/fa4 0 2026-03-10T14:07:53.969 INFO:tasks.workunit.client.0.vm03.stdout:0/140: write d3/d11/f18 [2515929,116090] 0 2026-03-10T14:07:53.970 INFO:tasks.workunit.client.1.vm04.stdout:9/535: fsync d9/f4a 0 2026-03-10T14:07:53.970 INFO:tasks.workunit.client.0.vm03.stdout:0/141: chown d3/d11/d25/l27 1066806 1 2026-03-10T14:07:53.971 INFO:tasks.workunit.client.0.vm03.stdout:0/142: write d3/f28 [252183,15242] 0 2026-03-10T14:07:53.971 INFO:tasks.workunit.client.0.vm03.stdout:0/143: readlink d3/d11/l1c 0 2026-03-10T14:07:53.974 INFO:tasks.workunit.client.1.vm04.stdout:9/536: mknod d9/d5c/d93/db1/cb8 0 2026-03-10T14:07:53.975 INFO:tasks.workunit.client.0.vm03.stdout:0/144: dwrite d3/f10 [0,4194304] 0 2026-03-10T14:07:53.976 INFO:tasks.workunit.client.1.vm04.stdout:9/537: symlink d9/da/d5d/lb9 0 2026-03-10T14:07:53.979 INFO:tasks.workunit.client.1.vm04.stdout:9/538: creat d9/d58/fba x:0 0 0 2026-03-10T14:07:53.984 INFO:tasks.workunit.client.1.vm04.stdout:4/578: write d4/d14/fa8 [887444,23352] 0 2026-03-10T14:07:53.999 INFO:tasks.workunit.client.0.vm03.stdout:0/145: rename d3/d11/d1b to d3/d11/d2c 0 2026-03-10T14:07:54.009 INFO:tasks.workunit.client.1.vm04.stdout:8/637: dwrite d0/d3/d63/d12/d51/d67/d96/f71 [0,4194304] 0 2026-03-10T14:07:54.009 INFO:tasks.workunit.client.0.vm03.stdout:0/146: dread d3/fe [0,4194304] 0 2026-03-10T14:07:54.009 INFO:tasks.workunit.client.0.vm03.stdout:0/147: write d3/f9 [3066167,49250] 0 2026-03-10T14:07:54.016 INFO:tasks.workunit.client.0.vm03.stdout:0/148: creat d3/d16/f2d x:0 0 0 2026-03-10T14:07:54.018 INFO:tasks.workunit.client.0.vm03.stdout:0/149: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:54.018 INFO:tasks.workunit.client.1.vm04.stdout:1/576: dwrite d3/d22/d63/f65 [0,4194304] 0 2026-03-10T14:07:54.019 INFO:tasks.workunit.client.1.vm04.stdout:5/647: dwrite d7/d12/d2b/f72 [0,4194304] 0 2026-03-10T14:07:54.019 INFO:tasks.workunit.client.1.vm04.stdout:5/648: readlink d7/d9/l68 0 2026-03-10T14:07:54.027 INFO:tasks.workunit.client.1.vm04.stdout:8/638: symlink d0/d3/d73/db8/lc7 0 2026-03-10T14:07:54.031 INFO:tasks.workunit.client.0.vm03.stdout:3/89: mknod c19 0 2026-03-10T14:07:54.034 INFO:tasks.workunit.client.0.vm03.stdout:0/150: unlink d3/d17/f1a 0 2026-03-10T14:07:54.035 INFO:tasks.workunit.client.1.vm04.stdout:3/602: dwrite da/dc/d47/d9b/fbe [0,4194304] 0 2026-03-10T14:07:54.038 INFO:tasks.workunit.client.0.vm03.stdout:0/151: dwrite d3/d11/d25/f22 [0,4194304] 0 2026-03-10T14:07:54.040 INFO:tasks.workunit.client.1.vm04.stdout:8/639: mkdir d0/d3/d63/d12/d51/d67/d96/dc8 0 2026-03-10T14:07:54.041 INFO:tasks.workunit.client.0.vm03.stdout:0/152: dread d3/f9 [0,4194304] 0 2026-03-10T14:07:54.043 INFO:tasks.workunit.client.0.vm03.stdout:0/153: dread d3/f10 [0,4194304] 0 2026-03-10T14:07:54.044 INFO:tasks.workunit.client.1.vm04.stdout:5/649: mknod d7/d26/d6b/cd0 0 2026-03-10T14:07:54.046 INFO:tasks.workunit.client.0.vm03.stdout:9/95: write d2/d14/f1b [898819,72715] 0 2026-03-10T14:07:54.046 INFO:tasks.workunit.client.0.vm03.stdout:9/96: readlink d2/d14/l1d 0 2026-03-10T14:07:54.047 INFO:tasks.workunit.client.1.vm04.stdout:7/614: dwrite d2/dc/de/dae/fb2 [0,4194304] 0 2026-03-10T14:07:54.050 INFO:tasks.workunit.client.1.vm04.stdout:7/615: dwrite d2/dc/de/d2d/d60/fbd [0,4194304] 0 2026-03-10T14:07:54.060 INFO:tasks.workunit.client.0.vm03.stdout:3/90: mknod c1a 0 2026-03-10T14:07:54.062 INFO:tasks.workunit.client.0.vm03.stdout:9/97: fdatasync d2/f3 0 2026-03-10T14:07:54.063 INFO:tasks.workunit.client.0.vm03.stdout:0/154: getdents d3/d11/d2c 0 2026-03-10T14:07:54.064 INFO:tasks.workunit.client.0.vm03.stdout:3/91: link fc f1b 0 2026-03-10T14:07:54.065 INFO:tasks.workunit.client.1.vm04.stdout:5/650: fsync d7/d26/d6b/d6e/f81 0 2026-03-10T14:07:54.065 INFO:tasks.workunit.client.0.vm03.stdout:3/92: readlink l5 0 2026-03-10T14:07:54.068 INFO:tasks.workunit.client.1.vm04.stdout:5/651: write d7/d9/f28 [2082659,9384] 0 2026-03-10T14:07:54.069 INFO:tasks.workunit.client.1.vm04.stdout:3/603: link da/dc/d35/d37/l41 da/dc/d3f/d61/dc1/ld1 0 2026-03-10T14:07:54.070 INFO:tasks.workunit.client.1.vm04.stdout:3/604: write da/dc/d35/d52/d6d/f75 [1165338,57228] 0 2026-03-10T14:07:54.072 INFO:tasks.workunit.client.1.vm04.stdout:8/640: creat d0/d3/d63/fc9 x:0 0 0 2026-03-10T14:07:54.072 INFO:tasks.workunit.client.1.vm04.stdout:5/652: rmdir d7/d12/d2b/d3e/d57/d77/da5 39 2026-03-10T14:07:54.076 INFO:tasks.workunit.client.1.vm04.stdout:3/605: link da/dc/d35/d52/f6f da/dc/d47/d9b/fd2 0 2026-03-10T14:07:54.095 INFO:tasks.workunit.client.1.vm04.stdout:5/653: read d7/d12/f51 [603293,1112] 0 2026-03-10T14:07:54.096 INFO:tasks.workunit.client.1.vm04.stdout:5/654: creat d7/d9/db5/fd1 x:0 0 0 2026-03-10T14:07:54.099 INFO:tasks.workunit.client.1.vm04.stdout:5/655: rename d7/d12/d2b/d3e/d3f/da6/la7 to d7/d12/d2b/d3e/d57/ld2 0 2026-03-10T14:07:54.100 INFO:tasks.workunit.client.1.vm04.stdout:5/656: symlink d7/d2d/d32/ld3 0 2026-03-10T14:07:54.102 INFO:tasks.workunit.client.1.vm04.stdout:5/657: creat d7/d59/d7e/d87/fd4 x:0 0 0 2026-03-10T14:07:54.104 INFO:tasks.workunit.client.1.vm04.stdout:5/658: stat d7/d12/d2b/d93/d9e/lc5 0 2026-03-10T14:07:54.104 INFO:tasks.workunit.client.1.vm04.stdout:5/659: chown d7/db7 25 1 2026-03-10T14:07:54.213 INFO:tasks.workunit.client.1.vm04.stdout:0/571: write d0/d2/d15/d22/d38/d56/f5e [926645,13261] 0 2026-03-10T14:07:54.215 INFO:tasks.workunit.client.1.vm04.stdout:0/572: rename d0/d2/d15/d49/f7c to d0/d2/d15/d49/d50/fb7 0 2026-03-10T14:07:54.224 INFO:tasks.workunit.client.1.vm04.stdout:0/573: rmdir d0/d2/d15/d49/d50/d61/d75 39 2026-03-10T14:07:54.232 INFO:tasks.workunit.client.1.vm04.stdout:0/574: creat d0/d2/d15/d49/d50/d61/fb8 x:0 0 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.1.vm04.stdout:0/575: chown d0/d2/d15/d22/d38/d56/d66/f54 102534 1 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:1/132: symlink d0/d2/l2b 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:1/133: dread d0/d2/fe [0,4194304] 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:0/155: mknod d3/c2e 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:1/134: fsync d0/d2/f1a 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:1/135: write d0/fa [1308939,29425] 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:5/143: mkdir d4/d35 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:5/144: fdatasync d4/d16/f1c 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:0/156: getdents d3/d16/d21 0 2026-03-10T14:07:54.245 INFO:tasks.workunit.client.0.vm03.stdout:0/157: readlink d3/d11/d25/l27 0 2026-03-10T14:07:54.247 INFO:tasks.workunit.client.0.vm03.stdout:5/145: symlink d4/d6/l36 0 2026-03-10T14:07:54.248 INFO:tasks.workunit.client.0.vm03.stdout:0/158: symlink d3/d17/l2f 0 2026-03-10T14:07:54.251 INFO:tasks.workunit.client.0.vm03.stdout:0/159: mkdir d3/d11/d25/d30 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.1.vm04.stdout:1/577: dread d3/d20/d60/d82/fd [0,4194304] 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.1.vm04.stdout:1/578: symlink d3/d22/d63/d35/d6c/lcd 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.1.vm04.stdout:1/579: dread d3/d22/f2b [0,4194304] 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.1.vm04.stdout:1/580: mkdir d3/d8f/db7/dce 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.1.vm04.stdout:1/581: symlink d3/d8f/db7/dce/lcf 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.0.vm03.stdout:0/160: dwrite d3/d16/f2d [0,4194304] 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.0.vm03.stdout:0/161: stat d3/f19 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.0.vm03.stdout:5/146: creat d4/f37 x:0 0 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.0.vm03.stdout:5/147: dread d4/d13/f24 [0,4194304] 0 2026-03-10T14:07:54.268 INFO:tasks.workunit.client.0.vm03.stdout:5/148: write d4/f37 [195486,112427] 0 2026-03-10T14:07:54.271 INFO:tasks.workunit.client.1.vm04.stdout:1/582: dread d3/d22/d63/f65 [0,4194304] 0 2026-03-10T14:07:54.271 INFO:tasks.workunit.client.0.vm03.stdout:5/149: dwrite d4/d6/fb [0,4194304] 0 2026-03-10T14:07:54.275 INFO:tasks.workunit.client.1.vm04.stdout:1/583: dwrite d3/d20/d60/d82/d13/d38/d58/d5b/f7c [0,4194304] 0 2026-03-10T14:07:54.278 INFO:tasks.workunit.client.0.vm03.stdout:0/162: link d3/f19 d3/d16/f31 0 2026-03-10T14:07:54.285 INFO:tasks.workunit.client.0.vm03.stdout:5/150: getdents d4/d35 0 2026-03-10T14:07:54.294 INFO:tasks.workunit.client.0.vm03.stdout:5/151: write d4/f17 [696019,66232] 0 2026-03-10T14:07:54.294 INFO:tasks.workunit.client.0.vm03.stdout:5/152: dwrite d4/f37 [0,4194304] 0 2026-03-10T14:07:54.294 INFO:tasks.workunit.client.0.vm03.stdout:0/163: dread d3/fe [0,4194304] 0 2026-03-10T14:07:54.294 INFO:tasks.workunit.client.1.vm04.stdout:3/606: sync 2026-03-10T14:07:54.294 INFO:tasks.workunit.client.1.vm04.stdout:2/602: sync 2026-03-10T14:07:54.294 INFO:tasks.workunit.client.1.vm04.stdout:7/616: sync 2026-03-10T14:07:54.295 INFO:tasks.workunit.client.0.vm03.stdout:5/153: truncate d4/d13/d1f/f21 800465 0 2026-03-10T14:07:54.298 INFO:tasks.workunit.client.0.vm03.stdout:0/164: creat d3/d11/d25/d30/f32 x:0 0 0 2026-03-10T14:07:54.298 INFO:tasks.workunit.client.1.vm04.stdout:2/603: read - d0/d14/d54/f9c zero size 2026-03-10T14:07:54.298 INFO:tasks.workunit.client.0.vm03.stdout:0/165: fsync d3/d11/d25/f2a 0 2026-03-10T14:07:54.299 INFO:tasks.workunit.client.0.vm03.stdout:0/166: mknod d3/d16/d21/c33 0 2026-03-10T14:07:54.299 INFO:tasks.workunit.client.0.vm03.stdout:0/167: chown d3/d16 1 1 2026-03-10T14:07:54.299 INFO:tasks.workunit.client.1.vm04.stdout:7/617: read d2/dc/d4d/d7f/f96 [163846,125128] 0 2026-03-10T14:07:54.301 INFO:tasks.workunit.client.1.vm04.stdout:2/604: unlink d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/lb4 0 2026-03-10T14:07:54.301 INFO:tasks.workunit.client.1.vm04.stdout:2/605: stat d0/d14/d39 0 2026-03-10T14:07:54.304 INFO:tasks.workunit.client.0.vm03.stdout:0/168: dwrite d3/f10 [0,4194304] 0 2026-03-10T14:07:54.311 INFO:tasks.workunit.client.1.vm04.stdout:2/606: getdents d0/db8 0 2026-03-10T14:07:54.311 INFO:tasks.workunit.client.1.vm04.stdout:7/618: link d2/dc/de/d2d/d60/d7c/d3b/cc1 d2/dac/ce9 0 2026-03-10T14:07:54.312 INFO:tasks.workunit.client.1.vm04.stdout:2/607: fsync d0/d14/d39/d47/d70/f8d 0 2026-03-10T14:07:54.324 INFO:tasks.workunit.client.1.vm04.stdout:3/607: sync 2026-03-10T14:07:54.329 INFO:tasks.workunit.client.1.vm04.stdout:3/608: dwrite da/dc/d35/d52/d6d/f75 [0,4194304] 0 2026-03-10T14:07:54.331 INFO:tasks.workunit.client.1.vm04.stdout:3/609: write da/dc/d35/d52/da3/dac/fc9 [568670,46898] 0 2026-03-10T14:07:54.334 INFO:tasks.workunit.client.1.vm04.stdout:3/610: truncate da/dc/fcc 695412 0 2026-03-10T14:07:54.338 INFO:tasks.workunit.client.1.vm04.stdout:2/608: rename d0/d14/d91/d8/d17/d4e/d85/d86/laf to d0/d14/d91/d3a/d3e/lbe 0 2026-03-10T14:07:54.343 INFO:tasks.workunit.client.0.vm03.stdout:9/98: dread d2/f11 [0,4194304] 0 2026-03-10T14:07:54.346 INFO:tasks.workunit.client.0.vm03.stdout:9/99: write d2/d14/f1b [1904898,115085] 0 2026-03-10T14:07:54.350 INFO:tasks.workunit.client.1.vm04.stdout:3/611: read f4 [6194106,60502] 0 2026-03-10T14:07:54.354 INFO:tasks.workunit.client.1.vm04.stdout:3/612: dread da/dc/d3f/d54/d66/fa7 [0,4194304] 0 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/527567025' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/1384323973' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: from='client.24405 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: Upgrade: Updating mgr.vm04.ywwcto 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:07:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:53 vm03.local ceph-mon[49718]: Deploying daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:07:54.426 INFO:tasks.workunit.client.1.vm04.stdout:6/480: write d3/de/d35/d3f/d2d/d32/f1f [945009,12395] 0 2026-03-10T14:07:54.433 INFO:tasks.workunit.client.1.vm04.stdout:7/619: dread d2/d94/f29 [0,4194304] 0 2026-03-10T14:07:54.434 INFO:tasks.workunit.client.0.vm03.stdout:4/129: rename d5/d9/db/d19/d1e to d5/d9/d2b 0 2026-03-10T14:07:54.436 INFO:tasks.workunit.client.0.vm03.stdout:8/139: rename da/d24 to da/d24/d2b 22 2026-03-10T14:07:54.436 INFO:tasks.workunit.client.0.vm03.stdout:8/140: truncate da/d24/f28 135838 0 2026-03-10T14:07:54.437 INFO:tasks.workunit.client.1.vm04.stdout:9/539: dwrite d9/da/dd/d1c/f3b [0,4194304] 0 2026-03-10T14:07:54.440 INFO:tasks.workunit.client.0.vm03.stdout:2/149: rename d5/fb to d5/d10/d17/f2c 0 2026-03-10T14:07:54.440 INFO:tasks.workunit.client.0.vm03.stdout:2/150: readlink d5/d10/l29 0 2026-03-10T14:07:54.446 INFO:tasks.workunit.client.1.vm04.stdout:4/579: dwrite d4/d14/f27 [4194304,4194304] 0 2026-03-10T14:07:54.446 INFO:tasks.workunit.client.0.vm03.stdout:5/154: rename d4/d6/l36 to d4/d13/l38 0 2026-03-10T14:07:54.446 INFO:tasks.workunit.client.1.vm04.stdout:4/580: readlink d4/d14/dac/lbe 0 2026-03-10T14:07:54.449 INFO:tasks.workunit.client.1.vm04.stdout:7/620: creat d2/dc/de/d2d/d5c/da9/fea x:0 0 0 2026-03-10T14:07:54.449 INFO:tasks.workunit.client.0.vm03.stdout:2/151: dread d5/d10/d1f/f24 [0,4194304] 0 2026-03-10T14:07:54.453 INFO:tasks.workunit.client.1.vm04.stdout:9/540: rmdir d9/d44/d4d/d7d 39 2026-03-10T14:07:54.453 INFO:tasks.workunit.client.0.vm03.stdout:5/155: mknod d4/d16/d19/c39 0 2026-03-10T14:07:54.457 INFO:tasks.workunit.client.0.vm03.stdout:5/156: dwrite d4/d6/fb [4194304,4194304] 0 2026-03-10T14:07:54.469 INFO:tasks.workunit.client.1.vm04.stdout:8/641: write d0/d3/d63/d12/d51/d67/d96/fbf [4295771,130471] 0 2026-03-10T14:07:54.471 INFO:tasks.workunit.client.0.vm03.stdout:2/152: dwrite d5/d10/d17/f19 [0,4194304] 0 2026-03-10T14:07:54.479 INFO:tasks.workunit.client.1.vm04.stdout:8/642: chown d0/d3/d63/d29/l77 130211 1 2026-03-10T14:07:54.484 INFO:tasks.workunit.client.0.vm03.stdout:8/141: link da/lc da/d24/l2c 0 2026-03-10T14:07:54.496 INFO:tasks.workunit.client.1.vm04.stdout:9/541: fdatasync d9/da/d5d/d81/f98 0 2026-03-10T14:07:54.496 INFO:tasks.workunit.client.1.vm04.stdout:5/660: dwrite d7/d26/d6b/d6e/fa3 [4194304,4194304] 0 2026-03-10T14:07:54.496 INFO:tasks.workunit.client.0.vm03.stdout:4/130: mknod d5/d9/db/c2c 0 2026-03-10T14:07:54.496 INFO:tasks.workunit.client.0.vm03.stdout:2/153: creat d5/f2d x:0 0 0 2026-03-10T14:07:54.496 INFO:tasks.workunit.client.1.vm04.stdout:8/643: dread d0/d3/d5/fb9 [0,4194304] 0 2026-03-10T14:07:54.498 INFO:tasks.workunit.client.0.vm03.stdout:8/142: creat da/d24/f2d x:0 0 0 2026-03-10T14:07:54.500 INFO:tasks.workunit.client.0.vm03.stdout:4/131: symlink d5/l2d 0 2026-03-10T14:07:54.503 INFO:tasks.workunit.client.0.vm03.stdout:2/154: creat d5/d10/d1c/f2e x:0 0 0 2026-03-10T14:07:54.505 INFO:tasks.workunit.client.1.vm04.stdout:0/576: write d0/d2/f17 [4788797,18407] 0 2026-03-10T14:07:54.505 INFO:tasks.workunit.client.0.vm03.stdout:8/143: unlink l1 0 2026-03-10T14:07:54.507 INFO:tasks.workunit.client.0.vm03.stdout:4/132: symlink d5/d9/l2e 0 2026-03-10T14:07:54.508 INFO:tasks.workunit.client.1.vm04.stdout:0/577: chown d0/d2/d25/c46 72904885 1 2026-03-10T14:07:54.509 INFO:tasks.workunit.client.0.vm03.stdout:4/133: dread d5/d9/f25 [0,4194304] 0 2026-03-10T14:07:54.510 INFO:tasks.workunit.client.1.vm04.stdout:9/542: creat d9/d58/db5/da5/dab/fbb x:0 0 0 2026-03-10T14:07:54.513 INFO:tasks.workunit.client.0.vm03.stdout:2/155: unlink d5/d10/d17/f2b 0 2026-03-10T14:07:54.514 INFO:tasks.workunit.client.0.vm03.stdout:5/157: dread d4/d16/f2d [0,4194304] 0 2026-03-10T14:07:54.517 INFO:tasks.workunit.client.0.vm03.stdout:2/156: dwrite d5/f2d [0,4194304] 0 2026-03-10T14:07:54.525 INFO:tasks.workunit.client.1.vm04.stdout:0/578: mknod d0/d2/d15/d49/d50/d5c/da4/cb9 0 2026-03-10T14:07:54.530 INFO:tasks.workunit.client.1.vm04.stdout:9/543: fdatasync d9/d5c/f77 0 2026-03-10T14:07:54.535 INFO:tasks.workunit.client.0.vm03.stdout:4/134: unlink d5/d9/fa 0 2026-03-10T14:07:54.538 INFO:tasks.workunit.client.1.vm04.stdout:5/661: dread d7/d2d/d32/f9d [0,4194304] 0 2026-03-10T14:07:54.538 INFO:tasks.workunit.client.1.vm04.stdout:8/644: dread d0/d3/f35 [0,4194304] 0 2026-03-10T14:07:54.540 INFO:tasks.workunit.client.1.vm04.stdout:8/645: mknod d0/d75/d8a/cca 0 2026-03-10T14:07:54.545 INFO:tasks.workunit.client.0.vm03.stdout:2/157: chown d5/d10/d17/c27 972765216 1 2026-03-10T14:07:54.545 INFO:tasks.workunit.client.1.vm04.stdout:5/662: write d7/d12/f42 [8311269,73054] 0 2026-03-10T14:07:54.545 INFO:tasks.workunit.client.1.vm04.stdout:0/579: getdents d0/d2/d15/d49/d50/d61/d75 0 2026-03-10T14:07:54.545 INFO:tasks.workunit.client.1.vm04.stdout:5/663: mknod d7/db7/cd5 0 2026-03-10T14:07:54.549 INFO:tasks.workunit.client.1.vm04.stdout:0/580: dwrite d0/d2/f12 [0,4194304] 0 2026-03-10T14:07:54.550 INFO:tasks.workunit.client.0.vm03.stdout:4/135: mknod d5/d9/db/d19/c2f 0 2026-03-10T14:07:54.552 INFO:tasks.workunit.client.0.vm03.stdout:5/158: mknod d4/d35/c3a 0 2026-03-10T14:07:54.552 INFO:tasks.workunit.client.0.vm03.stdout:5/159: chown d4/d13/d1f/f20 0 1 2026-03-10T14:07:54.553 INFO:tasks.workunit.client.1.vm04.stdout:0/581: readlink d0/da2/lb3 0 2026-03-10T14:07:54.554 INFO:tasks.workunit.client.1.vm04.stdout:5/664: rename d7/d59/d7e/fb6 to d7/d12/d2b/d3e/d57/d8a/fd6 0 2026-03-10T14:07:54.558 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/527567025' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:07:54.558 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/1384323973' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:07:54.558 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: from='client.24405 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:07:54.559 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: Upgrade: Updating mgr.vm04.ywwcto 2026-03-10T14:07:54.559 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:54.559 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:07:54.559 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:07:54.559 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:07:54.559 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: Deploying daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:07:54.564 INFO:tasks.workunit.client.0.vm03.stdout:4/136: symlink d5/d9/db/d19/l30 0 2026-03-10T14:07:54.565 INFO:tasks.workunit.client.0.vm03.stdout:4/137: truncate d5/d9/db/f2a 245266 0 2026-03-10T14:07:54.568 INFO:tasks.workunit.client.0.vm03.stdout:2/158: symlink d5/d2a/l2f 0 2026-03-10T14:07:54.578 INFO:tasks.workunit.client.0.vm03.stdout:5/160: mknod d4/d16/d19/d23/c3b 0 2026-03-10T14:07:54.578 INFO:tasks.workunit.client.0.vm03.stdout:5/161: chown d4/d6/fa 62 1 2026-03-10T14:07:54.578 INFO:tasks.workunit.client.0.vm03.stdout:2/159: mknod d5/d2a/c30 0 2026-03-10T14:07:54.578 INFO:tasks.workunit.client.0.vm03.stdout:5/162: mknod d4/d13/c3c 0 2026-03-10T14:07:54.581 INFO:tasks.workunit.client.0.vm03.stdout:5/163: link d4/d13/d1f/c2c d4/d13/d1f/c3d 0 2026-03-10T14:07:54.584 INFO:tasks.workunit.client.0.vm03.stdout:5/164: mknod d4/d6/c3e 0 2026-03-10T14:07:54.585 INFO:tasks.workunit.client.0.vm03.stdout:5/165: fsync d4/d6/de/f32 0 2026-03-10T14:07:54.588 INFO:tasks.workunit.client.0.vm03.stdout:5/166: write d4/d6/f10 [1453238,93149] 0 2026-03-10T14:07:54.597 INFO:tasks.workunit.client.1.vm04.stdout:6/481: dread d3/de/d35/d3a/f51 [0,4194304] 0 2026-03-10T14:07:54.610 INFO:tasks.workunit.client.1.vm04.stdout:9/544: dread d9/d58/f5e [0,4194304] 0 2026-03-10T14:07:54.616 INFO:tasks.workunit.client.0.vm03.stdout:6/106: link d8/l9 d8/l1e 0 2026-03-10T14:07:54.618 INFO:tasks.workunit.client.1.vm04.stdout:8/646: dread d0/d75/d8a/f9e [0,4194304] 0 2026-03-10T14:07:54.623 INFO:tasks.workunit.client.0.vm03.stdout:6/107: creat d8/db/f1f x:0 0 0 2026-03-10T14:07:54.623 INFO:tasks.workunit.client.0.vm03.stdout:6/108: dread - d8/db/f1f zero size 2026-03-10T14:07:54.628 INFO:tasks.workunit.client.1.vm04.stdout:8/647: dwrite d0/d3/d63/d12/d51/f97 [0,4194304] 0 2026-03-10T14:07:54.631 INFO:tasks.workunit.client.1.vm04.stdout:8/648: symlink d0/d3/d5/lcb 0 2026-03-10T14:07:54.633 INFO:tasks.workunit.client.1.vm04.stdout:8/649: fdatasync d0/f42 0 2026-03-10T14:07:54.633 INFO:tasks.workunit.client.1.vm04.stdout:8/650: chown d0/d75/c7b 27827383 1 2026-03-10T14:07:54.634 INFO:tasks.workunit.client.0.vm03.stdout:6/109: dwrite f0 [8388608,4194304] 0 2026-03-10T14:07:54.634 INFO:tasks.workunit.client.1.vm04.stdout:8/651: chown d0/d3/d63/d12/d69/l7a 10 1 2026-03-10T14:07:54.640 INFO:tasks.workunit.client.0.vm03.stdout:6/110: symlink d8/l20 0 2026-03-10T14:07:54.643 INFO:tasks.workunit.client.0.vm03.stdout:6/111: rmdir d8/d11/d18 39 2026-03-10T14:07:54.647 INFO:tasks.workunit.client.0.vm03.stdout:9/100: rename d2/d14/c20 to d2/c23 0 2026-03-10T14:07:54.648 INFO:tasks.workunit.client.0.vm03.stdout:6/112: creat d8/d11/d18/f21 x:0 0 0 2026-03-10T14:07:54.649 INFO:tasks.workunit.client.0.vm03.stdout:6/113: chown d8/db/f1f 109336862 1 2026-03-10T14:07:54.650 INFO:tasks.workunit.client.0.vm03.stdout:6/114: chown d8/d11/c14 121 1 2026-03-10T14:07:54.651 INFO:tasks.workunit.client.0.vm03.stdout:9/101: mknod d2/c24 0 2026-03-10T14:07:54.652 INFO:tasks.workunit.client.1.vm04.stdout:1/584: write d3/d22/d2f/f5d [649832,73048] 0 2026-03-10T14:07:54.653 INFO:tasks.workunit.client.1.vm04.stdout:8/652: creat d0/dc1/fcc x:0 0 0 2026-03-10T14:07:54.653 INFO:tasks.workunit.client.1.vm04.stdout:1/585: chown d3/d5c/fbf 85310 1 2026-03-10T14:07:54.653 INFO:tasks.workunit.client.1.vm04.stdout:8/653: chown d0/d3/d63/d12/d51/d67/f85 0 1 2026-03-10T14:07:54.653 INFO:tasks.workunit.client.1.vm04.stdout:8/654: readlink d0/d3/d63/d12/d51/l49 0 2026-03-10T14:07:54.653 INFO:tasks.workunit.client.1.vm04.stdout:8/655: read d0/d3/d5/f15 [2052897,43935] 0 2026-03-10T14:07:54.655 INFO:tasks.workunit.client.0.vm03.stdout:6/115: dwrite f2 [0,4194304] 0 2026-03-10T14:07:54.659 INFO:tasks.workunit.client.0.vm03.stdout:9/102: unlink d2/ff 0 2026-03-10T14:07:54.660 INFO:tasks.workunit.client.0.vm03.stdout:6/116: dwrite f3 [8388608,4194304] 0 2026-03-10T14:07:54.685 INFO:tasks.workunit.client.1.vm04.stdout:1/586: sync 2026-03-10T14:07:54.704 INFO:tasks.workunit.client.1.vm04.stdout:2/609: write d0/d14/d39/d47/d70/f8d [3383207,93599] 0 2026-03-10T14:07:54.704 INFO:tasks.workunit.client.1.vm04.stdout:1/587: symlink d3/d22/d2f/d57/ld0 0 2026-03-10T14:07:54.718 INFO:tasks.workunit.client.1.vm04.stdout:1/588: truncate d3/d20/d60/d82/f56 180926 0 2026-03-10T14:07:54.722 INFO:tasks.workunit.client.1.vm04.stdout:1/589: fdatasync d3/d22/d63/fb3 0 2026-03-10T14:07:54.724 INFO:tasks.workunit.client.0.vm03.stdout:7/140: symlink d5/l25 0 2026-03-10T14:07:54.726 INFO:tasks.workunit.client.0.vm03.stdout:7/141: mkdir d5/d9/d14/d26 0 2026-03-10T14:07:54.729 INFO:tasks.workunit.client.0.vm03.stdout:7/142: link d5/d9/f10 d5/d9/d14/d26/f27 0 2026-03-10T14:07:54.730 INFO:tasks.workunit.client.0.vm03.stdout:7/143: mkdir d5/d9/d14/d21/d28 0 2026-03-10T14:07:54.734 INFO:tasks.workunit.client.0.vm03.stdout:2/160: dread d5/f9 [0,4194304] 0 2026-03-10T14:07:54.735 INFO:tasks.workunit.client.0.vm03.stdout:7/144: dread d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:07:54.737 INFO:tasks.workunit.client.1.vm04.stdout:1/590: sync 2026-03-10T14:07:54.737 INFO:tasks.workunit.client.0.vm03.stdout:2/161: unlink d5/d10/d17/c25 0 2026-03-10T14:07:54.738 INFO:tasks.workunit.client.0.vm03.stdout:2/162: truncate d5/d10/d17/f28 3495 0 2026-03-10T14:07:54.739 INFO:tasks.workunit.client.0.vm03.stdout:7/145: write d5/d9/d14/ff [1520862,64136] 0 2026-03-10T14:07:54.741 INFO:tasks.workunit.client.0.vm03.stdout:7/146: creat d5/d9/d14/d21/f29 x:0 0 0 2026-03-10T14:07:54.742 INFO:tasks.workunit.client.0.vm03.stdout:2/163: dread d5/f9 [0,4194304] 0 2026-03-10T14:07:54.745 INFO:tasks.workunit.client.0.vm03.stdout:7/147: mknod d5/d9/d14/d21/d28/c2a 0 2026-03-10T14:07:54.745 INFO:tasks.workunit.client.0.vm03.stdout:7/148: write d5/d9/f19 [1663333,81619] 0 2026-03-10T14:07:54.746 INFO:tasks.workunit.client.0.vm03.stdout:7/149: stat d5/d9/d14/ff 0 2026-03-10T14:07:54.752 INFO:tasks.workunit.client.0.vm03.stdout:7/150: dwrite d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:07:54.754 INFO:tasks.workunit.client.0.vm03.stdout:7/151: dread - d5/d9/d14/f23 zero size 2026-03-10T14:07:54.756 INFO:tasks.workunit.client.0.vm03.stdout:7/152: truncate d5/d9/d14/f23 611782 0 2026-03-10T14:07:54.757 INFO:tasks.workunit.client.0.vm03.stdout:7/153: mknod d5/d9/d14/d26/c2b 0 2026-03-10T14:07:54.759 INFO:tasks.workunit.client.0.vm03.stdout:7/154: mknod d5/d9/d14/d1c/c2c 0 2026-03-10T14:07:54.760 INFO:tasks.workunit.client.0.vm03.stdout:7/155: rmdir d5/d9/d14/d26 39 2026-03-10T14:07:54.761 INFO:tasks.workunit.client.0.vm03.stdout:7/156: creat d5/d9/d14/f2d x:0 0 0 2026-03-10T14:07:54.761 INFO:tasks.workunit.client.0.vm03.stdout:7/157: stat d5/f6 0 2026-03-10T14:07:54.762 INFO:tasks.workunit.client.0.vm03.stdout:7/158: chown d5/d9/f17 464119676 1 2026-03-10T14:07:54.763 INFO:tasks.workunit.client.0.vm03.stdout:7/159: chown d5/d9/d14/f2d 3147650 1 2026-03-10T14:07:54.763 INFO:tasks.workunit.client.1.vm04.stdout:1/591: creat d3/d20/d60/d82/d13/da0/dc5/fd1 x:0 0 0 2026-03-10T14:07:54.766 INFO:tasks.workunit.client.0.vm03.stdout:7/160: dwrite d5/d9/f19 [0,4194304] 0 2026-03-10T14:07:54.773 INFO:tasks.workunit.client.0.vm03.stdout:7/161: unlink d5/f7 0 2026-03-10T14:07:54.773 INFO:tasks.workunit.client.0.vm03.stdout:7/162: chown d5/l8 63275 1 2026-03-10T14:07:54.773 INFO:tasks.workunit.client.0.vm03.stdout:7/163: creat d5/d9/f2e x:0 0 0 2026-03-10T14:07:54.773 INFO:tasks.workunit.client.0.vm03.stdout:7/164: creat d5/d9/d14/f2f x:0 0 0 2026-03-10T14:07:54.774 INFO:tasks.workunit.client.0.vm03.stdout:7/165: dread d5/d9/f19 [0,4194304] 0 2026-03-10T14:07:54.780 INFO:tasks.workunit.client.1.vm04.stdout:3/613: write da/dc/d3f/d54/fa9 [765429,112584] 0 2026-03-10T14:07:54.781 INFO:tasks.workunit.client.0.vm03.stdout:7/166: dwrite d5/d9/f17 [8388608,4194304] 0 2026-03-10T14:07:54.789 INFO:tasks.workunit.client.1.vm04.stdout:8/656: dread d0/d3/d63/d12/d51/f64 [0,4194304] 0 2026-03-10T14:07:54.789 INFO:tasks.workunit.client.0.vm03.stdout:7/167: link d5/d9/d14/f2f d5/d9/f30 0 2026-03-10T14:07:54.797 INFO:tasks.workunit.client.1.vm04.stdout:8/657: dwrite d0/d3/d63/d12/f50 [0,4194304] 0 2026-03-10T14:07:54.824 INFO:tasks.workunit.client.0.vm03.stdout:2/164: sync 2026-03-10T14:07:54.839 INFO:tasks.workunit.client.0.vm03.stdout:7/168: sync 2026-03-10T14:07:54.842 INFO:tasks.workunit.client.0.vm03.stdout:7/169: dread d5/d9/f19 [0,4194304] 0 2026-03-10T14:07:54.844 INFO:tasks.workunit.client.0.vm03.stdout:7/170: symlink d5/d9/d14/d21/d28/l31 0 2026-03-10T14:07:54.896 INFO:tasks.workunit.client.0.vm03.stdout:3/93: getdents . 0 2026-03-10T14:07:54.987 INFO:tasks.workunit.client.0.vm03.stdout:1/136: creat d0/f2c x:0 0 0 2026-03-10T14:07:54.990 INFO:tasks.workunit.client.0.vm03.stdout:0/169: rmdir d3/d17 39 2026-03-10T14:07:54.994 INFO:tasks.workunit.client.0.vm03.stdout:5/167: truncate d4/f29 5869999 0 2026-03-10T14:07:55.011 INFO:tasks.workunit.client.0.vm03.stdout:8/144: truncate f4 3933927 0 2026-03-10T14:07:55.022 INFO:tasks.workunit.client.0.vm03.stdout:6/117: dwrite f5 [4194304,4194304] 0 2026-03-10T14:07:55.031 INFO:tasks.workunit.client.0.vm03.stdout:2/165: write d5/fa [2868113,68858] 0 2026-03-10T14:07:55.219 INFO:tasks.workunit.client.1.vm04.stdout:6/482: symlink d3/de/d35/d3a/d43/d4c/d5e/l91 0 2026-03-10T14:07:55.227 INFO:tasks.workunit.client.1.vm04.stdout:6/483: creat d3/de/f92 x:0 0 0 2026-03-10T14:07:55.227 INFO:tasks.workunit.client.1.vm04.stdout:6/484: dread - d3/de/d35/f80 zero size 2026-03-10T14:07:55.228 INFO:tasks.workunit.client.1.vm04.stdout:6/485: chown d3/de/d35/d3f/d2d/d32/d5c 1464750 1 2026-03-10T14:07:55.231 INFO:tasks.workunit.client.1.vm04.stdout:6/486: write d3/de/d35/d3a/d43/d4c/f4d [3137754,112212] 0 2026-03-10T14:07:55.232 INFO:tasks.workunit.client.1.vm04.stdout:6/487: chown d3/de/d35/d3f/d2d/d32/d23/d47/f62 563963 1 2026-03-10T14:07:55.233 INFO:tasks.workunit.client.1.vm04.stdout:6/488: dread - d3/de/d35/d3f/d2d/d32/d23/d24/f36 zero size 2026-03-10T14:07:55.272 INFO:tasks.workunit.client.1.vm04.stdout:8/658: creat d0/d3/d63/d12/d51/d67/fcd x:0 0 0 2026-03-10T14:07:55.274 INFO:tasks.workunit.client.1.vm04.stdout:8/659: creat d0/d3/d63/d29/fce x:0 0 0 2026-03-10T14:07:55.282 INFO:tasks.workunit.client.1.vm04.stdout:5/665: creat d7/d12/d2b/d3e/fd7 x:0 0 0 2026-03-10T14:07:55.284 INFO:tasks.workunit.client.1.vm04.stdout:8/660: mkdir d0/d3/d63/d12/d51/d67/d96/dc8/dcf 0 2026-03-10T14:07:55.284 INFO:tasks.workunit.client.1.vm04.stdout:4/581: write d4/f5f [3503614,89496] 0 2026-03-10T14:07:55.287 INFO:tasks.workunit.client.1.vm04.stdout:8/661: fsync d0/d75/f83 0 2026-03-10T14:07:55.289 INFO:tasks.workunit.client.1.vm04.stdout:4/582: symlink d4/d14/d3c/d62/lcd 0 2026-03-10T14:07:55.290 INFO:tasks.workunit.client.1.vm04.stdout:8/662: dread - d0/d3/d63/d29/f9b zero size 2026-03-10T14:07:55.292 INFO:tasks.workunit.client.1.vm04.stdout:8/663: creat d0/d3/dd/d89/fd0 x:0 0 0 2026-03-10T14:07:55.292 INFO:tasks.workunit.client.1.vm04.stdout:4/583: symlink d4/d14/d1b/lce 0 2026-03-10T14:07:55.296 INFO:tasks.workunit.client.1.vm04.stdout:8/664: dwrite d0/d3/d5/f15 [4194304,4194304] 0 2026-03-10T14:07:55.299 INFO:tasks.workunit.client.1.vm04.stdout:4/584: truncate d4/df/f2e 800188 0 2026-03-10T14:07:55.334 INFO:tasks.workunit.client.1.vm04.stdout:4/585: readlink d4/d14/dac/lbe 0 2026-03-10T14:07:55.334 INFO:tasks.workunit.client.1.vm04.stdout:8/665: fsync d0/d3/d63/d29/f9b 0 2026-03-10T14:07:55.334 INFO:tasks.workunit.client.1.vm04.stdout:8/666: unlink d0/d3/d63/d12/c84 0 2026-03-10T14:07:55.334 INFO:tasks.workunit.client.1.vm04.stdout:8/667: chown d0/d3/d63/d12/d51/d67/d96/fbf 10 1 2026-03-10T14:07:55.334 INFO:tasks.workunit.client.1.vm04.stdout:8/668: chown d0/d75/d8a 2 1 2026-03-10T14:07:55.334 INFO:tasks.workunit.client.1.vm04.stdout:4/586: symlink d4/df/lcf 0 2026-03-10T14:07:55.334 INFO:tasks.workunit.client.1.vm04.stdout:4/587: mkdir d4/df/db2/db6/dc9/dd0 0 2026-03-10T14:07:55.334 INFO:tasks.workunit.client.1.vm04.stdout:4/588: fsync d4/d14/d64/fca 0 2026-03-10T14:07:55.334 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/1837299510' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:07:55.334 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:54 vm04.local ceph-mon[55966]: pgmap v156: 65 pgs: 65 active+clean; 1.7 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 27 MiB/s rd, 107 MiB/s wr, 290 op/s 2026-03-10T14:07:55.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:54 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/1837299510' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:07:55.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:54 vm03.local ceph-mon[49718]: pgmap v156: 65 pgs: 65 active+clean; 1.7 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 27 MiB/s rd, 107 MiB/s wr, 290 op/s 2026-03-10T14:07:55.369 INFO:tasks.workunit.client.1.vm04.stdout:7/621: dwrite d2/dc/f26 [0,4194304] 0 2026-03-10T14:07:55.383 INFO:tasks.workunit.client.1.vm04.stdout:7/622: dread d2/d94/f3d [0,4194304] 0 2026-03-10T14:07:55.384 INFO:tasks.workunit.client.1.vm04.stdout:7/623: mknod d2/dc/de/d2d/d38/d50/ceb 0 2026-03-10T14:07:55.387 INFO:tasks.workunit.client.1.vm04.stdout:7/624: mknod d2/dc/de/cec 0 2026-03-10T14:07:55.390 INFO:tasks.workunit.client.1.vm04.stdout:7/625: truncate d2/d94/f7e 2307648 0 2026-03-10T14:07:55.395 INFO:tasks.workunit.client.1.vm04.stdout:7/626: creat d2/dc/de/fed x:0 0 0 2026-03-10T14:07:55.395 INFO:tasks.workunit.client.1.vm04.stdout:7/627: chown d2/dc/de/d2d/d38/d50/dc8/f7b 67 1 2026-03-10T14:07:55.403 INFO:tasks.workunit.client.1.vm04.stdout:7/628: dread d2/dc/de/d2d/d60/d7c/d44/f51 [0,4194304] 0 2026-03-10T14:07:55.404 INFO:tasks.workunit.client.1.vm04.stdout:7/629: read - d2/dc/de/d2d/d38/f8a zero size 2026-03-10T14:07:55.409 INFO:tasks.workunit.client.1.vm04.stdout:7/630: rmdir d2/dc/de/dae 39 2026-03-10T14:07:55.739 INFO:tasks.workunit.client.1.vm04.stdout:3/614: write da/dc/d35/d52/d6d/fab [60997,115626] 0 2026-03-10T14:07:55.740 INFO:tasks.workunit.client.1.vm04.stdout:3/615: symlink da/dc/d3f/d61/dc1/ld3 0 2026-03-10T14:07:55.750 INFO:tasks.workunit.client.1.vm04.stdout:3/616: dread da/dc/f90 [0,4194304] 0 2026-03-10T14:07:55.752 INFO:tasks.workunit.client.1.vm04.stdout:3/617: truncate da/dc/d35/d52/d70/f8f 935321 0 2026-03-10T14:07:55.753 INFO:tasks.workunit.client.1.vm04.stdout:3/618: symlink da/dc/d35/d52/d53/d78/ld4 0 2026-03-10T14:07:55.756 INFO:tasks.workunit.client.1.vm04.stdout:3/619: dwrite da/dc/d47/d9b/fbe [0,4194304] 0 2026-03-10T14:07:55.766 INFO:tasks.workunit.client.1.vm04.stdout:3/620: dwrite da/dc/d35/f5b [0,4194304] 0 2026-03-10T14:07:55.772 INFO:tasks.workunit.client.1.vm04.stdout:3/621: write da/dc/d47/fbd [259838,58264] 0 2026-03-10T14:07:55.772 INFO:tasks.workunit.client.1.vm04.stdout:3/622: chown da/dc/d35/d52/da3/dac 10 1 2026-03-10T14:07:56.944 INFO:tasks.workunit.client.1.vm04.stdout:8/669: creat d0/d75/fd1 x:0 0 0 2026-03-10T14:07:56.999 INFO:tasks.workunit.client.1.vm04.stdout:6/489: mknod d3/c93 0 2026-03-10T14:07:56.999 INFO:tasks.workunit.client.1.vm04.stdout:1/592: unlink d3/d20/fa3 0 2026-03-10T14:07:57.020 INFO:tasks.workunit.client.1.vm04.stdout:8/670: dwrite d0/d3/d63/d12/d51/f4f [0,4194304] 0 2026-03-10T14:07:57.022 INFO:tasks.workunit.client.1.vm04.stdout:5/666: dwrite d7/d12/d2b/d3e/f4a [0,4194304] 0 2026-03-10T14:07:57.039 INFO:tasks.workunit.client.1.vm04.stdout:5/667: dread d7/d12/d2b/f53 [0,4194304] 0 2026-03-10T14:07:57.078 INFO:tasks.workunit.client.1.vm04.stdout:7/631: rmdir d2/d94 39 2026-03-10T14:07:57.083 INFO:tasks.workunit.client.1.vm04.stdout:4/589: dwrite d4/d14/d1b/f9d [0,4194304] 0 2026-03-10T14:07:57.090 INFO:tasks.workunit.client.1.vm04.stdout:0/582: rename d0/d2/d15/d49/d50/f53 to d0/d2/d15/d22/d38/d56/d66/fba 0 2026-03-10T14:07:57.107 INFO:tasks.workunit.client.1.vm04.stdout:3/623: dread da/dc/d35/d52/d70/f8f [0,4194304] 0 2026-03-10T14:07:57.153 INFO:tasks.workunit.client.1.vm04.stdout:8/671: creat d0/d3/dd/d78/fd2 x:0 0 0 2026-03-10T14:07:57.157 INFO:tasks.workunit.client.1.vm04.stdout:5/668: creat d7/d59/d7d/d9a/fd8 x:0 0 0 2026-03-10T14:07:57.171 INFO:tasks.workunit.client.1.vm04.stdout:0/583: creat d0/d2/d25/fbb x:0 0 0 2026-03-10T14:07:57.175 INFO:tasks.workunit.client.1.vm04.stdout:6/490: link d3/de/d35/d3f/d2d/f89 d3/de/d35/d3a/d43/d4c/d5e/d76/f94 0 2026-03-10T14:07:57.176 INFO:tasks.workunit.client.1.vm04.stdout:6/491: write d3/de/d35/d3a/d43/f8a [834592,38035] 0 2026-03-10T14:07:57.184 INFO:tasks.workunit.client.1.vm04.stdout:1/593: write d3/d20/d60/d82/d13/d38/d58/faf [1762918,124174] 0 2026-03-10T14:07:57.188 INFO:tasks.workunit.client.1.vm04.stdout:7/632: getdents d2/dc/de/d2d/d60/d7c/d64/dbf 0 2026-03-10T14:07:57.214 INFO:tasks.workunit.client.1.vm04.stdout:3/624: symlink da/dc/d35/dcd/ld5 0 2026-03-10T14:07:57.215 INFO:tasks.workunit.client.1.vm04.stdout:6/492: symlink d3/de/d35/d3f/d2d/d32/d23/d83/l95 0 2026-03-10T14:07:57.217 INFO:tasks.workunit.client.1.vm04.stdout:5/669: write d7/fa [3549680,19659] 0 2026-03-10T14:07:57.217 INFO:tasks.workunit.client.1.vm04.stdout:0/584: creat d0/d2/d15/d49/fbc x:0 0 0 2026-03-10T14:07:57.217 INFO:tasks.workunit.client.1.vm04.stdout:8/672: write d0/d3/d5/f30 [4214600,124198] 0 2026-03-10T14:07:57.217 INFO:tasks.workunit.client.1.vm04.stdout:5/670: chown d7/d12/d2b/d3e/d3f/da6 76310 1 2026-03-10T14:07:57.219 INFO:tasks.workunit.client.1.vm04.stdout:8/673: chown d0/d3/d63/d29/la8 12 1 2026-03-10T14:07:57.227 INFO:tasks.workunit.client.1.vm04.stdout:3/625: read da/dc/d35/d52/da3/dac/fc9 [315195,62723] 0 2026-03-10T14:07:57.230 INFO:tasks.workunit.client.1.vm04.stdout:9/545: rename d9/d44/d59/c87 to d9/cbc 0 2026-03-10T14:07:57.231 INFO:tasks.workunit.client.1.vm04.stdout:9/546: stat d9/da/dd/c38 0 2026-03-10T14:07:57.232 INFO:tasks.workunit.client.1.vm04.stdout:9/547: chown d9/d58/db5/da5/dab/fbb 847 1 2026-03-10T14:07:57.239 INFO:tasks.workunit.client.1.vm04.stdout:0/585: mknod d0/d2/d90/cbd 0 2026-03-10T14:07:57.239 INFO:tasks.workunit.client.1.vm04.stdout:5/671: creat d7/d12/d2b/d93/fd9 x:0 0 0 2026-03-10T14:07:57.241 INFO:tasks.workunit.client.1.vm04.stdout:7/633: dread d2/dc/f4a [0,4194304] 0 2026-03-10T14:07:57.245 INFO:tasks.workunit.client.1.vm04.stdout:3/626: fsync da/f22 0 2026-03-10T14:07:57.245 INFO:tasks.workunit.client.1.vm04.stdout:1/594: dread d3/f8 [0,4194304] 0 2026-03-10T14:07:57.252 INFO:tasks.workunit.client.1.vm04.stdout:2/610: rename d0/d14/d39/d47/f5d to d0/d14/d91/d3a/fbf 0 2026-03-10T14:07:57.256 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:56 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:57.256 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:56 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:57.256 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:56 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:07:57.256 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:56 vm04.local ceph-mon[55966]: pgmap v157: 65 pgs: 65 active+clean; 1.7 GiB data, 6.3 GiB used, 114 GiB / 120 GiB avail; 22 MiB/s rd, 96 MiB/s wr, 217 op/s 2026-03-10T14:07:57.261 INFO:tasks.workunit.client.1.vm04.stdout:9/548: write d9/d44/d4d/f66 [561169,50853] 0 2026-03-10T14:07:57.265 INFO:tasks.workunit.client.1.vm04.stdout:7/634: chown d2/dc/de/dae/fd2 428388 1 2026-03-10T14:07:57.269 INFO:tasks.workunit.client.1.vm04.stdout:6/493: write d3/de/d35/d3f/d2d/d32/d23/d47/f6e [818003,22657] 0 2026-03-10T14:07:57.279 INFO:tasks.workunit.client.1.vm04.stdout:3/627: write da/dc/d35/d52/d6d/d8a/fa0 [605363,122024] 0 2026-03-10T14:07:57.294 INFO:tasks.workunit.client.1.vm04.stdout:5/672: mknod d7/d12/d2b/d3e/dae/cda 0 2026-03-10T14:07:57.294 INFO:tasks.workunit.client.1.vm04.stdout:0/586: mkdir d0/d2/dbe 0 2026-03-10T14:07:57.295 INFO:tasks.workunit.client.1.vm04.stdout:8/674: creat d0/d3/d63/fd3 x:0 0 0 2026-03-10T14:07:57.309 INFO:tasks.workunit.client.1.vm04.stdout:2/611: dwrite d0/d14/d91/d4a/d8c/fac [0,4194304] 0 2026-03-10T14:07:57.310 INFO:tasks.workunit.client.1.vm04.stdout:6/494: chown d3/d1d/d73/fc 212 1 2026-03-10T14:07:57.312 INFO:tasks.workunit.client.1.vm04.stdout:2/612: readlink d0/d14/d91/d8/l62 0 2026-03-10T14:07:57.314 INFO:tasks.workunit.client.1.vm04.stdout:7/635: dwrite d2/dc/de/d2d/d38/d50/fbc [0,4194304] 0 2026-03-10T14:07:57.350 INFO:tasks.workunit.client.1.vm04.stdout:0/587: creat d0/d2/d15/d22/d38/d56/fbf x:0 0 0 2026-03-10T14:07:57.351 INFO:tasks.workunit.client.1.vm04.stdout:0/588: fdatasync d0/d2/d25/fbb 0 2026-03-10T14:07:57.351 INFO:tasks.workunit.client.1.vm04.stdout:8/675: dread d0/d3/d73/f91 [0,4194304] 0 2026-03-10T14:07:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:56 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:57.379 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:56 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:57.379 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:56 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:07:57.379 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:56 vm03.local ceph-mon[49718]: pgmap v157: 65 pgs: 65 active+clean; 1.7 GiB data, 6.3 GiB used, 114 GiB / 120 GiB avail; 22 MiB/s rd, 96 MiB/s wr, 217 op/s 2026-03-10T14:07:57.379 INFO:tasks.workunit.client.1.vm04.stdout:1/595: creat d3/d20/fd2 x:0 0 0 2026-03-10T14:07:57.379 INFO:tasks.workunit.client.1.vm04.stdout:1/596: chown d3/d22/d63 15615 1 2026-03-10T14:07:57.379 INFO:tasks.workunit.client.1.vm04.stdout:4/590: rename d4/df/d31/c9f to d4/df/db2/db4/d47/cd1 0 2026-03-10T14:07:57.382 INFO:tasks.workunit.client.1.vm04.stdout:7/636: mkdir d2/dc/de/d2d/d5c/da9/dee 0 2026-03-10T14:07:57.392 INFO:tasks.workunit.client.1.vm04.stdout:0/589: fsync d0/d2/d15/d49/d50/d61/d75/f9a 0 2026-03-10T14:07:57.392 INFO:tasks.workunit.client.1.vm04.stdout:1/597: creat d3/d20/d60/d82/d13/d38/fd3 x:0 0 0 2026-03-10T14:07:57.393 INFO:tasks.workunit.client.1.vm04.stdout:1/598: chown d3/d22/d2f/d57/ld0 82243526 1 2026-03-10T14:07:57.400 INFO:tasks.workunit.client.1.vm04.stdout:2/613: dread d0/d14/d91/d8/d17/f73 [0,4194304] 0 2026-03-10T14:07:57.403 INFO:tasks.workunit.client.1.vm04.stdout:8/676: truncate d0/d3/d63/f3c 750717 0 2026-03-10T14:07:57.407 INFO:tasks.workunit.client.1.vm04.stdout:8/677: dwrite d0/d3/d63/d12/d51/d67/fb2 [4194304,4194304] 0 2026-03-10T14:07:57.411 INFO:tasks.workunit.client.1.vm04.stdout:0/590: creat d0/d2/d25/fc0 x:0 0 0 2026-03-10T14:07:57.411 INFO:tasks.workunit.client.1.vm04.stdout:1/599: creat d3/d5c/d79/d98/fd4 x:0 0 0 2026-03-10T14:07:57.412 INFO:tasks.workunit.client.1.vm04.stdout:1/600: chown d3/d20/d60/d82/d13/d38/cc0 1 1 2026-03-10T14:07:57.413 INFO:tasks.workunit.client.1.vm04.stdout:1/601: read - d3/d20/d60/d82/d13/da0/dc5/fd1 zero size 2026-03-10T14:07:57.414 INFO:tasks.workunit.client.1.vm04.stdout:1/602: chown d3/d20/d60/d82/d13/d38/d58/c81 11934779 1 2026-03-10T14:07:57.417 INFO:tasks.workunit.client.1.vm04.stdout:2/614: creat d0/d14/d91/d8/dd/fc0 x:0 0 0 2026-03-10T14:07:57.426 INFO:tasks.workunit.client.1.vm04.stdout:8/678: rename d0/d3/d5/c54 to d0/d3/dd/d89/db5/cd4 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:8/679: mkdir d0/d3/d73/db8/dd5 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:1/603: mkdir d3/d22/d63/d35/dd5 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:1/604: readlink d3/d22/d2f/d57/l5f 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:2/615: symlink d0/d14/d91/d4a/d8c/dab/d46/lc1 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:8/680: mknod d0/d3/d63/d12/d51/d8b/cd6 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:8/681: truncate d0/dc1/fcc 559789 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:1/605: creat d3/d22/d63/d35/dd5/fd6 x:0 0 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:1/606: mknod d3/d20/d60/d82/d13/da0/cd7 0 2026-03-10T14:07:57.449 INFO:tasks.workunit.client.1.vm04.stdout:1/607: symlink d3/d20/d60/d82/d13/d38/d58/d5b/ld8 0 2026-03-10T14:07:57.461 INFO:tasks.workunit.client.1.vm04.stdout:8/682: dread d0/d3/dd/d76/f9c [0,4194304] 0 2026-03-10T14:07:57.462 INFO:tasks.workunit.client.1.vm04.stdout:8/683: chown d0/d3/d63/d12/d51/f8e 9589315 1 2026-03-10T14:07:57.466 INFO:tasks.workunit.client.1.vm04.stdout:9/549: write d9/da/d5d/d81/faa [858180,32272] 0 2026-03-10T14:07:57.468 INFO:tasks.workunit.client.1.vm04.stdout:8/684: dread d0/d3/dd/fc [0,4194304] 0 2026-03-10T14:07:57.469 INFO:tasks.workunit.client.1.vm04.stdout:5/673: dwrite d7/d12/f51 [0,4194304] 0 2026-03-10T14:07:57.471 INFO:tasks.workunit.client.1.vm04.stdout:9/550: truncate d9/d5c/f6e 804773 0 2026-03-10T14:07:57.480 INFO:tasks.workunit.client.1.vm04.stdout:9/551: chown d9/da/l37 131492271 1 2026-03-10T14:07:57.480 INFO:tasks.workunit.client.1.vm04.stdout:9/552: truncate d9/da/dd/f47 1187072 0 2026-03-10T14:07:57.481 INFO:tasks.workunit.client.1.vm04.stdout:6/495: write d3/f19 [6092161,53146] 0 2026-03-10T14:07:57.481 INFO:tasks.workunit.client.1.vm04.stdout:5/674: symlink d7/d2d/d69/db8/ldb 0 2026-03-10T14:07:57.483 INFO:tasks.workunit.client.1.vm04.stdout:9/553: dwrite d9/da/dd/d74/fa0 [0,4194304] 0 2026-03-10T14:07:57.491 INFO:tasks.workunit.client.1.vm04.stdout:6/496: creat d3/de/d35/d3f/f96 x:0 0 0 2026-03-10T14:07:57.493 INFO:tasks.workunit.client.1.vm04.stdout:8/685: dread d0/d3/d63/f8f [0,4194304] 0 2026-03-10T14:07:57.494 INFO:tasks.workunit.client.1.vm04.stdout:6/497: creat d3/de/d35/d3f/d2d/d32/f97 x:0 0 0 2026-03-10T14:07:57.496 INFO:tasks.workunit.client.1.vm04.stdout:6/498: dread - d3/de/d35/d3f/d2d/d32/f97 zero size 2026-03-10T14:07:57.498 INFO:tasks.workunit.client.1.vm04.stdout:6/499: creat d3/de/d35/d3f/d2d/f98 x:0 0 0 2026-03-10T14:07:57.499 INFO:tasks.workunit.client.1.vm04.stdout:6/500: fdatasync d3/de/d35/d3a/d43/d4c/d5e/d76/f77 0 2026-03-10T14:07:57.543 INFO:tasks.workunit.client.1.vm04.stdout:6/501: readlink d3/de/d35/d3f/l1e 0 2026-03-10T14:07:57.543 INFO:tasks.workunit.client.1.vm04.stdout:6/502: chown d3/de/d35/d3f/d2d/d32/d23/d47 13 1 2026-03-10T14:07:57.543 INFO:tasks.workunit.client.1.vm04.stdout:6/503: write d3/de/d35/d3a/d43/f8a [393507,100109] 0 2026-03-10T14:07:57.552 INFO:tasks.workunit.client.1.vm04.stdout:9/554: dread d9/d44/d4d/f66 [0,4194304] 0 2026-03-10T14:07:57.563 INFO:tasks.workunit.client.1.vm04.stdout:9/555: creat d9/da/dd/d74/fbd x:0 0 0 2026-03-10T14:07:57.563 INFO:tasks.workunit.client.1.vm04.stdout:9/556: link d9/da/dd/f8a d9/d33/fbe 0 2026-03-10T14:07:57.563 INFO:tasks.workunit.client.1.vm04.stdout:9/557: stat d9/l78 0 2026-03-10T14:07:57.563 INFO:tasks.workunit.client.1.vm04.stdout:9/558: creat d9/da/d8c/fbf x:0 0 0 2026-03-10T14:07:57.567 INFO:tasks.workunit.client.1.vm04.stdout:7/637: write d2/dc/de/d2d/d60/d7c/f8f [950734,44286] 0 2026-03-10T14:07:57.578 INFO:tasks.workunit.client.1.vm04.stdout:0/591: dwrite d0/d2/d15/d49/d50/d61/f96 [0,4194304] 0 2026-03-10T14:07:57.583 INFO:tasks.workunit.client.1.vm04.stdout:7/638: dwrite d2/d2a/f35 [0,4194304] 0 2026-03-10T14:07:57.586 INFO:tasks.workunit.client.1.vm04.stdout:3/628: sync 2026-03-10T14:07:57.586 INFO:tasks.workunit.client.1.vm04.stdout:4/591: sync 2026-03-10T14:07:57.589 INFO:tasks.workunit.client.1.vm04.stdout:0/592: mkdir d0/d2/d15/d22/d38/d56/dc1 0 2026-03-10T14:07:57.594 INFO:tasks.workunit.client.1.vm04.stdout:2/616: dwrite d0/d14/d91/d8/f42 [0,4194304] 0 2026-03-10T14:07:57.596 INFO:tasks.workunit.client.1.vm04.stdout:6/504: dread d3/de/d35/d3f/d2d/d32/d5c/f75 [0,4194304] 0 2026-03-10T14:07:57.597 INFO:tasks.workunit.client.1.vm04.stdout:6/505: stat d3/de/d35/d3f/f22 0 2026-03-10T14:07:57.597 INFO:tasks.workunit.client.1.vm04.stdout:3/629: symlink da/dc/d3f/d54/d66/ld6 0 2026-03-10T14:07:57.599 INFO:tasks.workunit.client.1.vm04.stdout:2/617: chown d0/d14/d39/f44 3968 1 2026-03-10T14:07:57.604 INFO:tasks.workunit.client.1.vm04.stdout:6/506: readlink d3/de/d35/d3f/d2d/d32/l88 0 2026-03-10T14:07:57.609 INFO:tasks.workunit.client.1.vm04.stdout:7/639: truncate d2/dc/de/d2d/d60/d7c/d36/f87 1012697 0 2026-03-10T14:07:57.610 INFO:tasks.workunit.client.1.vm04.stdout:4/592: truncate d4/df/d31/f5d 1159430 0 2026-03-10T14:07:57.614 INFO:tasks.workunit.client.1.vm04.stdout:2/618: mkdir d0/d14/d91/d8/d17/d4e/d85/d86/d96/dc2 0 2026-03-10T14:07:57.615 INFO:tasks.workunit.client.1.vm04.stdout:6/507: dread d3/de/d35/d3f/d2d/d38/f50 [0,4194304] 0 2026-03-10T14:07:57.628 INFO:tasks.workunit.client.1.vm04.stdout:5/675: write d7/d9/f20 [1879118,100785] 0 2026-03-10T14:07:57.628 INFO:tasks.workunit.client.1.vm04.stdout:1/608: rename d3/d20/d60/d82 to d3/d22/d63/d35/dd9 0 2026-03-10T14:07:57.630 INFO:tasks.workunit.client.1.vm04.stdout:1/609: readlink d3/d22/d6d/l70 0 2026-03-10T14:07:57.635 INFO:tasks.workunit.client.1.vm04.stdout:4/593: truncate d4/d14/d1b/f6c 704513 0 2026-03-10T14:07:57.642 INFO:tasks.workunit.client.1.vm04.stdout:8/686: rename d0/d3/dd/fc to d0/d3/d5/fd7 0 2026-03-10T14:07:57.647 INFO:tasks.workunit.client.1.vm04.stdout:5/676: dread d7/f4b [0,4194304] 0 2026-03-10T14:07:57.654 INFO:tasks.workunit.client.1.vm04.stdout:1/610: fsync d3/d22/d6d/fbd 0 2026-03-10T14:07:57.655 INFO:tasks.workunit.client.1.vm04.stdout:0/593: dread d0/d2/d25/f7f [0,4194304] 0 2026-03-10T14:07:57.655 INFO:tasks.workunit.client.1.vm04.stdout:0/594: chown d0/d2/d15/d22/d38/d56/l97 4 1 2026-03-10T14:07:57.695 INFO:tasks.workunit.client.1.vm04.stdout:7/640: rename d2/dc/d4d/dcd/dc6/fcb to d2/d2a/fef 0 2026-03-10T14:07:57.700 INFO:tasks.workunit.client.1.vm04.stdout:8/687: mknod d0/d75/cd8 0 2026-03-10T14:07:57.706 INFO:tasks.workunit.client.1.vm04.stdout:5/677: creat d7/d59/d7d/fdc x:0 0 0 2026-03-10T14:07:57.720 INFO:tasks.workunit.client.1.vm04.stdout:0/595: read - d0/d2/d15/d22/d38/d56/d66/f7e zero size 2026-03-10T14:07:57.723 INFO:tasks.workunit.client.1.vm04.stdout:8/688: sync 2026-03-10T14:07:57.723 INFO:tasks.workunit.client.1.vm04.stdout:8/689: readlink d0/d3/d63/d12/d51/lb6 0 2026-03-10T14:07:57.727 INFO:tasks.workunit.client.1.vm04.stdout:8/690: dwrite d0/d3/d63/d12/d51/f4f [0,4194304] 0 2026-03-10T14:07:57.746 INFO:tasks.workunit.client.1.vm04.stdout:9/559: dwrite d9/da/fb [0,4194304] 0 2026-03-10T14:07:57.752 INFO:tasks.workunit.client.1.vm04.stdout:3/630: dwrite da/f22 [0,4194304] 0 2026-03-10T14:07:57.761 INFO:tasks.workunit.client.1.vm04.stdout:2/619: write d0/d14/d54/f9c [92181,59053] 0 2026-03-10T14:07:57.802 INFO:tasks.workunit.client.1.vm04.stdout:6/508: dwrite d3/de/f46 [0,4194304] 0 2026-03-10T14:07:57.849 INFO:tasks.workunit.client.1.vm04.stdout:5/678: rmdir d7/d2d/d69/db8 39 2026-03-10T14:07:57.852 INFO:tasks.workunit.client.1.vm04.stdout:5/679: chown d7/d59/d7d/d9a/fd8 1832 1 2026-03-10T14:07:57.852 INFO:tasks.workunit.client.1.vm04.stdout:0/596: chown d0/d2/c37 3339 1 2026-03-10T14:07:57.854 INFO:tasks.workunit.client.1.vm04.stdout:0/597: chown d0/d2/d15/d49/d50/d61/f96 645425 1 2026-03-10T14:07:57.871 INFO:tasks.workunit.client.1.vm04.stdout:9/560: chown d9/da/dd/f85 19273319 1 2026-03-10T14:07:57.876 INFO:tasks.workunit.client.1.vm04.stdout:3/631: rename da/dc/d35/d52/d6d/d8a to da/dc/d35/dcd/dd7 0 2026-03-10T14:07:57.879 INFO:tasks.workunit.client.1.vm04.stdout:7/641: write d2/dc/de/d2d/d60/d7c/d3b/f48 [1206486,99453] 0 2026-03-10T14:07:57.882 INFO:tasks.workunit.client.1.vm04.stdout:6/509: mknod d3/de/d35/d3f/d2d/d32/d23/d24/d6f/c99 0 2026-03-10T14:07:57.883 INFO:tasks.workunit.client.1.vm04.stdout:6/510: chown d3/de/d35/d3a/d43/d4c/d5e/l91 58 1 2026-03-10T14:07:57.885 INFO:tasks.workunit.client.1.vm04.stdout:6/511: chown d3/f9 47697344 1 2026-03-10T14:07:57.889 INFO:tasks.workunit.client.1.vm04.stdout:4/594: creat d4/df/db2/db4/d47/d70/fd2 x:0 0 0 2026-03-10T14:07:57.897 INFO:tasks.workunit.client.1.vm04.stdout:5/680: chown d7/d12/d2b/d3e/c49 1958 1 2026-03-10T14:07:57.901 INFO:tasks.workunit.client.1.vm04.stdout:3/632: read da/dc/d35/f50 [353538,84394] 0 2026-03-10T14:07:57.909 INFO:tasks.workunit.client.1.vm04.stdout:1/611: getdents d3/d22/d6d 0 2026-03-10T14:07:57.913 INFO:tasks.workunit.client.1.vm04.stdout:6/512: write d3/de/d35/d3a/f51 [3425837,124256] 0 2026-03-10T14:07:57.913 INFO:tasks.workunit.client.1.vm04.stdout:7/642: dread d2/dc/d4d/dcd/f7a [0,4194304] 0 2026-03-10T14:07:57.914 INFO:tasks.workunit.client.1.vm04.stdout:2/620: dread d0/d14/d91/d8/d17/d4e/f78 [0,4194304] 0 2026-03-10T14:07:57.917 INFO:tasks.workunit.client.1.vm04.stdout:7/643: write d2/dc/de/d2d/d60/d7c/d3b/f48 [786929,76194] 0 2026-03-10T14:07:57.931 INFO:tasks.workunit.client.1.vm04.stdout:3/633: truncate da/dc/d35/d37/f5e 5222991 0 2026-03-10T14:07:57.938 INFO:tasks.workunit.client.1.vm04.stdout:4/595: write d4/df/f2e [246294,90172] 0 2026-03-10T14:07:57.940 INFO:tasks.workunit.client.1.vm04.stdout:1/612: write d3/d22/d63/d35/dd9/d13/d38/db5/fb8 [1989458,111860] 0 2026-03-10T14:07:57.942 INFO:tasks.workunit.client.1.vm04.stdout:1/613: dread d3/fc [0,4194304] 0 2026-03-10T14:07:57.943 INFO:tasks.workunit.client.1.vm04.stdout:1/614: chown d3/d22/d2f/f39 4 1 2026-03-10T14:07:57.946 INFO:tasks.workunit.client.1.vm04.stdout:8/691: getdents d0/d3/d63/d12/d51/d67 0 2026-03-10T14:07:57.946 INFO:tasks.workunit.client.1.vm04.stdout:1/615: dread - d3/d22/d63/d35/dd9/d13/da0/dc5/fd1 zero size 2026-03-10T14:07:57.948 INFO:tasks.workunit.client.1.vm04.stdout:8/692: chown d0/d75/c7b 25382300 1 2026-03-10T14:07:57.954 INFO:tasks.workunit.client.1.vm04.stdout:8/693: dwrite d0/d3/d63/fd3 [0,4194304] 0 2026-03-10T14:07:57.958 INFO:tasks.workunit.client.1.vm04.stdout:6/513: fdatasync d3/de/d35/d3f/d2d/d38/f60 0 2026-03-10T14:07:57.963 INFO:tasks.workunit.client.1.vm04.stdout:1/616: sync 2026-03-10T14:07:57.970 INFO:tasks.workunit.client.1.vm04.stdout:9/561: link d9/da/f13 d9/da/dd/fc0 0 2026-03-10T14:07:57.971 INFO:tasks.workunit.client.1.vm04.stdout:9/562: truncate d9/d5c/fa6 1709373 0 2026-03-10T14:07:57.975 INFO:tasks.workunit.client.1.vm04.stdout:3/634: symlink da/dc/d35/d52/d53/ld8 0 2026-03-10T14:07:57.987 INFO:tasks.workunit.client.1.vm04.stdout:2/621: fdatasync d0/d14/d91/f9 0 2026-03-10T14:07:57.989 INFO:tasks.workunit.client.1.vm04.stdout:7/644: write d2/dc/de/d2d/d5c/f8e [1550484,66475] 0 2026-03-10T14:07:57.992 INFO:tasks.workunit.client.1.vm04.stdout:7/645: dread - d2/dc/de/d2d/d60/d7c/d64/f6a zero size 2026-03-10T14:07:57.994 INFO:tasks.workunit.client.1.vm04.stdout:4/596: dwrite d4/df/d34/fc5 [0,4194304] 0 2026-03-10T14:07:58.000 INFO:tasks.workunit.client.1.vm04.stdout:8/694: creat d0/d3/d63/d12/d51/d67/d96/dc8/fd9 x:0 0 0 2026-03-10T14:07:58.001 INFO:tasks.workunit.client.1.vm04.stdout:6/514: dread - d3/d1d/f55 zero size 2026-03-10T14:07:58.002 INFO:tasks.workunit.client.1.vm04.stdout:6/515: chown d3/d1d/d73/c84 66 1 2026-03-10T14:07:58.008 INFO:tasks.workunit.client.1.vm04.stdout:0/598: getdents d0/d2/d15/d49/d50/d5c/da4 0 2026-03-10T14:07:58.011 INFO:tasks.workunit.client.1.vm04.stdout:9/563: truncate d9/f4f 4563755 0 2026-03-10T14:07:58.012 INFO:tasks.workunit.client.1.vm04.stdout:9/564: stat d9/d44 0 2026-03-10T14:07:58.016 INFO:tasks.workunit.client.1.vm04.stdout:5/681: creat d7/d12/d2b/d3e/d57/d77/fdd x:0 0 0 2026-03-10T14:07:58.022 INFO:tasks.workunit.client.1.vm04.stdout:1/617: dread d3/d22/d63/d35/dd9/d13/d1a/f28 [4194304,4194304] 0 2026-03-10T14:07:58.023 INFO:tasks.workunit.client.1.vm04.stdout:1/618: write d3/d22/d2f/f5d [1344835,37439] 0 2026-03-10T14:07:58.029 INFO:tasks.workunit.client.1.vm04.stdout:3/635: write da/dc/d3f/d61/dc1/fc4 [2043486,97623] 0 2026-03-10T14:07:58.043 INFO:tasks.workunit.client.1.vm04.stdout:6/516: rmdir d3/de/d35/d3f/d2d/d32/d23/d24/d6f/d71 39 2026-03-10T14:07:58.044 INFO:tasks.workunit.client.1.vm04.stdout:8/695: write d0/d3/f6b [1891762,46246] 0 2026-03-10T14:07:58.046 INFO:tasks.workunit.client.1.vm04.stdout:0/599: unlink d0/d2/d15/d49/fbc 0 2026-03-10T14:07:58.047 INFO:tasks.workunit.client.1.vm04.stdout:0/600: fdatasync d0/d2/d15/d49/d50/d61/fb8 0 2026-03-10T14:07:58.060 INFO:tasks.workunit.client.1.vm04.stdout:9/565: dwrite d9/da/f57 [0,4194304] 0 2026-03-10T14:07:58.063 INFO:tasks.workunit.client.1.vm04.stdout:2/622: mkdir d0/d14/d39/d47/d70/dc3 0 2026-03-10T14:07:58.081 INFO:tasks.workunit.client.1.vm04.stdout:4/597: creat d4/df/db2/db6/dc9/dd0/fd3 x:0 0 0 2026-03-10T14:07:58.092 INFO:tasks.workunit.client.1.vm04.stdout:3/636: chown da/dc/d47/d9b/fd2 0 1 2026-03-10T14:07:58.093 INFO:tasks.workunit.client.1.vm04.stdout:7/646: write d2/d2a/d42/d86/fd0 [2379140,69243] 0 2026-03-10T14:07:58.106 INFO:tasks.workunit.client.1.vm04.stdout:5/682: dread - d7/d12/d2b/d3e/d57/d77/da5/faa zero size 2026-03-10T14:07:58.106 INFO:tasks.workunit.client.1.vm04.stdout:0/601: write d0/d2/d15/d22/d38/d56/d66/f2e [4185404,37961] 0 2026-03-10T14:07:58.106 INFO:tasks.workunit.client.1.vm04.stdout:0/602: chown d0/d2/d15/d22/d38/f71 0 1 2026-03-10T14:07:58.120 INFO:tasks.workunit.client.1.vm04.stdout:1/619: mkdir d3/d22/d63/d35/dd9/d13/d38/db5/dc4/dda 0 2026-03-10T14:07:58.128 INFO:tasks.workunit.client.1.vm04.stdout:4/598: dread - d4/df/db2/db4/d47/d70/fbf zero size 2026-03-10T14:07:58.128 INFO:tasks.workunit.client.1.vm04.stdout:2/623: write d0/d14/d91/d4a/fa6 [616699,117104] 0 2026-03-10T14:07:58.144 INFO:tasks.workunit.client.1.vm04.stdout:8/696: creat d0/d3/d63/d12/d51/d67/d96/dc8/dcf/fda x:0 0 0 2026-03-10T14:07:58.145 INFO:tasks.workunit.client.1.vm04.stdout:3/637: symlink da/dc/d35/dcd/ld9 0 2026-03-10T14:07:58.145 INFO:tasks.workunit.client.1.vm04.stdout:7/647: dread - d2/dc/de/d2d/d5c/da9/fc3 zero size 2026-03-10T14:07:58.147 INFO:tasks.workunit.client.1.vm04.stdout:6/517: dwrite d3/d1d/d73/f2b [0,4194304] 0 2026-03-10T14:07:58.163 INFO:tasks.workunit.client.1.vm04.stdout:0/603: rename d0/f14 to d0/d2/d15/d22/d38/d56/dc1/fc2 0 2026-03-10T14:07:58.172 INFO:tasks.workunit.client.1.vm04.stdout:1/620: fsync d3/f2c 0 2026-03-10T14:07:58.185 INFO:tasks.workunit.client.1.vm04.stdout:8/697: fdatasync d0/d3/fc3 0 2026-03-10T14:07:58.192 INFO:tasks.workunit.client.1.vm04.stdout:3/638: creat da/dc/d3f/d61/dc1/fda x:0 0 0 2026-03-10T14:07:58.198 INFO:tasks.workunit.client.1.vm04.stdout:5/683: write d7/f24 [2223062,61733] 0 2026-03-10T14:07:58.200 INFO:tasks.workunit.client.1.vm04.stdout:5/684: chown d7/d26/d6b/d6e/fa3 4 1 2026-03-10T14:07:58.201 INFO:tasks.workunit.client.1.vm04.stdout:9/566: dwrite d9/da/dd/f47 [0,4194304] 0 2026-03-10T14:07:58.202 INFO:tasks.workunit.client.1.vm04.stdout:2/624: dwrite d0/d14/d91/d8/d17/d4e/f78 [0,4194304] 0 2026-03-10T14:07:58.209 INFO:tasks.workunit.client.1.vm04.stdout:7/648: symlink d2/dc/de/d2d/d60/d7c/d64/dbf/lf0 0 2026-03-10T14:07:58.209 INFO:tasks.workunit.client.1.vm04.stdout:8/698: mkdir d0/d3/dd/d76/ddb 0 2026-03-10T14:07:58.221 INFO:tasks.workunit.client.1.vm04.stdout:4/599: rename d4/df/d34/c6b to d4/df/db2/cd4 0 2026-03-10T14:07:58.241 INFO:tasks.workunit.client.1.vm04.stdout:0/604: getdents d0/d2/d15/d22/d38/d56/da7 0 2026-03-10T14:07:58.241 INFO:tasks.workunit.client.1.vm04.stdout:9/567: creat d9/d33/fc1 x:0 0 0 2026-03-10T14:07:58.241 INFO:tasks.workunit.client.1.vm04.stdout:0/605: chown d0/d2/d15/d49/d50/c6c 506 1 2026-03-10T14:07:58.241 INFO:tasks.workunit.client.1.vm04.stdout:0/606: readlink d0/d2/d15/d22/l95 0 2026-03-10T14:07:58.241 INFO:tasks.workunit.client.1.vm04.stdout:1/621: mkdir d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb 0 2026-03-10T14:07:58.242 INFO:tasks.workunit.client.1.vm04.stdout:4/600: read - d4/df/db2/db4/d47/d4f/d8c/fa4 zero size 2026-03-10T14:07:58.242 INFO:tasks.workunit.client.1.vm04.stdout:0/607: creat d0/d2/d15/d22/d38/d56/dc1/fc3 x:0 0 0 2026-03-10T14:07:58.242 INFO:tasks.workunit.client.1.vm04.stdout:1/622: unlink d3/d5c/d79/d98/fd4 0 2026-03-10T14:07:58.242 INFO:tasks.workunit.client.1.vm04.stdout:6/518: getdents d3/de/d35/d3f/d2d/d32/d23 0 2026-03-10T14:07:58.242 INFO:tasks.workunit.client.1.vm04.stdout:4/601: creat d4/d14/d3c/d62/fd5 x:0 0 0 2026-03-10T14:07:58.243 INFO:tasks.workunit.client.1.vm04.stdout:2/625: sync 2026-03-10T14:07:58.249 INFO:tasks.workunit.client.1.vm04.stdout:0/608: write d0/f99 [9219468,127719] 0 2026-03-10T14:07:58.250 INFO:tasks.workunit.client.1.vm04.stdout:0/609: write d0/d2/d15/d22/d38/d56/fbf [704487,42986] 0 2026-03-10T14:07:58.254 INFO:tasks.workunit.client.1.vm04.stdout:2/626: dread d0/d14/d39/f44 [0,4194304] 0 2026-03-10T14:07:58.266 INFO:tasks.workunit.client.1.vm04.stdout:7/649: dread d2/dc/de/d2d/d60/d7c/f84 [0,4194304] 0 2026-03-10T14:07:58.280 INFO:tasks.workunit.client.1.vm04.stdout:4/602: truncate d4/df/d34/f95 2346061 0 2026-03-10T14:07:58.287 INFO:tasks.workunit.client.1.vm04.stdout:0/610: creat d0/d2/d15/d49/d50/d61/d75/fc4 x:0 0 0 2026-03-10T14:07:58.287 INFO:tasks.workunit.client.1.vm04.stdout:0/611: chown d0/d2/d15/d22 7697174 1 2026-03-10T14:07:58.291 INFO:tasks.workunit.client.1.vm04.stdout:3/639: dwrite da/d3e/f4c [0,4194304] 0 2026-03-10T14:07:58.302 INFO:tasks.workunit.client.1.vm04.stdout:5/685: dwrite d7/f21 [0,4194304] 0 2026-03-10T14:07:58.315 INFO:tasks.workunit.client.1.vm04.stdout:8/699: write d0/d3/d63/f5b [1267960,114507] 0 2026-03-10T14:07:58.328 INFO:tasks.workunit.client.1.vm04.stdout:9/568: write d9/da/dd/f8a [403110,10708] 0 2026-03-10T14:07:58.331 INFO:tasks.workunit.client.1.vm04.stdout:1/623: dwrite d3/d22/d63/d35/dd9/fd [0,4194304] 0 2026-03-10T14:07:58.332 INFO:tasks.workunit.client.1.vm04.stdout:6/519: write d3/de/d35/d3f/d2d/d38/d40/f5d [366197,49730] 0 2026-03-10T14:07:58.361 INFO:tasks.workunit.client.1.vm04.stdout:3/640: dread da/dc/fa4 [0,4194304] 0 2026-03-10T14:07:58.371 INFO:tasks.workunit.client.1.vm04.stdout:5/686: symlink d7/d59/d7e/d87/lde 0 2026-03-10T14:07:58.373 INFO:tasks.workunit.client.1.vm04.stdout:5/687: dread d7/d12/d2b/f53 [0,4194304] 0 2026-03-10T14:07:58.375 INFO:tasks.workunit.client.1.vm04.stdout:8/700: mkdir d0/d3/dd/d89/db5/ddc 0 2026-03-10T14:07:58.376 INFO:tasks.workunit.client.1.vm04.stdout:8/701: chown d0/d75/fd1 30732443 1 2026-03-10T14:07:58.379 INFO:tasks.workunit.client.1.vm04.stdout:7/650: symlink d2/dc/de/lf1 0 2026-03-10T14:07:58.381 INFO:tasks.workunit.client.1.vm04.stdout:9/569: mkdir d9/d5c/dc2 0 2026-03-10T14:07:58.387 INFO:tasks.workunit.client.1.vm04.stdout:6/520: fsync d3/de/d35/d3f/d2d/f82 0 2026-03-10T14:07:58.411 INFO:tasks.workunit.client.1.vm04.stdout:0/612: symlink d0/d2/d15/d49/d50/d61/lc5 0 2026-03-10T14:07:58.412 INFO:tasks.workunit.client.1.vm04.stdout:1/624: write d3/f8 [1142287,98467] 0 2026-03-10T14:07:58.412 INFO:tasks.workunit.client.1.vm04.stdout:0/613: chown d0/d2/d15/d49/d50/d5c/da4/cb9 329198 1 2026-03-10T14:07:58.440 INFO:tasks.workunit.client.1.vm04.stdout:3/641: dwrite da/dc/d47/f6b [0,4194304] 0 2026-03-10T14:07:58.444 INFO:tasks.workunit.client.1.vm04.stdout:7/651: creat d2/dc/de/d2d/d60/ff2 x:0 0 0 2026-03-10T14:07:58.446 INFO:tasks.workunit.client.1.vm04.stdout:8/702: dread d0/d3/d63/d29/fba [0,4194304] 0 2026-03-10T14:07:58.447 INFO:tasks.workunit.client.1.vm04.stdout:8/703: read - d0/d3/d63/d29/fbb zero size 2026-03-10T14:07:58.447 INFO:tasks.workunit.client.1.vm04.stdout:8/704: stat d0/d3/dd/d76/f9f 0 2026-03-10T14:07:58.455 INFO:tasks.workunit.client.1.vm04.stdout:5/688: dread d7/f1d [0,4194304] 0 2026-03-10T14:07:58.456 INFO:tasks.workunit.client.1.vm04.stdout:2/627: link d0/d14/d91/d8/d17/d4e/d85/d86/la7 d0/d14/d91/d4a/d8c/dab/d95/lc4 0 2026-03-10T14:07:58.459 INFO:tasks.workunit.client.1.vm04.stdout:3/642: fdatasync da/f28 0 2026-03-10T14:07:58.459 INFO:tasks.workunit.client.1.vm04.stdout:3/643: chown da/dc/d3f/d54/f5d 1117 1 2026-03-10T14:07:58.460 INFO:tasks.workunit.client.1.vm04.stdout:7/652: unlink d2/d2a/f92 0 2026-03-10T14:07:58.461 INFO:tasks.workunit.client.1.vm04.stdout:8/705: creat d0/d3/d73/fdd x:0 0 0 2026-03-10T14:07:58.461 INFO:tasks.workunit.client.1.vm04.stdout:8/706: chown d0/d3/d73/fdd 2804 1 2026-03-10T14:07:58.473 INFO:tasks.workunit.client.1.vm04.stdout:4/603: getdents d4/df 0 2026-03-10T14:07:58.479 INFO:tasks.workunit.client.1.vm04.stdout:5/689: mkdir d7/d26/ddf 0 2026-03-10T14:07:58.479 INFO:tasks.workunit.client.1.vm04.stdout:7/653: dread - d2/d2a/d42/d86/fbb zero size 2026-03-10T14:07:58.479 INFO:tasks.workunit.client.1.vm04.stdout:7/654: chown d2/dc/d4d/dcd/f7a 6236 1 2026-03-10T14:07:58.482 INFO:tasks.workunit.client.1.vm04.stdout:8/707: unlink d0/d3/d63/d12/c13 0 2026-03-10T14:07:58.494 INFO:tasks.workunit.client.1.vm04.stdout:0/614: link d0/d2/d15/d22/d38/d56/fbf d0/d2/d15/d22/d38/fc6 0 2026-03-10T14:07:58.494 INFO:tasks.workunit.client.1.vm04.stdout:2/628: symlink d0/d14/lc5 0 2026-03-10T14:07:58.495 INFO:tasks.workunit.client.1.vm04.stdout:2/629: read - d0/d14/d91/d3a/d3e/fbd zero size 2026-03-10T14:07:58.497 INFO:tasks.workunit.client.1.vm04.stdout:2/630: read d0/d14/d91/d4a/d8c/dab/f33 [3755108,129990] 0 2026-03-10T14:07:58.502 INFO:tasks.workunit.client.1.vm04.stdout:7/655: dread d2/dc/de/d2d/d60/faf [0,4194304] 0 2026-03-10T14:07:58.507 INFO:tasks.workunit.client.1.vm04.stdout:9/570: write d9/da/dd/d74/f92 [1020353,115181] 0 2026-03-10T14:07:58.513 INFO:tasks.workunit.client.1.vm04.stdout:6/521: getdents d3/de/d35/d3a/d43/d4c/d5e/d76 0 2026-03-10T14:07:58.515 INFO:tasks.workunit.client.1.vm04.stdout:1/625: getdents d3/d22/d2f/d57 0 2026-03-10T14:07:58.517 INFO:tasks.workunit.client.1.vm04.stdout:0/615: chown d0/d2/d15/d22/d38/d56/d66/c9c 504167603 1 2026-03-10T14:07:58.522 INFO:tasks.workunit.client.1.vm04.stdout:3/644: write da/fe [3837437,52132] 0 2026-03-10T14:07:58.528 INFO:tasks.workunit.client.1.vm04.stdout:8/708: mknod d0/cde 0 2026-03-10T14:07:58.530 INFO:tasks.workunit.client.1.vm04.stdout:0/616: write d0/d2/d15/d22/d38/fc6 [1166536,9997] 0 2026-03-10T14:07:58.531 INFO:tasks.workunit.client.1.vm04.stdout:6/522: dread d3/de/d35/d3f/d2d/f34 [0,4194304] 0 2026-03-10T14:07:58.543 INFO:tasks.workunit.client.1.vm04.stdout:3/645: sync 2026-03-10T14:07:58.547 INFO:tasks.workunit.client.1.vm04.stdout:7/656: fdatasync d2/dc/de/d2d/d60/d7c/d36/d8b/fde 0 2026-03-10T14:07:58.552 INFO:tasks.workunit.client.1.vm04.stdout:4/604: getdents d4/df/db2/db4 0 2026-03-10T14:07:58.554 INFO:tasks.workunit.client.1.vm04.stdout:5/690: write d7/d26/f54 [863332,104916] 0 2026-03-10T14:07:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:58 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:07:58.567 INFO:tasks.workunit.client.1.vm04.stdout:8/709: creat d0/d75/d8a/fdf x:0 0 0 2026-03-10T14:07:58.567 INFO:tasks.workunit.client.1.vm04.stdout:8/710: chown d0/d3/dd/d89/fa5 3313 1 2026-03-10T14:07:58.569 INFO:tasks.workunit.client.1.vm04.stdout:1/626: write d3/d22/d63/d35/dd9/d13/da0/fb9 [109971,96271] 0 2026-03-10T14:07:58.573 INFO:tasks.workunit.client.1.vm04.stdout:8/711: dread d0/d3/d63/f8f [0,4194304] 0 2026-03-10T14:07:58.581 INFO:tasks.workunit.client.1.vm04.stdout:2/631: write d0/d14/d39/d47/d70/f74 [885365,10090] 0 2026-03-10T14:07:58.583 INFO:tasks.workunit.client.1.vm04.stdout:6/523: rename d3/de/d35/d3f/d2d/d32/f1f to d3/de/d35/d3f/d2d/f9a 0 2026-03-10T14:07:58.586 INFO:tasks.workunit.client.1.vm04.stdout:3/646: creat da/dc/d35/fdb x:0 0 0 2026-03-10T14:07:58.587 INFO:tasks.workunit.client.1.vm04.stdout:4/605: fdatasync d4/d14/d64/f97 0 2026-03-10T14:07:58.589 INFO:tasks.workunit.client.1.vm04.stdout:5/691: dread - d7/d12/d2b/d93/d9e/faf zero size 2026-03-10T14:07:58.593 INFO:tasks.workunit.client.1.vm04.stdout:1/627: fdatasync d3/d22/d2f/d57/fac 0 2026-03-10T14:07:58.594 INFO:tasks.workunit.client.1.vm04.stdout:1/628: stat d3/d22/d63/d35/dd9/d13/d38/db5/lc7 0 2026-03-10T14:07:58.603 INFO:tasks.workunit.client.1.vm04.stdout:8/712: dwrite d0/d3/d5/fb9 [4194304,4194304] 0 2026-03-10T14:07:58.605 INFO:tasks.workunit.client.1.vm04.stdout:7/657: dread d2/dc/de/f12 [0,4194304] 0 2026-03-10T14:07:58.605 INFO:tasks.workunit.client.1.vm04.stdout:7/658: dread - d2/fbe zero size 2026-03-10T14:07:58.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:58 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:07:58.610 INFO:tasks.workunit.client.1.vm04.stdout:8/713: dread d0/d3/d63/f57 [0,4194304] 0 2026-03-10T14:07:58.612 INFO:tasks.workunit.client.1.vm04.stdout:3/647: symlink da/dc/d35/d52/da3/dac/ldc 0 2026-03-10T14:07:58.618 INFO:tasks.workunit.client.1.vm04.stdout:4/606: creat d4/d14/d64/fd6 x:0 0 0 2026-03-10T14:07:58.623 INFO:tasks.workunit.client.1.vm04.stdout:5/692: rename d7/d26/fbc to d7/d9/fe0 0 2026-03-10T14:07:58.624 INFO:tasks.workunit.client.1.vm04.stdout:1/629: write d3/d22/d63/d35/dd9/d13/da0/dc5/fc6 [1247416,46140] 0 2026-03-10T14:07:58.635 INFO:tasks.workunit.client.1.vm04.stdout:8/714: mknod d0/d3/dd/d78/ce0 0 2026-03-10T14:07:58.635 INFO:tasks.workunit.client.1.vm04.stdout:7/659: chown d2/dc/de/d2d/d5c/da9/cce 10789 1 2026-03-10T14:07:58.637 INFO:tasks.workunit.client.1.vm04.stdout:7/660: dread - d2/dc/de/d2d/d38/f8a zero size 2026-03-10T14:07:58.645 INFO:tasks.workunit.client.1.vm04.stdout:5/693: symlink d7/d12/d2b/d93/d9e/le1 0 2026-03-10T14:07:58.645 INFO:tasks.workunit.client.1.vm04.stdout:4/607: mknod d4/df/db2/db4/d47/d4f/d8c/cd7 0 2026-03-10T14:07:58.646 INFO:tasks.workunit.client.1.vm04.stdout:1/630: truncate d3/d22/d63/f7f 419330 0 2026-03-10T14:07:58.646 INFO:tasks.workunit.client.1.vm04.stdout:6/524: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f9b x:0 0 0 2026-03-10T14:07:58.649 INFO:tasks.workunit.client.1.vm04.stdout:7/661: read d2/dc/de/d2d/d38/f41 [7631291,67744] 0 2026-03-10T14:07:58.657 INFO:tasks.workunit.client.1.vm04.stdout:8/715: truncate d0/d3/d63/d12/d69/f8c 717591 0 2026-03-10T14:07:58.666 INFO:tasks.workunit.client.1.vm04.stdout:3/648: dwrite da/dc/f1d [0,4194304] 0 2026-03-10T14:07:58.672 INFO:tasks.workunit.client.1.vm04.stdout:9/571: dwrite d9/d44/d59/f7c [0,4194304] 0 2026-03-10T14:07:58.674 INFO:tasks.workunit.client.1.vm04.stdout:0/617: dwrite d0/d2/d15/d22/f81 [0,4194304] 0 2026-03-10T14:07:58.674 INFO:tasks.workunit.client.1.vm04.stdout:2/632: dwrite d0/d14/d91/d8/d17/d4e/d85/f89 [0,4194304] 0 2026-03-10T14:07:58.676 INFO:tasks.workunit.client.1.vm04.stdout:5/694: rename d7/d2d/d32/d34/f89 to d7/d59/d7e/d87/fe2 0 2026-03-10T14:07:58.677 INFO:tasks.workunit.client.1.vm04.stdout:5/695: fsync d7/d26/d6b/d6e/fa3 0 2026-03-10T14:07:58.679 INFO:tasks.workunit.client.1.vm04.stdout:1/631: fsync d3/d22/f3b 0 2026-03-10T14:07:58.681 INFO:tasks.workunit.client.1.vm04.stdout:5/696: chown d7/d12/d2b/f4d 368 1 2026-03-10T14:07:58.721 INFO:tasks.workunit.client.1.vm04.stdout:0/618: sync 2026-03-10T14:07:58.745 INFO:tasks.workunit.client.1.vm04.stdout:3/649: mknod da/d30/cdd 0 2026-03-10T14:07:58.745 INFO:tasks.workunit.client.1.vm04.stdout:3/650: read - da/dc/d35/d52/f69 zero size 2026-03-10T14:07:58.763 INFO:tasks.workunit.client.1.vm04.stdout:8/716: mknod d0/d3/d73/db8/dd5/ce1 0 2026-03-10T14:07:58.763 INFO:tasks.workunit.client.1.vm04.stdout:8/717: readlink d0/d3/dd/d89/lb7 0 2026-03-10T14:07:58.764 INFO:tasks.workunit.client.1.vm04.stdout:8/718: readlink d0/d3/d63/d29/l5a 0 2026-03-10T14:07:58.771 INFO:tasks.workunit.client.1.vm04.stdout:0/619: rename d0/d2/d25/fbb to d0/da2/fc7 0 2026-03-10T14:07:58.772 INFO:tasks.workunit.client.1.vm04.stdout:9/572: dread d9/d58/db5/f1f [0,4194304] 0 2026-03-10T14:07:58.789 INFO:tasks.workunit.client.1.vm04.stdout:7/662: fdatasync d2/d94/f7e 0 2026-03-10T14:07:58.796 INFO:tasks.workunit.client.1.vm04.stdout:2/633: mknod d0/d14/d91/d4a/d8c/dab/db3/cc6 0 2026-03-10T14:07:58.800 INFO:tasks.workunit.client.1.vm04.stdout:3/651: dread da/dc/d35/d37/f6e [0,4194304] 0 2026-03-10T14:07:58.816 INFO:tasks.workunit.client.1.vm04.stdout:0/620: fsync d0/d2/d15/d22/d38/f5b 0 2026-03-10T14:07:58.816 INFO:tasks.workunit.client.1.vm04.stdout:9/573: mknod d9/d33/cc3 0 2026-03-10T14:07:58.817 INFO:tasks.workunit.client.1.vm04.stdout:8/719: rename d0/d3/d63/d12/d69/c6a to d0/d3/d63/d12/d51/d8b/ce2 0 2026-03-10T14:07:58.817 INFO:tasks.workunit.client.1.vm04.stdout:7/663: mknod d2/dc/de/d2d/d60/d7c/d36/daa/cf3 0 2026-03-10T14:07:58.818 INFO:tasks.workunit.client.1.vm04.stdout:8/720: chown d0/d3/d63/fd3 10161 1 2026-03-10T14:07:58.818 INFO:tasks.workunit.client.1.vm04.stdout:9/574: chown d9/da/l14 14 1 2026-03-10T14:07:58.818 INFO:tasks.workunit.client.1.vm04.stdout:7/664: write d2/d2a/d42/d86/fd0 [3673424,61893] 0 2026-03-10T14:07:58.818 INFO:tasks.workunit.client.1.vm04.stdout:9/575: readlink d9/da/d5d/lb9 0 2026-03-10T14:07:58.822 INFO:tasks.workunit.client.1.vm04.stdout:3/652: rename da/dc/d35/d52/da3 to da/dc/d35/dcd/dde 0 2026-03-10T14:07:58.823 INFO:tasks.workunit.client.1.vm04.stdout:3/653: readlink da/dc/d3f/lc5 0 2026-03-10T14:07:58.829 INFO:tasks.workunit.client.1.vm04.stdout:6/525: dwrite d3/de/d35/d3f/d2d/d32/f81 [0,4194304] 0 2026-03-10T14:07:58.835 INFO:tasks.workunit.client.1.vm04.stdout:3/654: sync 2026-03-10T14:07:58.838 INFO:tasks.workunit.client.1.vm04.stdout:4/608: dwrite d4/d14/f98 [0,4194304] 0 2026-03-10T14:07:58.866 INFO:tasks.workunit.client.1.vm04.stdout:0/621: symlink d0/d2/d15/d22/d38/d56/d66/lc8 0 2026-03-10T14:07:58.889 INFO:tasks.workunit.client.1.vm04.stdout:8/721: dread d0/d3/d63/d29/f45 [0,4194304] 0 2026-03-10T14:07:58.890 INFO:tasks.workunit.client.1.vm04.stdout:8/722: chown d0/d75/d8a/cca 126 1 2026-03-10T14:07:58.895 INFO:tasks.workunit.client.1.vm04.stdout:8/723: dwrite d0/d3/d5/f15 [4194304,4194304] 0 2026-03-10T14:07:58.916 INFO:tasks.workunit.client.1.vm04.stdout:6/526: fsync d3/de/d35/d3f/d2d/d32/d23/f33 0 2026-03-10T14:07:58.925 INFO:tasks.workunit.client.1.vm04.stdout:4/609: symlink d4/d14/d64/ld8 0 2026-03-10T14:07:58.929 INFO:tasks.workunit.client.1.vm04.stdout:1/632: write d3/d20/f27 [18042,93172] 0 2026-03-10T14:07:58.937 INFO:tasks.workunit.client.1.vm04.stdout:8/724: read d0/d3/d63/d12/d51/f8e [2198388,57791] 0 2026-03-10T14:07:58.996 INFO:tasks.workunit.client.1.vm04.stdout:4/610: rmdir d4/df/db2/db4/d47/d4f/d8c 39 2026-03-10T14:07:59.013 INFO:tasks.workunit.client.1.vm04.stdout:2/634: link d0/d14/d1b/f29 d0/d14/fc7 0 2026-03-10T14:07:59.019 INFO:tasks.workunit.client.1.vm04.stdout:7/665: write d2/dc/de/d2d/d38/f37 [3418800,17865] 0 2026-03-10T14:07:59.019 INFO:tasks.workunit.client.1.vm04.stdout:5/697: write d7/d59/d7e/d87/fe2 [2529246,119265] 0 2026-03-10T14:07:59.022 INFO:tasks.workunit.client.1.vm04.stdout:9/576: dwrite d9/d44/d4d/f4e [0,4194304] 0 2026-03-10T14:07:59.030 INFO:tasks.workunit.client.1.vm04.stdout:5/698: dwrite d7/d12/d2b/d3e/d57/d77/fdd [0,4194304] 0 2026-03-10T14:07:59.036 INFO:tasks.workunit.client.1.vm04.stdout:5/699: fdatasync d7/d59/d7d/fdc 0 2026-03-10T14:07:59.051 INFO:tasks.workunit.client.1.vm04.stdout:0/622: mkdir d0/d2/dc9 0 2026-03-10T14:07:59.051 INFO:tasks.workunit.client.1.vm04.stdout:8/725: creat d0/d3/d63/d29/fe3 x:0 0 0 2026-03-10T14:07:59.052 INFO:tasks.workunit.client.1.vm04.stdout:0/623: truncate d0/d2/d15/fb4 797129 0 2026-03-10T14:07:59.053 INFO:tasks.workunit.client.1.vm04.stdout:0/624: chown d0/d2/d25/l9d 344091 1 2026-03-10T14:07:59.056 INFO:tasks.workunit.client.1.vm04.stdout:6/527: mkdir d3/de/d35/d3a/d43/d9c 0 2026-03-10T14:07:59.056 INFO:tasks.workunit.client.1.vm04.stdout:0/625: chown d0/d2/d15/d22/d38/d56/d66/l42 0 1 2026-03-10T14:07:59.058 INFO:tasks.workunit.client.1.vm04.stdout:5/700: read d7/f24 [4027321,12643] 0 2026-03-10T14:07:59.059 INFO:tasks.workunit.client.1.vm04.stdout:5/701: readlink d7/d59/l79 0 2026-03-10T14:07:59.062 INFO:tasks.workunit.client.1.vm04.stdout:1/633: dread d3/d22/d63/d35/dd9/d13/d1a/f4b [0,4194304] 0 2026-03-10T14:07:59.062 INFO:tasks.workunit.client.1.vm04.stdout:5/702: dread - d7/d12/d2b/d3e/d57/d77/da5/faa zero size 2026-03-10T14:07:59.063 INFO:tasks.workunit.client.1.vm04.stdout:1/634: chown d3/d22/d63/d35/dd9/d13/d38 2 1 2026-03-10T14:07:59.066 INFO:tasks.workunit.client.1.vm04.stdout:1/635: chown d3/d22/d63/d35/dd9/d13/d38/cc0 530 1 2026-03-10T14:07:59.069 INFO:tasks.workunit.client.1.vm04.stdout:4/611: truncate d4/df/d31/fae 239928 0 2026-03-10T14:07:59.099 INFO:tasks.workunit.client.1.vm04.stdout:2/635: write d0/d14/d91/d4a/d66/f7d [787347,119736] 0 2026-03-10T14:07:59.103 INFO:tasks.workunit.client.1.vm04.stdout:7/666: fdatasync d2/dc/f25 0 2026-03-10T14:07:59.120 INFO:tasks.workunit.client.1.vm04.stdout:9/577: dread d9/d5c/f6e [0,4194304] 0 2026-03-10T14:07:59.127 INFO:tasks.workunit.client.1.vm04.stdout:8/726: mknod d0/d3/d73/ce4 0 2026-03-10T14:07:59.131 INFO:tasks.workunit.client.1.vm04.stdout:0/626: creat d0/da2/fca x:0 0 0 2026-03-10T14:07:59.133 INFO:tasks.workunit.client.1.vm04.stdout:6/528: dread - d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f86 zero size 2026-03-10T14:07:59.149 INFO:tasks.workunit.client.1.vm04.stdout:1/636: dwrite d3/d22/d2f/f3c [0,4194304] 0 2026-03-10T14:07:59.152 INFO:tasks.workunit.client.1.vm04.stdout:4/612: creat d4/df/db2/db4/fd9 x:0 0 0 2026-03-10T14:07:59.160 INFO:tasks.workunit.client.1.vm04.stdout:4/613: dwrite d4/d14/d1b/f9d [4194304,4194304] 0 2026-03-10T14:07:59.164 INFO:tasks.workunit.client.1.vm04.stdout:4/614: chown d4/df/db2/db4/d47/d70/fd2 117 1 2026-03-10T14:07:59.187 INFO:tasks.workunit.client.1.vm04.stdout:7/667: mkdir d2/d2a/d42/d86/df4 0 2026-03-10T14:07:59.194 INFO:tasks.workunit.client.0.vm03.stdout:1/137: unlink d0/lb 0 2026-03-10T14:07:59.195 INFO:tasks.workunit.client.0.vm03.stdout:1/138: write d0/d18/f26 [556022,113966] 0 2026-03-10T14:07:59.202 INFO:tasks.workunit.client.1.vm04.stdout:8/727: mknod d0/d3/d63/d12/d69/ce5 0 2026-03-10T14:07:59.207 INFO:tasks.workunit.client.1.vm04.stdout:3/655: dwrite da/dc/d35/d52/d53/d78/f98 [0,4194304] 0 2026-03-10T14:07:59.210 INFO:tasks.workunit.client.0.vm03.stdout:2/166: mkdir d5/d10/d31 0 2026-03-10T14:07:59.211 INFO:tasks.workunit.client.0.vm03.stdout:2/167: chown d5/d10/d1c/f2e 1766436 1 2026-03-10T14:07:59.214 INFO:tasks.workunit.client.0.vm03.stdout:2/168: dwrite d5/f2d [0,4194304] 0 2026-03-10T14:07:59.215 INFO:tasks.workunit.client.0.vm03.stdout:2/169: chown f4 7 1 2026-03-10T14:07:59.223 INFO:tasks.workunit.client.1.vm04.stdout:6/529: mknod d3/d1d/d73/c9d 0 2026-03-10T14:07:59.226 INFO:tasks.workunit.client.1.vm04.stdout:5/703: mknod d7/d12/d2b/d3e/ce3 0 2026-03-10T14:07:59.236 INFO:tasks.workunit.client.0.vm03.stdout:5/168: mkdir d4/d16/d19/d23/d3f 0 2026-03-10T14:07:59.236 INFO:tasks.workunit.client.0.vm03.stdout:3/94: unlink c7 0 2026-03-10T14:07:59.239 INFO:tasks.workunit.client.1.vm04.stdout:1/637: chown d3/d22/d63/d35/dd9/d13/d38/l99 4 1 2026-03-10T14:07:59.242 INFO:tasks.workunit.client.1.vm04.stdout:4/615: rmdir d4/df/d34/d6f 39 2026-03-10T14:07:59.245 INFO:tasks.workunit.client.1.vm04.stdout:2/636: mkdir d0/d14/d91/d4a/d8c/dab/d46/dc8 0 2026-03-10T14:07:59.252 INFO:tasks.workunit.client.0.vm03.stdout:8/145: fdatasync da/d24/f28 0 2026-03-10T14:07:59.262 INFO:tasks.workunit.client.0.vm03.stdout:8/146: dread da/ff [0,4194304] 0 2026-03-10T14:07:59.272 INFO:tasks.workunit.client.1.vm04.stdout:0/627: write d0/d6e/f8b [362046,84649] 0 2026-03-10T14:07:59.273 INFO:tasks.workunit.client.1.vm04.stdout:7/668: write d2/dc/de/d11/f19 [166616,122798] 0 2026-03-10T14:07:59.280 INFO:tasks.workunit.client.1.vm04.stdout:9/578: dwrite d9/da/d5d/f90 [0,4194304] 0 2026-03-10T14:07:59.293 INFO:tasks.workunit.client.1.vm04.stdout:8/728: creat d0/d3/d63/d29/fe6 x:0 0 0 2026-03-10T14:07:59.308 INFO:tasks.workunit.client.0.vm03.stdout:7/171: truncate d5/fb 746845 0 2026-03-10T14:07:59.318 INFO:tasks.workunit.client.1.vm04.stdout:6/530: unlink d3/de/d35/d3f/d2d/l26 0 2026-03-10T14:07:59.319 INFO:tasks.workunit.client.0.vm03.stdout:9/103: fdatasync d2/f1e 0 2026-03-10T14:07:59.321 INFO:tasks.workunit.client.1.vm04.stdout:5/704: creat d7/d12/d2b/d93/d9e/fe4 x:0 0 0 2026-03-10T14:07:59.322 INFO:tasks.workunit.client.0.vm03.stdout:0/170: readlink d3/d17/l2f 0 2026-03-10T14:07:59.323 INFO:tasks.workunit.client.1.vm04.stdout:5/705: write d7/d9/f28 [2114858,65589] 0 2026-03-10T14:07:59.342 INFO:tasks.workunit.client.1.vm04.stdout:4/616: chown d4/df/d34/l83 70104465 1 2026-03-10T14:07:59.343 INFO:tasks.workunit.client.1.vm04.stdout:4/617: chown d4/d14/d1b/f99 1534279954 1 2026-03-10T14:07:59.453 INFO:tasks.workunit.client.1.vm04.stdout:7/669: creat d2/dc/de/d2d/d60/d7c/d44/ff5 x:0 0 0 2026-03-10T14:07:59.472 INFO:tasks.workunit.client.1.vm04.stdout:9/579: truncate d9/da/dd/f31 601972 0 2026-03-10T14:07:59.481 INFO:tasks.workunit.client.1.vm04.stdout:0/628: dwrite d0/d2/d15/d49/d50/d5c/f76 [0,4194304] 0 2026-03-10T14:07:59.489 INFO:tasks.workunit.client.1.vm04.stdout:0/629: write d0/d2/d15/d22/d38/d56/dc1/fc3 [571866,124097] 0 2026-03-10T14:07:59.490 INFO:tasks.workunit.client.1.vm04.stdout:0/630: truncate d0/d2/d25/fc0 779292 0 2026-03-10T14:07:59.502 INFO:tasks.workunit.client.1.vm04.stdout:0/631: dwrite d0/d2/d15/d22/d38/d56/dc1/fc3 [0,4194304] 0 2026-03-10T14:07:59.533 INFO:tasks.workunit.client.1.vm04.stdout:0/632: dread d0/d2/d15/d22/d38/d56/fbf [0,4194304] 0 2026-03-10T14:07:59.533 INFO:tasks.workunit.client.1.vm04.stdout:0/633: readlink d0/da2/lb3 0 2026-03-10T14:07:59.537 INFO:tasks.workunit.client.0.vm03.stdout:6/118: mknod d8/d11/c22 0 2026-03-10T14:07:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:59 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:59 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:07:59 vm04.local ceph-mon[55966]: pgmap v158: 65 pgs: 65 active+clean; 1.8 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 25 MiB/s rd, 98 MiB/s wr, 212 op/s 2026-03-10T14:07:59.564 INFO:tasks.workunit.client.1.vm04.stdout:6/531: mkdir d3/de/d35/d3f/d2d/d32/d9e 0 2026-03-10T14:07:59.573 INFO:tasks.workunit.client.1.vm04.stdout:3/656: symlink da/dc/d35/d52/ldf 0 2026-03-10T14:07:59.578 INFO:tasks.workunit.client.0.vm03.stdout:2/170: write d5/d10/f22 [4316863,130426] 0 2026-03-10T14:07:59.579 INFO:tasks.workunit.client.1.vm04.stdout:1/638: getdents d3/d22/d63/d35/dd9/d13/d38/db5/dc4/dda 0 2026-03-10T14:07:59.583 INFO:tasks.workunit.client.0.vm03.stdout:4/138: write f3 [9374584,15405] 0 2026-03-10T14:07:59.594 INFO:tasks.workunit.client.0.vm03.stdout:5/169: fdatasync d4/d6/fa 0 2026-03-10T14:07:59.596 INFO:tasks.workunit.client.1.vm04.stdout:4/618: mknod d4/df/db2/cda 0 2026-03-10T14:07:59.600 INFO:tasks.workunit.client.0.vm03.stdout:3/95: stat l12 0 2026-03-10T14:07:59.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:59 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:59.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:59 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:07:59.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:07:59 vm03.local ceph-mon[49718]: pgmap v158: 65 pgs: 65 active+clean; 1.8 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 25 MiB/s rd, 98 MiB/s wr, 212 op/s 2026-03-10T14:07:59.631 INFO:tasks.workunit.client.1.vm04.stdout:9/580: rename d9/da/d5d/d81/faa to d9/d58/db5/da5/fc4 0 2026-03-10T14:07:59.632 INFO:tasks.workunit.client.1.vm04.stdout:9/581: fdatasync d9/d58/fba 0 2026-03-10T14:07:59.675 INFO:tasks.workunit.client.1.vm04.stdout:0/634: truncate d0/d2/d15/f59 177253 0 2026-03-10T14:07:59.695 INFO:tasks.workunit.client.1.vm04.stdout:1/639: rmdir d3/d22/d63/d35/dd9/d13/d38/d58/d5b 39 2026-03-10T14:07:59.740 INFO:tasks.workunit.client.1.vm04.stdout:8/729: write d0/d3/d63/f8f [3647289,62038] 0 2026-03-10T14:07:59.776 INFO:tasks.workunit.client.0.vm03.stdout:7/172: rename d5/f16 to d5/f32 0 2026-03-10T14:07:59.779 INFO:tasks.workunit.client.0.vm03.stdout:7/173: dwrite d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:07:59.817 INFO:tasks.workunit.client.0.vm03.stdout:0/171: creat d3/d16/f34 x:0 0 0 2026-03-10T14:07:59.820 INFO:tasks.workunit.client.0.vm03.stdout:9/104: truncate d2/f15 4410258 0 2026-03-10T14:07:59.852 INFO:tasks.workunit.client.0.vm03.stdout:2/171: write d5/d10/f13 [1524742,29258] 0 2026-03-10T14:07:59.905 INFO:tasks.workunit.client.1.vm04.stdout:2/637: dwrite d0/d14/d1b/f29 [0,4194304] 0 2026-03-10T14:07:59.917 INFO:tasks.workunit.client.1.vm04.stdout:7/670: dwrite d2/d2a/d42/d86/fbb [0,4194304] 0 2026-03-10T14:07:59.921 INFO:tasks.workunit.client.0.vm03.stdout:5/170: rmdir d4/d6/de 39 2026-03-10T14:07:59.971 INFO:tasks.workunit.client.0.vm03.stdout:9/105: fsync d2/fc 0 2026-03-10T14:08:00.007 INFO:tasks.workunit.client.1.vm04.stdout:3/657: write da/dc/f90 [1186505,39811] 0 2026-03-10T14:08:00.016 INFO:tasks.workunit.client.1.vm04.stdout:6/532: dwrite d3/de/f6d [4194304,4194304] 0 2026-03-10T14:08:00.017 INFO:tasks.workunit.client.0.vm03.stdout:2/172: dread d5/d10/f16 [0,4194304] 0 2026-03-10T14:08:00.017 INFO:tasks.workunit.client.1.vm04.stdout:3/658: dwrite da/d3e/f4c [4194304,4194304] 0 2026-03-10T14:08:00.020 INFO:tasks.workunit.client.0.vm03.stdout:2/173: dread d5/d10/d17/f2c [0,4194304] 0 2026-03-10T14:08:00.043 INFO:tasks.workunit.client.1.vm04.stdout:4/619: dwrite d4/f2c [0,4194304] 0 2026-03-10T14:08:00.065 INFO:tasks.workunit.client.1.vm04.stdout:9/582: rmdir d9/d58 39 2026-03-10T14:08:00.066 INFO:tasks.workunit.client.0.vm03.stdout:9/106: creat d2/d14/f25 x:0 0 0 2026-03-10T14:08:00.069 INFO:tasks.workunit.client.0.vm03.stdout:1/139: write d0/f24 [152555,79813] 0 2026-03-10T14:08:00.070 INFO:tasks.workunit.client.1.vm04.stdout:0/635: rmdir d0/d6e 39 2026-03-10T14:08:00.073 INFO:tasks.workunit.client.1.vm04.stdout:1/640: truncate d3/d22/d63/d35/dd9/d13/d1a/f62 2327855 0 2026-03-10T14:08:00.074 INFO:tasks.workunit.client.1.vm04.stdout:1/641: chown d3/d22/d2f/d57/cbe 116808 1 2026-03-10T14:08:00.076 INFO:tasks.workunit.client.0.vm03.stdout:3/96: creat f1c x:0 0 0 2026-03-10T14:08:00.083 INFO:tasks.workunit.client.0.vm03.stdout:8/147: truncate da/f12 1025316 0 2026-03-10T14:08:00.083 INFO:tasks.workunit.client.0.vm03.stdout:5/171: mkdir d4/d40 0 2026-03-10T14:08:00.083 INFO:tasks.workunit.client.0.vm03.stdout:0/172: link d3/f9 d3/d17/f35 0 2026-03-10T14:08:00.083 INFO:tasks.workunit.client.1.vm04.stdout:8/730: mkdir d0/d3/dd/d89/de7 0 2026-03-10T14:08:00.102 INFO:tasks.workunit.client.0.vm03.stdout:9/107: chown d2/ce 14421845 1 2026-03-10T14:08:00.107 INFO:tasks.workunit.client.0.vm03.stdout:6/119: write d8/fd [1534476,95840] 0 2026-03-10T14:08:00.114 INFO:tasks.workunit.client.0.vm03.stdout:4/139: link d5/d9/db/f10 d5/d9/f31 0 2026-03-10T14:08:00.114 INFO:tasks.workunit.client.0.vm03.stdout:4/140: readlink d5/d9/db/l1d 0 2026-03-10T14:08:00.116 INFO:tasks.workunit.client.0.vm03.stdout:1/140: symlink d0/d18/d1d/l2d 0 2026-03-10T14:08:00.116 INFO:tasks.workunit.client.0.vm03.stdout:1/141: dread - d0/d18/f21 zero size 2026-03-10T14:08:00.117 INFO:tasks.workunit.client.0.vm03.stdout:2/174: mknod d5/d10/d31/c32 0 2026-03-10T14:08:00.135 INFO:tasks.workunit.client.0.vm03.stdout:0/173: fsync d3/f1e 0 2026-03-10T14:08:00.135 INFO:tasks.workunit.client.1.vm04.stdout:4/620: rmdir d4/df/db2/db4 39 2026-03-10T14:08:00.139 INFO:tasks.workunit.client.1.vm04.stdout:7/671: write d2/dc/de/f1e [4737857,90334] 0 2026-03-10T14:08:00.169 INFO:tasks.workunit.client.0.vm03.stdout:7/174: getdents d5/d9 0 2026-03-10T14:08:00.171 INFO:tasks.workunit.client.0.vm03.stdout:4/141: chown d5/d9/db/ff 204775136 1 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:5/706: getdents d7/d12/d2b/d3e/d57/d77 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:8/731: mkdir d0/d3/d73/de8 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:7/672: mkdir d2/dc/de/d2d/d5c/da9/df6 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:7/673: read - d2/dc/de/d2d/d60/d7c/d44/f66 zero size 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:5/707: dread d7/d2d/d76/f8f [0,4194304] 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:5/708: write d7/d26/d6b/d6e/fa3 [4956882,129414] 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:8/732: symlink d0/d3/d63/d12/d69/le9 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:8/733: fdatasync d0/d3/d63/d12/d51/d67/d96/f71 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:2/638: creat d0/d14/d39/d47/d70/d8b/fc9 x:0 0 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:2/639: fdatasync d0/d14/d1b/f32 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:5/709: creat d7/d12/d2b/d8c/fe5 x:0 0 0 2026-03-10T14:08:00.228 INFO:tasks.workunit.client.1.vm04.stdout:5/710: creat d7/d59/d7e/d87/fe6 x:0 0 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.1.vm04.stdout:2/640: creat d0/db8/fca x:0 0 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.1.vm04.stdout:5/711: mknod d7/d26/ce7 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.1.vm04.stdout:8/734: read d0/d3/f35 [3378597,113265] 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:1/142: mkdir d0/d2/df/d16/d20/d2e 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:2/175: creat d5/d10/d17/f33 x:0 0 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:2/176: chown d5 2526 1 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:3/97: mkdir d1d 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:5/172: truncate d4/d6/de/f14 4633704 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:9/108: fdatasync d2/f15 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:4/142: creat d5/d9/db/f32 x:0 0 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:9/109: read - d2/d14/f25 zero size 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:1/143: symlink d0/d2/l2f 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:1/144: dread d0/f11 [0,4194304] 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:9/110: dread d2/f11 [0,4194304] 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:9/111: stat d2/d14/f16 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:1/145: dwrite d0/f2c [0,4194304] 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:3/98: mknod d1d/c1e 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:9/112: symlink d2/d14/l26 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:0/174: creat d3/d11/d25/d30/f36 x:0 0 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:8/148: getdents da 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:8/149: dread da/f16 [0,4194304] 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.0.vm03.stdout:9/113: creat d2/d14/f27 x:0 0 0 2026-03-10T14:08:00.229 INFO:tasks.workunit.client.1.vm04.stdout:8/735: read d0/d3/d63/d12/d51/d67/d96/f71 [676051,18253] 0 2026-03-10T14:08:00.237 INFO:tasks.workunit.client.1.vm04.stdout:5/712: fdatasync d7/d9/f75 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.0.vm03.stdout:8/150: creat da/f2e x:0 0 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.0.vm03.stdout:1/146: creat d0/f30 x:0 0 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.0.vm03.stdout:1/147: chown d0/d2/df/d16/d20/d2e 31 1 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.0.vm03.stdout:3/99: truncate fc 2990218 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.1.vm04.stdout:5/713: fdatasync d7/d12/d2b/d3e/f4a 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.1.vm04.stdout:5/714: dread d7/d12/d2b/d3e/d57/d77/fdd [0,4194304] 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.1.vm04.stdout:8/736: creat d0/d3/dd/d89/db5/fea x:0 0 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.1.vm04.stdout:8/737: symlink d0/d3/d63/d12/d69/leb 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.1.vm04.stdout:8/738: truncate d0/d3/d63/d29/fce 198937 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.1.vm04.stdout:8/739: chown d0/d3/d63/d12/c3e 3930990 1 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.1.vm04.stdout:8/740: fsync d0/f42 0 2026-03-10T14:08:00.272 INFO:tasks.workunit.client.1.vm04.stdout:8/741: chown d0/d75/f83 923065 1 2026-03-10T14:08:00.277 INFO:tasks.workunit.client.1.vm04.stdout:8/742: dread d0/d3/d63/d12/f2c [0,4194304] 0 2026-03-10T14:08:00.277 INFO:tasks.workunit.client.1.vm04.stdout:8/743: chown d0/f42 22 1 2026-03-10T14:08:00.277 INFO:tasks.workunit.client.1.vm04.stdout:8/744: chown d0/d3/d63/d12/d51/d67/d96 95387059 1 2026-03-10T14:08:00.278 INFO:tasks.workunit.client.1.vm04.stdout:8/745: truncate d0/d3/dd/fc0 4799640 0 2026-03-10T14:08:00.278 INFO:tasks.workunit.client.1.vm04.stdout:8/746: dread - d0/d75/fd1 zero size 2026-03-10T14:08:00.283 INFO:tasks.workunit.client.0.vm03.stdout:9/114: rename d2/c23 to d2/d14/c28 0 2026-03-10T14:08:00.283 INFO:tasks.workunit.client.1.vm04.stdout:6/533: write d3/de/d35/d3f/d2d/d32/d23/f31 [7768317,50280] 0 2026-03-10T14:08:00.291 INFO:tasks.workunit.client.0.vm03.stdout:6/120: sync 2026-03-10T14:08:00.292 INFO:tasks.workunit.client.0.vm03.stdout:3/100: sync 2026-03-10T14:08:00.308 INFO:tasks.workunit.client.1.vm04.stdout:9/583: dwrite d9/d5c/f77 [0,4194304] 0 2026-03-10T14:08:00.309 INFO:tasks.workunit.client.1.vm04.stdout:0/636: dwrite d0/d2/d25/f64 [0,4194304] 0 2026-03-10T14:08:00.325 INFO:tasks.workunit.client.1.vm04.stdout:4/621: write d4/d14/d1b/f28 [924682,22942] 0 2026-03-10T14:08:00.327 INFO:tasks.workunit.client.1.vm04.stdout:1/642: write d3/d5c/fb2 [5283077,50116] 0 2026-03-10T14:08:00.328 INFO:tasks.workunit.client.1.vm04.stdout:3/659: dwrite da/dc/d3f/d54/d66/f80 [0,4194304] 0 2026-03-10T14:08:00.335 INFO:tasks.workunit.client.1.vm04.stdout:7/674: dwrite d2/dc/de/d2d/d60/d81/db3/fb9 [0,4194304] 0 2026-03-10T14:08:00.342 INFO:tasks.workunit.client.0.vm03.stdout:2/177: write d5/d10/f16 [4634209,58291] 0 2026-03-10T14:08:00.353 INFO:tasks.workunit.client.0.vm03.stdout:4/143: dwrite d5/d9/f16 [0,4194304] 0 2026-03-10T14:08:00.354 INFO:tasks.workunit.client.0.vm03.stdout:5/173: dwrite d4/d16/f1c [0,4194304] 0 2026-03-10T14:08:00.354 INFO:tasks.workunit.client.0.vm03.stdout:7/175: dwrite d5/fb [0,4194304] 0 2026-03-10T14:08:00.355 INFO:tasks.workunit.client.1.vm04.stdout:2/641: dwrite d0/d14/d91/d3a/fb5 [0,4194304] 0 2026-03-10T14:08:00.359 INFO:tasks.workunit.client.0.vm03.stdout:0/175: write d3/fe [155112,129114] 0 2026-03-10T14:08:00.359 INFO:tasks.workunit.client.0.vm03.stdout:4/144: chown d5/d9/db/f2a 0 1 2026-03-10T14:08:00.360 INFO:tasks.workunit.client.1.vm04.stdout:6/534: rmdir d3/de/d35/d3f/d2d/d38 39 2026-03-10T14:08:00.362 INFO:tasks.workunit.client.0.vm03.stdout:2/178: dread d5/d10/f22 [0,4194304] 0 2026-03-10T14:08:00.373 INFO:tasks.workunit.client.1.vm04.stdout:0/637: mkdir d0/d2/d15/d22/d38/d56/dcb 0 2026-03-10T14:08:00.379 INFO:tasks.workunit.client.1.vm04.stdout:4/622: fdatasync d4/df/d34/f7c 0 2026-03-10T14:08:00.387 INFO:tasks.workunit.client.1.vm04.stdout:5/715: truncate d7/d26/f54 136607 0 2026-03-10T14:08:00.422 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:00 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:00.462 INFO:tasks.workunit.client.1.vm04.stdout:2/642: truncate d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/f7b 1495090 0 2026-03-10T14:08:00.507 INFO:tasks.workunit.client.0.vm03.stdout:6/121: write d8/d11/d18/f21 [972725,12458] 0 2026-03-10T14:08:00.508 INFO:tasks.workunit.client.0.vm03.stdout:6/122: write d8/fe [4486935,121897] 0 2026-03-10T14:08:00.508 INFO:tasks.workunit.client.0.vm03.stdout:3/101: symlink d1d/l1f 0 2026-03-10T14:08:00.513 INFO:tasks.workunit.client.0.vm03.stdout:3/102: sync 2026-03-10T14:08:00.529 INFO:tasks.workunit.client.1.vm04.stdout:2/643: dread d0/d14/d91/d3a/d3e/f61 [4194304,4194304] 0 2026-03-10T14:08:00.533 INFO:tasks.workunit.client.1.vm04.stdout:8/747: write d0/d3/dd/fa7 [4750455,52277] 0 2026-03-10T14:08:00.533 INFO:tasks.workunit.client.1.vm04.stdout:9/584: write d9/da/dd/fc0 [691246,113857] 0 2026-03-10T14:08:00.550 INFO:tasks.workunit.client.1.vm04.stdout:6/535: rename d3/de/d35/d3f/d2d/d32/d23/d47/c70 to d3/c9f 0 2026-03-10T14:08:00.561 INFO:tasks.workunit.client.1.vm04.stdout:6/536: dread d3/de/d35/d3f/d2d/d32/d23/f31 [4194304,4194304] 0 2026-03-10T14:08:00.584 INFO:tasks.workunit.client.1.vm04.stdout:7/675: dwrite d2/dc/de/d2d/d60/d7c/d36/d8b/fe3 [0,4194304] 0 2026-03-10T14:08:00.586 INFO:tasks.workunit.client.1.vm04.stdout:0/638: dwrite d0/d2/d25/f9f [0,4194304] 0 2026-03-10T14:08:00.593 INFO:tasks.workunit.client.1.vm04.stdout:4/623: dwrite d4/d14/d3c/f3e [4194304,4194304] 0 2026-03-10T14:08:00.606 INFO:tasks.workunit.client.1.vm04.stdout:5/716: write d7/d12/d2b/d3e/d57/d77/fdd [3125395,49267] 0 2026-03-10T14:08:00.625 INFO:tasks.workunit.client.0.vm03.stdout:7/176: write d5/d9/f2e [295924,85789] 0 2026-03-10T14:08:00.625 INFO:tasks.workunit.client.0.vm03.stdout:7/177: write d5/d9/f2e [767433,56850] 0 2026-03-10T14:08:00.639 INFO:tasks.workunit.client.1.vm04.stdout:3/660: write da/dc/d3f/f85 [3259380,53176] 0 2026-03-10T14:08:00.640 INFO:tasks.workunit.client.1.vm04.stdout:1/643: dwrite d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92/fa2 [0,4194304] 0 2026-03-10T14:08:00.663 INFO:tasks.workunit.client.0.vm03.stdout:2/179: truncate d5/f23 106213 0 2026-03-10T14:08:00.663 INFO:tasks.workunit.client.0.vm03.stdout:9/115: mkdir d2/d29 0 2026-03-10T14:08:00.663 INFO:tasks.workunit.client.0.vm03.stdout:0/176: read d3/d16/f2d [2124824,55104] 0 2026-03-10T14:08:00.664 INFO:tasks.workunit.client.0.vm03.stdout:0/177: chown d3/fe 282512928 1 2026-03-10T14:08:00.672 INFO:tasks.workunit.client.0.vm03.stdout:6/123: creat d8/d1b/f23 x:0 0 0 2026-03-10T14:08:00.673 INFO:tasks.workunit.client.1.vm04.stdout:0/639: rmdir d0/d2/d15/d22 39 2026-03-10T14:08:00.679 INFO:tasks.workunit.client.0.vm03.stdout:3/103: rmdir d1d 39 2026-03-10T14:08:00.682 INFO:tasks.workunit.client.1.vm04.stdout:4/624: chown d4/df/db2/db4/fb8 161 1 2026-03-10T14:08:00.684 INFO:tasks.workunit.client.0.vm03.stdout:1/148: truncate d0/f2c 1780173 0 2026-03-10T14:08:00.688 INFO:tasks.workunit.client.1.vm04.stdout:5/717: chown d7/d12/d45/c58 3946872 1 2026-03-10T14:08:00.688 INFO:tasks.workunit.client.1.vm04.stdout:5/718: chown d7/d59/d7e/d87/cb1 24450555 1 2026-03-10T14:08:00.692 INFO:tasks.workunit.client.0.vm03.stdout:8/151: write da/f12 [570403,119500] 0 2026-03-10T14:08:00.692 INFO:tasks.workunit.client.1.vm04.stdout:9/585: unlink d9/d44/d59/c5b 0 2026-03-10T14:08:00.696 INFO:tasks.workunit.client.1.vm04.stdout:2/644: dwrite d0/d14/d91/f1d [8388608,4194304] 0 2026-03-10T14:08:00.712 INFO:tasks.workunit.client.1.vm04.stdout:8/748: write d0/d3/fc3 [3448377,111082] 0 2026-03-10T14:08:00.720 INFO:tasks.workunit.client.1.vm04.stdout:1/644: fsync d3/d22/d63/f65 0 2026-03-10T14:08:00.724 INFO:tasks.workunit.client.1.vm04.stdout:6/537: fsync d3/de/d35/d3f/f17 0 2026-03-10T14:08:00.729 INFO:tasks.workunit.client.1.vm04.stdout:7/676: mkdir d2/dc/de/d2d/d5c/da9/df6/df7 0 2026-03-10T14:08:00.729 INFO:tasks.workunit.client.1.vm04.stdout:7/677: fsync d2/d2a/d42/d86/fd0 0 2026-03-10T14:08:00.729 INFO:tasks.workunit.client.0.vm03.stdout:9/116: dread d2/d14/f1b [0,4194304] 0 2026-03-10T14:08:00.732 INFO:tasks.workunit.client.0.vm03.stdout:0/178: mknod d3/d11/c37 0 2026-03-10T14:08:00.732 INFO:tasks.workunit.client.0.vm03.stdout:2/180: truncate d5/fa 990321 0 2026-03-10T14:08:00.732 INFO:tasks.workunit.client.0.vm03.stdout:6/124: symlink d8/d1b/l24 0 2026-03-10T14:08:00.743 INFO:tasks.workunit.client.0.vm03.stdout:1/149: creat d0/d2/df/f31 x:0 0 0 2026-03-10T14:08:00.744 INFO:tasks.workunit.client.0.vm03.stdout:1/150: chown d0/f30 162110102 1 2026-03-10T14:08:00.758 INFO:tasks.workunit.client.0.vm03.stdout:8/152: symlink da/d24/l2f 0 2026-03-10T14:08:00.769 INFO:tasks.workunit.client.0.vm03.stdout:9/117: dread d2/d14/f1a [0,4194304] 0 2026-03-10T14:08:00.780 INFO:tasks.workunit.client.0.vm03.stdout:5/174: dwrite d4/f29 [4194304,4194304] 0 2026-03-10T14:08:00.780 INFO:tasks.workunit.client.0.vm03.stdout:5/175: readlink l1 0 2026-03-10T14:08:00.781 INFO:tasks.workunit.client.0.vm03.stdout:5/176: chown d4/d13/d1f/d34 78346336 1 2026-03-10T14:08:00.794 INFO:tasks.workunit.client.0.vm03.stdout:4/145: getdents d5/d9/db/d19 0 2026-03-10T14:08:00.797 INFO:tasks.workunit.client.1.vm04.stdout:4/625: dread d4/df/d31/fae [0,4194304] 0 2026-03-10T14:08:00.797 INFO:tasks.workunit.client.0.vm03.stdout:6/125: dwrite f2 [4194304,4194304] 0 2026-03-10T14:08:00.797 INFO:tasks.workunit.client.1.vm04.stdout:4/626: dread - d4/f57 zero size 2026-03-10T14:08:00.804 INFO:tasks.workunit.client.0.vm03.stdout:1/151: truncate d0/fa 1328829 0 2026-03-10T14:08:00.812 INFO:tasks.workunit.client.0.vm03.stdout:8/153: creat da/f30 x:0 0 0 2026-03-10T14:08:00.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:00 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:00.815 INFO:tasks.workunit.client.1.vm04.stdout:9/586: mknod d9/da/dd/d1c/da3/cc5 0 2026-03-10T14:08:00.821 INFO:tasks.workunit.client.0.vm03.stdout:9/118: rename d2/d14/l22 to d2/d14/l2a 0 2026-03-10T14:08:00.821 INFO:tasks.workunit.client.1.vm04.stdout:2/645: creat d0/d14/d91/d8/d17/d4e/d85/fcb x:0 0 0 2026-03-10T14:08:00.822 INFO:tasks.workunit.client.0.vm03.stdout:5/177: symlink d4/d16/l41 0 2026-03-10T14:08:00.825 INFO:tasks.workunit.client.1.vm04.stdout:3/661: truncate da/dc/d35/d52/f6f 340282 0 2026-03-10T14:08:00.827 INFO:tasks.workunit.client.0.vm03.stdout:4/146: dwrite d5/d9/db/f28 [0,4194304] 0 2026-03-10T14:08:00.829 INFO:tasks.workunit.client.0.vm03.stdout:4/147: dread - d5/d9/db/f32 zero size 2026-03-10T14:08:00.829 INFO:tasks.workunit.client.0.vm03.stdout:4/148: fdatasync f3 0 2026-03-10T14:08:00.839 INFO:tasks.workunit.client.0.vm03.stdout:9/119: dread d2/f15 [0,4194304] 0 2026-03-10T14:08:00.843 INFO:tasks.workunit.client.0.vm03.stdout:9/120: dwrite d2/d14/f25 [0,4194304] 0 2026-03-10T14:08:00.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:00 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:00.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:00 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:00.863 INFO:tasks.workunit.client.1.vm04.stdout:1/645: symlink d3/d22/d63/d35/dd9/d13/d38/db5/dc4/ldc 0 2026-03-10T14:08:00.880 INFO:tasks.workunit.client.0.vm03.stdout:6/126: mknod d8/d1b/c25 0 2026-03-10T14:08:00.892 INFO:tasks.workunit.client.0.vm03.stdout:2/181: rename d5/d10/d1f/f24 to d5/d2a/f34 0 2026-03-10T14:08:00.912 INFO:tasks.workunit.client.1.vm04.stdout:7/678: rmdir d2/dc/de/d2d/d60/d7c/d36/daa 39 2026-03-10T14:08:00.917 INFO:tasks.workunit.client.0.vm03.stdout:1/152: unlink d0/d18/f26 0 2026-03-10T14:08:00.953 INFO:tasks.workunit.client.0.vm03.stdout:9/121: mkdir d2/d14/d2b 0 2026-03-10T14:08:00.953 INFO:tasks.workunit.client.1.vm04.stdout:5/719: mkdir d7/d12/d2b/de8 0 2026-03-10T14:08:00.953 INFO:tasks.workunit.client.1.vm04.stdout:5/720: write d7/d12/d2b/d93/fd9 [473802,107693] 0 2026-03-10T14:08:00.954 INFO:tasks.workunit.client.1.vm04.stdout:2/646: readlink d0/d14/d91/d4a/d8c/lbb 0 2026-03-10T14:08:00.958 INFO:tasks.workunit.client.1.vm04.stdout:8/749: mkdir d0/d3/dd/dec 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.1.vm04.stdout:3/662: creat da/d3e/fe0 x:0 0 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.1.vm04.stdout:4/627: dread d4/d14/f27 [4194304,4194304] 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.1.vm04.stdout:6/538: symlink d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/la0 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.0.vm03.stdout:6/127: truncate f3 13090740 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.0.vm03.stdout:6/128: fdatasync d8/fd 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.0.vm03.stdout:6/129: readlink l4 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.0.vm03.stdout:0/179: rename d3/c15 to d3/d11/d2c/c38 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.0.vm03.stdout:2/182: fdatasync d5/d10/d17/f28 0 2026-03-10T14:08:00.986 INFO:tasks.workunit.client.0.vm03.stdout:2/183: dwrite d5/d10/d17/f26 [0,4194304] 0 2026-03-10T14:08:00.999 INFO:tasks.workunit.client.0.vm03.stdout:1/153: creat d0/d2/df/d27/f32 x:0 0 0 2026-03-10T14:08:01.004 INFO:tasks.workunit.client.1.vm04.stdout:9/587: creat d9/d58/db5/fc6 x:0 0 0 2026-03-10T14:08:01.012 INFO:tasks.workunit.client.1.vm04.stdout:1/646: sync 2026-03-10T14:08:01.012 INFO:tasks.workunit.client.1.vm04.stdout:3/663: sync 2026-03-10T14:08:01.022 INFO:tasks.workunit.client.1.vm04.stdout:5/721: fdatasync d7/f3c 0 2026-03-10T14:08:01.031 INFO:tasks.workunit.client.0.vm03.stdout:9/122: read d2/d14/f16 [115797,96701] 0 2026-03-10T14:08:01.034 INFO:tasks.workunit.client.1.vm04.stdout:2/647: mknod d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/ccc 0 2026-03-10T14:08:01.058 INFO:tasks.workunit.client.0.vm03.stdout:4/149: rename c0 to d5/d9/d2b/c33 0 2026-03-10T14:08:01.062 INFO:tasks.workunit.client.0.vm03.stdout:0/180: fsync d3/d16/f2d 0 2026-03-10T14:08:01.066 INFO:tasks.workunit.client.1.vm04.stdout:9/588: readlink d9/l25 0 2026-03-10T14:08:01.067 INFO:tasks.workunit.client.1.vm04.stdout:9/589: chown d9/d33/f76 2250 1 2026-03-10T14:08:01.074 INFO:tasks.workunit.client.1.vm04.stdout:3/664: mknod da/dc/d47/ce1 0 2026-03-10T14:08:01.085 INFO:tasks.workunit.client.0.vm03.stdout:3/104: dwrite f1b [0,4194304] 0 2026-03-10T14:08:01.088 INFO:tasks.workunit.client.1.vm04.stdout:0/640: write d0/d2/d15/d22/d38/d56/fbf [1103942,11029] 0 2026-03-10T14:08:01.090 INFO:tasks.workunit.client.0.vm03.stdout:5/178: write d4/d16/f2d [1059295,113155] 0 2026-03-10T14:08:01.090 INFO:tasks.workunit.client.0.vm03.stdout:5/179: readlink d4/d16/d19/l26 0 2026-03-10T14:08:01.091 INFO:tasks.workunit.client.0.vm03.stdout:5/180: chown d4/d6/cc 932378684 1 2026-03-10T14:08:01.091 INFO:tasks.workunit.client.0.vm03.stdout:8/154: dwrite da/f2a [0,4194304] 0 2026-03-10T14:08:01.095 INFO:tasks.workunit.client.0.vm03.stdout:5/181: dread d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:01.124 INFO:tasks.workunit.client.0.vm03.stdout:4/150: chown d5/d9/c26 382 1 2026-03-10T14:08:01.124 INFO:tasks.workunit.client.0.vm03.stdout:4/151: chown d5/d9/db/c2c 25 1 2026-03-10T14:08:01.138 INFO:tasks.workunit.client.0.vm03.stdout:3/105: dread fe [0,4194304] 0 2026-03-10T14:08:01.143 INFO:tasks.workunit.client.1.vm04.stdout:3/665: fsync da/dc/d35/dcd/dde/dac/fca 0 2026-03-10T14:08:01.146 INFO:tasks.workunit.client.1.vm04.stdout:4/628: dwrite d4/df/d34/fc3 [0,4194304] 0 2026-03-10T14:08:01.146 INFO:tasks.workunit.client.1.vm04.stdout:4/629: dread - d4/f77 zero size 2026-03-10T14:08:01.151 INFO:tasks.workunit.client.1.vm04.stdout:6/539: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:08:01.164 INFO:tasks.workunit.client.1.vm04.stdout:7/679: write d2/d2a/fef [974219,126473] 0 2026-03-10T14:08:01.166 INFO:tasks.workunit.client.1.vm04.stdout:1/647: rename d3/d22/d63/f77 to d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92/fdd 0 2026-03-10T14:08:01.179 INFO:tasks.workunit.client.0.vm03.stdout:1/154: write d0/f11 [123590,45452] 0 2026-03-10T14:08:01.180 INFO:tasks.workunit.client.1.vm04.stdout:5/722: write d7/d26/d6b/d6e/f81 [585257,42741] 0 2026-03-10T14:08:01.180 INFO:tasks.workunit.client.0.vm03.stdout:2/184: dwrite d5/d10/d17/f18 [0,4194304] 0 2026-03-10T14:08:01.182 INFO:tasks.workunit.client.0.vm03.stdout:2/185: chown d5/d10 100 1 2026-03-10T14:08:01.182 INFO:tasks.workunit.client.0.vm03.stdout:1/155: write d0/d2/df/d27/f32 [290380,79292] 0 2026-03-10T14:08:01.192 INFO:tasks.workunit.client.0.vm03.stdout:8/155: rmdir da 39 2026-03-10T14:08:01.199 INFO:tasks.workunit.client.1.vm04.stdout:4/630: creat d4/df/db2/db4/fdb x:0 0 0 2026-03-10T14:08:01.199 INFO:tasks.workunit.client.1.vm04.stdout:5/723: read d7/f11 [1145889,117] 0 2026-03-10T14:08:01.201 INFO:tasks.workunit.client.1.vm04.stdout:0/641: mknod d0/d2/d15/d22/d38/d56/dcb/ccc 0 2026-03-10T14:08:01.202 INFO:tasks.workunit.client.1.vm04.stdout:5/724: write d7/f21 [4873175,119251] 0 2026-03-10T14:08:01.209 INFO:tasks.workunit.client.1.vm04.stdout:3/666: dread da/dc/d35/d52/d6d/fab [0,4194304] 0 2026-03-10T14:08:01.209 INFO:tasks.workunit.client.0.vm03.stdout:9/123: unlink d2/c9 0 2026-03-10T14:08:01.210 INFO:tasks.workunit.client.0.vm03.stdout:4/152: mkdir d5/d9/db/d19/d34 0 2026-03-10T14:08:01.211 INFO:tasks.workunit.client.1.vm04.stdout:7/680: mkdir d2/dc/de/d2d/d60/d7c/df8 0 2026-03-10T14:08:01.213 INFO:tasks.workunit.client.1.vm04.stdout:1/648: mknod d3/d5c/cde 0 2026-03-10T14:08:01.214 INFO:tasks.workunit.client.0.vm03.stdout:3/106: dwrite fe [0,4194304] 0 2026-03-10T14:08:01.223 INFO:tasks.workunit.client.0.vm03.stdout:1/156: rename d0/d18/d1d/l2d to d0/d2/df/d16/d20/l33 0 2026-03-10T14:08:01.242 INFO:tasks.workunit.client.1.vm04.stdout:0/642: creat d0/d2/d15/d49/d50/d5c/fcd x:0 0 0 2026-03-10T14:08:01.244 INFO:tasks.workunit.client.1.vm04.stdout:0/643: dread d0/d2/d15/d22/d38/d56/dc1/fc3 [0,4194304] 0 2026-03-10T14:08:01.250 INFO:tasks.workunit.client.1.vm04.stdout:5/725: creat d7/d12/d2b/d3e/d3f/da6/fe9 x:0 0 0 2026-03-10T14:08:01.250 INFO:tasks.workunit.client.1.vm04.stdout:5/726: readlink d7/d12/d2b/d93/d9e/lc5 0 2026-03-10T14:08:01.250 INFO:tasks.workunit.client.1.vm04.stdout:5/727: read d7/d9/f20 [2156353,19853] 0 2026-03-10T14:08:01.252 INFO:tasks.workunit.client.0.vm03.stdout:3/107: dwrite f17 [0,4194304] 0 2026-03-10T14:08:01.255 INFO:tasks.workunit.client.1.vm04.stdout:8/750: truncate d0/d3/fc3 1995306 0 2026-03-10T14:08:01.272 INFO:tasks.workunit.client.1.vm04.stdout:6/540: symlink d3/de/d35/d3f/d2d/la1 0 2026-03-10T14:08:01.275 INFO:tasks.workunit.client.0.vm03.stdout:2/186: mkdir d5/d35 0 2026-03-10T14:08:01.280 INFO:tasks.workunit.client.1.vm04.stdout:3/667: rmdir da/dc/d3f/d61 39 2026-03-10T14:08:01.280 INFO:tasks.workunit.client.1.vm04.stdout:9/590: rename d9/d58/db5/l6d to d9/da/dd/d1c/lc7 0 2026-03-10T14:08:01.280 INFO:tasks.workunit.client.0.vm03.stdout:1/157: mkdir d0/d2/d34 0 2026-03-10T14:08:01.280 INFO:tasks.workunit.client.0.vm03.stdout:1/158: fdatasync d0/d2/f2a 0 2026-03-10T14:08:01.280 INFO:tasks.workunit.client.0.vm03.stdout:1/159: dread - d0/d2/f2a zero size 2026-03-10T14:08:01.281 INFO:tasks.workunit.client.0.vm03.stdout:8/156: fdatasync da/f20 0 2026-03-10T14:08:01.281 INFO:tasks.workunit.client.0.vm03.stdout:1/160: dwrite d0/d2/df/f31 [0,4194304] 0 2026-03-10T14:08:01.281 INFO:tasks.workunit.client.0.vm03.stdout:8/157: readlink da/d24/l2f 0 2026-03-10T14:08:01.281 INFO:tasks.workunit.client.1.vm04.stdout:1/649: symlink d3/d22/d2f/d57/ldf 0 2026-03-10T14:08:01.285 INFO:tasks.workunit.client.0.vm03.stdout:8/158: fdatasync da/f12 0 2026-03-10T14:08:01.291 INFO:tasks.workunit.client.0.vm03.stdout:1/161: dwrite d0/f11 [0,4194304] 0 2026-03-10T14:08:01.305 INFO:tasks.workunit.client.1.vm04.stdout:7/681: mkdir d2/df9 0 2026-03-10T14:08:01.306 INFO:tasks.workunit.client.1.vm04.stdout:0/644: dread d0/d2/d15/f57 [0,4194304] 0 2026-03-10T14:08:01.310 INFO:tasks.workunit.client.1.vm04.stdout:3/668: unlink da/dc/d35/d52/d6d/cb8 0 2026-03-10T14:08:01.319 INFO:tasks.workunit.client.1.vm04.stdout:6/541: sync 2026-03-10T14:08:01.319 INFO:tasks.workunit.client.1.vm04.stdout:8/751: rename d0/d3/d5/f15 to d0/d3/d73/db8/dd5/fed 0 2026-03-10T14:08:01.323 INFO:tasks.workunit.client.1.vm04.stdout:9/591: symlink d9/da/dd/d1c/da3/lc8 0 2026-03-10T14:08:01.341 INFO:tasks.workunit.client.0.vm03.stdout:3/108: mkdir d1d/d20 0 2026-03-10T14:08:01.341 INFO:tasks.workunit.client.1.vm04.stdout:5/728: mknod d7/d59/d7e/dc6/cea 0 2026-03-10T14:08:01.341 INFO:tasks.workunit.client.0.vm03.stdout:1/162: mknod d0/d2/df/d16/c35 0 2026-03-10T14:08:01.345 INFO:tasks.workunit.client.0.vm03.stdout:9/124: rename d2/d14/f16 to d2/f2c 0 2026-03-10T14:08:01.345 INFO:tasks.workunit.client.1.vm04.stdout:8/752: read d0/d3/dd/d78/fae [95705,48619] 0 2026-03-10T14:08:01.346 INFO:tasks.workunit.client.0.vm03.stdout:3/109: dread fc [0,4194304] 0 2026-03-10T14:08:01.350 INFO:tasks.workunit.client.1.vm04.stdout:0/645: dread d0/d2/d15/d22/f30 [4194304,4194304] 0 2026-03-10T14:08:01.352 INFO:tasks.workunit.client.0.vm03.stdout:1/163: dread d0/f24 [0,4194304] 0 2026-03-10T14:08:01.356 INFO:tasks.workunit.client.0.vm03.stdout:3/110: dwrite fe [0,4194304] 0 2026-03-10T14:08:01.363 INFO:tasks.workunit.client.0.vm03.stdout:8/159: unlink f4 0 2026-03-10T14:08:01.367 INFO:tasks.workunit.client.1.vm04.stdout:9/592: creat d9/d58/db5/da5/fc9 x:0 0 0 2026-03-10T14:08:01.368 INFO:tasks.workunit.client.1.vm04.stdout:2/648: getdents d0/d14/d91/d8/d17/d4e/d85/d86/d96 0 2026-03-10T14:08:01.371 INFO:tasks.workunit.client.0.vm03.stdout:2/187: creat d5/f36 x:0 0 0 2026-03-10T14:08:01.371 INFO:tasks.workunit.client.0.vm03.stdout:2/188: chown d5/d10/d1f 110610 1 2026-03-10T14:08:01.374 INFO:tasks.workunit.client.1.vm04.stdout:7/682: mknod d2/dc/de/d11/de7/cfa 0 2026-03-10T14:08:01.386 INFO:tasks.workunit.client.1.vm04.stdout:5/729: mknod d7/d12/d2b/d8c/ceb 0 2026-03-10T14:08:01.388 INFO:tasks.workunit.client.0.vm03.stdout:1/164: mknod d0/d2/df/c36 0 2026-03-10T14:08:01.388 INFO:tasks.workunit.client.0.vm03.stdout:3/111: symlink d1d/l21 0 2026-03-10T14:08:01.389 INFO:tasks.workunit.client.0.vm03.stdout:8/160: creat da/f31 x:0 0 0 2026-03-10T14:08:01.397 INFO:tasks.workunit.client.1.vm04.stdout:0/646: mkdir d0/d2/d15/d22/d38/d56/dcb/dce 0 2026-03-10T14:08:01.397 INFO:tasks.workunit.client.1.vm04.stdout:0/647: read d0/d2/d25/f64 [2088797,71325] 0 2026-03-10T14:08:01.403 INFO:tasks.workunit.client.0.vm03.stdout:2/189: dwrite d5/ff [0,4194304] 0 2026-03-10T14:08:01.403 INFO:tasks.workunit.client.1.vm04.stdout:9/593: fdatasync d9/da/dd/d74/f75 0 2026-03-10T14:08:01.404 INFO:tasks.workunit.client.0.vm03.stdout:7/178: link d5/d9/d14/f2f d5/d9/d14/d26/f33 0 2026-03-10T14:08:01.405 INFO:tasks.workunit.client.0.vm03.stdout:7/179: dread - d5/d9/d14/f2f zero size 2026-03-10T14:08:01.405 INFO:tasks.workunit.client.1.vm04.stdout:2/649: unlink d0/d14/d91/d8/d17/c3b 0 2026-03-10T14:08:01.418 INFO:tasks.workunit.client.0.vm03.stdout:1/165: symlink d0/d2/l37 0 2026-03-10T14:08:01.420 INFO:tasks.workunit.client.1.vm04.stdout:3/669: creat da/dc/d3f/d54/fe2 x:0 0 0 2026-03-10T14:08:01.422 INFO:tasks.workunit.client.1.vm04.stdout:1/650: rename d3/d5c/d79/lae to d3/d22/d63/d35/le0 0 2026-03-10T14:08:01.423 INFO:tasks.workunit.client.1.vm04.stdout:1/651: truncate d3/d22/d63/f65 4760354 0 2026-03-10T14:08:01.423 INFO:tasks.workunit.client.1.vm04.stdout:1/652: fsync d3/d22/d2f/f39 0 2026-03-10T14:08:01.423 INFO:tasks.workunit.client.0.vm03.stdout:3/112: dread f10 [0,4194304] 0 2026-03-10T14:08:01.424 INFO:tasks.workunit.client.1.vm04.stdout:1/653: chown d3/d22/d63/d35 91 1 2026-03-10T14:08:01.425 INFO:tasks.workunit.client.0.vm03.stdout:8/161: fdatasync da/ff 0 2026-03-10T14:08:01.425 INFO:tasks.workunit.client.0.vm03.stdout:8/162: dread - da/f31 zero size 2026-03-10T14:08:01.432 INFO:tasks.workunit.client.0.vm03.stdout:7/180: mknod d5/d9/d14/d21/c34 0 2026-03-10T14:08:01.447 INFO:tasks.workunit.client.0.vm03.stdout:6/130: dwrite f0 [0,4194304] 0 2026-03-10T14:08:01.458 INFO:tasks.workunit.client.0.vm03.stdout:5/182: write d4/d13/d1f/f20 [5129349,100191] 0 2026-03-10T14:08:01.472 INFO:tasks.workunit.client.0.vm03.stdout:6/131: creat d8/db/d12/f26 x:0 0 0 2026-03-10T14:08:01.476 INFO:tasks.workunit.client.0.vm03.stdout:1/166: getdents d0/d2/d34 0 2026-03-10T14:08:01.479 INFO:tasks.workunit.client.1.vm04.stdout:4/631: write d4/df/db2/db4/d47/d70/d74/f86 [1748957,63278] 0 2026-03-10T14:08:01.491 INFO:tasks.workunit.client.0.vm03.stdout:7/181: mkdir d5/d9/d35 0 2026-03-10T14:08:01.494 INFO:tasks.workunit.client.0.vm03.stdout:5/183: truncate d4/d13/d1f/f21 1495615 0 2026-03-10T14:08:01.499 INFO:tasks.workunit.client.1.vm04.stdout:8/753: rmdir d0/d3/dd/d89/de7 0 2026-03-10T14:08:01.508 INFO:tasks.workunit.client.0.vm03.stdout:4/153: truncate f2 98593 0 2026-03-10T14:08:01.535 INFO:tasks.workunit.client.0.vm03.stdout:9/125: getdents d2 0 2026-03-10T14:08:01.536 INFO:tasks.workunit.client.0.vm03.stdout:1/167: dread d0/d2/df/f1b [0,4194304] 0 2026-03-10T14:08:01.536 INFO:tasks.workunit.client.0.vm03.stdout:7/182: dread - d5/f1b zero size 2026-03-10T14:08:01.537 INFO:tasks.workunit.client.1.vm04.stdout:6/542: write d3/de/f92 [731245,102960] 0 2026-03-10T14:08:01.541 INFO:tasks.workunit.client.0.vm03.stdout:5/184: mknod d4/d13/d1f/c42 0 2026-03-10T14:08:01.544 INFO:tasks.workunit.client.0.vm03.stdout:3/113: rmdir d1d/d20 0 2026-03-10T14:08:01.545 INFO:tasks.workunit.client.1.vm04.stdout:4/632: creat d4/d14/d64/fdc x:0 0 0 2026-03-10T14:08:01.548 INFO:tasks.workunit.client.0.vm03.stdout:8/163: creat da/d24/f32 x:0 0 0 2026-03-10T14:08:01.552 INFO:tasks.workunit.client.0.vm03.stdout:9/126: dread d2/f2c [0,4194304] 0 2026-03-10T14:08:01.556 INFO:tasks.workunit.client.0.vm03.stdout:1/168: fdatasync d0/d2/fe 0 2026-03-10T14:08:01.556 INFO:tasks.workunit.client.1.vm04.stdout:8/754: truncate d0/d3/d63/d12/d51/d67/fcd 288557 0 2026-03-10T14:08:01.557 INFO:tasks.workunit.client.0.vm03.stdout:5/185: mkdir d4/d13/d43 0 2026-03-10T14:08:01.557 INFO:tasks.workunit.client.1.vm04.stdout:0/648: link d0/d2/d15/d49/d50/d5c/c5f d0/d2/d15/d49/ccf 0 2026-03-10T14:08:01.561 INFO:tasks.workunit.client.1.vm04.stdout:7/683: dwrite d2/dc/de/d2d/d60/faf [0,4194304] 0 2026-03-10T14:08:01.562 INFO:tasks.workunit.client.0.vm03.stdout:3/114: symlink d1d/l22 0 2026-03-10T14:08:01.562 INFO:tasks.workunit.client.1.vm04.stdout:5/730: dwrite d7/d12/d2b/f72 [0,4194304] 0 2026-03-10T14:08:01.566 INFO:tasks.workunit.client.0.vm03.stdout:4/154: fsync d5/d9/db/f10 0 2026-03-10T14:08:01.571 INFO:tasks.workunit.client.1.vm04.stdout:9/594: dwrite d9/ff [0,4194304] 0 2026-03-10T14:08:01.572 INFO:tasks.workunit.client.0.vm03.stdout:5/186: dread d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:08:01.582 INFO:tasks.workunit.client.1.vm04.stdout:5/731: mkdir d7/d12/d2b/d3e/d57/d8a/dec 0 2026-03-10T14:08:01.583 INFO:tasks.workunit.client.1.vm04.stdout:5/732: chown d7/fa 40 1 2026-03-10T14:08:01.584 INFO:tasks.workunit.client.1.vm04.stdout:7/684: dread d2/dc/f8d [0,4194304] 0 2026-03-10T14:08:01.601 INFO:tasks.workunit.client.1.vm04.stdout:2/650: write d0/d14/d91/d8/d17/f73 [4782902,86942] 0 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: pgmap v159: 65 pgs: 65 active+clean; 1.8 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 21 MiB/s rd, 78 MiB/s wr, 163 op/s 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr fail", "who": "vm03.rwbbep"}]: dispatch 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "mgr fail", "who": "vm03.rwbbep"}]': finished 2026-03-10T14:08:01.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:01 vm03.local ceph-mon[49718]: mgrmap e20: vm04.ywwcto(active, starting, since 0.0106431s) 2026-03-10T14:08:01.610 INFO:tasks.workunit.client.1.vm04.stdout:0/649: dread d0/f19 [0,4194304] 0 2026-03-10T14:08:01.613 INFO:tasks.workunit.client.1.vm04.stdout:0/650: dwrite d0/d2/d15/d22/d38/d56/d66/f2e [0,4194304] 0 2026-03-10T14:08:01.617 INFO:tasks.workunit.client.1.vm04.stdout:9/595: dread d9/d5c/d93/d96/d9d/fa2 [0,4194304] 0 2026-03-10T14:08:01.637 INFO:tasks.workunit.client.1.vm04.stdout:7/685: creat d2/dc/de/d2d/d60/ffb x:0 0 0 2026-03-10T14:08:01.638 INFO:tasks.workunit.client.1.vm04.stdout:7/686: chown d2/dc/de/d2d/d60/d81/le1 91885819 1 2026-03-10T14:08:01.649 INFO:tasks.workunit.client.0.vm03.stdout:4/155: fsync d5/d9/db/f29 0 2026-03-10T14:08:01.655 INFO:tasks.workunit.client.1.vm04.stdout:0/651: unlink d0/d2/d15/d49/d50/d61/d75/f9a 0 2026-03-10T14:08:01.657 INFO:tasks.workunit.client.1.vm04.stdout:9/596: truncate d9/da/dd/d1c/f27 1916612 0 2026-03-10T14:08:01.659 INFO:tasks.workunit.client.1.vm04.stdout:7/687: symlink d2/dc/de/d2d/d60/d7c/d44/dc0/lfc 0 2026-03-10T14:08:01.663 INFO:tasks.workunit.client.1.vm04.stdout:7/688: chown d2/d2a/d42/l54 3518 1 2026-03-10T14:08:01.665 INFO:tasks.workunit.client.0.vm03.stdout:4/156: fdatasync d5/f7 0 2026-03-10T14:08:01.666 INFO:tasks.workunit.client.0.vm03.stdout:4/157: fsync d5/d9/db/f20 0 2026-03-10T14:08:01.666 INFO:tasks.workunit.client.0.vm03.stdout:4/158: chown d5/f7 0 1 2026-03-10T14:08:01.668 INFO:tasks.workunit.client.1.vm04.stdout:9/597: rmdir d9/da/d5d 39 2026-03-10T14:08:01.669 INFO:tasks.workunit.client.1.vm04.stdout:3/670: write da/dc/f2a [981969,116185] 0 2026-03-10T14:08:01.673 INFO:tasks.workunit.client.1.vm04.stdout:7/689: rename d2/dc/de/d2d/d60/d7c/f97 to d2/dc/de/d11/ffd 0 2026-03-10T14:08:01.673 INFO:tasks.workunit.client.1.vm04.stdout:7/690: chown d2/dc/de/d2d/d5c/f63 207094 1 2026-03-10T14:08:01.680 INFO:tasks.workunit.client.0.vm03.stdout:4/159: symlink d5/d9/l35 0 2026-03-10T14:08:01.681 INFO:tasks.workunit.client.1.vm04.stdout:1/654: dwrite d3/d22/f8e [0,4194304] 0 2026-03-10T14:08:01.693 INFO:tasks.workunit.client.1.vm04.stdout:9/598: mknod d9/da/dd/d1c/cca 0 2026-03-10T14:08:01.693 INFO:tasks.workunit.client.1.vm04.stdout:9/599: write d9/da/dd/d1c/da3/fb3 [20059,27284] 0 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: pgmap v159: 65 pgs: 65 active+clean; 1.8 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 21 MiB/s rd, 78 MiB/s wr, 163 op/s 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr fail", "who": "vm03.rwbbep"}]: dispatch 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "mgr fail", "who": "vm03.rwbbep"}]': finished 2026-03-10T14:08:01.696 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:01 vm04.local ceph-mon[55966]: mgrmap e20: vm04.ywwcto(active, starting, since 0.0106431s) 2026-03-10T14:08:01.700 INFO:tasks.workunit.client.0.vm03.stdout:5/187: rmdir d4/d13/d1f/d34 0 2026-03-10T14:08:01.704 INFO:tasks.workunit.client.1.vm04.stdout:3/671: symlink da/dc/d47/d9b/dcb/le3 0 2026-03-10T14:08:01.708 INFO:tasks.workunit.client.0.vm03.stdout:3/115: link c1a d1d/c23 0 2026-03-10T14:08:01.708 INFO:tasks.workunit.client.1.vm04.stdout:9/600: chown d9/da/d5d 1186 1 2026-03-10T14:08:01.708 INFO:tasks.workunit.client.0.vm03.stdout:8/164: getdents da 0 2026-03-10T14:08:01.709 INFO:tasks.workunit.client.0.vm03.stdout:4/160: creat d5/d9/f36 x:0 0 0 2026-03-10T14:08:01.712 INFO:tasks.workunit.client.0.vm03.stdout:3/116: dread fc [0,4194304] 0 2026-03-10T14:08:01.715 INFO:tasks.workunit.client.0.vm03.stdout:4/161: mknod d5/d9/db/c37 0 2026-03-10T14:08:01.732 INFO:tasks.workunit.client.0.vm03.stdout:5/188: mknod d4/d35/c44 0 2026-03-10T14:08:01.733 INFO:tasks.workunit.client.0.vm03.stdout:5/189: dread d4/d16/d19/f25 [0,4194304] 0 2026-03-10T14:08:01.733 INFO:tasks.workunit.client.0.vm03.stdout:5/190: stat d4/d40 0 2026-03-10T14:08:01.733 INFO:tasks.workunit.client.0.vm03.stdout:5/191: chown d4/c5 751 1 2026-03-10T14:08:01.733 INFO:tasks.workunit.client.0.vm03.stdout:5/192: chown d4/d16/d19 336137 1 2026-03-10T14:08:01.733 INFO:tasks.workunit.client.0.vm03.stdout:5/193: truncate d4/f17 1650211 0 2026-03-10T14:08:01.733 INFO:tasks.workunit.client.0.vm03.stdout:5/194: link d4/c5 d4/d6/c45 0 2026-03-10T14:08:01.733 INFO:tasks.workunit.client.0.vm03.stdout:5/195: rename d4/d13/d1f/c2c to d4/d13/d1f/c46 0 2026-03-10T14:08:01.733 INFO:tasks.workunit.client.1.vm04.stdout:1/655: sync 2026-03-10T14:08:01.740 INFO:tasks.workunit.client.0.vm03.stdout:4/162: dread d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:01.741 INFO:tasks.workunit.client.0.vm03.stdout:4/163: dread d5/d9/db/f10 [0,4194304] 0 2026-03-10T14:08:01.751 INFO:tasks.workunit.client.0.vm03.stdout:5/196: dread d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:08:01.759 INFO:tasks.workunit.client.1.vm04.stdout:1/656: creat d3/d22/fe1 x:0 0 0 2026-03-10T14:08:01.782 INFO:tasks.workunit.client.0.vm03.stdout:4/164: mkdir d5/d9/db/d19/d38 0 2026-03-10T14:08:01.784 INFO:tasks.workunit.client.0.vm03.stdout:5/197: fdatasync d4/d6/de/f14 0 2026-03-10T14:08:01.786 INFO:tasks.workunit.client.0.vm03.stdout:6/132: dwrite d8/db/f1f [0,4194304] 0 2026-03-10T14:08:01.788 INFO:tasks.workunit.client.0.vm03.stdout:6/133: chown d8/d11/d18/l1a 135293 1 2026-03-10T14:08:01.788 INFO:tasks.workunit.client.0.vm03.stdout:6/134: readlink d8/d11/d18/l1a 0 2026-03-10T14:08:01.791 INFO:tasks.workunit.client.0.vm03.stdout:6/135: dread f3 [0,4194304] 0 2026-03-10T14:08:01.794 INFO:tasks.workunit.client.0.vm03.stdout:6/136: dread f5 [4194304,4194304] 0 2026-03-10T14:08:01.800 INFO:tasks.workunit.client.0.vm03.stdout:5/198: mknod d4/d6/c47 0 2026-03-10T14:08:01.801 INFO:tasks.workunit.client.0.vm03.stdout:6/137: dwrite f2 [4194304,4194304] 0 2026-03-10T14:08:01.803 INFO:tasks.workunit.client.1.vm04.stdout:6/543: dwrite d3/de/d35/d3f/d2d/d32/d23/d83/f87 [0,4194304] 0 2026-03-10T14:08:01.808 INFO:tasks.workunit.client.1.vm04.stdout:4/633: dwrite d4/f96 [0,4194304] 0 2026-03-10T14:08:01.815 INFO:tasks.workunit.client.1.vm04.stdout:4/634: dwrite d4/f96 [4194304,4194304] 0 2026-03-10T14:08:01.831 INFO:tasks.workunit.client.0.vm03.stdout:4/165: rmdir d5 39 2026-03-10T14:08:01.835 INFO:tasks.workunit.client.1.vm04.stdout:6/544: rename d3/de/d35/d3a/d43/d4c/c59 to d3/d1d/ca2 0 2026-03-10T14:08:01.838 INFO:tasks.workunit.client.1.vm04.stdout:8/755: write d0/d82/fa0 [1030491,24666] 0 2026-03-10T14:08:01.841 INFO:tasks.workunit.client.0.vm03.stdout:6/138: creat d8/db/df/f27 x:0 0 0 2026-03-10T14:08:01.841 INFO:tasks.workunit.client.0.vm03.stdout:9/127: write d2/f3 [1974455,52074] 0 2026-03-10T14:08:01.843 INFO:tasks.workunit.client.0.vm03.stdout:1/169: dwrite d0/f4 [0,4194304] 0 2026-03-10T14:08:01.843 INFO:tasks.workunit.client.1.vm04.stdout:5/733: dwrite d7/d2d/f64 [0,4194304] 0 2026-03-10T14:08:01.846 INFO:tasks.workunit.client.1.vm04.stdout:5/734: chown d7/d59/d7e/dc6 44804 1 2026-03-10T14:08:01.848 INFO:tasks.workunit.client.1.vm04.stdout:2/651: write d0/d14/d91/d8/d17/d4e/d85/f90 [810143,42963] 0 2026-03-10T14:08:01.849 INFO:tasks.workunit.client.1.vm04.stdout:2/652: dread - d0/db8/fca zero size 2026-03-10T14:08:01.852 INFO:tasks.workunit.client.1.vm04.stdout:0/652: write d0/f19 [753383,126968] 0 2026-03-10T14:08:01.858 INFO:tasks.workunit.client.0.vm03.stdout:5/199: creat d4/d13/d43/f48 x:0 0 0 2026-03-10T14:08:01.870 INFO:tasks.workunit.client.0.vm03.stdout:9/128: fsync d2/d14/f1a 0 2026-03-10T14:08:01.870 INFO:tasks.workunit.client.1.vm04.stdout:2/653: mkdir d0/d14/d91/d4a/d66/dcd 0 2026-03-10T14:08:01.870 INFO:tasks.workunit.client.1.vm04.stdout:2/654: write d0/d14/d91/f1d [6472525,55084] 0 2026-03-10T14:08:01.877 INFO:tasks.workunit.client.0.vm03.stdout:1/170: dread d0/f4 [4194304,4194304] 0 2026-03-10T14:08:01.883 INFO:tasks.workunit.client.1.vm04.stdout:0/653: fsync d0/d2/d25/f2a 0 2026-03-10T14:08:01.883 INFO:tasks.workunit.client.0.vm03.stdout:4/166: truncate d5/d9/f31 1808248 0 2026-03-10T14:08:01.883 INFO:tasks.workunit.client.0.vm03.stdout:6/139: symlink d8/db/l28 0 2026-03-10T14:08:01.883 INFO:tasks.workunit.client.1.vm04.stdout:7/691: dread d2/dc/de/d11/ffd [0,4194304] 0 2026-03-10T14:08:01.884 INFO:tasks.workunit.client.1.vm04.stdout:8/756: truncate d0/d3/d63/d12/d51/d67/fcd 804658 0 2026-03-10T14:08:01.886 INFO:tasks.workunit.client.1.vm04.stdout:5/735: symlink d7/d12/d2b/de8/led 0 2026-03-10T14:08:01.890 INFO:tasks.workunit.client.1.vm04.stdout:0/654: unlink d0/d2/d15/d22/d62/fa6 0 2026-03-10T14:08:01.890 INFO:tasks.workunit.client.0.vm03.stdout:5/200: symlink d4/d16/d19/d23/d3f/l49 0 2026-03-10T14:08:01.890 INFO:tasks.workunit.client.0.vm03.stdout:4/167: fdatasync d5/fe 0 2026-03-10T14:08:01.890 INFO:tasks.workunit.client.0.vm03.stdout:6/140: creat d8/d1b/f29 x:0 0 0 2026-03-10T14:08:01.904 INFO:tasks.workunit.client.1.vm04.stdout:6/545: creat d3/de/d35/d3f/fa3 x:0 0 0 2026-03-10T14:08:01.910 INFO:tasks.workunit.client.1.vm04.stdout:7/692: mkdir d2/dc/de/d2d/d5c/da9/df6/dfe 0 2026-03-10T14:08:01.923 INFO:tasks.workunit.client.1.vm04.stdout:2/655: mknod d0/d14/d39/d47/d70/dad/cce 0 2026-03-10T14:08:01.923 INFO:tasks.workunit.client.1.vm04.stdout:0/655: fdatasync d0/d2/d15/f2f 0 2026-03-10T14:08:01.924 INFO:tasks.workunit.client.1.vm04.stdout:0/656: symlink d0/d2/d15/d49/d50/d5c/ld0 0 2026-03-10T14:08:01.926 INFO:tasks.workunit.client.1.vm04.stdout:7/693: creat d2/d6b/fff x:0 0 0 2026-03-10T14:08:01.927 INFO:tasks.workunit.client.0.vm03.stdout:6/141: getdents d8 0 2026-03-10T14:08:01.928 INFO:tasks.workunit.client.1.vm04.stdout:2/656: symlink d0/d14/d91/d4a/d8c/dab/d46/dc8/lcf 0 2026-03-10T14:08:01.929 INFO:tasks.workunit.client.1.vm04.stdout:2/657: fdatasync d0/d14/d39/d47/d70/f74 0 2026-03-10T14:08:01.930 INFO:tasks.workunit.client.1.vm04.stdout:7/694: dwrite d2/dc/de/d2d/d38/d50/fbc [4194304,4194304] 0 2026-03-10T14:08:01.936 INFO:tasks.workunit.client.1.vm04.stdout:7/695: dread - d2/dc/de/d2d/d5c/da9/fea zero size 2026-03-10T14:08:01.946 INFO:tasks.workunit.client.0.vm03.stdout:6/142: creat d8/d11/f2a x:0 0 0 2026-03-10T14:08:01.956 INFO:tasks.workunit.client.0.vm03.stdout:6/143: chown d8/d1b/f23 248 1 2026-03-10T14:08:01.956 INFO:tasks.workunit.client.0.vm03.stdout:6/144: truncate d8/d1b/f29 974849 0 2026-03-10T14:08:01.956 INFO:tasks.workunit.client.0.vm03.stdout:6/145: stat d8/d11/d18/l1a 0 2026-03-10T14:08:01.956 INFO:tasks.workunit.client.1.vm04.stdout:7/696: truncate d2/dc/de/d2d/d38/fa5 295968 0 2026-03-10T14:08:01.956 INFO:tasks.workunit.client.1.vm04.stdout:7/697: fsync d2/dc/de/dae/fb2 0 2026-03-10T14:08:01.956 INFO:tasks.workunit.client.1.vm04.stdout:7/698: dwrite d2/dc/de/d2d/d5c/f8e [0,4194304] 0 2026-03-10T14:08:01.956 INFO:tasks.workunit.client.1.vm04.stdout:7/699: creat d2/dc/d4d/dcd/f100 x:0 0 0 2026-03-10T14:08:01.962 INFO:tasks.workunit.client.0.vm03.stdout:6/146: fsync f2 0 2026-03-10T14:08:01.965 INFO:tasks.workunit.client.0.vm03.stdout:6/147: symlink d8/l2b 0 2026-03-10T14:08:01.968 INFO:tasks.workunit.client.0.vm03.stdout:6/148: truncate d8/d11/f2a 337264 0 2026-03-10T14:08:01.982 INFO:tasks.workunit.client.1.vm04.stdout:6/546: dread d3/f57 [0,4194304] 0 2026-03-10T14:08:01.983 INFO:tasks.workunit.client.1.vm04.stdout:6/547: write d3/de/d35/d3f/d2d/d32/d23/f5a [3690941,29131] 0 2026-03-10T14:08:01.985 INFO:tasks.workunit.client.1.vm04.stdout:2/658: dread d0/d14/d1b/f32 [0,4194304] 0 2026-03-10T14:08:01.986 INFO:tasks.workunit.client.1.vm04.stdout:6/548: symlink d3/de/d35/d3a/la4 0 2026-03-10T14:08:01.990 INFO:tasks.workunit.client.1.vm04.stdout:6/549: rmdir d3/d1d/d73 39 2026-03-10T14:08:01.991 INFO:tasks.workunit.client.0.vm03.stdout:9/129: sync 2026-03-10T14:08:01.995 INFO:tasks.workunit.client.1.vm04.stdout:6/550: creat d3/de/d35/d3f/d2d/d38/d40/fa5 x:0 0 0 2026-03-10T14:08:01.996 INFO:tasks.workunit.client.0.vm03.stdout:9/130: creat d2/d14/d2b/f2d x:0 0 0 2026-03-10T14:08:01.997 INFO:tasks.workunit.client.0.vm03.stdout:9/131: chown d2/d14/d2b/f2d 1010 1 2026-03-10T14:08:01.998 INFO:tasks.workunit.client.0.vm03.stdout:5/201: sync 2026-03-10T14:08:01.998 INFO:tasks.workunit.client.1.vm04.stdout:5/736: sync 2026-03-10T14:08:01.998 INFO:tasks.workunit.client.1.vm04.stdout:0/657: sync 2026-03-10T14:08:01.999 INFO:tasks.workunit.client.1.vm04.stdout:6/551: dread d3/de/d35/d3f/d2d/d32/d23/f31 [4194304,4194304] 0 2026-03-10T14:08:02.002 INFO:tasks.workunit.client.1.vm04.stdout:5/737: unlink d7/f4b 0 2026-03-10T14:08:02.002 INFO:tasks.workunit.client.1.vm04.stdout:5/738: readlink d7/d12/d2b/d93/d9e/lc5 0 2026-03-10T14:08:02.003 INFO:tasks.workunit.client.0.vm03.stdout:9/132: mknod d2/d14/d2b/c2e 0 2026-03-10T14:08:02.004 INFO:tasks.workunit.client.0.vm03.stdout:5/202: mkdir d4/d16/d19/d4a 0 2026-03-10T14:08:02.005 INFO:tasks.workunit.client.0.vm03.stdout:9/133: chown d2/c18 12058 1 2026-03-10T14:08:02.011 INFO:tasks.workunit.client.1.vm04.stdout:6/552: rename d3/de/d35/d3f/d2d/f82 to d3/de/d35/d3f/d2d/fa6 0 2026-03-10T14:08:02.017 INFO:tasks.workunit.client.1.vm04.stdout:6/553: truncate d3/de/d35/d3f/fa3 26322 0 2026-03-10T14:08:02.017 INFO:tasks.workunit.client.1.vm04.stdout:0/658: getdents d0/d2/d15/d22/d38 0 2026-03-10T14:08:02.017 INFO:tasks.workunit.client.1.vm04.stdout:0/659: rename d0/ca0 to d0/d2/dbe/cd1 0 2026-03-10T14:08:02.017 INFO:tasks.workunit.client.1.vm04.stdout:6/554: mknod d3/d1d/d73/ca7 0 2026-03-10T14:08:02.021 INFO:tasks.workunit.client.0.vm03.stdout:5/203: rename d4/d13/f24 to d4/d13/f4b 0 2026-03-10T14:08:02.023 INFO:tasks.workunit.client.1.vm04.stdout:6/555: fdatasync d3/de/d35/d3f/d2d/d32/d23/f7c 0 2026-03-10T14:08:02.027 INFO:tasks.workunit.client.1.vm04.stdout:0/660: link d0/d2/d15/d22/d62/l8f d0/d2/dbe/ld2 0 2026-03-10T14:08:02.027 INFO:tasks.workunit.client.0.vm03.stdout:9/134: rename d2/f3 to d2/f2f 0 2026-03-10T14:08:02.027 INFO:tasks.workunit.client.0.vm03.stdout:5/204: link d4/d6/c47 d4/c4c 0 2026-03-10T14:08:02.030 INFO:tasks.workunit.client.0.vm03.stdout:9/135: sync 2026-03-10T14:08:02.030 INFO:tasks.workunit.client.1.vm04.stdout:6/556: dread d3/de/d35/d3f/f22 [0,4194304] 0 2026-03-10T14:08:02.033 INFO:tasks.workunit.client.0.vm03.stdout:5/205: dwrite d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:02.043 INFO:tasks.workunit.client.0.vm03.stdout:9/136: dread d2/f2f [0,4194304] 0 2026-03-10T14:08:02.054 INFO:tasks.workunit.client.1.vm04.stdout:9/601: write d9/da/f9e [356260,16959] 0 2026-03-10T14:08:02.060 INFO:tasks.workunit.client.1.vm04.stdout:3/672: dwrite f8 [0,4194304] 0 2026-03-10T14:08:02.063 INFO:tasks.workunit.client.1.vm04.stdout:0/661: link d0/d2/dbe/ld2 d0/d2/d15/d22/d38/d56/dc1/ld3 0 2026-03-10T14:08:02.066 INFO:tasks.workunit.client.1.vm04.stdout:6/557: creat d3/de/d35/d3a/d43/d9c/fa8 x:0 0 0 2026-03-10T14:08:02.067 INFO:tasks.workunit.client.0.vm03.stdout:9/137: write d2/fc [2596522,121944] 0 2026-03-10T14:08:02.067 INFO:tasks.workunit.client.1.vm04.stdout:9/602: rename d9/da/fb to d9/da/dd/d74/fcb 0 2026-03-10T14:08:02.072 INFO:tasks.workunit.client.0.vm03.stdout:5/206: symlink d4/d16/d19/d4a/l4d 0 2026-03-10T14:08:02.072 INFO:tasks.workunit.client.0.vm03.stdout:5/207: chown d4/d6/c12 85693660 1 2026-03-10T14:08:02.072 INFO:tasks.workunit.client.1.vm04.stdout:3/673: creat da/dc/d35/d52/d6d/fe4 x:0 0 0 2026-03-10T14:08:02.073 INFO:tasks.workunit.client.1.vm04.stdout:3/674: chown da/dc/d35/d37/f5e 61298288 1 2026-03-10T14:08:02.074 INFO:tasks.workunit.client.0.vm03.stdout:9/138: chown d2/d14/f27 6695085 1 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.0.vm03.stdout:9/139: dread - d2/d14/f27 zero size 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.0.vm03.stdout:9/140: dread d2/f2f [0,4194304] 0 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.0.vm03.stdout:9/141: creat d2/d14/f30 x:0 0 0 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.0.vm03.stdout:9/142: dread - d2/d14/d2b/f2d zero size 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.1.vm04.stdout:0/662: dread d0/d2/d15/d22/d38/d56/f5e [0,4194304] 0 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.1.vm04.stdout:9/603: dread d9/da/dd/d74/f75 [0,4194304] 0 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.1.vm04.stdout:9/604: creat d9/d44/d59/fcc x:0 0 0 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.1.vm04.stdout:9/605: dread d9/d5c/d93/d96/d9d/fa2 [0,4194304] 0 2026-03-10T14:08:02.097 INFO:tasks.workunit.client.1.vm04.stdout:9/606: creat d9/da/d5d/d81/fcd x:0 0 0 2026-03-10T14:08:02.115 INFO:tasks.workunit.client.0.vm03.stdout:8/165: dwrite da/fd [0,4194304] 0 2026-03-10T14:08:02.131 INFO:tasks.workunit.client.0.vm03.stdout:8/166: rmdir da/d24 39 2026-03-10T14:08:02.131 INFO:tasks.workunit.client.0.vm03.stdout:8/167: rename da/f17 to da/f33 0 2026-03-10T14:08:02.131 INFO:tasks.workunit.client.0.vm03.stdout:8/168: dread - da/d24/f32 zero size 2026-03-10T14:08:02.131 INFO:tasks.workunit.client.0.vm03.stdout:8/169: write da/f16 [1880108,56372] 0 2026-03-10T14:08:02.131 INFO:tasks.workunit.client.0.vm03.stdout:8/170: dwrite da/d24/f25 [0,4194304] 0 2026-03-10T14:08:02.131 INFO:tasks.workunit.client.0.vm03.stdout:8/171: dread da/d24/f25 [0,4194304] 0 2026-03-10T14:08:02.132 INFO:tasks.workunit.client.0.vm03.stdout:8/172: mknod da/d24/c34 0 2026-03-10T14:08:02.169 INFO:tasks.workunit.client.1.vm04.stdout:3/675: sync 2026-03-10T14:08:02.173 INFO:tasks.workunit.client.1.vm04.stdout:3/676: dwrite da/dc/f2a [0,4194304] 0 2026-03-10T14:08:02.175 INFO:tasks.workunit.client.0.vm03.stdout:9/143: sync 2026-03-10T14:08:02.179 INFO:tasks.workunit.client.1.vm04.stdout:3/677: rename da/dc/lf to da/d8e/db5/le5 0 2026-03-10T14:08:02.179 INFO:tasks.workunit.client.1.vm04.stdout:3/678: symlink da/d8e/le6 0 2026-03-10T14:08:02.182 INFO:tasks.workunit.client.1.vm04.stdout:3/679: dwrite da/dc/d3f/d61/dc1/fc4 [0,4194304] 0 2026-03-10T14:08:02.193 INFO:tasks.workunit.client.1.vm04.stdout:3/680: mknod da/ce7 0 2026-03-10T14:08:02.217 INFO:tasks.workunit.client.1.vm04.stdout:3/681: link da/dc/fa4 da/dc/d35/dcd/dde/dac/fe8 0 2026-03-10T14:08:02.217 INFO:tasks.workunit.client.1.vm04.stdout:3/682: getdents da/dc/d3f/d61/dc1 0 2026-03-10T14:08:02.217 INFO:tasks.workunit.client.1.vm04.stdout:3/683: creat da/d30/fe9 x:0 0 0 2026-03-10T14:08:02.217 INFO:tasks.workunit.client.1.vm04.stdout:3/684: readlink da/dc/d35/d52/d53/l58 0 2026-03-10T14:08:02.217 INFO:tasks.workunit.client.1.vm04.stdout:3/685: creat da/dc/d47/fea x:0 0 0 2026-03-10T14:08:02.372 INFO:tasks.workunit.client.0.vm03.stdout:3/117: write fc [2137803,45407] 0 2026-03-10T14:08:02.373 INFO:tasks.workunit.client.0.vm03.stdout:3/118: mknod d1d/c24 0 2026-03-10T14:08:02.379 INFO:tasks.workunit.client.1.vm04.stdout:3/686: dread da/dc/d35/dcd/dde/dac/fb4 [0,4194304] 0 2026-03-10T14:08:02.380 INFO:tasks.workunit.client.1.vm04.stdout:3/687: symlink da/dc/d3f/d61/leb 0 2026-03-10T14:08:02.383 INFO:tasks.workunit.client.1.vm04.stdout:3/688: rename da/dc/d35/d52/fa8 to da/dc/d35/d37/fec 0 2026-03-10T14:08:02.394 INFO:tasks.workunit.client.1.vm04.stdout:3/689: dread da/d30/f42 [0,4194304] 0 2026-03-10T14:08:02.396 INFO:tasks.workunit.client.1.vm04.stdout:3/690: getdents da/dc/d3f 0 2026-03-10T14:08:02.396 INFO:tasks.workunit.client.1.vm04.stdout:3/691: rmdir da 39 2026-03-10T14:08:02.398 INFO:tasks.workunit.client.1.vm04.stdout:3/692: mkdir da/ded 0 2026-03-10T14:08:02.399 INFO:tasks.workunit.client.1.vm04.stdout:3/693: getdents da/ded 0 2026-03-10T14:08:02.400 INFO:tasks.workunit.client.1.vm04.stdout:3/694: symlink da/dc/d35/dcd/dde/dac/lee 0 2026-03-10T14:08:02.401 INFO:tasks.workunit.client.1.vm04.stdout:3/695: fdatasync da/dc/d47/d9b/fbe 0 2026-03-10T14:08:02.476 INFO:tasks.workunit.client.1.vm04.stdout:1/657: dwrite d3/d22/d63/d35/dd9/d13/d38/d58/d5b/fa8 [0,4194304] 0 2026-03-10T14:08:02.488 INFO:tasks.workunit.client.1.vm04.stdout:1/658: creat d3/d22/d63/d35/dd9/d13/d38/d58/dcc/fe2 x:0 0 0 2026-03-10T14:08:02.499 INFO:tasks.workunit.client.1.vm04.stdout:1/659: dread d3/d22/d63/d35/dd9/f56 [0,4194304] 0 2026-03-10T14:08:02.509 INFO:tasks.workunit.client.1.vm04.stdout:1/660: dread d3/d5c/f71 [0,4194304] 0 2026-03-10T14:08:02.510 INFO:tasks.workunit.client.1.vm04.stdout:1/661: chown d3/d22/d63/d35/dd9/d13/d38/ca6 26 1 2026-03-10T14:08:02.604 INFO:tasks.workunit.client.1.vm04.stdout:1/662: sync 2026-03-10T14:08:02.604 INFO:tasks.workunit.client.1.vm04.stdout:1/663: write d3/d22/d2f/f39 [3559889,107914] 0 2026-03-10T14:08:02.611 INFO:tasks.workunit.client.1.vm04.stdout:1/664: truncate d3/d22/d63/f7f 96234 0 2026-03-10T14:08:02.618 INFO:tasks.workunit.client.1.vm04.stdout:1/665: creat d3/d22/d6d/fe3 x:0 0 0 2026-03-10T14:08:02.628 INFO:tasks.workunit.client.1.vm04.stdout:1/666: link d3/d22/d2f/d57/f78 d3/d5c/d79/fe4 0 2026-03-10T14:08:02.636 INFO:tasks.workunit.client.1.vm04.stdout:1/667: dread d3/d5c/fbf [0,4194304] 0 2026-03-10T14:08:02.637 INFO:tasks.workunit.client.1.vm04.stdout:1/668: chown d3/d22/d63/d35/d6c/lcd 113033124 1 2026-03-10T14:08:02.722 INFO:tasks.workunit.client.1.vm04.stdout:4/635: dwrite d4/df/db2/db4/f37 [0,4194304] 0 2026-03-10T14:08:02.722 INFO:tasks.workunit.client.1.vm04.stdout:4/636: dread - d4/d14/d64/f97 zero size 2026-03-10T14:08:02.757 INFO:tasks.workunit.client.0.vm03.stdout:7/183: dwrite d5/d9/d14/d26/f33 [0,4194304] 0 2026-03-10T14:08:02.899 INFO:tasks.workunit.client.0.vm03.stdout:1/171: write d0/fa [115306,106846] 0 2026-03-10T14:08:02.903 INFO:tasks.workunit.client.0.vm03.stdout:1/172: rmdir d0/d2/df/d16 39 2026-03-10T14:08:02.907 INFO:tasks.workunit.client.0.vm03.stdout:1/173: fsync d0/d2/df/f1b 0 2026-03-10T14:08:02.908 INFO:tasks.workunit.client.0.vm03.stdout:1/174: fsync d0/d2/df/f31 0 2026-03-10T14:08:02.908 INFO:tasks.workunit.client.0.vm03.stdout:1/175: truncate d0/f30 359354 0 2026-03-10T14:08:02.909 INFO:tasks.workunit.client.0.vm03.stdout:1/176: chown d0/d2/l2b 3696273 1 2026-03-10T14:08:02.918 INFO:tasks.workunit.client.0.vm03.stdout:4/168: dwrite d5/d9/db/f24 [0,4194304] 0 2026-03-10T14:08:02.919 INFO:tasks.workunit.client.1.vm04.stdout:8/757: dwrite d0/d3/dd/d78/fae [0,4194304] 0 2026-03-10T14:08:02.923 INFO:tasks.workunit.client.0.vm03.stdout:4/169: dread d5/f7 [0,4194304] 0 2026-03-10T14:08:02.942 INFO:tasks.workunit.client.1.vm04.stdout:8/758: mknod d0/d3/d63/d12/d51/cee 0 2026-03-10T14:08:02.948 INFO:tasks.workunit.client.1.vm04.stdout:8/759: dread d0/d3/d63/d12/d69/f81 [0,4194304] 0 2026-03-10T14:08:02.955 INFO:tasks.workunit.client.0.vm03.stdout:1/177: creat d0/d2/df/d16/d20/f38 x:0 0 0 2026-03-10T14:08:02.997 INFO:tasks.workunit.client.0.vm03.stdout:1/178: chown d0/l12 30 1 2026-03-10T14:08:02.997 INFO:tasks.workunit.client.0.vm03.stdout:1/179: symlink d0/d2/df/d27/l39 0 2026-03-10T14:08:02.997 INFO:tasks.workunit.client.0.vm03.stdout:4/170: rmdir d5/d9 39 2026-03-10T14:08:02.997 INFO:tasks.workunit.client.0.vm03.stdout:1/180: creat d0/d2/d34/f3a x:0 0 0 2026-03-10T14:08:02.997 INFO:tasks.workunit.client.0.vm03.stdout:1/181: mkdir d0/d18/d3b 0 2026-03-10T14:08:02.997 INFO:tasks.workunit.client.0.vm03.stdout:1/182: chown d0/fa 596613 1 2026-03-10T14:08:03.107 INFO:tasks.workunit.client.1.vm04.stdout:7/700: dwrite d2/d94/f7e [0,4194304] 0 2026-03-10T14:08:03.108 INFO:tasks.workunit.client.1.vm04.stdout:7/701: chown d2/dc/d4d/d7f/fc2 34724 1 2026-03-10T14:08:03.109 INFO:tasks.workunit.client.1.vm04.stdout:7/702: write d2/d2a/d42/f4e [4719214,119672] 0 2026-03-10T14:08:03.140 INFO:tasks.workunit.client.1.vm04.stdout:7/703: sync 2026-03-10T14:08:03.142 INFO:tasks.workunit.client.1.vm04.stdout:7/704: symlink d2/dc/de/d2d/d60/d7c/d36/daa/l101 0 2026-03-10T14:08:03.143 INFO:tasks.workunit.client.1.vm04.stdout:7/705: mkdir d2/dc/de/d2d/d60/d7c/d44/d102 0 2026-03-10T14:08:03.146 INFO:tasks.workunit.client.1.vm04.stdout:7/706: rmdir d2/dc/de/d2d/d60/de8 0 2026-03-10T14:08:03.151 INFO:tasks.workunit.client.0.vm03.stdout:6/149: truncate d8/d11/d18/f21 546406 0 2026-03-10T14:08:03.153 INFO:tasks.workunit.client.0.vm03.stdout:6/150: mkdir d8/db/d2c 0 2026-03-10T14:08:03.155 INFO:tasks.workunit.client.0.vm03.stdout:6/151: getdents d8/db/d2c 0 2026-03-10T14:08:03.156 INFO:tasks.workunit.client.0.vm03.stdout:6/152: mkdir d8/db/d2c/d2d 0 2026-03-10T14:08:03.159 INFO:tasks.workunit.client.1.vm04.stdout:2/659: truncate d0/d14/d91/d3a/fb5 3838267 0 2026-03-10T14:08:03.160 INFO:tasks.workunit.client.1.vm04.stdout:2/660: write d0/d14/d91/d4a/d66/f72 [1180270,105326] 0 2026-03-10T14:08:03.161 INFO:tasks.workunit.client.1.vm04.stdout:2/661: stat d0/d14/d91/f9 0 2026-03-10T14:08:03.161 INFO:tasks.workunit.client.1.vm04.stdout:2/662: chown d0/d14/d91/d8/d17/d4e/d85/d86 471650 1 2026-03-10T14:08:03.163 INFO:tasks.workunit.client.1.vm04.stdout:2/663: dread d0/d14/d1b/d45/fb6 [4194304,4194304] 0 2026-03-10T14:08:03.166 INFO:tasks.workunit.client.0.vm03.stdout:6/153: rmdir d8/d1b 39 2026-03-10T14:08:03.170 INFO:tasks.workunit.client.0.vm03.stdout:6/154: truncate d8/d1b/f23 600206 0 2026-03-10T14:08:03.171 INFO:tasks.workunit.client.0.vm03.stdout:6/155: stat d8/d11/d18/l1a 0 2026-03-10T14:08:03.171 INFO:tasks.workunit.client.0.vm03.stdout:6/156: chown d8/db/df/f10 69124091 1 2026-03-10T14:08:03.171 INFO:tasks.workunit.client.0.vm03.stdout:6/157: chown d8/db/d12 216341742 1 2026-03-10T14:08:03.179 INFO:tasks.workunit.client.1.vm04.stdout:2/664: unlink d0/d14/d39/lb9 0 2026-03-10T14:08:03.182 INFO:tasks.workunit.client.1.vm04.stdout:2/665: dwrite d0/d14/d1b/f29 [4194304,4194304] 0 2026-03-10T14:08:03.187 INFO:tasks.workunit.client.1.vm04.stdout:5/739: write d7/d12/d2b/d3e/d3f/f52 [279827,31946] 0 2026-03-10T14:08:03.201 INFO:tasks.workunit.client.1.vm04.stdout:5/740: creat d7/d12/d2b/d3e/d3f/da6/fee x:0 0 0 2026-03-10T14:08:03.202 INFO:tasks.workunit.client.1.vm04.stdout:2/666: mkdir d0/d14/d39/d47/d70/dc3/dd0 0 2026-03-10T14:08:03.204 INFO:tasks.workunit.client.1.vm04.stdout:5/741: unlink d7/d2d/f6d 0 2026-03-10T14:08:03.209 INFO:tasks.workunit.client.1.vm04.stdout:2/667: link d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/c75 d0/d14/d91/d3a/d3e/cd1 0 2026-03-10T14:08:03.219 INFO:tasks.workunit.client.1.vm04.stdout:5/742: creat d7/d12/fef x:0 0 0 2026-03-10T14:08:03.225 INFO:tasks.workunit.client.0.vm03.stdout:5/208: write d4/d6/fa [3514849,129729] 0 2026-03-10T14:08:03.226 INFO:tasks.workunit.client.0.vm03.stdout:5/209: chown d4/d6/fb 419067 1 2026-03-10T14:08:03.228 INFO:tasks.workunit.client.1.vm04.stdout:6/558: dwrite d3/de/d35/d3a/d43/d4c/d5e/d76/f77 [0,4194304] 0 2026-03-10T14:08:03.238 INFO:tasks.workunit.client.1.vm04.stdout:0/663: write d0/d2/d15/d22/d38/f93 [388326,117740] 0 2026-03-10T14:08:03.241 INFO:tasks.workunit.client.1.vm04.stdout:9/607: dwrite d9/f20 [0,4194304] 0 2026-03-10T14:08:03.243 INFO:tasks.workunit.client.1.vm04.stdout:2/668: truncate d0/d14/d1b/f32 7820023 0 2026-03-10T14:08:03.248 INFO:tasks.workunit.client.0.vm03.stdout:5/210: dread d4/f17 [0,4194304] 0 2026-03-10T14:08:03.248 INFO:tasks.workunit.client.1.vm04.stdout:5/743: mknod d7/d12/d2b/d3e/d57/d8a/cf0 0 2026-03-10T14:08:03.262 INFO:tasks.workunit.client.1.vm04.stdout:2/669: creat d0/d14/d91/d4a/d8c/dab/fd2 x:0 0 0 2026-03-10T14:08:03.262 INFO:tasks.workunit.client.1.vm04.stdout:2/670: readlink d0/d14/d91/d8/l62 0 2026-03-10T14:08:03.265 INFO:tasks.workunit.client.0.vm03.stdout:5/211: mkdir d4/d40/d4e 0 2026-03-10T14:08:03.265 INFO:tasks.workunit.client.1.vm04.stdout:5/744: rmdir d7/d26/d6b 39 2026-03-10T14:08:03.269 INFO:tasks.workunit.client.1.vm04.stdout:5/745: readlink d7/d2d/d32/ld3 0 2026-03-10T14:08:03.271 INFO:tasks.workunit.client.1.vm04.stdout:5/746: mkdir d7/d12/d2b/d3e/d57/d77/df1 0 2026-03-10T14:08:03.272 INFO:tasks.workunit.client.1.vm04.stdout:5/747: stat d7/d12/d2b/d3e/d3f/l78 0 2026-03-10T14:08:03.273 INFO:tasks.workunit.client.1.vm04.stdout:0/664: sync 2026-03-10T14:08:03.282 INFO:tasks.workunit.client.1.vm04.stdout:5/748: fdatasync d7/d59/d7e/d87/fd4 0 2026-03-10T14:08:03.282 INFO:tasks.workunit.client.1.vm04.stdout:0/665: mkdir d0/d2/d15/d22/d38/d56/dc1/dd4 0 2026-03-10T14:08:03.288 INFO:tasks.workunit.client.0.vm03.stdout:5/212: dread d4/fd [0,4194304] 0 2026-03-10T14:08:03.289 INFO:tasks.workunit.client.0.vm03.stdout:5/213: chown d4/d16/d19/l26 0 1 2026-03-10T14:08:03.289 INFO:tasks.workunit.client.0.vm03.stdout:5/214: write d4/d16/f1c [1197835,64437] 0 2026-03-10T14:08:03.297 INFO:tasks.workunit.client.0.vm03.stdout:8/173: getdents da/d24 0 2026-03-10T14:08:03.298 INFO:tasks.workunit.client.0.vm03.stdout:5/215: creat d4/d6/de/f4f x:0 0 0 2026-03-10T14:08:03.302 INFO:tasks.workunit.client.1.vm04.stdout:5/749: dwrite d7/d2d/f64 [0,4194304] 0 2026-03-10T14:08:03.306 INFO:tasks.workunit.client.0.vm03.stdout:9/144: write d2/f2c [885009,10249] 0 2026-03-10T14:08:03.311 INFO:tasks.workunit.client.0.vm03.stdout:8/174: dwrite f5 [0,4194304] 0 2026-03-10T14:08:03.313 INFO:tasks.workunit.client.0.vm03.stdout:9/145: dwrite d2/d14/f30 [0,4194304] 0 2026-03-10T14:08:03.316 INFO:tasks.workunit.client.0.vm03.stdout:5/216: creat d4/d16/d19/d23/f50 x:0 0 0 2026-03-10T14:08:03.323 INFO:tasks.workunit.client.0.vm03.stdout:8/175: dwrite da/ff [0,4194304] 0 2026-03-10T14:08:03.325 INFO:tasks.workunit.client.1.vm04.stdout:0/666: symlink d0/d2/dc9/ld5 0 2026-03-10T14:08:03.326 INFO:tasks.workunit.client.1.vm04.stdout:0/667: stat d0/d2/d15/d49/d50/f8a 0 2026-03-10T14:08:03.326 INFO:tasks.workunit.client.0.vm03.stdout:5/217: dwrite d4/d6/de/f14 [4194304,4194304] 0 2026-03-10T14:08:03.329 INFO:tasks.workunit.client.1.vm04.stdout:5/750: rename d7/d12/d2b/d3e/d3f/c7c to d7/d12/d2b/d3e/d3f/da6/cf2 0 2026-03-10T14:08:03.331 INFO:tasks.workunit.client.0.vm03.stdout:8/176: unlink da/c11 0 2026-03-10T14:08:03.331 INFO:tasks.workunit.client.0.vm03.stdout:8/177: readlink da/d24/l2f 0 2026-03-10T14:08:03.331 INFO:tasks.workunit.client.1.vm04.stdout:5/751: dread - d7/d9/db5/fd1 zero size 2026-03-10T14:08:03.331 INFO:tasks.workunit.client.1.vm04.stdout:5/752: stat d7/f3c 0 2026-03-10T14:08:03.332 INFO:tasks.workunit.client.1.vm04.stdout:5/753: readlink d7/d12/d45/l85 0 2026-03-10T14:08:03.333 INFO:tasks.workunit.client.0.vm03.stdout:5/218: creat d4/d13/d43/f51 x:0 0 0 2026-03-10T14:08:03.336 INFO:tasks.workunit.client.0.vm03.stdout:5/219: readlink d4/d16/l27 0 2026-03-10T14:08:03.355 INFO:tasks.workunit.client.0.vm03.stdout:9/146: mknod d2/c31 0 2026-03-10T14:08:03.357 INFO:tasks.workunit.client.1.vm04.stdout:5/754: read d7/d26/f30 [2442285,11566] 0 2026-03-10T14:08:03.362 INFO:tasks.workunit.client.0.vm03.stdout:5/220: chown d4/d13/l38 9 1 2026-03-10T14:08:03.381 INFO:tasks.workunit.client.1.vm04.stdout:0/668: getdents d0/d2/d15/d22 0 2026-03-10T14:08:03.382 INFO:tasks.workunit.client.0.vm03.stdout:5/221: write d4/d6/fa [4151699,88439] 0 2026-03-10T14:08:03.382 INFO:tasks.workunit.client.0.vm03.stdout:8/178: mkdir da/d35 0 2026-03-10T14:08:03.382 INFO:tasks.workunit.client.0.vm03.stdout:8/179: fsync da/f2a 0 2026-03-10T14:08:03.382 INFO:tasks.workunit.client.0.vm03.stdout:5/222: dwrite d4/d6/fa [0,4194304] 0 2026-03-10T14:08:03.382 INFO:tasks.workunit.client.0.vm03.stdout:5/223: mknod d4/d16/d19/d23/d3f/c52 0 2026-03-10T14:08:03.382 INFO:tasks.workunit.client.0.vm03.stdout:5/224: chown d4/d16/d19/d4a/l4d 10016843 1 2026-03-10T14:08:03.382 INFO:tasks.workunit.client.0.vm03.stdout:5/225: mkdir d4/d53 0 2026-03-10T14:08:03.419 INFO:tasks.workunit.client.0.vm03.stdout:8/180: dread da/f15 [0,4194304] 0 2026-03-10T14:08:03.420 INFO:tasks.workunit.client.0.vm03.stdout:8/181: truncate da/fe 805542 0 2026-03-10T14:08:03.423 INFO:tasks.workunit.client.0.vm03.stdout:8/182: dwrite da/f16 [4194304,4194304] 0 2026-03-10T14:08:03.427 INFO:tasks.workunit.client.0.vm03.stdout:8/183: rename da/d35 to da/d36 0 2026-03-10T14:08:03.428 INFO:tasks.workunit.client.0.vm03.stdout:8/184: write da/f2e [286878,24669] 0 2026-03-10T14:08:03.439 INFO:tasks.workunit.client.1.vm04.stdout:3/696: write da/dc/d35/dcd/dde/dac/fb4 [893756,99670] 0 2026-03-10T14:08:03.478 INFO:tasks.workunit.client.1.vm04.stdout:1/669: dwrite d3/d22/d63/f69 [0,4194304] 0 2026-03-10T14:08:03.482 INFO:tasks.workunit.client.1.vm04.stdout:1/670: read d3/f14 [706738,119565] 0 2026-03-10T14:08:03.486 INFO:tasks.workunit.client.1.vm04.stdout:1/671: rename d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92/fdd to d3/d22/d63/d35/fe5 0 2026-03-10T14:08:03.488 INFO:tasks.workunit.client.1.vm04.stdout:1/672: mkdir d3/d5c/d79/de6 0 2026-03-10T14:08:03.495 INFO:tasks.workunit.client.1.vm04.stdout:1/673: unlink d3/d22/d63/d35/dd9/d13/d38/ca6 0 2026-03-10T14:08:03.497 INFO:tasks.workunit.client.1.vm04.stdout:1/674: dread d3/d22/d63/d35/fe5 [0,4194304] 0 2026-03-10T14:08:03.525 INFO:tasks.workunit.client.1.vm04.stdout:1/675: sync 2026-03-10T14:08:03.535 INFO:tasks.workunit.client.1.vm04.stdout:4/637: write d4/df/f66 [709710,45624] 0 2026-03-10T14:08:03.538 INFO:tasks.workunit.client.1.vm04.stdout:4/638: dwrite d4/d14/d64/fd6 [0,4194304] 0 2026-03-10T14:08:03.593 INFO:tasks.workunit.client.1.vm04.stdout:8/760: write d0/d3/d73/f98 [2566514,51751] 0 2026-03-10T14:08:03.596 INFO:tasks.workunit.client.1.vm04.stdout:8/761: mkdir d0/dc1/def 0 2026-03-10T14:08:03.618 INFO:tasks.workunit.client.0.vm03.stdout:4/171: dwrite d5/fe [0,4194304] 0 2026-03-10T14:08:03.625 INFO:tasks.workunit.client.0.vm03.stdout:4/172: getdents d5/d9/db/d19 0 2026-03-10T14:08:03.629 INFO:tasks.workunit.client.1.vm04.stdout:7/707: dwrite d2/dc/de/dae/fd2 [0,4194304] 0 2026-03-10T14:08:03.631 INFO:tasks.workunit.client.0.vm03.stdout:4/173: mknod d5/d9/db/d19/c39 0 2026-03-10T14:08:03.664 INFO:tasks.workunit.client.1.vm04.stdout:7/708: sync 2026-03-10T14:08:03.665 INFO:tasks.workunit.client.1.vm04.stdout:7/709: dread - d2/dc/de/fed zero size 2026-03-10T14:08:03.667 INFO:tasks.workunit.client.1.vm04.stdout:7/710: fsync d2/dc/de/f98 0 2026-03-10T14:08:03.672 INFO:tasks.workunit.client.1.vm04.stdout:7/711: mkdir d2/dc/de/d2d/d60/d7c/d36/d103 0 2026-03-10T14:08:03.676 INFO:tasks.workunit.client.1.vm04.stdout:7/712: dwrite d2/dc/de/d2d/d60/d7c/d3b/f48 [0,4194304] 0 2026-03-10T14:08:03.706 INFO:tasks.workunit.client.1.vm04.stdout:7/713: dread d2/d2a/d42/d86/fbb [0,4194304] 0 2026-03-10T14:08:03.707 INFO:tasks.workunit.client.1.vm04.stdout:7/714: write d2/dc/d4d/dcd/f100 [606063,109886] 0 2026-03-10T14:08:03.741 INFO:tasks.workunit.client.1.vm04.stdout:9/608: dwrite d9/d5c/f6e [0,4194304] 0 2026-03-10T14:08:03.742 INFO:tasks.workunit.client.1.vm04.stdout:9/609: dread - d9/d58/db5/da5/fc9 zero size 2026-03-10T14:08:03.763 INFO:tasks.workunit.client.1.vm04.stdout:6/559: write d3/f8d [2892105,24494] 0 2026-03-10T14:08:03.779 INFO:tasks.workunit.client.1.vm04.stdout:2/671: dwrite d0/d14/d91/d4a/d8c/dab/f36 [0,4194304] 0 2026-03-10T14:08:03.823 INFO:tasks.workunit.client.1.vm04.stdout:5/755: dwrite d7/d2d/d76/f8f [0,4194304] 0 2026-03-10T14:08:03.831 INFO:tasks.workunit.client.1.vm04.stdout:0/669: dwrite d0/d2/d15/d22/d38/d56/f84 [0,4194304] 0 2026-03-10T14:08:03.855 INFO:tasks.workunit.client.1.vm04.stdout:6/560: fsync d3/d1d/d73/f2b 0 2026-03-10T14:08:03.858 INFO:tasks.workunit.client.1.vm04.stdout:2/672: creat d0/db8/fd3 x:0 0 0 2026-03-10T14:08:03.861 INFO:tasks.workunit.client.1.vm04.stdout:2/673: read d0/d14/d91/d4a/f57 [389269,70128] 0 2026-03-10T14:08:03.862 INFO:tasks.workunit.client.1.vm04.stdout:2/674: readlink d0/d14/d91/d8/d17/d35/l3d 0 2026-03-10T14:08:03.865 INFO:tasks.workunit.client.1.vm04.stdout:9/610: link d9/da/dd/d74/f75 d9/d5c/d93/d96/fce 0 2026-03-10T14:08:03.868 INFO:tasks.workunit.client.1.vm04.stdout:0/670: sync 2026-03-10T14:08:03.879 INFO:tasks.workunit.client.1.vm04.stdout:6/561: rename d3/de/d35/d3f/d2d/d32/d23/f7c to d3/fa9 0 2026-03-10T14:08:03.880 INFO:tasks.workunit.client.1.vm04.stdout:9/611: mknod d9/da/d5d/d81/ccf 0 2026-03-10T14:08:03.881 INFO:tasks.workunit.client.1.vm04.stdout:9/612: write d9/da/f9e [1268298,83613] 0 2026-03-10T14:08:03.885 INFO:tasks.workunit.client.1.vm04.stdout:2/675: symlink d0/d14/d1b/ld4 0 2026-03-10T14:08:03.887 INFO:tasks.workunit.client.1.vm04.stdout:0/671: symlink d0/d2/d15/d22/d38/ld6 0 2026-03-10T14:08:03.888 INFO:tasks.workunit.client.1.vm04.stdout:6/562: read - d3/de/d35/d3f/d2d/f98 zero size 2026-03-10T14:08:03.890 INFO:tasks.workunit.client.1.vm04.stdout:9/613: creat d9/da/dd/d1c/da3/fd0 x:0 0 0 2026-03-10T14:08:03.890 INFO:tasks.workunit.client.1.vm04.stdout:2/676: fdatasync d0/d14/d39/d47/d70/f8d 0 2026-03-10T14:08:03.891 INFO:tasks.workunit.client.1.vm04.stdout:9/614: chown d9/da/d5d/f8b 49 1 2026-03-10T14:08:03.892 INFO:tasks.workunit.client.1.vm04.stdout:0/672: chown d0/d6e/f8b 717432513 1 2026-03-10T14:08:03.897 INFO:tasks.workunit.client.1.vm04.stdout:9/615: sync 2026-03-10T14:08:03.900 INFO:tasks.workunit.client.1.vm04.stdout:6/563: rename d3/de/d35/d3f/d2d/d32/l6b to d3/de/d35/d3a/d43/d9c/laa 0 2026-03-10T14:08:03.904 INFO:tasks.workunit.client.1.vm04.stdout:2/677: dread - d0/d14/d91/d3a/d3e/fbd zero size 2026-03-10T14:08:03.906 INFO:tasks.workunit.client.1.vm04.stdout:0/673: creat d0/d2/dbe/fd7 x:0 0 0 2026-03-10T14:08:03.909 INFO:tasks.workunit.client.1.vm04.stdout:9/616: symlink d9/da/d5d/ld1 0 2026-03-10T14:08:03.909 INFO:tasks.workunit.client.1.vm04.stdout:9/617: chown d9/da/d5d/l80 440378762 1 2026-03-10T14:08:03.912 INFO:tasks.workunit.client.0.vm03.stdout:5/226: dwrite d4/f17 [0,4194304] 0 2026-03-10T14:08:03.925 INFO:tasks.workunit.client.0.vm03.stdout:8/185: write da/f15 [2817708,44631] 0 2026-03-10T14:08:03.927 INFO:tasks.workunit.client.1.vm04.stdout:2/678: rename d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/ccc to d0/d14/d39/d47/d70/d8b/cd5 0 2026-03-10T14:08:03.928 INFO:tasks.workunit.client.1.vm04.stdout:2/679: write d0/d14/d91/d4a/d8c/dab/fd2 [618335,68034] 0 2026-03-10T14:08:03.928 INFO:tasks.workunit.client.0.vm03.stdout:8/186: dwrite da/f2e [0,4194304] 0 2026-03-10T14:08:03.931 INFO:tasks.workunit.client.1.vm04.stdout:3/697: write da/dc/d35/dcd/dde/dac/fca [1058184,406] 0 2026-03-10T14:08:03.935 INFO:tasks.workunit.client.1.vm04.stdout:0/674: truncate d0/d2/d15/d22/d38/d56/f58 233784 0 2026-03-10T14:08:03.935 INFO:tasks.workunit.client.0.vm03.stdout:8/187: dwrite da/f31 [0,4194304] 0 2026-03-10T14:08:03.940 INFO:tasks.workunit.client.0.vm03.stdout:5/227: read d4/f37 [576473,12734] 0 2026-03-10T14:08:03.940 INFO:tasks.workunit.client.0.vm03.stdout:5/228: truncate d4/d16/d19/d23/f50 56583 0 2026-03-10T14:08:03.952 INFO:tasks.workunit.client.0.vm03.stdout:8/188: fsync da/f1b 0 2026-03-10T14:08:03.959 INFO:tasks.workunit.client.1.vm04.stdout:1/676: write d3/d22/d63/fb3 [489117,73947] 0 2026-03-10T14:08:03.962 INFO:tasks.workunit.client.1.vm04.stdout:3/698: symlink da/dc/d47/d9b/dcb/lef 0 2026-03-10T14:08:03.974 INFO:tasks.workunit.client.1.vm04.stdout:4/639: write d4/df/f60 [795053,59270] 0 2026-03-10T14:08:03.974 INFO:tasks.workunit.client.1.vm04.stdout:4/640: readlink d4/lc 0 2026-03-10T14:08:03.984 INFO:tasks.workunit.client.1.vm04.stdout:8/762: dwrite d0/d75/fa2 [0,4194304] 0 2026-03-10T14:08:03.986 INFO:tasks.workunit.client.1.vm04.stdout:0/675: mkdir d0/d2/d15/d49/d50/d5c/dd8 0 2026-03-10T14:08:03.986 INFO:tasks.workunit.client.1.vm04.stdout:0/676: readlink d0/d2/d15/d49/d50/d61/lc5 0 2026-03-10T14:08:03.987 INFO:tasks.workunit.client.0.vm03.stdout:0/181: rename d3/d11/d25/d30/f36 to d3/d16/f39 0 2026-03-10T14:08:04.000 INFO:tasks.workunit.client.0.vm03.stdout:8/189: dwrite da/fe [0,4194304] 0 2026-03-10T14:08:04.004 INFO:tasks.workunit.client.0.vm03.stdout:2/190: rename d5/f1e to d5/d2a/f37 0 2026-03-10T14:08:04.004 INFO:tasks.workunit.client.1.vm04.stdout:7/715: getdents d2/dc/de/d2d/d60/d7c/d36 0 2026-03-10T14:08:04.009 INFO:tasks.workunit.client.0.vm03.stdout:0/182: truncate d3/d11/f18 1362455 0 2026-03-10T14:08:04.009 INFO:tasks.workunit.client.1.vm04.stdout:1/677: chown d3/d22/d2f/d57/f78 2 1 2026-03-10T14:08:04.010 INFO:tasks.workunit.client.1.vm04.stdout:1/678: chown d3/d5c/d79/d98 6480 1 2026-03-10T14:08:04.017 INFO:tasks.workunit.client.1.vm04.stdout:3/699: symlink da/d30/lf0 0 2026-03-10T14:08:04.019 INFO:tasks.workunit.client.1.vm04.stdout:4/641: truncate d4/d14/d3c/d5e/f79 598927 0 2026-03-10T14:08:04.021 INFO:tasks.workunit.client.0.vm03.stdout:8/190: dread da/f1b [0,4194304] 0 2026-03-10T14:08:04.025 INFO:tasks.workunit.client.1.vm04.stdout:8/763: chown d0/d3/d63/d12/d51/l49 3416897 1 2026-03-10T14:08:04.026 INFO:tasks.workunit.client.1.vm04.stdout:8/764: chown d0/d3/d63/d29/fce 287 1 2026-03-10T14:08:04.026 INFO:tasks.workunit.client.1.vm04.stdout:8/765: fsync d0/d3/d63/d29/fe6 0 2026-03-10T14:08:04.027 INFO:tasks.workunit.client.0.vm03.stdout:0/183: mknod d3/d16/c3a 0 2026-03-10T14:08:04.028 INFO:tasks.workunit.client.0.vm03.stdout:0/184: chown d3/fe 0 1 2026-03-10T14:08:04.040 INFO:tasks.workunit.client.0.vm03.stdout:3/119: rename c19 to d1d/c25 0 2026-03-10T14:08:04.040 INFO:tasks.workunit.client.0.vm03.stdout:3/120: fdatasync f1c 0 2026-03-10T14:08:04.058 INFO:tasks.workunit.client.0.vm03.stdout:0/185: creat d3/d11/d2c/f3b x:0 0 0 2026-03-10T14:08:04.059 INFO:tasks.workunit.client.1.vm04.stdout:4/642: symlink d4/d14/d64/ldd 0 2026-03-10T14:08:04.065 INFO:tasks.workunit.client.1.vm04.stdout:8/766: stat d0/d3/d63/d12/d51/d67/d96 0 2026-03-10T14:08:04.066 INFO:tasks.workunit.client.0.vm03.stdout:3/121: fdatasync f10 0 2026-03-10T14:08:04.068 INFO:tasks.workunit.client.0.vm03.stdout:0/186: mkdir d3/d16/d21/d3c 0 2026-03-10T14:08:04.073 INFO:tasks.workunit.client.1.vm04.stdout:4/643: truncate d4/d14/d64/fab 851921 0 2026-03-10T14:08:04.075 INFO:tasks.workunit.client.0.vm03.stdout:8/191: creat da/f37 x:0 0 0 2026-03-10T14:08:04.079 INFO:tasks.workunit.client.1.vm04.stdout:4/644: symlink d4/df/lde 0 2026-03-10T14:08:04.081 INFO:tasks.workunit.client.1.vm04.stdout:4/645: fdatasync d4/d14/f27 0 2026-03-10T14:08:04.088 INFO:tasks.workunit.client.1.vm04.stdout:4/646: fdatasync d4/f77 0 2026-03-10T14:08:04.092 INFO:tasks.workunit.client.1.vm04.stdout:8/767: dread d0/d3/f80 [0,4194304] 0 2026-03-10T14:08:04.092 INFO:tasks.workunit.client.1.vm04.stdout:8/768: stat d0/d3/f35 0 2026-03-10T14:08:04.092 INFO:tasks.workunit.client.1.vm04.stdout:8/769: dread - d0/d3/d63/d12/d51/d67/f87 zero size 2026-03-10T14:08:04.099 INFO:tasks.workunit.client.0.vm03.stdout:8/192: link da/c10 da/d24/c38 0 2026-03-10T14:08:04.099 INFO:tasks.workunit.client.0.vm03.stdout:8/193: read da/f2e [2820781,67049] 0 2026-03-10T14:08:04.100 INFO:tasks.workunit.client.1.vm04.stdout:8/770: fsync d0/d3/d63/d12/d51/d67/d96/fbf 0 2026-03-10T14:08:04.100 INFO:tasks.workunit.client.1.vm04.stdout:8/771: stat d0/d3/d63/fc9 0 2026-03-10T14:08:04.107 INFO:tasks.workunit.client.1.vm04.stdout:5/756: write d7/d9/fe0 [993075,82344] 0 2026-03-10T14:08:04.108 INFO:tasks.workunit.client.1.vm04.stdout:5/757: readlink d7/d26/l3d 0 2026-03-10T14:08:04.112 INFO:tasks.workunit.client.1.vm04.stdout:8/772: creat d0/d3/d63/d12/d51/d67/ff0 x:0 0 0 2026-03-10T14:08:04.120 INFO:tasks.workunit.client.0.vm03.stdout:8/194: mknod da/c39 0 2026-03-10T14:08:04.120 INFO:tasks.workunit.client.1.vm04.stdout:5/758: unlink d7/d26/d6b/d6e/fa3 0 2026-03-10T14:08:04.128 INFO:tasks.workunit.client.0.vm03.stdout:8/195: mkdir da/d3a 0 2026-03-10T14:08:04.129 INFO:tasks.workunit.client.1.vm04.stdout:6/564: write d3/de/d35/d3f/d2d/f98 [635641,43426] 0 2026-03-10T14:08:04.132 INFO:tasks.workunit.client.1.vm04.stdout:6/565: dwrite d3/f19 [4194304,4194304] 0 2026-03-10T14:08:04.135 INFO:tasks.workunit.client.1.vm04.stdout:8/773: rename d0/d3/d63/f5b to d0/d3/d63/ff1 0 2026-03-10T14:08:04.151 INFO:tasks.workunit.client.0.vm03.stdout:2/191: write f1 [4613247,10808] 0 2026-03-10T14:08:04.163 INFO:tasks.workunit.client.0.vm03.stdout:2/192: readlink d5/l11 0 2026-03-10T14:08:04.163 INFO:tasks.workunit.client.1.vm04.stdout:9/618: write d9/da/d8c/fbf [824194,117665] 0 2026-03-10T14:08:04.163 INFO:tasks.workunit.client.1.vm04.stdout:2/680: truncate d0/d14/d91/d8/d17/f73 2493805 0 2026-03-10T14:08:04.163 INFO:tasks.workunit.client.1.vm04.stdout:2/681: read - d0/db8/fca zero size 2026-03-10T14:08:04.163 INFO:tasks.workunit.client.1.vm04.stdout:0/677: write d0/d2/f9 [722106,86196] 0 2026-03-10T14:08:04.163 INFO:tasks.workunit.client.1.vm04.stdout:1/679: write d3/d22/d63/d35/f9a [4955189,90380] 0 2026-03-10T14:08:04.163 INFO:tasks.workunit.client.1.vm04.stdout:7/716: dwrite d2/dc/de/d2d/d5c/f63 [0,4194304] 0 2026-03-10T14:08:04.167 INFO:tasks.workunit.client.0.vm03.stdout:2/193: dread d5/f9 [0,4194304] 0 2026-03-10T14:08:04.167 INFO:tasks.workunit.client.1.vm04.stdout:3/700: dwrite da/d3e/f44 [4194304,4194304] 0 2026-03-10T14:08:04.167 INFO:tasks.workunit.client.0.vm03.stdout:2/194: truncate d5/ff 4535193 0 2026-03-10T14:08:04.169 INFO:tasks.workunit.client.0.vm03.stdout:2/195: creat d5/d10/d31/f38 x:0 0 0 2026-03-10T14:08:04.178 INFO:tasks.workunit.client.1.vm04.stdout:6/566: stat d3/de/d35/d3f/d2d/d32/d23/d47/f6e 0 2026-03-10T14:08:04.178 INFO:tasks.workunit.client.1.vm04.stdout:5/759: creat d7/d12/ff3 x:0 0 0 2026-03-10T14:08:04.182 INFO:tasks.workunit.client.1.vm04.stdout:9/619: read - d9/fae zero size 2026-03-10T14:08:04.193 INFO:tasks.workunit.client.1.vm04.stdout:8/774: mknod d0/d3/d63/d12/d51/d8b/cf2 0 2026-03-10T14:08:04.193 INFO:tasks.workunit.client.1.vm04.stdout:8/775: chown d0 9458778 1 2026-03-10T14:08:04.202 INFO:tasks.workunit.client.1.vm04.stdout:7/717: dread d2/d94/f29 [0,4194304] 0 2026-03-10T14:08:04.226 INFO:tasks.workunit.client.0.vm03.stdout:5/229: creat d4/d35/f54 x:0 0 0 2026-03-10T14:08:04.228 INFO:tasks.workunit.client.1.vm04.stdout:6/567: mknod d3/de/d35/d3f/d2d/d32/cab 0 2026-03-10T14:08:04.231 INFO:tasks.workunit.client.0.vm03.stdout:5/230: read d4/d6/fb [5208077,45726] 0 2026-03-10T14:08:04.232 INFO:tasks.workunit.client.1.vm04.stdout:9/620: creat d9/d5c/d93/fd2 x:0 0 0 2026-03-10T14:08:04.234 INFO:tasks.workunit.client.1.vm04.stdout:8/776: truncate d0/d3/d63/d12/f50 2547418 0 2026-03-10T14:08:04.244 INFO:tasks.workunit.client.1.vm04.stdout:7/718: rename d2/dc/de/d2d/d60/f91 to d2/d2a/d42/d86/f104 0 2026-03-10T14:08:04.245 INFO:tasks.workunit.client.0.vm03.stdout:5/231: truncate d4/f37 102823 0 2026-03-10T14:08:04.245 INFO:tasks.workunit.client.0.vm03.stdout:8/196: dwrite f9 [0,4194304] 0 2026-03-10T14:08:04.246 INFO:tasks.workunit.client.0.vm03.stdout:2/196: write d5/f9 [3058331,98685] 0 2026-03-10T14:08:04.247 INFO:tasks.workunit.client.0.vm03.stdout:2/197: stat d5/d10/l1a 0 2026-03-10T14:08:04.255 INFO:tasks.workunit.client.1.vm04.stdout:1/680: dwrite d3/fa [0,4194304] 0 2026-03-10T14:08:04.259 INFO:tasks.workunit.client.1.vm04.stdout:4/647: dread d4/df/d31/f3d [4194304,4194304] 0 2026-03-10T14:08:04.260 INFO:tasks.workunit.client.1.vm04.stdout:0/678: creat d0/d2/d15/d22/d38/d56/dc1/dd4/fd9 x:0 0 0 2026-03-10T14:08:04.275 INFO:tasks.workunit.client.1.vm04.stdout:5/760: truncate d7/d12/d2b/f46 7438797 0 2026-03-10T14:08:04.275 INFO:tasks.workunit.client.1.vm04.stdout:5/761: stat d7/d26/d6b/d6e/f91 0 2026-03-10T14:08:04.278 INFO:tasks.workunit.client.0.vm03.stdout:7/184: rename d5/d9/d14/d1c to d5/d9/d14/d26/d36 0 2026-03-10T14:08:04.297 INFO:tasks.workunit.client.0.vm03.stdout:1/183: rename d0/c15 to d0/d2/df/d16/d20/c3c 0 2026-03-10T14:08:04.297 INFO:tasks.workunit.client.0.vm03.stdout:6/158: rename d8/d1b to d8/d1b/d1c/d2e 22 2026-03-10T14:08:04.297 INFO:tasks.workunit.client.0.vm03.stdout:1/184: stat d0/d2/f1a 0 2026-03-10T14:08:04.298 INFO:tasks.workunit.client.0.vm03.stdout:1/185: fdatasync d0/f11 0 2026-03-10T14:08:04.303 INFO:tasks.workunit.client.0.vm03.stdout:7/185: read d5/d9/f1f [2500260,102982] 0 2026-03-10T14:08:04.307 INFO:tasks.workunit.client.1.vm04.stdout:2/682: write d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/f7b [1174497,115587] 0 2026-03-10T14:08:04.309 INFO:tasks.workunit.client.1.vm04.stdout:3/701: truncate f8 3145326 0 2026-03-10T14:08:04.320 INFO:tasks.workunit.client.0.vm03.stdout:9/147: rename d2/d14/l26 to d2/d29/l32 0 2026-03-10T14:08:04.329 INFO:tasks.workunit.client.1.vm04.stdout:3/702: dread da/dc/d3f/d61/f8c [0,4194304] 0 2026-03-10T14:08:04.329 INFO:tasks.workunit.client.1.vm04.stdout:3/703: dread - da/dc/d47/fea zero size 2026-03-10T14:08:04.345 INFO:tasks.workunit.client.1.vm04.stdout:0/679: truncate d0/f72 71401 0 2026-03-10T14:08:04.346 INFO:tasks.workunit.client.1.vm04.stdout:5/762: truncate d7/d12/f42 4407163 0 2026-03-10T14:08:04.347 INFO:tasks.workunit.client.1.vm04.stdout:5/763: readlink d7/d26/l43 0 2026-03-10T14:08:04.348 INFO:tasks.workunit.client.0.vm03.stdout:5/232: dwrite d4/f37 [0,4194304] 0 2026-03-10T14:08:04.348 INFO:tasks.workunit.client.1.vm04.stdout:5/764: truncate d7/d12/d2b/d93/fd9 1040263 0 2026-03-10T14:08:04.348 INFO:tasks.workunit.client.0.vm03.stdout:5/233: write d4/f17 [3039125,84487] 0 2026-03-10T14:08:04.352 INFO:tasks.workunit.client.0.vm03.stdout:8/197: getdents da 0 2026-03-10T14:08:04.352 INFO:tasks.workunit.client.0.vm03.stdout:8/198: stat da/l22 0 2026-03-10T14:08:04.354 INFO:tasks.workunit.client.0.vm03.stdout:7/186: dread d5/d9/f17 [4194304,4194304] 0 2026-03-10T14:08:04.364 INFO:tasks.workunit.client.0.vm03.stdout:4/174: rename d5/d9/f36 to d5/d9/db/d19/f3a 0 2026-03-10T14:08:04.439 INFO:tasks.workunit.client.1.vm04.stdout:1/681: write d3/d22/d63/d35/dd9/d13/d1a/f28 [9162530,62701] 0 2026-03-10T14:08:04.440 INFO:tasks.workunit.client.1.vm04.stdout:9/621: truncate d9/f4f 67855 0 2026-03-10T14:08:04.440 INFO:tasks.workunit.client.1.vm04.stdout:9/622: chown d9/d44/d59/f5a 387 1 2026-03-10T14:08:04.442 INFO:tasks.workunit.client.1.vm04.stdout:4/648: dwrite d4/d14/d3c/f41 [0,4194304] 0 2026-03-10T14:08:04.444 INFO:tasks.workunit.client.0.vm03.stdout:9/148: mkdir d2/d29/d33 0 2026-03-10T14:08:04.455 INFO:tasks.workunit.client.0.vm03.stdout:1/186: creat d0/d18/d3b/f3d x:0 0 0 2026-03-10T14:08:04.462 INFO:tasks.workunit.client.0.vm03.stdout:8/199: readlink da/d24/l2f 0 2026-03-10T14:08:04.465 INFO:tasks.workunit.client.1.vm04.stdout:6/568: link d3/de/d35/d3f/l15 d3/de/d35/d3a/d43/d4c/d5e/lac 0 2026-03-10T14:08:04.465 INFO:tasks.workunit.client.1.vm04.stdout:6/569: chown d3/f19 386581 1 2026-03-10T14:08:04.465 INFO:tasks.workunit.client.0.vm03.stdout:1/187: creat d0/d2/d34/f3e x:0 0 0 2026-03-10T14:08:04.466 INFO:tasks.workunit.client.0.vm03.stdout:1/188: write d0/d2/fe [1096986,17592] 0 2026-03-10T14:08:04.467 INFO:tasks.workunit.client.0.vm03.stdout:1/189: truncate d0/d2/df/d16/d20/f38 468137 0 2026-03-10T14:08:04.468 INFO:tasks.workunit.client.1.vm04.stdout:6/570: dwrite d3/f19 [4194304,4194304] 0 2026-03-10T14:08:04.470 INFO:tasks.workunit.client.1.vm04.stdout:1/682: write d3/d22/d63/d35/fe5 [5072155,120009] 0 2026-03-10T14:08:04.473 INFO:tasks.workunit.client.0.vm03.stdout:1/190: fsync d0/d2/f9 0 2026-03-10T14:08:04.477 INFO:tasks.workunit.client.0.vm03.stdout:1/191: creat d0/d2/df/d27/f3f x:0 0 0 2026-03-10T14:08:04.478 INFO:tasks.workunit.client.0.vm03.stdout:1/192: chown d0/d2/df/d16/d20 194 1 2026-03-10T14:08:04.479 INFO:tasks.workunit.client.1.vm04.stdout:5/765: chown d7/d2d/d69/db8 1 1 2026-03-10T14:08:04.479 INFO:tasks.workunit.client.0.vm03.stdout:9/149: sync 2026-03-10T14:08:04.479 INFO:tasks.workunit.client.1.vm04.stdout:5/766: truncate d7/d12/ff3 505058 0 2026-03-10T14:08:04.483 INFO:tasks.workunit.client.0.vm03.stdout:1/193: dwrite d0/fa [0,4194304] 0 2026-03-10T14:08:04.483 INFO:tasks.workunit.client.0.vm03.stdout:9/150: dwrite d2/d14/f25 [0,4194304] 0 2026-03-10T14:08:04.486 INFO:tasks.workunit.client.1.vm04.stdout:1/683: truncate d3/d22/d63/d35/dd9/fd 1531713 0 2026-03-10T14:08:04.487 INFO:tasks.workunit.client.0.vm03.stdout:1/194: readlink d0/d2/l2f 0 2026-03-10T14:08:04.492 INFO:tasks.workunit.client.0.vm03.stdout:9/151: dwrite d2/d14/d2b/f2d [0,4194304] 0 2026-03-10T14:08:04.500 INFO:tasks.workunit.client.0.vm03.stdout:9/152: mkdir d2/d14/d2b/d34 0 2026-03-10T14:08:04.502 INFO:tasks.workunit.client.1.vm04.stdout:7/719: getdents d2/dc/de/d2d/d60/d7c/d64 0 2026-03-10T14:08:04.502 INFO:tasks.workunit.client.1.vm04.stdout:6/571: dread d3/de/f46 [0,4194304] 0 2026-03-10T14:08:04.508 INFO:tasks.workunit.client.1.vm04.stdout:5/767: creat d7/d12/d2b/d3e/d3f/ff4 x:0 0 0 2026-03-10T14:08:04.512 INFO:tasks.workunit.client.0.vm03.stdout:9/153: fdatasync d2/f1e 0 2026-03-10T14:08:04.517 INFO:tasks.workunit.client.1.vm04.stdout:4/649: dread d4/df/db2/db4/fb8 [0,4194304] 0 2026-03-10T14:08:04.537 INFO:tasks.workunit.client.0.vm03.stdout:9/154: truncate d2/f15 5370500 0 2026-03-10T14:08:04.541 INFO:tasks.workunit.client.1.vm04.stdout:1/684: dread d3/d22/d63/f6f [0,4194304] 0 2026-03-10T14:08:04.569 INFO:tasks.workunit.client.1.vm04.stdout:1/685: creat d3/d22/d63/d35/dd9/d13/d38/db5/dc4/fe7 x:0 0 0 2026-03-10T14:08:04.569 INFO:tasks.workunit.client.1.vm04.stdout:1/686: readlink d3/d22/d63/d35/dd9/d13/d38/l99 0 2026-03-10T14:08:04.573 INFO:tasks.workunit.client.1.vm04.stdout:6/572: mkdir d3/de/d35/d3f/d2d/d32/d23/d83/dad 0 2026-03-10T14:08:04.581 INFO:tasks.workunit.client.1.vm04.stdout:4/650: truncate d4/df/d31/f5d 1827882 0 2026-03-10T14:08:04.586 INFO:tasks.workunit.client.1.vm04.stdout:6/573: symlink d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/lae 0 2026-03-10T14:08:04.589 INFO:tasks.workunit.client.1.vm04.stdout:1/687: link d3/d5c/fb2 d3/d22/d63/d35/dd9/fe8 0 2026-03-10T14:08:04.591 INFO:tasks.workunit.client.1.vm04.stdout:4/651: dread d4/df/d34/f7c [0,4194304] 0 2026-03-10T14:08:04.593 INFO:tasks.workunit.client.0.vm03.stdout:7/187: creat d5/d9/d14/d21/d28/f37 x:0 0 0 2026-03-10T14:08:04.645 INFO:tasks.workunit.client.1.vm04.stdout:6/574: sync 2026-03-10T14:08:04.645 INFO:tasks.workunit.client.1.vm04.stdout:1/688: sync 2026-03-10T14:08:04.646 INFO:tasks.workunit.client.1.vm04.stdout:1/689: chown d3/d22/d63/d35/dd9/d13/d38/d58/d5b 1159140601 1 2026-03-10T14:08:04.648 INFO:tasks.workunit.client.1.vm04.stdout:1/690: mkdir d3/d22/d63/d35/dd9/d13/da0/de9 0 2026-03-10T14:08:04.649 INFO:tasks.workunit.client.1.vm04.stdout:1/691: chown d3/d22/d63/f7f 9 1 2026-03-10T14:08:04.649 INFO:tasks.workunit.client.1.vm04.stdout:1/692: readlink d3/d22/d63/d35/la9 0 2026-03-10T14:08:04.653 INFO:tasks.workunit.client.1.vm04.stdout:1/693: creat d3/d22/d63/d35/dd9/fea x:0 0 0 2026-03-10T14:08:04.672 INFO:tasks.workunit.client.1.vm04.stdout:8/777: write d0/d3/d63/d12/d51/f64 [1699279,57373] 0 2026-03-10T14:08:04.672 INFO:tasks.workunit.client.1.vm04.stdout:8/778: chown d0/d3/d63/d29/c7e 354865691 1 2026-03-10T14:08:04.675 INFO:tasks.workunit.client.1.vm04.stdout:8/779: mkdir d0/d3/d63/d12/d51/d67/d96/df3 0 2026-03-10T14:08:04.676 INFO:tasks.workunit.client.1.vm04.stdout:8/780: truncate d0/d3/dd/d78/f8d 1085775 0 2026-03-10T14:08:04.701 INFO:tasks.workunit.client.1.vm04.stdout:2/683: dwrite d0/d14/d91/f24 [0,4194304] 0 2026-03-10T14:08:04.708 INFO:tasks.workunit.client.1.vm04.stdout:2/684: unlink d0/d14/d91/d4a/f7c 0 2026-03-10T14:08:04.709 INFO:tasks.workunit.client.1.vm04.stdout:2/685: write d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/f7b [495273,128500] 0 2026-03-10T14:08:04.717 INFO:tasks.workunit.client.0.vm03.stdout:6/159: link d8/d11/d18/l1a d8/l2f 0 2026-03-10T14:08:04.725 INFO:tasks.workunit.client.0.vm03.stdout:6/160: sync 2026-03-10T14:08:04.726 INFO:tasks.workunit.client.0.vm03.stdout:6/161: write f2 [9083877,15878] 0 2026-03-10T14:08:04.743 INFO:tasks.workunit.client.1.vm04.stdout:2/686: dread d0/d14/d91/d8/f6 [0,4194304] 0 2026-03-10T14:08:04.749 INFO:tasks.workunit.client.1.vm04.stdout:0/680: write d0/fb0 [750182,64277] 0 2026-03-10T14:08:04.754 INFO:tasks.workunit.client.1.vm04.stdout:2/687: dread d0/d14/d91/d3a/d3e/f61 [0,4194304] 0 2026-03-10T14:08:04.754 INFO:tasks.workunit.client.1.vm04.stdout:2/688: chown d0/d14/d91/d4a/fa6 0 1 2026-03-10T14:08:04.767 INFO:tasks.workunit.client.1.vm04.stdout:9/623: write d9/d58/db5/f67 [30679,45023] 0 2026-03-10T14:08:04.769 INFO:tasks.workunit.client.1.vm04.stdout:0/681: link d0/d2/d15/d22/l69 d0/d2/d15/d22/d38/d56/dc1/lda 0 2026-03-10T14:08:04.773 INFO:tasks.workunit.client.1.vm04.stdout:9/624: write d9/d44/d59/f5a [1002696,128223] 0 2026-03-10T14:08:04.784 INFO:tasks.workunit.client.0.vm03.stdout:3/122: rename f10 to d1d/f26 0 2026-03-10T14:08:04.786 INFO:tasks.workunit.client.1.vm04.stdout:9/625: mkdir d9/dd3 0 2026-03-10T14:08:04.792 INFO:tasks.workunit.client.0.vm03.stdout:0/187: rename d3/d17/c26 to d3/d11/d25/c3d 0 2026-03-10T14:08:04.798 INFO:tasks.workunit.client.1.vm04.stdout:9/626: mknod d9/d58/cd4 0 2026-03-10T14:08:04.800 INFO:tasks.workunit.client.0.vm03.stdout:9/155: rmdir d2/d14/d2b 39 2026-03-10T14:08:04.808 INFO:tasks.workunit.client.1.vm04.stdout:9/627: mknod d9/d5c/d93/d96/d9d/cd5 0 2026-03-10T14:08:04.811 INFO:tasks.workunit.client.0.vm03.stdout:0/188: rmdir d3/d11/d2c 39 2026-03-10T14:08:04.818 INFO:tasks.workunit.client.1.vm04.stdout:9/628: truncate f5 10587825 0 2026-03-10T14:08:04.819 INFO:tasks.workunit.client.1.vm04.stdout:7/720: write d2/dc/d4d/dcd/fb0 [53551,87075] 0 2026-03-10T14:08:04.819 INFO:tasks.workunit.client.1.vm04.stdout:7/721: chown d2/dc/de/f1e 0 1 2026-03-10T14:08:04.820 INFO:tasks.workunit.client.1.vm04.stdout:7/722: chown d2/dc/de/d2d/d60/d7c/f8f 381713 1 2026-03-10T14:08:04.821 INFO:tasks.workunit.client.1.vm04.stdout:7/723: write d2/dc/de/d2d/d38/d50/fbc [362979,24910] 0 2026-03-10T14:08:04.826 INFO:tasks.workunit.client.0.vm03.stdout:9/156: chown d2/l10 19 1 2026-03-10T14:08:04.850 INFO:tasks.workunit.client.1.vm04.stdout:3/704: rename da/dc/d35/dcd/dde/dac/lee to da/dc/lf1 0 2026-03-10T14:08:04.851 INFO:tasks.workunit.client.1.vm04.stdout:3/705: chown da/dc/d3f/d54/d66/f99 6 1 2026-03-10T14:08:04.853 INFO:tasks.workunit.client.1.vm04.stdout:3/706: dread da/d30/f55 [4194304,4194304] 0 2026-03-10T14:08:04.857 INFO:tasks.workunit.client.1.vm04.stdout:5/768: dwrite d7/d9/db5/fd1 [0,4194304] 0 2026-03-10T14:08:04.884 INFO:tasks.workunit.client.1.vm04.stdout:5/769: sync 2026-03-10T14:08:04.884 INFO:tasks.workunit.client.1.vm04.stdout:5/770: read d7/d2d/d76/f8f [2954344,58077] 0 2026-03-10T14:08:04.885 INFO:tasks.workunit.client.1.vm04.stdout:5/771: readlink d7/d26/l39 0 2026-03-10T14:08:04.889 INFO:tasks.workunit.client.0.vm03.stdout:2/198: rename d5/d10/d17/f2c to d5/d10/f39 0 2026-03-10T14:08:04.894 INFO:tasks.workunit.client.1.vm04.stdout:3/707: truncate da/dc/d35/dcd/dde/dac/fe8 1521352 0 2026-03-10T14:08:04.911 INFO:tasks.workunit.client.1.vm04.stdout:3/708: mkdir da/dc/d35/dcd/dde/df2 0 2026-03-10T14:08:04.914 INFO:tasks.workunit.client.1.vm04.stdout:3/709: dwrite da/d3e/f4c [4194304,4194304] 0 2026-03-10T14:08:04.918 INFO:tasks.workunit.client.1.vm04.stdout:3/710: sync 2026-03-10T14:08:04.928 INFO:tasks.workunit.client.1.vm04.stdout:8/781: read d0/d3/d63/d12/d51/d67/f6e [144301,99987] 0 2026-03-10T14:08:04.928 INFO:tasks.workunit.client.1.vm04.stdout:8/782: stat d0/d3/d63/d12/l5c 0 2026-03-10T14:08:04.930 INFO:tasks.workunit.client.1.vm04.stdout:5/772: unlink d7/d59/d7e/d87/c97 0 2026-03-10T14:08:04.931 INFO:tasks.workunit.client.0.vm03.stdout:4/175: write d5/d9/f11 [1292331,112488] 0 2026-03-10T14:08:04.941 INFO:tasks.workunit.client.0.vm03.stdout:5/234: rename d4/d6/de/f32 to d4/f55 0 2026-03-10T14:08:04.943 INFO:tasks.workunit.client.0.vm03.stdout:5/235: truncate d4/d16/f1c 5006259 0 2026-03-10T14:08:04.946 INFO:tasks.workunit.client.0.vm03.stdout:5/236: dwrite d4/f17 [4194304,4194304] 0 2026-03-10T14:08:04.948 INFO:tasks.workunit.client.0.vm03.stdout:4/176: mknod d5/d9/db/c3b 0 2026-03-10T14:08:04.953 INFO:tasks.workunit.client.1.vm04.stdout:4/652: write d4/d14/d6d/f8a [548480,77406] 0 2026-03-10T14:08:04.964 INFO:tasks.workunit.client.1.vm04.stdout:8/783: rename d0/d3/d5/fb9 to d0/d3/dd/d76/ff4 0 2026-03-10T14:08:04.968 INFO:tasks.workunit.client.0.vm03.stdout:8/200: rename da/f2a to da/d36/f3b 0 2026-03-10T14:08:04.982 INFO:tasks.workunit.client.0.vm03.stdout:4/177: unlink d5/d9/d2b/f23 0 2026-03-10T14:08:04.985 INFO:tasks.workunit.client.1.vm04.stdout:4/653: dread d4/d14/f27 [0,4194304] 0 2026-03-10T14:08:04.986 INFO:tasks.workunit.client.1.vm04.stdout:4/654: dread - d4/d14/d64/f71 zero size 2026-03-10T14:08:04.994 INFO:tasks.workunit.client.1.vm04.stdout:3/711: creat da/dc/d3f/d54/ff3 x:0 0 0 2026-03-10T14:08:04.995 INFO:tasks.workunit.client.1.vm04.stdout:3/712: fsync da/d3e/fe0 0 2026-03-10T14:08:05.003 INFO:tasks.workunit.client.0.vm03.stdout:1/195: rename d0/d2/l2b to d0/d18/d1d/l40 0 2026-03-10T14:08:05.011 INFO:tasks.workunit.client.1.vm04.stdout:8/784: mkdir d0/d3/d63/d12/df5 0 2026-03-10T14:08:05.011 INFO:tasks.workunit.client.1.vm04.stdout:6/575: write d3/de/d35/d3a/d43/d4c/d5e/d76/f94 [312750,91840] 0 2026-03-10T14:08:05.017 INFO:tasks.workunit.client.0.vm03.stdout:5/237: creat d4/d40/d4e/f56 x:0 0 0 2026-03-10T14:08:05.018 INFO:tasks.workunit.client.0.vm03.stdout:5/238: write d4/d6/fa [2327283,65309] 0 2026-03-10T14:08:05.020 INFO:tasks.workunit.client.1.vm04.stdout:4/655: mknod d4/df/db2/db4/d47/d4f/d8c/cdf 0 2026-03-10T14:08:05.021 INFO:tasks.workunit.client.0.vm03.stdout:4/178: dread d5/d9/db/f10 [0,4194304] 0 2026-03-10T14:08:05.024 INFO:tasks.workunit.client.0.vm03.stdout:6/162: rename d8/d1b/c25 to d8/d1b/d1c/c30 0 2026-03-10T14:08:05.025 INFO:tasks.workunit.client.0.vm03.stdout:1/196: mkdir d0/d2/df/d16/d41 0 2026-03-10T14:08:05.025 INFO:tasks.workunit.client.0.vm03.stdout:4/179: dread d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:05.027 INFO:tasks.workunit.client.0.vm03.stdout:7/188: creat d5/d9/d14/d26/f38 x:0 0 0 2026-03-10T14:08:05.043 INFO:tasks.workunit.client.1.vm04.stdout:4/656: chown d4/df/d34/d6f/cc7 3 1 2026-03-10T14:08:05.045 INFO:tasks.workunit.client.0.vm03.stdout:6/163: creat d8/d1b/f31 x:0 0 0 2026-03-10T14:08:05.058 INFO:tasks.workunit.client.0.vm03.stdout:5/239: creat d4/d53/f57 x:0 0 0 2026-03-10T14:08:05.059 INFO:tasks.workunit.client.0.vm03.stdout:6/164: fsync f3 0 2026-03-10T14:08:05.059 INFO:tasks.workunit.client.0.vm03.stdout:5/240: readlink d4/d16/d19/d23/d3f/l49 0 2026-03-10T14:08:05.059 INFO:tasks.workunit.client.0.vm03.stdout:6/165: chown d8/d1b/d1c/c1d 11932364 1 2026-03-10T14:08:05.060 INFO:tasks.workunit.client.0.vm03.stdout:1/197: mkdir d0/d42 0 2026-03-10T14:08:05.063 INFO:tasks.workunit.client.0.vm03.stdout:5/241: dread d4/d6/fa [0,4194304] 0 2026-03-10T14:08:05.066 INFO:tasks.workunit.client.0.vm03.stdout:7/189: write d5/d9/f22 [1471744,22902] 0 2026-03-10T14:08:05.073 INFO:tasks.workunit.client.0.vm03.stdout:4/180: creat d5/f3c x:0 0 0 2026-03-10T14:08:05.073 INFO:tasks.workunit.client.0.vm03.stdout:4/181: write d5/d9/f16 [4407124,103915] 0 2026-03-10T14:08:05.073 INFO:tasks.workunit.client.1.vm04.stdout:1/694: mkdir d3/d22/deb 0 2026-03-10T14:08:05.078 INFO:tasks.workunit.client.1.vm04.stdout:6/576: rename d3/c7 to d3/de/d35/d3a/d43/d4c/d5e/caf 0 2026-03-10T14:08:05.080 INFO:tasks.workunit.client.1.vm04.stdout:1/695: creat d3/d22/d63/d35/d6c/fec x:0 0 0 2026-03-10T14:08:05.081 INFO:tasks.workunit.client.1.vm04.stdout:1/696: dread d3/d22/d63/f6f [0,4194304] 0 2026-03-10T14:08:05.089 INFO:tasks.workunit.client.1.vm04.stdout:6/577: dread d3/de/d35/d3f/d2d/d32/d23/f33 [4194304,4194304] 0 2026-03-10T14:08:05.107 INFO:tasks.workunit.client.1.vm04.stdout:6/578: truncate d3/d1d/f55 654940 0 2026-03-10T14:08:05.110 INFO:tasks.workunit.client.1.vm04.stdout:6/579: dread d3/de/d35/d3f/d2d/f89 [0,4194304] 0 2026-03-10T14:08:05.116 INFO:tasks.workunit.client.0.vm03.stdout:4/182: creat d5/d9/db/f3d x:0 0 0 2026-03-10T14:08:05.117 INFO:tasks.workunit.client.0.vm03.stdout:4/183: dread d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:05.119 INFO:tasks.workunit.client.0.vm03.stdout:5/242: rename d4/f29 to d4/d13/d43/f58 0 2026-03-10T14:08:05.120 INFO:tasks.workunit.client.0.vm03.stdout:6/166: mkdir d8/db/d2c/d2d/d32 0 2026-03-10T14:08:05.122 INFO:tasks.workunit.client.1.vm04.stdout:6/580: rename d3/l39 to d3/de/d35/d3f/d2d/d32/d23/lb0 0 2026-03-10T14:08:05.123 INFO:tasks.workunit.client.0.vm03.stdout:4/184: mknod d5/d9/c3e 0 2026-03-10T14:08:05.126 INFO:tasks.workunit.client.1.vm04.stdout:6/581: fsync d3/d1d/d73/fa 0 2026-03-10T14:08:05.135 INFO:tasks.workunit.client.1.vm04.stdout:6/582: dread d3/d1d/d73/f2b [0,4194304] 0 2026-03-10T14:08:05.135 INFO:tasks.workunit.client.1.vm04.stdout:6/583: write d3/f19 [4693987,63955] 0 2026-03-10T14:08:05.136 INFO:tasks.workunit.client.1.vm04.stdout:6/584: chown d3/f19 14 1 2026-03-10T14:08:05.146 INFO:tasks.workunit.client.0.vm03.stdout:6/167: unlink d8/d1b/f23 0 2026-03-10T14:08:05.146 INFO:tasks.workunit.client.0.vm03.stdout:1/198: rmdir d0/d2/df/d16/d20/d2e 0 2026-03-10T14:08:05.156 INFO:tasks.workunit.client.1.vm04.stdout:6/585: mknod d3/de/d35/d3f/d2d/d32/d23/d24/cb1 0 2026-03-10T14:08:05.156 INFO:tasks.workunit.client.0.vm03.stdout:4/185: mkdir d5/d9/d3f 0 2026-03-10T14:08:05.158 INFO:tasks.workunit.client.1.vm04.stdout:6/586: write d3/de/d35/d3f/d2d/f9a [2376997,1160] 0 2026-03-10T14:08:05.159 INFO:tasks.workunit.client.0.vm03.stdout:4/186: dread d5/d9/db/f28 [0,4194304] 0 2026-03-10T14:08:05.168 INFO:tasks.workunit.client.0.vm03.stdout:6/168: chown d8/l9 645 1 2026-03-10T14:08:05.170 INFO:tasks.workunit.client.1.vm04.stdout:6/587: unlink d3/de/d35/d3f/d2d/d32/d23/d47/f6e 0 2026-03-10T14:08:05.177 INFO:tasks.workunit.client.1.vm04.stdout:5/773: dread d7/d12/f42 [0,4194304] 0 2026-03-10T14:08:05.195 INFO:tasks.workunit.client.1.vm04.stdout:5/774: rmdir d7/d12/d2b/d3e/d3f/dc0 39 2026-03-10T14:08:05.197 INFO:tasks.workunit.client.0.vm03.stdout:6/169: mknod d8/db/d12/c33 0 2026-03-10T14:08:05.209 INFO:tasks.workunit.client.1.vm04.stdout:6/588: dread d3/de/d35/d3a/d43/d4c/f4d [0,4194304] 0 2026-03-10T14:08:05.211 INFO:tasks.workunit.client.1.vm04.stdout:7/724: truncate d2/dc/de/d2d/d60/d7c/d44/ff5 456184 0 2026-03-10T14:08:05.213 INFO:tasks.workunit.client.1.vm04.stdout:5/775: creat d7/d2d/ff5 x:0 0 0 2026-03-10T14:08:05.215 INFO:tasks.workunit.client.0.vm03.stdout:4/187: symlink d5/d9/d3f/l40 0 2026-03-10T14:08:05.217 INFO:tasks.workunit.client.1.vm04.stdout:2/689: write d0/d14/d91/d8/d17/d35/f81 [183531,42174] 0 2026-03-10T14:08:05.220 INFO:tasks.workunit.client.1.vm04.stdout:0/682: dwrite d0/d2/d15/d22/f30 [0,4194304] 0 2026-03-10T14:08:05.222 INFO:tasks.workunit.client.1.vm04.stdout:2/690: dwrite d0/d14/d91/d4a/d66/f72 [8388608,4194304] 0 2026-03-10T14:08:05.237 INFO:tasks.workunit.client.1.vm04.stdout:7/725: creat d2/dc/de/d2d/d60/d7c/f105 x:0 0 0 2026-03-10T14:08:05.238 INFO:tasks.workunit.client.1.vm04.stdout:7/726: write d2/dc/de/d2d/d60/d7c/f105 [367213,112551] 0 2026-03-10T14:08:05.241 INFO:tasks.workunit.client.1.vm04.stdout:7/727: dwrite d2/dc/de/f12 [4194304,4194304] 0 2026-03-10T14:08:05.243 INFO:tasks.workunit.client.1.vm04.stdout:7/728: chown d2/dc/de/d11/de7 126021 1 2026-03-10T14:08:05.245 INFO:tasks.workunit.client.0.vm03.stdout:1/199: creat d0/d2/df/f43 x:0 0 0 2026-03-10T14:08:05.252 INFO:tasks.workunit.client.0.vm03.stdout:0/189: getdents d3/d11/d25 0 2026-03-10T14:08:05.258 INFO:tasks.workunit.client.0.vm03.stdout:1/200: dread d0/f10 [0,4194304] 0 2026-03-10T14:08:05.261 INFO:tasks.workunit.client.0.vm03.stdout:1/201: dwrite d0/d2/d34/f3a [0,4194304] 0 2026-03-10T14:08:05.262 INFO:tasks.workunit.client.1.vm04.stdout:9/629: write d9/da/dd/d1c/f3b [72042,74365] 0 2026-03-10T14:08:05.263 INFO:tasks.workunit.client.0.vm03.stdout:3/123: dread d1d/f26 [0,4194304] 0 2026-03-10T14:08:05.266 INFO:tasks.workunit.client.0.vm03.stdout:0/190: creat d3/d16/f3e x:0 0 0 2026-03-10T14:08:05.266 INFO:tasks.workunit.client.0.vm03.stdout:0/191: fdatasync d3/d16/f3e 0 2026-03-10T14:08:05.268 INFO:tasks.workunit.client.0.vm03.stdout:9/157: write d2/d14/f1b [1065432,58258] 0 2026-03-10T14:08:05.268 INFO:tasks.workunit.client.0.vm03.stdout:9/158: write d2/d14/f27 [764821,60248] 0 2026-03-10T14:08:05.274 INFO:tasks.workunit.client.0.vm03.stdout:0/192: dread d3/f1e [0,4194304] 0 2026-03-10T14:08:05.280 INFO:tasks.workunit.client.1.vm04.stdout:7/729: symlink d2/dc/de/d2d/d60/d7c/d64/l106 0 2026-03-10T14:08:05.300 INFO:tasks.workunit.client.1.vm04.stdout:2/691: getdents d0/d14/d39/d47/d93 0 2026-03-10T14:08:05.300 INFO:tasks.workunit.client.1.vm04.stdout:6/589: link d3/de/d35/d3f/d2d/d32/d23/d24/f36 d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fb2 0 2026-03-10T14:08:05.300 INFO:tasks.workunit.client.1.vm04.stdout:7/730: write d2/d2a/d42/d86/fbb [1490899,40935] 0 2026-03-10T14:08:05.300 INFO:tasks.workunit.client.0.vm03.stdout:1/202: dwrite d0/f10 [0,4194304] 0 2026-03-10T14:08:05.300 INFO:tasks.workunit.client.0.vm03.stdout:2/199: dwrite d5/d10/d1c/f1d [0,4194304] 0 2026-03-10T14:08:05.300 INFO:tasks.workunit.client.0.vm03.stdout:9/159: creat d2/d29/f35 x:0 0 0 2026-03-10T14:08:05.300 INFO:tasks.workunit.client.0.vm03.stdout:9/160: write d2/d14/f1b [1826448,80364] 0 2026-03-10T14:08:05.301 INFO:tasks.workunit.client.1.vm04.stdout:7/731: dwrite d2/dc/de/d2d/d38/f37 [4194304,4194304] 0 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: Active manager daemon vm04.ywwcto restarted 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: Activating manager daemon vm04.ywwcto 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: mgrmap e21: vm04.ywwcto(active, starting, since 0.0157253s) 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/crt"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr metadata", "who": "vm04.ywwcto", "id": "vm04.ywwcto"}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:08:05.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:05 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:08:05.315 INFO:tasks.workunit.client.1.vm04.stdout:9/630: sync 2026-03-10T14:08:05.315 INFO:tasks.workunit.client.1.vm04.stdout:9/631: stat d9/d5c/f6c 0 2026-03-10T14:08:05.322 INFO:tasks.workunit.client.1.vm04.stdout:2/692: dread - d0/d14/d91/d8/d17/d4e/d85/f88 zero size 2026-03-10T14:08:05.330 INFO:tasks.workunit.client.1.vm04.stdout:6/590: rename d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f9b to d3/de/d35/d3f/d2d/d32/d9e/fb3 0 2026-03-10T14:08:05.336 INFO:tasks.workunit.client.1.vm04.stdout:7/732: symlink d2/dc/de/d11/de7/l107 0 2026-03-10T14:08:05.338 INFO:tasks.workunit.client.1.vm04.stdout:9/632: mkdir d9/da/d5d/dd6 0 2026-03-10T14:08:05.339 INFO:tasks.workunit.client.1.vm04.stdout:8/785: write d0/d3/d63/d12/f2c [716240,45687] 0 2026-03-10T14:08:05.340 INFO:tasks.workunit.client.1.vm04.stdout:8/786: write d0/d3/d63/d29/fce [1097264,107191] 0 2026-03-10T14:08:05.342 INFO:tasks.workunit.client.1.vm04.stdout:3/713: write da/dc/d35/d52/f6f [43064,103954] 0 2026-03-10T14:08:05.342 INFO:tasks.workunit.client.1.vm04.stdout:3/714: stat da/dc/d3f/f85 0 2026-03-10T14:08:05.344 INFO:tasks.workunit.client.1.vm04.stdout:2/693: rename d0/d14/d91/f9 to d0/d14/d91/d4a/d8c/dab/db3/fd6 0 2026-03-10T14:08:05.346 INFO:tasks.workunit.client.1.vm04.stdout:6/591: dread - d3/d1d/f6c zero size 2026-03-10T14:08:05.348 INFO:tasks.workunit.client.0.vm03.stdout:0/193: dread d3/f28 [0,4194304] 0 2026-03-10T14:08:05.349 INFO:tasks.workunit.client.1.vm04.stdout:4/657: write d4/df/f4e [2728058,51266] 0 2026-03-10T14:08:05.349 INFO:tasks.workunit.client.0.vm03.stdout:0/194: dread d3/f1e [0,4194304] 0 2026-03-10T14:08:05.349 INFO:tasks.workunit.client.0.vm03.stdout:0/195: chown d3/d16/d21/d3c 17 1 2026-03-10T14:08:05.351 INFO:tasks.workunit.client.1.vm04.stdout:7/733: mkdir d2/dc/de/d2d/d60/d7c/d64/d108 0 2026-03-10T14:08:05.352 INFO:tasks.workunit.client.1.vm04.stdout:8/787: sync 2026-03-10T14:08:05.356 INFO:tasks.workunit.client.0.vm03.stdout:3/124: write f18 [973948,87387] 0 2026-03-10T14:08:05.356 INFO:tasks.workunit.client.0.vm03.stdout:6/170: truncate f3 1044274 0 2026-03-10T14:08:05.357 INFO:tasks.workunit.client.0.vm03.stdout:1/203: rename d0/d2/cd to d0/d18/d3b/c44 0 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: Active manager daemon vm04.ywwcto restarted 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: Activating manager daemon vm04.ywwcto 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: mgrmap e21: vm04.ywwcto(active, starting, since 0.0157253s) 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/crt"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr metadata", "who": "vm04.ywwcto", "id": "vm04.ywwcto"}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:08:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:05 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:08:05.359 INFO:tasks.workunit.client.1.vm04.stdout:3/715: creat da/d3e/ff4 x:0 0 0 2026-03-10T14:08:05.362 INFO:tasks.workunit.client.1.vm04.stdout:1/697: write d3/d22/d63/d35/dd9/d13/d38/d58/d5b/f7c [1338330,98013] 0 2026-03-10T14:08:05.363 INFO:tasks.workunit.client.1.vm04.stdout:1/698: chown d3/d22/d63/d35/dd9/d13/da0/cab 27 1 2026-03-10T14:08:05.368 INFO:tasks.workunit.client.1.vm04.stdout:4/658: symlink d4/df/d34/d6f/le0 0 2026-03-10T14:08:05.379 INFO:tasks.workunit.client.0.vm03.stdout:0/196: mknod d3/d16/d21/c3f 0 2026-03-10T14:08:05.379 INFO:tasks.workunit.client.0.vm03.stdout:3/125: symlink d1d/l27 0 2026-03-10T14:08:05.379 INFO:tasks.workunit.client.0.vm03.stdout:3/126: dwrite fe [0,4194304] 0 2026-03-10T14:08:05.379 INFO:tasks.workunit.client.1.vm04.stdout:7/734: creat d2/d2a/f109 x:0 0 0 2026-03-10T14:08:05.379 INFO:tasks.workunit.client.1.vm04.stdout:9/633: creat d9/dd3/fd7 x:0 0 0 2026-03-10T14:08:05.379 INFO:tasks.workunit.client.1.vm04.stdout:9/634: stat d9/da/dd/l21 0 2026-03-10T14:08:05.379 INFO:tasks.workunit.client.1.vm04.stdout:3/716: read da/dc/d47/fc8 [151989,30668] 0 2026-03-10T14:08:05.381 INFO:tasks.workunit.client.0.vm03.stdout:5/243: dwrite d4/d13/d43/f58 [4194304,4194304] 0 2026-03-10T14:08:05.384 INFO:tasks.workunit.client.0.vm03.stdout:2/200: rename d5/f36 to d5/d10/d1f/f3a 0 2026-03-10T14:08:05.384 INFO:tasks.workunit.client.0.vm03.stdout:2/201: chown d5/d2a 17143 1 2026-03-10T14:08:05.387 INFO:tasks.workunit.client.1.vm04.stdout:7/735: dread d2/dc/de/dae/fb2 [0,4194304] 0 2026-03-10T14:08:05.391 INFO:tasks.workunit.client.1.vm04.stdout:7/736: stat d2/dc/de/d2d/d60/d7c/d36/d8b/fde 0 2026-03-10T14:08:05.391 INFO:tasks.workunit.client.0.vm03.stdout:1/204: dread - d0/d18/f21 zero size 2026-03-10T14:08:05.391 INFO:tasks.workunit.client.0.vm03.stdout:1/205: stat d0/d2/df/d16/l19 0 2026-03-10T14:08:05.392 INFO:tasks.workunit.client.0.vm03.stdout:9/161: mknod d2/d29/d33/c36 0 2026-03-10T14:08:05.394 INFO:tasks.workunit.client.1.vm04.stdout:4/659: rmdir d4/df/db2/db4/d47/d70 39 2026-03-10T14:08:05.399 INFO:tasks.workunit.client.1.vm04.stdout:9/635: dread - d9/d58/fba zero size 2026-03-10T14:08:05.399 INFO:tasks.workunit.client.0.vm03.stdout:8/201: mkdir da/d3c 0 2026-03-10T14:08:05.402 INFO:tasks.workunit.client.0.vm03.stdout:6/171: link d8/db/df/f10 d8/d11/d18/f34 0 2026-03-10T14:08:05.403 INFO:tasks.workunit.client.1.vm04.stdout:7/737: readlink d2/dc/l1c 0 2026-03-10T14:08:05.403 INFO:tasks.workunit.client.0.vm03.stdout:5/244: symlink d4/d53/l59 0 2026-03-10T14:08:05.404 INFO:tasks.workunit.client.0.vm03.stdout:3/127: rename f18 to d1d/f28 0 2026-03-10T14:08:05.404 INFO:tasks.workunit.client.1.vm04.stdout:1/699: mkdir d3/d22/d63/ded 0 2026-03-10T14:08:05.405 INFO:tasks.workunit.client.0.vm03.stdout:7/190: dwrite d5/d9/d14/d26/f33 [0,4194304] 0 2026-03-10T14:08:05.408 INFO:tasks.workunit.client.0.vm03.stdout:6/172: dwrite d8/db/f1f [4194304,4194304] 0 2026-03-10T14:08:05.414 INFO:tasks.workunit.client.0.vm03.stdout:0/197: sync 2026-03-10T14:08:05.415 INFO:tasks.workunit.client.0.vm03.stdout:3/128: dwrite f1c [0,4194304] 0 2026-03-10T14:08:05.418 INFO:tasks.workunit.client.1.vm04.stdout:1/700: mkdir d3/d22/d63/d35/dd9/d13/d38/db5/dc4/dee 0 2026-03-10T14:08:05.433 INFO:tasks.workunit.client.1.vm04.stdout:9/636: creat d9/d5c/dc2/fd8 x:0 0 0 2026-03-10T14:08:05.436 INFO:tasks.workunit.client.1.vm04.stdout:3/717: link da/d30/l3a da/dc/d3f/d61/dc1/lf5 0 2026-03-10T14:08:05.438 INFO:tasks.workunit.client.0.vm03.stdout:6/173: stat d8/d11/d18/f21 0 2026-03-10T14:08:05.439 INFO:tasks.workunit.client.0.vm03.stdout:1/206: symlink d0/d42/l45 0 2026-03-10T14:08:05.439 INFO:tasks.workunit.client.1.vm04.stdout:5/776: write d7/f11 [3155118,4116] 0 2026-03-10T14:08:05.439 INFO:tasks.workunit.client.0.vm03.stdout:6/174: chown d8/d1b/f31 1148225 1 2026-03-10T14:08:05.439 INFO:tasks.workunit.client.1.vm04.stdout:5/777: write d7/d2d/f64 [2691931,5321] 0 2026-03-10T14:08:05.440 INFO:tasks.workunit.client.0.vm03.stdout:1/207: write d0/d18/d3b/f3d [242335,47377] 0 2026-03-10T14:08:05.440 INFO:tasks.workunit.client.0.vm03.stdout:6/175: stat d8/d11/d18/f34 0 2026-03-10T14:08:05.441 INFO:tasks.workunit.client.1.vm04.stdout:1/701: creat d3/d5c/fef x:0 0 0 2026-03-10T14:08:05.442 INFO:tasks.workunit.client.0.vm03.stdout:4/188: write d5/d9/db/f2a [146628,55056] 0 2026-03-10T14:08:05.443 INFO:tasks.workunit.client.1.vm04.stdout:9/637: chown d9/l9a 1891155783 1 2026-03-10T14:08:05.448 INFO:tasks.workunit.client.1.vm04.stdout:0/683: dwrite d0/d6e/f8b [0,4194304] 0 2026-03-10T14:08:05.467 INFO:tasks.workunit.client.1.vm04.stdout:3/718: rename da/dc/d35/fdb to da/dc/d35/d37/ff6 0 2026-03-10T14:08:05.468 INFO:tasks.workunit.client.0.vm03.stdout:8/202: fsync da/d24/f32 0 2026-03-10T14:08:05.470 INFO:tasks.workunit.client.1.vm04.stdout:6/592: dwrite d3/d1d/d73/fc [0,4194304] 0 2026-03-10T14:08:05.478 INFO:tasks.workunit.client.0.vm03.stdout:3/129: mkdir d1d/d29 0 2026-03-10T14:08:05.478 INFO:tasks.workunit.client.1.vm04.stdout:5/778: truncate d7/d2d/d76/f84 4389266 0 2026-03-10T14:08:05.481 INFO:tasks.workunit.client.0.vm03.stdout:0/198: mknod d3/d16/d21/c40 0 2026-03-10T14:08:05.481 INFO:tasks.workunit.client.0.vm03.stdout:1/208: rename d0/f17 to d0/d2/f46 0 2026-03-10T14:08:05.482 INFO:tasks.workunit.client.1.vm04.stdout:9/638: creat d9/da/dd/fd9 x:0 0 0 2026-03-10T14:08:05.482 INFO:tasks.workunit.client.0.vm03.stdout:3/130: dwrite f1c [0,4194304] 0 2026-03-10T14:08:05.483 INFO:tasks.workunit.client.0.vm03.stdout:3/131: chown d1d/f26 228428 1 2026-03-10T14:08:05.487 INFO:tasks.workunit.client.0.vm03.stdout:4/189: mknod d5/d9/c41 0 2026-03-10T14:08:05.488 INFO:tasks.workunit.client.0.vm03.stdout:9/162: creat d2/f37 x:0 0 0 2026-03-10T14:08:05.488 INFO:tasks.workunit.client.1.vm04.stdout:0/684: unlink d0/d2/d15/d22/d38/d56/d66/f2c 0 2026-03-10T14:08:05.490 INFO:tasks.workunit.client.0.vm03.stdout:8/203: truncate da/d24/f25 4591949 0 2026-03-10T14:08:05.490 INFO:tasks.workunit.client.1.vm04.stdout:3/719: rmdir da/d8e/db5 39 2026-03-10T14:08:05.491 INFO:tasks.workunit.client.1.vm04.stdout:0/685: dread d0/d2/d15/d22/f88 [0,4194304] 0 2026-03-10T14:08:05.493 INFO:tasks.workunit.client.0.vm03.stdout:9/163: dread d2/f2c [0,4194304] 0 2026-03-10T14:08:05.497 INFO:tasks.workunit.client.1.vm04.stdout:6/593: chown d3/d1d/ca2 62 1 2026-03-10T14:08:05.506 INFO:tasks.workunit.client.0.vm03.stdout:0/199: write d3/d11/d2c/f3b [212985,96101] 0 2026-03-10T14:08:05.506 INFO:tasks.workunit.client.1.vm04.stdout:9/639: mkdir d9/d58/dda 0 2026-03-10T14:08:05.506 INFO:tasks.workunit.client.1.vm04.stdout:1/702: rename d3/d22/d63/d35/dd9/d13/d38/l7e to d3/d5c/d79/lf0 0 2026-03-10T14:08:05.506 INFO:tasks.workunit.client.1.vm04.stdout:3/720: mkdir da/dc/d3f/d61/df7 0 2026-03-10T14:08:05.506 INFO:tasks.workunit.client.1.vm04.stdout:6/594: rename d3/de/d35/d3f/d2d/d32/d23/d4e to d3/de/d35/d3f/d2d/d32/d23/d4e/db4 22 2026-03-10T14:08:05.506 INFO:tasks.workunit.client.1.vm04.stdout:9/640: creat d9/d44/d70/fdb x:0 0 0 2026-03-10T14:08:05.507 INFO:tasks.workunit.client.0.vm03.stdout:3/132: dwrite d1d/f28 [0,4194304] 0 2026-03-10T14:08:05.520 INFO:tasks.workunit.client.0.vm03.stdout:4/190: mknod d5/c42 0 2026-03-10T14:08:05.521 INFO:tasks.workunit.client.1.vm04.stdout:0/686: mkdir d0/d2/d15/d49/d50/d61/ddb 0 2026-03-10T14:08:05.521 INFO:tasks.workunit.client.0.vm03.stdout:4/191: write d5/f7 [1410298,15603] 0 2026-03-10T14:08:05.525 INFO:tasks.workunit.client.1.vm04.stdout:2/694: write d0/d14/d91/d8/d17/d4e/d85/f89 [1686137,37708] 0 2026-03-10T14:08:05.528 INFO:tasks.workunit.client.1.vm04.stdout:8/788: fsync d0/d3/dd/d89/fa9 0 2026-03-10T14:08:05.529 INFO:tasks.workunit.client.0.vm03.stdout:1/209: truncate d0/f11 1295818 0 2026-03-10T14:08:05.531 INFO:tasks.workunit.client.1.vm04.stdout:0/687: read d0/d2/d15/d22/d38/d56/dc1/fc3 [1776438,65484] 0 2026-03-10T14:08:05.536 INFO:tasks.workunit.client.0.vm03.stdout:2/202: write d5/d10/d1c/f2e [387681,42407] 0 2026-03-10T14:08:05.537 INFO:tasks.workunit.client.0.vm03.stdout:1/210: rmdir d0/d2/df/d27 39 2026-03-10T14:08:05.538 INFO:tasks.workunit.client.0.vm03.stdout:1/211: dread - d0/d18/f21 zero size 2026-03-10T14:08:05.540 INFO:tasks.workunit.client.0.vm03.stdout:8/204: dread f2 [0,4194304] 0 2026-03-10T14:08:05.542 INFO:tasks.workunit.client.1.vm04.stdout:3/721: dread da/f22 [0,4194304] 0 2026-03-10T14:08:05.544 INFO:tasks.workunit.client.1.vm04.stdout:4/660: dwrite d4/d14/fa8 [0,4194304] 0 2026-03-10T14:08:05.547 INFO:tasks.workunit.client.1.vm04.stdout:4/661: fsync d4/df/db2/db4/fdb 0 2026-03-10T14:08:05.549 INFO:tasks.workunit.client.1.vm04.stdout:8/789: dread d0/d3/d63/d12/d69/f9d [0,4194304] 0 2026-03-10T14:08:05.549 INFO:tasks.workunit.client.0.vm03.stdout:8/205: dwrite f5 [0,4194304] 0 2026-03-10T14:08:05.552 INFO:tasks.workunit.client.1.vm04.stdout:7/738: dwrite d2/dc/de/d2d/d38/f8a [0,4194304] 0 2026-03-10T14:08:05.554 INFO:tasks.workunit.client.0.vm03.stdout:1/212: rmdir d0/d2/d34 39 2026-03-10T14:08:05.557 INFO:tasks.workunit.client.1.vm04.stdout:0/688: mknod d0/d2/d90/cdc 0 2026-03-10T14:08:05.578 INFO:tasks.workunit.client.1.vm04.stdout:0/689: read d0/d6e/f8b [3414742,58519] 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.1.vm04.stdout:6/595: creat d3/de/d35/d3a/fb5 x:0 0 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.1.vm04.stdout:0/690: stat d0/d2/fa 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.1.vm04.stdout:7/739: dread d2/dc/de/f1e [4194304,4194304] 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.1.vm04.stdout:7/740: fsync d2/dc/de/d2d/d38/d50/fa2 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.1.vm04.stdout:4/662: mkdir d4/df/db2/de1 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.0.vm03.stdout:4/192: dread d5/d9/f25 [0,4194304] 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.0.vm03.stdout:4/193: read d5/fe [3521164,32178] 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.0.vm03.stdout:4/194: chown d5/f7 1279019805 1 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.0.vm03.stdout:1/213: chown d0/d2/df/d27/f32 4177747 1 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.0.vm03.stdout:2/203: link d5/d10/l1b d5/d35/l3b 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.0.vm03.stdout:5/245: rmdir d4/d35 39 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.0.vm03.stdout:1/214: dwrite d0/f2c [0,4194304] 0 2026-03-10T14:08:05.579 INFO:tasks.workunit.client.0.vm03.stdout:3/133: sync 2026-03-10T14:08:05.584 INFO:tasks.workunit.client.0.vm03.stdout:3/134: dwrite d1d/f28 [0,4194304] 0 2026-03-10T14:08:05.588 INFO:tasks.workunit.client.0.vm03.stdout:4/195: symlink d5/d9/db/d19/d34/l43 0 2026-03-10T14:08:05.591 INFO:tasks.workunit.client.0.vm03.stdout:3/135: write fc [938143,56166] 0 2026-03-10T14:08:05.594 INFO:tasks.workunit.client.0.vm03.stdout:8/206: creat da/d24/f3d x:0 0 0 2026-03-10T14:08:05.595 INFO:tasks.workunit.client.0.vm03.stdout:2/204: rmdir d5/d10/d1c 39 2026-03-10T14:08:05.604 INFO:tasks.workunit.client.0.vm03.stdout:5/246: dread d4/d16/f1c [0,4194304] 0 2026-03-10T14:08:05.609 INFO:tasks.workunit.client.1.vm04.stdout:7/741: rmdir d2/d9f 39 2026-03-10T14:08:05.609 INFO:tasks.workunit.client.1.vm04.stdout:7/742: write d2/dc/de/d2d/d60/faf [3975709,54260] 0 2026-03-10T14:08:05.609 INFO:tasks.workunit.client.1.vm04.stdout:7/743: write d2/dc/de/d2d/d38/d50/fbc [2990984,40319] 0 2026-03-10T14:08:05.613 INFO:tasks.workunit.client.1.vm04.stdout:6/596: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/d71/fb6 x:0 0 0 2026-03-10T14:08:05.618 INFO:tasks.workunit.client.0.vm03.stdout:5/247: sync 2026-03-10T14:08:05.621 INFO:tasks.workunit.client.1.vm04.stdout:1/703: dread d3/d22/f2b [0,4194304] 0 2026-03-10T14:08:05.624 INFO:tasks.workunit.client.0.vm03.stdout:1/215: rename d0/d18/d3b/c44 to d0/d2/df/d16/c47 0 2026-03-10T14:08:05.627 INFO:tasks.workunit.client.1.vm04.stdout:7/744: fdatasync d2/dc/f26 0 2026-03-10T14:08:05.633 INFO:tasks.workunit.client.0.vm03.stdout:5/248: creat d4/d6/de/f5a x:0 0 0 2026-03-10T14:08:05.633 INFO:tasks.workunit.client.1.vm04.stdout:6/597: creat d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/fb7 x:0 0 0 2026-03-10T14:08:05.633 INFO:tasks.workunit.client.1.vm04.stdout:1/704: creat d3/d22/d63/d35/dd9/d13/d38/ff1 x:0 0 0 2026-03-10T14:08:05.633 INFO:tasks.workunit.client.0.vm03.stdout:5/249: dwrite d4/d13/d43/f58 [4194304,4194304] 0 2026-03-10T14:08:05.634 INFO:tasks.workunit.client.1.vm04.stdout:1/705: fdatasync d3/d22/d63/d35/dd9/d13/d38/db5/dc4/fe7 0 2026-03-10T14:08:05.658 INFO:tasks.workunit.client.1.vm04.stdout:4/663: creat d4/fe2 x:0 0 0 2026-03-10T14:08:05.673 INFO:tasks.workunit.client.1.vm04.stdout:6/598: mknod d3/de/d35/d3a/d43/d4c/d5e/cb8 0 2026-03-10T14:08:05.675 INFO:tasks.workunit.client.1.vm04.stdout:1/706: fsync d3/d22/d63/faa 0 2026-03-10T14:08:05.676 INFO:tasks.workunit.client.0.vm03.stdout:9/164: dread d2/fc [0,4194304] 0 2026-03-10T14:08:05.677 INFO:tasks.workunit.client.0.vm03.stdout:4/196: getdents d5/d9/d3f 0 2026-03-10T14:08:05.680 INFO:tasks.workunit.client.0.vm03.stdout:1/216: creat d0/f48 x:0 0 0 2026-03-10T14:08:05.681 INFO:tasks.workunit.client.0.vm03.stdout:6/176: truncate f5 2833859 0 2026-03-10T14:08:05.683 INFO:tasks.workunit.client.0.vm03.stdout:1/217: dwrite d0/d18/f25 [0,4194304] 0 2026-03-10T14:08:05.685 INFO:tasks.workunit.client.1.vm04.stdout:5/779: dwrite d7/d59/d7e/d87/fe2 [0,4194304] 0 2026-03-10T14:08:05.685 INFO:tasks.workunit.client.0.vm03.stdout:1/218: chown d0/f2c 45930 1 2026-03-10T14:08:05.688 INFO:tasks.workunit.client.1.vm04.stdout:9/641: write d9/d5c/d93/d96/f9c [4260845,76744] 0 2026-03-10T14:08:05.691 INFO:tasks.workunit.client.0.vm03.stdout:0/200: dwrite d3/f10 [4194304,4194304] 0 2026-03-10T14:08:05.692 INFO:tasks.workunit.client.1.vm04.stdout:9/642: dwrite d9/d44/d70/fdb [0,4194304] 0 2026-03-10T14:08:05.703 INFO:tasks.workunit.client.1.vm04.stdout:3/722: dwrite da/dc/d35/dcd/dde/dac/fc9 [0,4194304] 0 2026-03-10T14:08:05.704 INFO:tasks.workunit.client.1.vm04.stdout:8/790: dwrite d0/d3/d63/d12/d51/d67/d96/dc8/dcf/fda [0,4194304] 0 2026-03-10T14:08:05.704 INFO:tasks.workunit.client.1.vm04.stdout:8/791: chown d0/d3/d63/d12/d51/d67/d96 215226941 1 2026-03-10T14:08:05.715 INFO:tasks.workunit.client.1.vm04.stdout:4/664: truncate d4/f77 946897 0 2026-03-10T14:08:05.718 INFO:tasks.workunit.client.1.vm04.stdout:0/691: dwrite d0/d2/d15/d49/d50/d61/d75/fc4 [0,4194304] 0 2026-03-10T14:08:05.720 INFO:tasks.workunit.client.0.vm03.stdout:3/136: rename c6 to d1d/c2a 0 2026-03-10T14:08:05.720 INFO:tasks.workunit.client.0.vm03.stdout:8/207: rmdir da/d24 39 2026-03-10T14:08:05.720 INFO:tasks.workunit.client.1.vm04.stdout:0/692: truncate d0/d2/d25/f2a 5447704 0 2026-03-10T14:08:05.725 INFO:tasks.workunit.client.0.vm03.stdout:9/165: mkdir d2/d29/d38 0 2026-03-10T14:08:05.728 INFO:tasks.workunit.client.0.vm03.stdout:2/205: write d5/d10/f39 [3214434,116529] 0 2026-03-10T14:08:05.729 INFO:tasks.workunit.client.1.vm04.stdout:6/599: creat d3/de/fb9 x:0 0 0 2026-03-10T14:08:05.729 INFO:tasks.workunit.client.0.vm03.stdout:9/166: dwrite d2/d14/f27 [0,4194304] 0 2026-03-10T14:08:05.729 INFO:tasks.workunit.client.1.vm04.stdout:6/600: dread - d3/de/d35/d3f/d2d/d38/d40/fa5 zero size 2026-03-10T14:08:05.729 INFO:tasks.workunit.client.0.vm03.stdout:9/167: truncate d2/f1e 1142478 0 2026-03-10T14:08:05.730 INFO:tasks.workunit.client.1.vm04.stdout:6/601: truncate d3/de/d35/d3a/fb5 28849 0 2026-03-10T14:08:05.731 INFO:tasks.workunit.client.1.vm04.stdout:7/745: write d2/dc/f25 [216821,52645] 0 2026-03-10T14:08:05.731 INFO:tasks.workunit.client.1.vm04.stdout:6/602: write d3/d1d/d73/f2b [38749,122847] 0 2026-03-10T14:08:05.732 INFO:tasks.workunit.client.1.vm04.stdout:7/746: chown d2/dc/de/d2d/d60/d7c/d64/d108 63 1 2026-03-10T14:08:05.732 INFO:tasks.workunit.client.1.vm04.stdout:6/603: chown d3/de/d35/d3f/d2d/d32/d5c/f75 2 1 2026-03-10T14:08:05.737 INFO:tasks.workunit.client.0.vm03.stdout:1/219: chown d0/d2/d34 134 1 2026-03-10T14:08:05.762 INFO:tasks.workunit.client.0.vm03.stdout:2/206: fdatasync d5/d10/f13 0 2026-03-10T14:08:05.766 INFO:tasks.workunit.client.0.vm03.stdout:2/207: dwrite d5/ff [0,4194304] 0 2026-03-10T14:08:05.773 INFO:tasks.workunit.client.0.vm03.stdout:8/208: fsync da/f20 0 2026-03-10T14:08:05.785 INFO:tasks.workunit.client.1.vm04.stdout:9/643: creat d9/d5c/fdc x:0 0 0 2026-03-10T14:08:05.786 INFO:tasks.workunit.client.1.vm04.stdout:4/665: fsync d4/df/d31/fa7 0 2026-03-10T14:08:05.786 INFO:tasks.workunit.client.0.vm03.stdout:5/250: link d4/f55 d4/d40/f5b 0 2026-03-10T14:08:05.786 INFO:tasks.workunit.client.0.vm03.stdout:8/209: write da/d24/f3d [797315,8324] 0 2026-03-10T14:08:05.786 INFO:tasks.workunit.client.0.vm03.stdout:5/251: dwrite d4/d13/f4b [0,4194304] 0 2026-03-10T14:08:05.787 INFO:tasks.workunit.client.1.vm04.stdout:0/693: unlink d0/d2/d25/c82 0 2026-03-10T14:08:05.788 INFO:tasks.workunit.client.0.vm03.stdout:5/252: dread - d4/d13/d43/f48 zero size 2026-03-10T14:08:05.789 INFO:tasks.workunit.client.0.vm03.stdout:9/168: creat d2/d14/f39 x:0 0 0 2026-03-10T14:08:05.789 INFO:tasks.workunit.client.0.vm03.stdout:0/201: link d3/d11/d2c/c38 d3/d16/d21/c41 0 2026-03-10T14:08:05.789 INFO:tasks.workunit.client.1.vm04.stdout:1/707: unlink d3/d22/d63/d35/dd9/l11 0 2026-03-10T14:08:05.789 INFO:tasks.workunit.client.0.vm03.stdout:9/169: truncate d2/f1e 1730968 0 2026-03-10T14:08:05.790 INFO:tasks.workunit.client.1.vm04.stdout:7/747: dread - d2/d94/f9a zero size 2026-03-10T14:08:05.792 INFO:tasks.workunit.client.0.vm03.stdout:0/202: fdatasync d3/f1e 0 2026-03-10T14:08:05.793 INFO:tasks.workunit.client.1.vm04.stdout:3/723: mkdir da/ded/df8 0 2026-03-10T14:08:05.794 INFO:tasks.workunit.client.0.vm03.stdout:8/210: mknod da/d36/c3e 0 2026-03-10T14:08:05.795 INFO:tasks.workunit.client.1.vm04.stdout:4/666: mknod d4/d14/d3c/d62/ce3 0 2026-03-10T14:08:05.796 INFO:tasks.workunit.client.1.vm04.stdout:0/694: rename d0/d2/d15/d49/d50/d61/d75/fc4 to d0/d2/d15/d49/d50/fdd 0 2026-03-10T14:08:05.796 INFO:tasks.workunit.client.0.vm03.stdout:0/203: mknod d3/d16/c42 0 2026-03-10T14:08:05.797 INFO:tasks.workunit.client.0.vm03.stdout:0/204: write d3/f10 [3463165,87761] 0 2026-03-10T14:08:05.798 INFO:tasks.workunit.client.0.vm03.stdout:0/205: chown d3/f9 672083287 1 2026-03-10T14:08:05.798 INFO:tasks.workunit.client.1.vm04.stdout:6/604: mkdir d3/de/d35/d3f/dba 0 2026-03-10T14:08:05.799 INFO:tasks.workunit.client.0.vm03.stdout:9/170: mkdir d2/d29/d38/d3a 0 2026-03-10T14:08:05.800 INFO:tasks.workunit.client.1.vm04.stdout:5/780: link d7/d26/d6b/d6e/lbe d7/d12/d2b/d93/d9e/lf6 0 2026-03-10T14:08:05.802 INFO:tasks.workunit.client.0.vm03.stdout:8/211: creat da/d3c/f3f x:0 0 0 2026-03-10T14:08:05.804 INFO:tasks.workunit.client.0.vm03.stdout:9/171: dwrite d2/d14/f1b [0,4194304] 0 2026-03-10T14:08:05.804 INFO:tasks.workunit.client.1.vm04.stdout:0/695: fsync d0/d2/d15/d49/d50/d61/f96 0 2026-03-10T14:08:05.805 INFO:tasks.workunit.client.1.vm04.stdout:6/605: dread - d3/de/d35/d3f/d2d/d38/f60 zero size 2026-03-10T14:08:05.805 INFO:tasks.workunit.client.1.vm04.stdout:6/606: fsync d3/f19 0 2026-03-10T14:08:05.805 INFO:tasks.workunit.client.0.vm03.stdout:9/172: read d2/d14/f25 [1659429,5001] 0 2026-03-10T14:08:05.806 INFO:tasks.workunit.client.1.vm04.stdout:3/724: creat da/d8e/db5/ff9 x:0 0 0 2026-03-10T14:08:05.807 INFO:tasks.workunit.client.0.vm03.stdout:8/212: readlink da/d24/l2c 0 2026-03-10T14:08:05.808 INFO:tasks.workunit.client.1.vm04.stdout:0/696: mkdir d0/d2/d15/d22/d38/d56/dcb/dde 0 2026-03-10T14:08:05.810 INFO:tasks.workunit.client.1.vm04.stdout:6/607: write d3/de/f6d [7991729,33156] 0 2026-03-10T14:08:05.816 INFO:tasks.workunit.client.0.vm03.stdout:9/173: chown d2/ce 1045405455 1 2026-03-10T14:08:05.816 INFO:tasks.workunit.client.0.vm03.stdout:8/213: mkdir da/d36/d40 0 2026-03-10T14:08:05.817 INFO:tasks.workunit.client.0.vm03.stdout:9/174: chown d2/c1f 0 1 2026-03-10T14:08:05.817 INFO:tasks.workunit.client.1.vm04.stdout:9/644: read d9/d58/db5/f1f [2944170,107015] 0 2026-03-10T14:08:05.819 INFO:tasks.workunit.client.1.vm04.stdout:0/697: rename d0/d2/d90/fb2 to d0/d2/d15/d49/d50/d5c/da4/fdf 0 2026-03-10T14:08:05.821 INFO:tasks.workunit.client.0.vm03.stdout:8/214: mknod da/d3c/c41 0 2026-03-10T14:08:05.821 INFO:tasks.workunit.client.1.vm04.stdout:6/608: dread d3/de/d35/d3a/d43/d4c/d5e/d76/f94 [0,4194304] 0 2026-03-10T14:08:05.822 INFO:tasks.workunit.client.0.vm03.stdout:9/175: symlink d2/d29/d38/l3b 0 2026-03-10T14:08:05.823 INFO:tasks.workunit.client.1.vm04.stdout:5/781: link d7/d2d/fbf d7/d59/d7d/d9a/ff7 0 2026-03-10T14:08:05.823 INFO:tasks.workunit.client.0.vm03.stdout:8/215: creat da/d36/f42 x:0 0 0 2026-03-10T14:08:05.824 INFO:tasks.workunit.client.1.vm04.stdout:0/698: truncate d0/d2/f12 2227898 0 2026-03-10T14:08:05.825 INFO:tasks.workunit.client.0.vm03.stdout:9/176: creat d2/d29/d33/f3c x:0 0 0 2026-03-10T14:08:05.825 INFO:tasks.workunit.client.0.vm03.stdout:8/216: dread - da/f1f zero size 2026-03-10T14:08:05.826 INFO:tasks.workunit.client.1.vm04.stdout:5/782: rename d7/d9/l35 to d7/d59/lf8 0 2026-03-10T14:08:05.827 INFO:tasks.workunit.client.0.vm03.stdout:8/217: read - da/f30 zero size 2026-03-10T14:08:05.827 INFO:tasks.workunit.client.1.vm04.stdout:5/783: symlink d7/d12/d2b/d3e/d3f/da6/lf9 0 2026-03-10T14:08:05.829 INFO:tasks.workunit.client.1.vm04.stdout:6/609: getdents d3/de/d35/d3a 0 2026-03-10T14:08:05.831 INFO:tasks.workunit.client.1.vm04.stdout:0/699: getdents d0 0 2026-03-10T14:08:05.834 INFO:tasks.workunit.client.1.vm04.stdout:5/784: link d7/d59/d7e/d87/fe6 d7/d59/d7e/ffa 0 2026-03-10T14:08:05.834 INFO:tasks.workunit.client.1.vm04.stdout:5/785: rename d7/d2d/d69/c7a to d7/d12/d2b/d93/d9e/cfb 0 2026-03-10T14:08:05.835 INFO:tasks.workunit.client.1.vm04.stdout:5/786: unlink d7/d26/d6b/d6e/d82/ca8 0 2026-03-10T14:08:05.835 INFO:tasks.workunit.client.0.vm03.stdout:8/218: unlink da/f1b 0 2026-03-10T14:08:05.837 INFO:tasks.workunit.client.0.vm03.stdout:2/208: sync 2026-03-10T14:08:05.837 INFO:tasks.workunit.client.0.vm03.stdout:0/206: sync 2026-03-10T14:08:05.838 INFO:tasks.workunit.client.1.vm04.stdout:5/787: symlink d7/d26/d6b/d6e/lfc 0 2026-03-10T14:08:05.840 INFO:tasks.workunit.client.1.vm04.stdout:5/788: unlink d7/d12/d45/f5c 0 2026-03-10T14:08:05.845 INFO:tasks.workunit.client.1.vm04.stdout:6/610: read d3/de/d35/d3a/d43/f90 [151086,70701] 0 2026-03-10T14:08:05.852 INFO:tasks.workunit.client.1.vm04.stdout:6/611: link d3/de/d35/d3f/d2d/fa6 d3/de/d35/d3f/d2d/d32/d23/d83/dad/fbb 0 2026-03-10T14:08:05.855 INFO:tasks.workunit.client.0.vm03.stdout:2/209: dread d5/d10/d17/f26 [0,4194304] 0 2026-03-10T14:08:05.855 INFO:tasks.workunit.client.1.vm04.stdout:6/612: mknod d3/d1d/d73/cbc 0 2026-03-10T14:08:05.856 INFO:tasks.workunit.client.1.vm04.stdout:6/613: fdatasync d3/de/d35/d3f/d2d/d32/d23/d24/f36 0 2026-03-10T14:08:05.858 INFO:tasks.workunit.client.0.vm03.stdout:2/210: rename d5/d10/d1c/f1d to d5/d10/d1c/f3c 0 2026-03-10T14:08:05.859 INFO:tasks.workunit.client.0.vm03.stdout:2/211: creat d5/d10/d31/f3d x:0 0 0 2026-03-10T14:08:05.859 INFO:tasks.workunit.client.1.vm04.stdout:6/614: mknod d3/de/d35/d3a/d43/cbd 0 2026-03-10T14:08:05.860 INFO:tasks.workunit.client.1.vm04.stdout:6/615: stat d3/de/d35/d3f/d2d/d38/f50 0 2026-03-10T14:08:05.862 INFO:tasks.workunit.client.0.vm03.stdout:2/212: fdatasync d5/d2a/f37 0 2026-03-10T14:08:05.865 INFO:tasks.workunit.client.1.vm04.stdout:0/700: dread d0/fb0 [0,4194304] 0 2026-03-10T14:08:05.868 INFO:tasks.workunit.client.1.vm04.stdout:6/616: symlink d3/de/d35/d3f/d2d/d32/d23/d83/dad/lbe 0 2026-03-10T14:08:05.869 INFO:tasks.workunit.client.1.vm04.stdout:6/617: rmdir d3/de/d35/d3f/d2d/d32/d23/d24/d8e 39 2026-03-10T14:08:05.928 INFO:tasks.workunit.client.0.vm03.stdout:4/197: dwrite d5/d9/db/f10 [0,4194304] 0 2026-03-10T14:08:05.931 INFO:tasks.workunit.client.0.vm03.stdout:6/177: write f3 [1702662,22634] 0 2026-03-10T14:08:05.934 INFO:tasks.workunit.client.0.vm03.stdout:3/137: truncate d1d/f28 1764590 0 2026-03-10T14:08:05.935 INFO:tasks.workunit.client.0.vm03.stdout:4/198: dwrite f3 [0,4194304] 0 2026-03-10T14:08:05.941 INFO:tasks.workunit.client.0.vm03.stdout:6/178: creat d8/d11/f35 x:0 0 0 2026-03-10T14:08:05.941 INFO:tasks.workunit.client.0.vm03.stdout:4/199: truncate d5/d9/db/f32 676484 0 2026-03-10T14:08:05.945 INFO:tasks.workunit.client.0.vm03.stdout:4/200: truncate d5/d9/db/ff 759224 0 2026-03-10T14:08:05.945 INFO:tasks.workunit.client.0.vm03.stdout:4/201: chown f3 216120 1 2026-03-10T14:08:05.945 INFO:tasks.workunit.client.0.vm03.stdout:4/202: chown d5/d9/db/d19/c39 9269201 1 2026-03-10T14:08:05.946 INFO:tasks.workunit.client.0.vm03.stdout:3/138: getdents d1d 0 2026-03-10T14:08:05.946 INFO:tasks.workunit.client.0.vm03.stdout:4/203: fsync d5/d9/db/f32 0 2026-03-10T14:08:05.947 INFO:tasks.workunit.client.0.vm03.stdout:4/204: write d5/f7 [1691886,55347] 0 2026-03-10T14:08:05.952 INFO:tasks.workunit.client.0.vm03.stdout:6/179: symlink d8/l36 0 2026-03-10T14:08:05.961 INFO:tasks.workunit.client.0.vm03.stdout:1/220: dwrite d0/d2/f46 [0,4194304] 0 2026-03-10T14:08:05.963 INFO:tasks.workunit.client.0.vm03.stdout:4/205: creat d5/d9/f44 x:0 0 0 2026-03-10T14:08:05.968 INFO:tasks.workunit.client.0.vm03.stdout:6/180: creat d8/db/df/f37 x:0 0 0 2026-03-10T14:08:05.968 INFO:tasks.workunit.client.1.vm04.stdout:8/792: dwrite d0/d3/d63/d12/d51/d67/fb2 [4194304,4194304] 0 2026-03-10T14:08:05.969 INFO:tasks.workunit.client.0.vm03.stdout:3/139: truncate d1d/f26 983052 0 2026-03-10T14:08:05.972 INFO:tasks.workunit.client.0.vm03.stdout:6/181: dwrite d8/db/d12/f26 [0,4194304] 0 2026-03-10T14:08:05.972 INFO:tasks.workunit.client.0.vm03.stdout:4/206: dwrite d5/d9/db/f20 [4194304,4194304] 0 2026-03-10T14:08:05.990 INFO:tasks.workunit.client.0.vm03.stdout:1/221: creat d0/d2/df/d27/f49 x:0 0 0 2026-03-10T14:08:05.991 INFO:tasks.workunit.client.0.vm03.stdout:1/222: chown d0/d2/df/d16/f1e 1273365072 1 2026-03-10T14:08:05.991 INFO:tasks.workunit.client.1.vm04.stdout:8/793: unlink d0/d3/d63/d29/fbb 0 2026-03-10T14:08:05.996 INFO:tasks.workunit.client.0.vm03.stdout:3/140: creat d1d/f2b x:0 0 0 2026-03-10T14:08:05.999 INFO:tasks.workunit.client.0.vm03.stdout:1/223: dread d0/d2/df/d27/f32 [0,4194304] 0 2026-03-10T14:08:05.999 INFO:tasks.workunit.client.0.vm03.stdout:5/253: dwrite d4/f55 [0,4194304] 0 2026-03-10T14:08:06.009 INFO:tasks.workunit.client.1.vm04.stdout:8/794: rename d0/d82/fa0 to d0/dc1/def/ff6 0 2026-03-10T14:08:06.026 INFO:tasks.workunit.client.1.vm04.stdout:8/795: fdatasync d0/d3/dd/fc0 0 2026-03-10T14:08:06.026 INFO:tasks.workunit.client.1.vm04.stdout:1/708: dwrite d3/d5c/fb2 [0,4194304] 0 2026-03-10T14:08:06.026 INFO:tasks.workunit.client.1.vm04.stdout:7/748: dwrite d2/d94/f9a [0,4194304] 0 2026-03-10T14:08:06.026 INFO:tasks.workunit.client.1.vm04.stdout:4/667: dwrite d4/d14/d1b/f99 [0,4194304] 0 2026-03-10T14:08:06.031 INFO:tasks.workunit.client.0.vm03.stdout:4/207: creat d5/d9/db/d19/f45 x:0 0 0 2026-03-10T14:08:06.031 INFO:tasks.workunit.client.1.vm04.stdout:8/796: chown d0/l4e 98 1 2026-03-10T14:08:06.035 INFO:tasks.workunit.client.1.vm04.stdout:1/709: creat d3/d5c/ff2 x:0 0 0 2026-03-10T14:08:06.037 INFO:tasks.workunit.client.0.vm03.stdout:3/141: creat d1d/f2c x:0 0 0 2026-03-10T14:08:06.042 INFO:tasks.workunit.client.1.vm04.stdout:3/725: dwrite da/dc/f39 [0,4194304] 0 2026-03-10T14:08:06.042 INFO:tasks.workunit.client.0.vm03.stdout:5/254: sync 2026-03-10T14:08:06.042 INFO:tasks.workunit.client.0.vm03.stdout:5/255: read - d4/d6/de/f4f zero size 2026-03-10T14:08:06.043 INFO:tasks.workunit.client.0.vm03.stdout:5/256: truncate d4/d13/d43/f51 50841 0 2026-03-10T14:08:06.044 INFO:tasks.workunit.client.1.vm04.stdout:0/701: dread - d0/d2/d15/d49/d50/d5c/da4/fdf zero size 2026-03-10T14:08:06.049 INFO:tasks.workunit.client.0.vm03.stdout:1/224: unlink d0/d2/l6 0 2026-03-10T14:08:06.049 INFO:tasks.workunit.client.0.vm03.stdout:1/225: stat d0/f24 0 2026-03-10T14:08:06.049 INFO:tasks.workunit.client.1.vm04.stdout:4/668: creat d4/d14/d3c/d62/fe4 x:0 0 0 2026-03-10T14:08:06.050 INFO:tasks.workunit.client.1.vm04.stdout:9/645: write d9/da/d5d/f9b [1588388,10342] 0 2026-03-10T14:08:06.050 INFO:tasks.workunit.client.1.vm04.stdout:4/669: fsync d4/d14/d3c/f41 0 2026-03-10T14:08:06.053 INFO:tasks.workunit.client.0.vm03.stdout:1/226: dwrite d0/f10 [0,4194304] 0 2026-03-10T14:08:06.057 INFO:tasks.workunit.client.1.vm04.stdout:1/710: mkdir d3/d22/d63/d35/dd9/d13/d38/df3 0 2026-03-10T14:08:06.068 INFO:tasks.workunit.client.0.vm03.stdout:9/177: rmdir d2/d29/d33 39 2026-03-10T14:08:06.071 INFO:tasks.workunit.client.0.vm03.stdout:8/219: getdents da 0 2026-03-10T14:08:06.071 INFO:tasks.workunit.client.0.vm03.stdout:9/178: dread d2/f1e [0,4194304] 0 2026-03-10T14:08:06.071 INFO:tasks.workunit.client.1.vm04.stdout:1/711: chown d3/d22/d63/d35/dd9/d13/d38/d58/dcc 5 1 2026-03-10T14:08:06.071 INFO:tasks.workunit.client.1.vm04.stdout:8/797: dread d0/d3/d63/d12/d51/d67/d96/fbf [0,4194304] 0 2026-03-10T14:08:06.071 INFO:tasks.workunit.client.1.vm04.stdout:5/789: getdents d7/d2d/d69 0 2026-03-10T14:08:06.072 INFO:tasks.workunit.client.0.vm03.stdout:3/142: dread fb [0,4194304] 0 2026-03-10T14:08:06.072 INFO:tasks.workunit.client.0.vm03.stdout:3/143: write d1d/f2c [189174,10341] 0 2026-03-10T14:08:06.072 INFO:tasks.workunit.client.0.vm03.stdout:3/144: fdatasync d1d/f2b 0 2026-03-10T14:08:06.072 INFO:tasks.workunit.client.0.vm03.stdout:5/257: dread - d4/d35/f54 zero size 2026-03-10T14:08:06.075 INFO:tasks.workunit.client.0.vm03.stdout:3/145: dwrite f1b [0,4194304] 0 2026-03-10T14:08:06.076 INFO:tasks.workunit.client.0.vm03.stdout:0/207: truncate d3/fe 2322022 0 2026-03-10T14:08:06.077 INFO:tasks.workunit.client.0.vm03.stdout:0/208: readlink d3/d17/l2f 0 2026-03-10T14:08:06.080 INFO:tasks.workunit.client.0.vm03.stdout:4/208: link d5/d9/db/d19/f3a d5/d9/d3f/f46 0 2026-03-10T14:08:06.084 INFO:tasks.workunit.client.1.vm04.stdout:9/646: creat d9/da/dd/d1c/fdd x:0 0 0 2026-03-10T14:08:06.084 INFO:tasks.workunit.client.1.vm04.stdout:4/670: mknod d4/df/db2/db6/dc9/dd0/ce5 0 2026-03-10T14:08:06.084 INFO:tasks.workunit.client.0.vm03.stdout:4/209: chown d5/d9/db/f10 350874 1 2026-03-10T14:08:06.084 INFO:tasks.workunit.client.0.vm03.stdout:4/210: read f3 [573445,24164] 0 2026-03-10T14:08:06.088 INFO:tasks.workunit.client.1.vm04.stdout:3/726: read da/d30/f42 [392612,22916] 0 2026-03-10T14:08:06.091 INFO:tasks.workunit.client.1.vm04.stdout:1/712: unlink d3/d22/d2f/f39 0 2026-03-10T14:08:06.094 INFO:tasks.workunit.client.0.vm03.stdout:2/213: write d5/d10/d17/f33 [401950,65739] 0 2026-03-10T14:08:06.094 INFO:tasks.workunit.client.1.vm04.stdout:6/618: truncate d3/d1d/d73/fc 1970301 0 2026-03-10T14:08:06.098 INFO:tasks.workunit.client.1.vm04.stdout:0/702: symlink d0/d2/d15/le0 0 2026-03-10T14:08:06.099 INFO:tasks.workunit.client.0.vm03.stdout:5/258: rmdir d4/d13/d43 39 2026-03-10T14:08:06.100 INFO:tasks.workunit.client.0.vm03.stdout:5/259: chown d4/d16/d19/d23/d3f/l49 7148 1 2026-03-10T14:08:06.100 INFO:tasks.workunit.client.1.vm04.stdout:9/647: fsync d9/da/f6b 0 2026-03-10T14:08:06.102 INFO:tasks.workunit.client.1.vm04.stdout:4/671: mkdir d4/d14/d3c/d62/de6 0 2026-03-10T14:08:06.102 INFO:tasks.workunit.client.0.vm03.stdout:9/179: rmdir d2/d14 39 2026-03-10T14:08:06.103 INFO:tasks.workunit.client.1.vm04.stdout:3/727: dread - da/d3e/fb9 zero size 2026-03-10T14:08:06.103 INFO:tasks.workunit.client.0.vm03.stdout:5/260: dwrite d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:06.107 INFO:tasks.workunit.client.1.vm04.stdout:3/728: dwrite da/d8e/db5/ff9 [0,4194304] 0 2026-03-10T14:08:06.108 INFO:tasks.workunit.client.0.vm03.stdout:5/261: dread d4/f37 [0,4194304] 0 2026-03-10T14:08:06.108 INFO:tasks.workunit.client.1.vm04.stdout:3/729: stat da/d30/fe9 0 2026-03-10T14:08:06.109 INFO:tasks.workunit.client.1.vm04.stdout:3/730: truncate da/dc/d47/fea 764819 0 2026-03-10T14:08:06.109 INFO:tasks.workunit.client.0.vm03.stdout:4/211: fdatasync d5/d9/d3f/f46 0 2026-03-10T14:08:06.117 INFO:tasks.workunit.client.0.vm03.stdout:1/227: symlink d0/d2/df/d16/d41/l4a 0 2026-03-10T14:08:06.125 INFO:tasks.workunit.client.0.vm03.stdout:2/214: creat d5/d10/d1f/f3e x:0 0 0 2026-03-10T14:08:06.125 INFO:tasks.workunit.client.0.vm03.stdout:2/215: fdatasync d5/d10/d17/f18 0 2026-03-10T14:08:06.129 INFO:tasks.workunit.client.1.vm04.stdout:9/648: creat d9/da/d5d/fde x:0 0 0 2026-03-10T14:08:06.135 INFO:tasks.workunit.client.1.vm04.stdout:4/672: truncate d4/d14/d1b/fc0 467449 0 2026-03-10T14:08:06.137 INFO:tasks.workunit.client.1.vm04.stdout:1/713: creat d3/d22/d63/d35/dd9/d13/d38/df3/ff4 x:0 0 0 2026-03-10T14:08:06.138 INFO:tasks.workunit.client.0.vm03.stdout:9/180: dread d2/d14/f1b [0,4194304] 0 2026-03-10T14:08:06.141 INFO:tasks.workunit.client.0.vm03.stdout:9/181: dread d2/d14/f1b [0,4194304] 0 2026-03-10T14:08:06.142 INFO:tasks.workunit.client.1.vm04.stdout:8/798: rename d0/d3/d63/d12/d51/d67/d96/l65 to d0/d3/d63/d12/lf7 0 2026-03-10T14:08:06.146 INFO:tasks.workunit.client.0.vm03.stdout:5/262: creat d4/d40/d4e/f5c x:0 0 0 2026-03-10T14:08:06.146 INFO:tasks.workunit.client.0.vm03.stdout:5/263: write d4/d6/de/f4f [613585,45004] 0 2026-03-10T14:08:06.154 INFO:tasks.workunit.client.1.vm04.stdout:3/731: dread da/dc/d35/d52/d6d/f75 [0,4194304] 0 2026-03-10T14:08:06.154 INFO:tasks.workunit.client.0.vm03.stdout:8/220: rename da/ff to da/d24/f43 0 2026-03-10T14:08:06.155 INFO:tasks.workunit.client.0.vm03.stdout:2/216: chown d5/d10/d17/c27 81550217 1 2026-03-10T14:08:06.159 INFO:tasks.workunit.client.0.vm03.stdout:2/217: dwrite d5/d10/f39 [4194304,4194304] 0 2026-03-10T14:08:06.159 INFO:tasks.workunit.client.1.vm04.stdout:0/703: rename d0/d2/d15/d22/lb5 to d0/d2/dbe/le1 0 2026-03-10T14:08:06.165 INFO:tasks.workunit.client.1.vm04.stdout:1/714: dread d3/d22/d2f/f34 [0,4194304] 0 2026-03-10T14:08:06.170 INFO:tasks.workunit.client.0.vm03.stdout:0/209: link d3/d11/l23 d3/d16/d21/l43 0 2026-03-10T14:08:06.170 INFO:tasks.workunit.client.0.vm03.stdout:0/210: stat d3/d16/d21/c40 0 2026-03-10T14:08:06.170 INFO:tasks.workunit.client.0.vm03.stdout:5/264: symlink d4/d6/de/l5d 0 2026-03-10T14:08:06.170 INFO:tasks.workunit.client.0.vm03.stdout:4/212: sync 2026-03-10T14:08:06.170 INFO:tasks.workunit.client.0.vm03.stdout:1/228: sync 2026-03-10T14:08:06.171 INFO:tasks.workunit.client.1.vm04.stdout:4/673: link d4/df/db2/db6/dc9/dd0/ce5 d4/df/db2/de1/ce7 0 2026-03-10T14:08:06.176 INFO:tasks.workunit.client.1.vm04.stdout:0/704: mknod d0/d2/d15/d49/d50/d61/ce2 0 2026-03-10T14:08:06.176 INFO:tasks.workunit.client.0.vm03.stdout:3/146: link l5 d1d/d29/l2d 0 2026-03-10T14:08:06.177 INFO:tasks.workunit.client.1.vm04.stdout:0/705: stat d0/d2/d15/d49/d50/d5c/fcd 0 2026-03-10T14:08:06.178 INFO:tasks.workunit.client.1.vm04.stdout:0/706: chown d0/d2/d15/d22/d38/d56/d66/cb6 96 1 2026-03-10T14:08:06.178 INFO:tasks.workunit.client.0.vm03.stdout:4/213: dread d5/d9/db/f10 [0,4194304] 0 2026-03-10T14:08:06.178 INFO:tasks.workunit.client.1.vm04.stdout:1/715: symlink d3/d22/d63/d35/lf5 0 2026-03-10T14:08:06.178 INFO:tasks.workunit.client.0.vm03.stdout:3/147: write fe [4708538,38219] 0 2026-03-10T14:08:06.179 INFO:tasks.workunit.client.1.vm04.stdout:0/707: write d0/d2/d15/d22/d38/d56/dc1/dd4/fd9 [663557,389] 0 2026-03-10T14:08:06.181 INFO:tasks.workunit.client.1.vm04.stdout:4/674: mknod d4/d14/dac/db7/ce8 0 2026-03-10T14:08:06.182 INFO:tasks.workunit.client.0.vm03.stdout:1/229: dwrite d0/d2/df/d27/f3f [0,4194304] 0 2026-03-10T14:08:06.182 INFO:tasks.workunit.client.1.vm04.stdout:4/675: chown d4/d14/d3c/d62/lcd 1310495 1 2026-03-10T14:08:06.183 INFO:tasks.workunit.client.0.vm03.stdout:1/230: rename d0/d2/df to d0/d2/df/d16/d41/d4b 22 2026-03-10T14:08:06.183 INFO:tasks.workunit.client.0.vm03.stdout:1/231: fsync d0/d2/df/d16/d20/f38 0 2026-03-10T14:08:06.183 INFO:tasks.workunit.client.1.vm04.stdout:1/716: creat d3/d22/d63/d35/dd9/d13/d38/db5/dc4/ff6 x:0 0 0 2026-03-10T14:08:06.184 INFO:tasks.workunit.client.1.vm04.stdout:1/717: write d3/d22/d6d/fe3 [763266,102106] 0 2026-03-10T14:08:06.186 INFO:tasks.workunit.client.1.vm04.stdout:0/708: rmdir d0/d2/d15/d22/d62 39 2026-03-10T14:08:06.186 INFO:tasks.workunit.client.0.vm03.stdout:1/232: write d0/d18/d3b/f3d [154694,66251] 0 2026-03-10T14:08:06.186 INFO:tasks.workunit.client.1.vm04.stdout:0/709: fdatasync d0/d2/d25/f2a 0 2026-03-10T14:08:06.186 INFO:tasks.workunit.client.0.vm03.stdout:4/214: dwrite d5/d9/db/f29 [4194304,4194304] 0 2026-03-10T14:08:06.187 INFO:tasks.workunit.client.1.vm04.stdout:0/710: stat d0/d2/dbe/fd7 0 2026-03-10T14:08:06.188 INFO:tasks.workunit.client.0.vm03.stdout:1/233: chown d0/d2/l37 595 1 2026-03-10T14:08:06.188 INFO:tasks.workunit.client.0.vm03.stdout:1/234: fdatasync d0/d2/df/d27/f49 0 2026-03-10T14:08:06.192 INFO:tasks.workunit.client.0.vm03.stdout:4/215: chown d5/d9/db/f2a 336996 1 2026-03-10T14:08:06.196 INFO:tasks.workunit.client.0.vm03.stdout:4/216: dwrite d5/f3c [0,4194304] 0 2026-03-10T14:08:06.201 INFO:tasks.workunit.client.0.vm03.stdout:4/217: stat d5/d9/f25 0 2026-03-10T14:08:06.210 INFO:tasks.workunit.client.0.vm03.stdout:4/218: stat d5/lc 0 2026-03-10T14:08:06.211 INFO:tasks.workunit.client.0.vm03.stdout:4/219: dwrite d5/d9/db/f24 [0,4194304] 0 2026-03-10T14:08:06.211 INFO:tasks.workunit.client.0.vm03.stdout:4/220: read f3 [3621944,101610] 0 2026-03-10T14:08:06.223 INFO:tasks.workunit.client.0.vm03.stdout:3/148: chown lf 1599385 1 2026-03-10T14:08:06.238 INFO:tasks.workunit.client.1.vm04.stdout:0/711: mknod d0/d2/d15/d22/ce3 0 2026-03-10T14:08:06.256 INFO:tasks.workunit.client.1.vm04.stdout:7/749: dwrite d2/dc/de/d2d/d60/d7c/d44/f51 [0,4194304] 0 2026-03-10T14:08:06.277 INFO:tasks.workunit.client.0.vm03.stdout:6/182: dread d8/db/df/f10 [0,4194304] 0 2026-03-10T14:08:06.278 INFO:tasks.workunit.client.0.vm03.stdout:9/182: creat d2/d14/f3d x:0 0 0 2026-03-10T14:08:06.278 INFO:tasks.workunit.client.0.vm03.stdout:9/183: stat d2/f2f 0 2026-03-10T14:08:06.280 INFO:tasks.workunit.client.0.vm03.stdout:6/183: dread d8/db/d12/f26 [0,4194304] 0 2026-03-10T14:08:06.283 INFO:tasks.workunit.client.0.vm03.stdout:1/235: creat d0/d18/d3b/f4c x:0 0 0 2026-03-10T14:08:06.297 INFO:tasks.workunit.client.0.vm03.stdout:5/265: mknod d4/c5e 0 2026-03-10T14:08:06.300 INFO:tasks.workunit.client.1.vm04.stdout:5/790: truncate d7/d9/db5/fd1 2220047 0 2026-03-10T14:08:06.301 INFO:tasks.workunit.client.1.vm04.stdout:6/619: write d3/de/d35/d3a/d43/d4c/f53 [1538064,75286] 0 2026-03-10T14:08:06.301 INFO:tasks.workunit.client.1.vm04.stdout:6/620: stat d3/de/f6d 0 2026-03-10T14:08:06.303 INFO:tasks.workunit.client.0.vm03.stdout:3/149: write d1d/f28 [286754,97157] 0 2026-03-10T14:08:06.306 INFO:tasks.workunit.client.1.vm04.stdout:5/791: mknod d7/db7/cfd 0 2026-03-10T14:08:06.316 INFO:tasks.workunit.client.0.vm03.stdout:6/184: symlink d8/d1b/d1c/l38 0 2026-03-10T14:08:06.317 INFO:tasks.workunit.client.0.vm03.stdout:5/266: truncate d4/d6/fa 1882754 0 2026-03-10T14:08:06.319 INFO:tasks.workunit.client.0.vm03.stdout:4/221: rename d5/d9/d3f to d5/d47 0 2026-03-10T14:08:06.320 INFO:tasks.workunit.client.0.vm03.stdout:3/150: creat d1d/d29/f2e x:0 0 0 2026-03-10T14:08:06.320 INFO:tasks.workunit.client.0.vm03.stdout:5/267: symlink d4/d13/d1f/l5f 0 2026-03-10T14:08:06.321 INFO:tasks.workunit.client.0.vm03.stdout:4/222: chown d5/d9/f44 61592 1 2026-03-10T14:08:06.323 INFO:tasks.workunit.client.0.vm03.stdout:1/236: dwrite d0/d2/df/f1f [4194304,4194304] 0 2026-03-10T14:08:06.330 INFO:tasks.workunit.client.0.vm03.stdout:4/223: unlink d5/d9/db/d19/c39 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:4/224: truncate d5/d9/db/f3d 693872 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:4/225: stat d5/d9/c41 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:5/268: write d4/d13/d43/f58 [5462826,44177] 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:1/237: rename d0/d2/df/d16/d20/l33 to d0/d2/df/d27/l4d 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:5/269: symlink d4/d16/d19/d23/l60 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:5/270: fsync d4/d40/d4e/f56 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:1/238: write d0/d2/df/d27/f3f [2722997,45578] 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:1/239: stat d0/d2/f46 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:1/240: write d0/d2/f46 [3302440,39347] 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:5/271: rename d4/d16/l27 to d4/d6/l61 0 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:1/241: dread - d0/d18/f21 zero size 2026-03-10T14:08:06.366 INFO:tasks.workunit.client.0.vm03.stdout:1/242: write d0/f10 [2072271,41975] 0 2026-03-10T14:08:06.367 INFO:tasks.workunit.client.0.vm03.stdout:5/272: dread d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:08:06.367 INFO:tasks.workunit.client.0.vm03.stdout:5/273: dread d4/d6/fb [4194304,4194304] 0 2026-03-10T14:08:06.367 INFO:tasks.workunit.client.0.vm03.stdout:5/274: chown d4/d13 1592837 1 2026-03-10T14:08:06.367 INFO:tasks.workunit.client.0.vm03.stdout:5/275: mknod d4/d6/c62 0 2026-03-10T14:08:06.367 INFO:tasks.workunit.client.0.vm03.stdout:5/276: readlink l1 0 2026-03-10T14:08:06.367 INFO:tasks.workunit.client.0.vm03.stdout:5/277: creat d4/d6/f63 x:0 0 0 2026-03-10T14:08:06.370 INFO:tasks.workunit.client.0.vm03.stdout:4/226: dread f3 [8388608,4194304] 0 2026-03-10T14:08:06.384 INFO:tasks.workunit.client.0.vm03.stdout:4/227: stat d5/d9/f11 0 2026-03-10T14:08:06.384 INFO:tasks.workunit.client.0.vm03.stdout:4/228: link d5/f7 d5/d9/db/f48 0 2026-03-10T14:08:06.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:08:06.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:08:06.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:08:06.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/key"}]: dispatch 2026-03-10T14:08:06.407 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: Manager daemon vm04.ywwcto is now available 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: Migrating agent root cert to cert store 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: Migrating agent root key to cert store 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: Checking for cert/key for grafana.vm03 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: Migrating grafana.vm03 cert to cert store 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: Migrating grafana.vm03 key to cert store 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.408 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:06 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:06.410 INFO:tasks.workunit.client.1.vm04.stdout:0/712: sync 2026-03-10T14:08:06.411 INFO:tasks.workunit.client.1.vm04.stdout:7/750: sync 2026-03-10T14:08:06.411 INFO:tasks.workunit.client.1.vm04.stdout:7/751: stat d2/dc/de/d2d/d38/d50/fbc 0 2026-03-10T14:08:06.412 INFO:tasks.workunit.client.1.vm04.stdout:7/752: stat d2/dc/de/d2d/d38/d50/dc8/cdd 0 2026-03-10T14:08:06.414 INFO:tasks.workunit.client.1.vm04.stdout:7/753: rename d2/dc/de/d11/de7/cfa to d2/dc/de/d2d/d60/d7c/d44/d102/c10a 0 2026-03-10T14:08:06.418 INFO:tasks.workunit.client.1.vm04.stdout:0/713: link d0/d2/d15/c73 d0/d2/d15/d22/d38/d56/dcb/ce4 0 2026-03-10T14:08:06.418 INFO:tasks.workunit.client.1.vm04.stdout:0/714: dread d0/fb0 [0,4194304] 0 2026-03-10T14:08:06.430 INFO:tasks.workunit.client.1.vm04.stdout:0/715: fsync d0/d2/d15/d22/d38/d56/d66/f2b 0 2026-03-10T14:08:06.431 INFO:tasks.workunit.client.1.vm04.stdout:0/716: symlink d0/d2/d15/d22/le5 0 2026-03-10T14:08:06.431 INFO:tasks.workunit.client.1.vm04.stdout:0/717: rename d0/d2/d15/d49/d50/d61/f8d to d0/d2/d15/d22/d38/fe6 0 2026-03-10T14:08:06.431 INFO:tasks.workunit.client.0.vm03.stdout:4/229: sync 2026-03-10T14:08:06.434 INFO:tasks.workunit.client.0.vm03.stdout:4/230: creat d5/f49 x:0 0 0 2026-03-10T14:08:06.440 INFO:tasks.workunit.client.0.vm03.stdout:4/231: getdents d5/d9/d2b 0 2026-03-10T14:08:06.444 INFO:tasks.workunit.client.0.vm03.stdout:4/232: dwrite d5/d9/db/f3d [0,4194304] 0 2026-03-10T14:08:06.465 INFO:tasks.workunit.client.1.vm04.stdout:0/718: dread d0/d2/d15/d22/d38/d56/dc1/fc3 [0,4194304] 0 2026-03-10T14:08:06.469 INFO:tasks.workunit.client.1.vm04.stdout:0/719: read d0/d2/d15/d22/d38/f93 [500860,100783] 0 2026-03-10T14:08:06.470 INFO:tasks.workunit.client.1.vm04.stdout:0/720: creat d0/d2/d15/d49/d50/d5c/da4/fe7 x:0 0 0 2026-03-10T14:08:06.471 INFO:tasks.workunit.client.1.vm04.stdout:0/721: creat d0/d2/d90/fe8 x:0 0 0 2026-03-10T14:08:06.472 INFO:tasks.workunit.client.1.vm04.stdout:0/722: fsync d0/d2/d15/d49/d50/d5c/da4/fdf 0 2026-03-10T14:08:06.475 INFO:tasks.workunit.client.1.vm04.stdout:0/723: dwrite d0/d2/f17 [0,4194304] 0 2026-03-10T14:08:06.479 INFO:tasks.workunit.client.1.vm04.stdout:0/724: creat d0/d2/d15/d22/d62/fe9 x:0 0 0 2026-03-10T14:08:06.479 INFO:tasks.workunit.client.1.vm04.stdout:0/725: unlink d0/d2/d15/d22/d38/d56/d66/cb6 0 2026-03-10T14:08:06.481 INFO:tasks.workunit.client.1.vm04.stdout:0/726: truncate d0/d2/d15/d49/f94 1804447 0 2026-03-10T14:08:06.497 INFO:tasks.workunit.client.1.vm04.stdout:9/649: dread d9/da/d5d/f9b [0,4194304] 0 2026-03-10T14:08:06.498 INFO:tasks.workunit.client.1.vm04.stdout:9/650: fdatasync d9/da/dd/f47 0 2026-03-10T14:08:06.498 INFO:tasks.workunit.client.1.vm04.stdout:9/651: fdatasync d9/da/d5d/d81/fcd 0 2026-03-10T14:08:06.500 INFO:tasks.workunit.client.1.vm04.stdout:9/652: rename d9/c46 to d9/d44/cdf 0 2026-03-10T14:08:06.544 INFO:tasks.workunit.client.1.vm04.stdout:8/799: write d0/d3/d63/d12/d51/d67/d96/dc8/fd9 [578281,92707] 0 2026-03-10T14:08:06.546 INFO:tasks.workunit.client.1.vm04.stdout:3/732: dwrite da/dc/d3f/f4d [0,4194304] 0 2026-03-10T14:08:06.551 INFO:tasks.workunit.client.0.vm03.stdout:0/211: dwrite d3/d11/f13 [0,4194304] 0 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/key"}]: dispatch 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: Manager daemon vm04.ywwcto is now available 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: Migrating agent root cert to cert store 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: Migrating agent root key to cert store 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: Checking for cert/key for grafana.vm03 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: Migrating grafana.vm03 cert to cert store 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: Migrating grafana.vm03 key to cert store 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:06.563 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:06 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:06.577 INFO:tasks.workunit.client.1.vm04.stdout:8/800: unlink d0/d3/f35 0 2026-03-10T14:08:06.578 INFO:tasks.workunit.client.0.vm03.stdout:0/212: dread - d3/d16/f34 zero size 2026-03-10T14:08:06.579 INFO:tasks.workunit.client.1.vm04.stdout:8/801: mkdir d0/d3/d73/db8/dd5/df8 0 2026-03-10T14:08:06.580 INFO:tasks.workunit.client.1.vm04.stdout:8/802: chown d0/d3/dd/d89/fd0 263 1 2026-03-10T14:08:06.589 INFO:tasks.workunit.client.1.vm04.stdout:8/803: creat d0/d3/d63/d12/df5/ff9 x:0 0 0 2026-03-10T14:08:06.607 INFO:tasks.workunit.client.1.vm04.stdout:8/804: dread d0/d3/d63/f57 [0,4194304] 0 2026-03-10T14:08:06.610 INFO:tasks.workunit.client.1.vm04.stdout:8/805: creat d0/d3/d63/d29/ffa x:0 0 0 2026-03-10T14:08:06.614 INFO:tasks.workunit.client.1.vm04.stdout:1/718: write d3/d22/d63/d35/dd9/d13/da0/dc5/fc6 [1772650,95758] 0 2026-03-10T14:08:06.614 INFO:tasks.workunit.client.1.vm04.stdout:4/676: write d4/df/d31/f5d [1274031,9369] 0 2026-03-10T14:08:06.615 INFO:tasks.workunit.client.1.vm04.stdout:1/719: chown d3/f18 22 1 2026-03-10T14:08:06.620 INFO:tasks.workunit.client.1.vm04.stdout:2/695: dread d0/d14/d91/d4a/d66/f7d [0,4194304] 0 2026-03-10T14:08:06.621 INFO:tasks.workunit.client.1.vm04.stdout:4/677: creat d4/d14/d3c/d5e/fe9 x:0 0 0 2026-03-10T14:08:06.624 INFO:tasks.workunit.client.0.vm03.stdout:2/218: truncate d5/d10/f16 4219601 0 2026-03-10T14:08:06.632 INFO:tasks.workunit.client.1.vm04.stdout:1/720: creat d3/d20/d60/ff7 x:0 0 0 2026-03-10T14:08:06.637 INFO:tasks.workunit.client.0.vm03.stdout:7/191: dread d5/f6 [0,4194304] 0 2026-03-10T14:08:06.637 INFO:tasks.workunit.client.1.vm04.stdout:8/806: mknod d0/d3/dd/cfb 0 2026-03-10T14:08:06.637 INFO:tasks.workunit.client.0.vm03.stdout:7/192: dread - d5/d9/d14/f2d zero size 2026-03-10T14:08:06.638 INFO:tasks.workunit.client.0.vm03.stdout:7/193: chown d5/d9/d14/f2d 297645 1 2026-03-10T14:08:06.644 INFO:tasks.workunit.client.1.vm04.stdout:4/678: unlink d4/d14/d3c/d62/fd5 0 2026-03-10T14:08:06.648 INFO:tasks.workunit.client.1.vm04.stdout:8/807: creat d0/d75/d8a/ffc x:0 0 0 2026-03-10T14:08:06.648 INFO:tasks.workunit.client.1.vm04.stdout:8/808: chown d0/d3/d63/d12/df5/ff9 0 1 2026-03-10T14:08:06.650 INFO:tasks.workunit.client.1.vm04.stdout:1/721: sync 2026-03-10T14:08:06.655 INFO:tasks.workunit.client.1.vm04.stdout:4/679: creat d4/df/db2/db4/d47/d4f/d8c/fea x:0 0 0 2026-03-10T14:08:06.655 INFO:tasks.workunit.client.0.vm03.stdout:7/194: mkdir d5/d9/d14/d26/d39 0 2026-03-10T14:08:06.658 INFO:tasks.workunit.client.1.vm04.stdout:4/680: dwrite d4/d14/d64/fdc [0,4194304] 0 2026-03-10T14:08:06.659 INFO:tasks.workunit.client.1.vm04.stdout:4/681: readlink d4/lc 0 2026-03-10T14:08:06.663 INFO:tasks.workunit.client.1.vm04.stdout:4/682: dread d4/d14/d3c/f41 [0,4194304] 0 2026-03-10T14:08:06.681 INFO:tasks.workunit.client.1.vm04.stdout:1/722: rename d3/d22/l37 to d3/d22/d63/d35/dd9/d13/da0/dc5/lf8 0 2026-03-10T14:08:06.689 INFO:tasks.workunit.client.1.vm04.stdout:4/683: mknod d4/df/d31/ceb 0 2026-03-10T14:08:06.693 INFO:tasks.workunit.client.1.vm04.stdout:1/723: read - d3/d22/d63/d35/dd9/d13/d38/fd3 zero size 2026-03-10T14:08:06.694 INFO:tasks.workunit.client.1.vm04.stdout:1/724: chown d3/d5c/d79/d98 1 1 2026-03-10T14:08:06.701 INFO:tasks.workunit.client.0.vm03.stdout:4/233: dread d5/fd [0,4194304] 0 2026-03-10T14:08:06.703 INFO:tasks.workunit.client.1.vm04.stdout:6/621: write d3/de/d35/d3f/d2d/f21 [196467,128029] 0 2026-03-10T14:08:06.704 INFO:tasks.workunit.client.1.vm04.stdout:6/622: chown d3/de/d35/d3a/d43/cbd 29 1 2026-03-10T14:08:06.705 INFO:tasks.workunit.client.1.vm04.stdout:6/623: chown d3/de/d35/d3a/d43/d4c/f53 1 1 2026-03-10T14:08:06.706 INFO:tasks.workunit.client.0.vm03.stdout:4/234: mknod d5/d9/c4a 0 2026-03-10T14:08:06.713 INFO:tasks.workunit.client.1.vm04.stdout:5/792: dwrite d7/d12/d2b/f46 [4194304,4194304] 0 2026-03-10T14:08:06.715 INFO:tasks.workunit.client.1.vm04.stdout:5/793: stat d7/d26/l39 0 2026-03-10T14:08:06.720 INFO:tasks.workunit.client.0.vm03.stdout:4/235: unlink d5/d9/db/l1d 0 2026-03-10T14:08:06.731 INFO:tasks.workunit.client.0.vm03.stdout:9/184: write d2/f1e [461819,121187] 0 2026-03-10T14:08:06.760 INFO:tasks.workunit.client.0.vm03.stdout:3/151: dwrite fb [0,4194304] 0 2026-03-10T14:08:06.765 INFO:tasks.workunit.client.0.vm03.stdout:3/152: dread fc [0,4194304] 0 2026-03-10T14:08:06.776 INFO:tasks.workunit.client.0.vm03.stdout:1/243: write d0/d2/df/d27/f32 [115995,83473] 0 2026-03-10T14:08:06.777 INFO:tasks.workunit.client.0.vm03.stdout:4/236: link d5/d9/c41 d5/d47/c4b 0 2026-03-10T14:08:06.777 INFO:tasks.workunit.client.0.vm03.stdout:1/244: chown d0/d2/f9 1 1 2026-03-10T14:08:06.780 INFO:tasks.workunit.client.0.vm03.stdout:5/278: truncate d4/d13/d1f/f20 6916255 0 2026-03-10T14:08:06.795 INFO:tasks.workunit.client.1.vm04.stdout:7/754: dwrite d2/dc/f26 [0,4194304] 0 2026-03-10T14:08:06.804 INFO:tasks.workunit.client.0.vm03.stdout:3/153: creat d1d/d29/f2f x:0 0 0 2026-03-10T14:08:06.809 INFO:tasks.workunit.client.0.vm03.stdout:9/185: creat d2/d14/d2b/d34/f3e x:0 0 0 2026-03-10T14:08:06.809 INFO:tasks.workunit.client.0.vm03.stdout:9/186: chown d2/c1f 1 1 2026-03-10T14:08:06.809 INFO:tasks.workunit.client.0.vm03.stdout:9/187: dread - d2/d14/d2b/d34/f3e zero size 2026-03-10T14:08:06.811 INFO:tasks.workunit.client.0.vm03.stdout:4/237: unlink d5/fd 0 2026-03-10T14:08:06.816 INFO:tasks.workunit.client.0.vm03.stdout:4/238: dwrite d5/d9/f44 [0,4194304] 0 2026-03-10T14:08:06.817 INFO:tasks.workunit.client.0.vm03.stdout:1/245: symlink d0/d2/df/d27/l4e 0 2026-03-10T14:08:06.824 INFO:tasks.workunit.client.0.vm03.stdout:5/279: chown d4/d13/d1f/c3d 155416 1 2026-03-10T14:08:06.830 INFO:tasks.workunit.client.1.vm04.stdout:0/727: dwrite d0/d2/d15/d22/d38/f5b [0,4194304] 0 2026-03-10T14:08:06.846 INFO:tasks.workunit.client.0.vm03.stdout:3/154: chown l5 471724524 1 2026-03-10T14:08:06.846 INFO:tasks.workunit.client.0.vm03.stdout:9/188: symlink d2/d14/d2b/d34/l3f 0 2026-03-10T14:08:06.846 INFO:tasks.workunit.client.0.vm03.stdout:9/189: read - d2/d14/f39 zero size 2026-03-10T14:08:06.846 INFO:tasks.workunit.client.1.vm04.stdout:9/653: write d9/da/dd/f31 [1343659,113725] 0 2026-03-10T14:08:06.846 INFO:tasks.workunit.client.1.vm04.stdout:9/654: write d9/da/dd/d1c/f3b [1543807,127110] 0 2026-03-10T14:08:06.846 INFO:tasks.workunit.client.1.vm04.stdout:9/655: fdatasync d9/d5c/dc2/fd8 0 2026-03-10T14:08:06.848 INFO:tasks.workunit.client.0.vm03.stdout:6/185: write f5 [1371589,39472] 0 2026-03-10T14:08:06.849 INFO:tasks.workunit.client.1.vm04.stdout:3/733: write da/dc/d3f/f83 [3741993,103771] 0 2026-03-10T14:08:06.853 INFO:tasks.workunit.client.0.vm03.stdout:6/186: dwrite f3 [0,4194304] 0 2026-03-10T14:08:06.853 INFO:tasks.workunit.client.0.vm03.stdout:6/187: dread - d8/db/df/f27 zero size 2026-03-10T14:08:06.857 INFO:tasks.workunit.client.0.vm03.stdout:4/239: creat d5/d9/db/d19/d34/f4c x:0 0 0 2026-03-10T14:08:06.859 INFO:tasks.workunit.client.0.vm03.stdout:8/221: truncate da/fd 1377528 0 2026-03-10T14:08:06.866 INFO:tasks.workunit.client.1.vm04.stdout:1/725: mknod d3/d22/d63/d35/dd9/d13/d38/db5/dc4/cf9 0 2026-03-10T14:08:06.866 INFO:tasks.workunit.client.0.vm03.stdout:0/213: dwrite d3/f28 [0,4194304] 0 2026-03-10T14:08:06.866 INFO:tasks.workunit.client.1.vm04.stdout:1/726: write d3/d22/fe1 [835759,20644] 0 2026-03-10T14:08:06.867 INFO:tasks.workunit.client.0.vm03.stdout:4/240: dread d5/d9/f25 [0,4194304] 0 2026-03-10T14:08:06.870 INFO:tasks.workunit.client.1.vm04.stdout:5/794: unlink d7/d12/f51 0 2026-03-10T14:08:06.873 INFO:tasks.workunit.client.0.vm03.stdout:4/241: write d5/d9/db/d19/f45 [120122,119164] 0 2026-03-10T14:08:06.875 INFO:tasks.workunit.client.0.vm03.stdout:9/190: chown d2/d14/l2a 253013 1 2026-03-10T14:08:06.876 INFO:tasks.workunit.client.0.vm03.stdout:9/191: read d2/d14/f27 [4168058,103314] 0 2026-03-10T14:08:06.897 INFO:tasks.workunit.client.1.vm04.stdout:0/728: mknod d0/d2/d15/d49/d50/cea 0 2026-03-10T14:08:06.901 INFO:tasks.workunit.client.1.vm04.stdout:2/696: write d0/d14/d91/d4a/d8c/fac [4902279,7359] 0 2026-03-10T14:08:06.913 INFO:tasks.workunit.client.1.vm04.stdout:9/656: rename d9/d5c/f6c to d9/da/d5d/d81/fe0 0 2026-03-10T14:08:06.915 INFO:tasks.workunit.client.0.vm03.stdout:2/219: truncate d5/d10/d1c/f3c 3614319 0 2026-03-10T14:08:06.915 INFO:tasks.workunit.client.0.vm03.stdout:2/220: write d5/d10/d31/f38 [738129,105612] 0 2026-03-10T14:08:06.916 INFO:tasks.workunit.client.0.vm03.stdout:2/221: chown d5/d10/f22 113180570 1 2026-03-10T14:08:06.921 INFO:tasks.workunit.client.1.vm04.stdout:4/684: creat d4/d14/d64/dcb/fec x:0 0 0 2026-03-10T14:08:06.923 INFO:tasks.workunit.client.0.vm03.stdout:4/242: mknod d5/d47/c4d 0 2026-03-10T14:08:06.929 INFO:tasks.workunit.client.1.vm04.stdout:1/727: mkdir d3/d22/d63/d35/dd9/d13/da0/dc5/dfa 0 2026-03-10T14:08:06.930 INFO:tasks.workunit.client.1.vm04.stdout:8/809: dwrite d0/d75/d8a/fa1 [0,4194304] 0 2026-03-10T14:08:06.944 INFO:tasks.workunit.client.1.vm04.stdout:9/657: creat d9/da/dd/d74/fe1 x:0 0 0 2026-03-10T14:08:06.948 INFO:tasks.workunit.client.1.vm04.stdout:3/734: symlink da/dc/d3f/d61/df7/lfa 0 2026-03-10T14:08:06.952 INFO:tasks.workunit.client.1.vm04.stdout:4/685: creat d4/df/db2/db6/dc9/dd0/fed x:0 0 0 2026-03-10T14:08:06.961 INFO:tasks.workunit.client.1.vm04.stdout:0/729: mkdir d0/d2/d15/d49/d50/d5c/dd8/deb 0 2026-03-10T14:08:06.961 INFO:tasks.workunit.client.0.vm03.stdout:2/222: creat d5/d10/d1f/f3f x:0 0 0 2026-03-10T14:08:06.961 INFO:tasks.workunit.client.0.vm03.stdout:4/243: dwrite d5/d9/f25 [0,4194304] 0 2026-03-10T14:08:06.961 INFO:tasks.workunit.client.0.vm03.stdout:8/222: mkdir da/d3a/d44 0 2026-03-10T14:08:06.962 INFO:tasks.workunit.client.0.vm03.stdout:9/192: mknod d2/d29/d33/c40 0 2026-03-10T14:08:06.963 INFO:tasks.workunit.client.1.vm04.stdout:2/697: getdents d0/d14/d1b/d45/da5 0 2026-03-10T14:08:06.969 INFO:tasks.workunit.client.1.vm04.stdout:9/658: rename d9/d5c/dc2/fd8 to d9/d5c/fe2 0 2026-03-10T14:08:06.969 INFO:tasks.workunit.client.1.vm04.stdout:9/659: readlink d9/da/d5d/ld1 0 2026-03-10T14:08:06.972 INFO:tasks.workunit.client.1.vm04.stdout:3/735: rmdir da/dc/d35/dcd/dd7 39 2026-03-10T14:08:06.978 INFO:tasks.workunit.client.0.vm03.stdout:1/246: getdents d0 0 2026-03-10T14:08:06.980 INFO:tasks.workunit.client.0.vm03.stdout:2/223: mkdir d5/d10/d1c/d40 0 2026-03-10T14:08:06.982 INFO:tasks.workunit.client.1.vm04.stdout:4/686: mknod d4/d14/d64/dcb/cee 0 2026-03-10T14:08:06.984 INFO:tasks.workunit.client.0.vm03.stdout:4/244: read f2 [75547,91730] 0 2026-03-10T14:08:06.985 INFO:tasks.workunit.client.0.vm03.stdout:4/245: truncate d5/d9/db/d19/d34/f4c 733330 0 2026-03-10T14:08:07.005 INFO:tasks.workunit.client.0.vm03.stdout:7/195: stat d5/d9/d14/d21/d28/f37 0 2026-03-10T14:08:07.005 INFO:tasks.workunit.client.0.vm03.stdout:1/247: creat d0/d2/df/d16/f4f x:0 0 0 2026-03-10T14:08:07.006 INFO:tasks.workunit.client.0.vm03.stdout:1/248: write d0/d2/f46 [556999,92033] 0 2026-03-10T14:08:07.006 INFO:tasks.workunit.client.1.vm04.stdout:5/795: symlink d7/d12/d2b/d3e/d57/d77/da5/lfe 0 2026-03-10T14:08:07.009 INFO:tasks.workunit.client.0.vm03.stdout:1/249: write d0/d2/df/f31 [4018837,287] 0 2026-03-10T14:08:07.015 INFO:tasks.workunit.client.0.vm03.stdout:2/224: symlink d5/d2a/l41 0 2026-03-10T14:08:07.016 INFO:tasks.workunit.client.1.vm04.stdout:0/730: truncate d0/d2/d15/d49/d50/d61/d75/f80 624413 0 2026-03-10T14:08:07.017 INFO:tasks.workunit.client.0.vm03.stdout:2/225: read d5/d10/d31/f38 [342711,21183] 0 2026-03-10T14:08:07.022 INFO:tasks.workunit.client.0.vm03.stdout:4/246: symlink d5/d9/d2b/l4e 0 2026-03-10T14:08:07.022 INFO:tasks.workunit.client.1.vm04.stdout:2/698: mknod d0/d14/d91/d4a/d8c/dab/cd7 0 2026-03-10T14:08:07.025 INFO:tasks.workunit.client.0.vm03.stdout:8/223: dread - da/d24/f2d zero size 2026-03-10T14:08:07.029 INFO:tasks.workunit.client.1.vm04.stdout:9/660: rmdir d9/da/dd/d1c/da3 39 2026-03-10T14:08:07.033 INFO:tasks.workunit.client.1.vm04.stdout:3/736: fsync da/f14 0 2026-03-10T14:08:07.038 INFO:tasks.workunit.client.0.vm03.stdout:1/250: mkdir d0/d18/d3b/d50 0 2026-03-10T14:08:07.039 INFO:tasks.workunit.client.1.vm04.stdout:8/810: rmdir d0/d3/d73/de8 0 2026-03-10T14:08:07.049 INFO:tasks.workunit.client.1.vm04.stdout:0/731: rmdir d0/d2/d15/d49 39 2026-03-10T14:08:07.051 INFO:tasks.workunit.client.0.vm03.stdout:4/247: mknod d5/d9/c4f 0 2026-03-10T14:08:07.051 INFO:tasks.workunit.client.0.vm03.stdout:4/248: fsync d5/d9/db/f3d 0 2026-03-10T14:08:07.053 INFO:tasks.workunit.client.0.vm03.stdout:7/196: write d5/d9/f30 [4828553,92639] 0 2026-03-10T14:08:07.054 INFO:tasks.workunit.client.1.vm04.stdout:3/737: creat da/ded/ffb x:0 0 0 2026-03-10T14:08:07.054 INFO:tasks.workunit.client.1.vm04.stdout:9/661: write d9/da/dd/f48 [2191571,68263] 0 2026-03-10T14:08:07.054 INFO:tasks.workunit.client.1.vm04.stdout:9/662: chown d9/d44/d4d/f4e 44899 1 2026-03-10T14:08:07.057 INFO:tasks.workunit.client.1.vm04.stdout:4/687: dread - d4/df/db2/db4/d47/d70/fd2 zero size 2026-03-10T14:08:07.057 INFO:tasks.workunit.client.1.vm04.stdout:9/663: chown d9/d5c/d93/d96/f9c 128460 1 2026-03-10T14:08:07.057 INFO:tasks.workunit.client.0.vm03.stdout:4/249: dwrite d5/d9/db/f29 [0,4194304] 0 2026-03-10T14:08:07.074 INFO:tasks.workunit.client.1.vm04.stdout:5/796: symlink d7/d12/d2b/d3e/d3f/dc0/lff 0 2026-03-10T14:08:07.076 INFO:tasks.workunit.client.1.vm04.stdout:3/738: dread da/dc/f2a [0,4194304] 0 2026-03-10T14:08:07.078 INFO:tasks.workunit.client.0.vm03.stdout:8/224: mknod da/d3c/c45 0 2026-03-10T14:08:07.079 INFO:tasks.workunit.client.1.vm04.stdout:2/699: creat d0/d14/d91/d4a/d66/dcd/fd8 x:0 0 0 2026-03-10T14:08:07.081 INFO:tasks.workunit.client.0.vm03.stdout:1/251: mknod d0/d2/c51 0 2026-03-10T14:08:07.082 INFO:tasks.workunit.client.1.vm04.stdout:9/664: truncate d9/da/f13 838657 0 2026-03-10T14:08:07.082 INFO:tasks.workunit.client.1.vm04.stdout:9/665: write d9/d5c/d93/fd2 [114825,8773] 0 2026-03-10T14:08:07.082 INFO:tasks.workunit.client.0.vm03.stdout:8/225: dwrite da/d24/f3d [0,4194304] 0 2026-03-10T14:08:07.083 INFO:tasks.workunit.client.0.vm03.stdout:1/252: read - d0/d2/df/d27/f49 zero size 2026-03-10T14:08:07.084 INFO:tasks.workunit.client.0.vm03.stdout:4/250: mknod d5/d47/c50 0 2026-03-10T14:08:07.085 INFO:tasks.workunit.client.0.vm03.stdout:4/251: chown d5/d9/db/f24 9363742 1 2026-03-10T14:08:07.087 INFO:tasks.workunit.client.1.vm04.stdout:0/732: fdatasync d0/d2/d15/d49/d50/f8a 0 2026-03-10T14:08:07.101 INFO:tasks.workunit.client.1.vm04.stdout:2/700: getdents d0/d14/d91/d8/d17/d4e/d8e 0 2026-03-10T14:08:07.101 INFO:tasks.workunit.client.0.vm03.stdout:8/226: dread - da/f30 zero size 2026-03-10T14:08:07.101 INFO:tasks.workunit.client.0.vm03.stdout:8/227: chown da/c21 19768 1 2026-03-10T14:08:07.101 INFO:tasks.workunit.client.0.vm03.stdout:2/226: link d5/lc d5/d10/d17/l42 0 2026-03-10T14:08:07.102 INFO:tasks.workunit.client.0.vm03.stdout:8/228: rename da/f20 to da/d3a/d44/f46 0 2026-03-10T14:08:07.102 INFO:tasks.workunit.client.1.vm04.stdout:4/688: sync 2026-03-10T14:08:07.103 INFO:tasks.workunit.client.1.vm04.stdout:4/689: chown d4/d14/d1b/caa 2052508 1 2026-03-10T14:08:07.108 INFO:tasks.workunit.client.1.vm04.stdout:4/690: chown d4/df/db2/de1 1 1 2026-03-10T14:08:07.109 INFO:tasks.workunit.client.1.vm04.stdout:0/733: read d0/d2/d25/f7f [2393789,24525] 0 2026-03-10T14:08:07.109 INFO:tasks.workunit.client.1.vm04.stdout:0/734: dread d0/d2/d15/d49/d50/fdd [0,4194304] 0 2026-03-10T14:08:07.112 INFO:tasks.workunit.client.0.vm03.stdout:2/227: dwrite d5/d10/d17/f18 [4194304,4194304] 0 2026-03-10T14:08:07.118 INFO:tasks.workunit.client.1.vm04.stdout:7/755: truncate d2/d2a/fef 367969 0 2026-03-10T14:08:07.129 INFO:tasks.workunit.client.1.vm04.stdout:9/666: dread d9/da/dd/d1c/f27 [0,4194304] 0 2026-03-10T14:08:07.129 INFO:tasks.workunit.client.1.vm04.stdout:9/667: chown d9/dd3 23669062 1 2026-03-10T14:08:07.129 INFO:tasks.workunit.client.1.vm04.stdout:7/756: dwrite d2/dc/f25 [0,4194304] 0 2026-03-10T14:08:07.129 INFO:tasks.workunit.client.1.vm04.stdout:7/757: symlink d2/dc/de/d2d/d38/d50/l10b 0 2026-03-10T14:08:07.129 INFO:tasks.workunit.client.1.vm04.stdout:7/758: chown d2/dc/d4d/l53 1971 1 2026-03-10T14:08:07.136 INFO:tasks.workunit.client.1.vm04.stdout:0/735: sync 2026-03-10T14:08:07.136 INFO:tasks.workunit.client.1.vm04.stdout:0/736: dread - d0/da2/fc7 zero size 2026-03-10T14:08:07.137 INFO:tasks.workunit.client.1.vm04.stdout:0/737: fsync d0/d2/dbe/fd7 0 2026-03-10T14:08:07.139 INFO:tasks.workunit.client.0.vm03.stdout:4/252: sync 2026-03-10T14:08:07.140 INFO:tasks.workunit.client.1.vm04.stdout:0/738: symlink d0/d2/d15/d22/d38/d56/lec 0 2026-03-10T14:08:07.140 INFO:tasks.workunit.client.1.vm04.stdout:0/739: chown d0/d2/d15/d49 0 1 2026-03-10T14:08:07.160 INFO:tasks.workunit.client.1.vm04.stdout:0/740: getdents d0/da2 0 2026-03-10T14:08:07.161 INFO:tasks.workunit.client.1.vm04.stdout:0/741: truncate d0/da2/fc7 951081 0 2026-03-10T14:08:07.175 INFO:tasks.workunit.client.1.vm04.stdout:0/742: dread d0/d2/d15/d49/d50/fb7 [0,4194304] 0 2026-03-10T14:08:07.175 INFO:tasks.workunit.client.1.vm04.stdout:0/743: chown d0/d2/d15/d22/c52 175 1 2026-03-10T14:08:07.183 INFO:tasks.workunit.client.1.vm04.stdout:0/744: dwrite d0/d2/d90/fe8 [0,4194304] 0 2026-03-10T14:08:07.198 INFO:tasks.workunit.client.1.vm04.stdout:0/745: rename d0/d2/d90/fe8 to d0/d2/d15/d49/d50/d5c/dd8/fed 0 2026-03-10T14:08:07.264 INFO:tasks.workunit.client.0.vm03.stdout:5/280: dwrite d4/fd [4194304,4194304] 0 2026-03-10T14:08:07.269 INFO:tasks.workunit.client.0.vm03.stdout:3/155: write f17 [4263366,25698] 0 2026-03-10T14:08:07.269 INFO:tasks.workunit.client.0.vm03.stdout:5/281: write d4/d6/f63 [367158,108987] 0 2026-03-10T14:08:07.270 INFO:tasks.workunit.client.0.vm03.stdout:5/282: write d4/fd [718491,62692] 0 2026-03-10T14:08:07.278 INFO:tasks.workunit.client.0.vm03.stdout:3/156: rmdir d1d/d29 39 2026-03-10T14:08:07.280 INFO:tasks.workunit.client.0.vm03.stdout:5/283: sync 2026-03-10T14:08:07.282 INFO:tasks.workunit.client.0.vm03.stdout:3/157: unlink d1d/f28 0 2026-03-10T14:08:07.289 INFO:tasks.workunit.client.0.vm03.stdout:5/284: rename d4/d16/d19/d23/l60 to d4/d40/d4e/l64 0 2026-03-10T14:08:07.289 INFO:tasks.workunit.client.0.vm03.stdout:5/285: chown d4/d6/de/c28 1865 1 2026-03-10T14:08:07.292 INFO:tasks.workunit.client.0.vm03.stdout:5/286: rename d4/d13/d43/f48 to d4/d6/de/f65 0 2026-03-10T14:08:07.292 INFO:tasks.workunit.client.0.vm03.stdout:5/287: fdatasync d4/d16/d19/f25 0 2026-03-10T14:08:07.293 INFO:tasks.workunit.client.0.vm03.stdout:3/158: symlink d1d/l30 0 2026-03-10T14:08:07.293 INFO:tasks.workunit.client.0.vm03.stdout:5/288: write d4/d13/f4b [1473300,82006] 0 2026-03-10T14:08:07.301 INFO:tasks.workunit.client.0.vm03.stdout:5/289: rename d4/d16/d19/d23/c3b to d4/d40/d4e/c66 0 2026-03-10T14:08:07.301 INFO:tasks.workunit.client.0.vm03.stdout:5/290: chown d4/d40 51645895 1 2026-03-10T14:08:07.302 INFO:tasks.workunit.client.0.vm03.stdout:5/291: readlink d4/d16/l41 0 2026-03-10T14:08:07.307 INFO:tasks.workunit.client.0.vm03.stdout:5/292: dwrite d4/d40/f5b [0,4194304] 0 2026-03-10T14:08:07.310 INFO:tasks.workunit.client.0.vm03.stdout:3/159: creat d1d/f31 x:0 0 0 2026-03-10T14:08:07.311 INFO:tasks.workunit.client.0.vm03.stdout:3/160: fdatasync f17 0 2026-03-10T14:08:07.313 INFO:tasks.workunit.client.0.vm03.stdout:5/293: truncate d4/d13/d43/f51 535822 0 2026-03-10T14:08:07.313 INFO:tasks.workunit.client.0.vm03.stdout:5/294: read d4/d13/d43/f58 [1935084,99849] 0 2026-03-10T14:08:07.314 INFO:tasks.workunit.client.0.vm03.stdout:5/295: stat d4/l2a 0 2026-03-10T14:08:07.333 INFO:tasks.workunit.client.0.vm03.stdout:5/296: mkdir d4/d16/d19/d67 0 2026-03-10T14:08:07.348 INFO:tasks.workunit.client.0.vm03.stdout:5/297: symlink d4/d13/d1f/l68 0 2026-03-10T14:08:07.351 INFO:tasks.workunit.client.0.vm03.stdout:3/161: getdents d1d/d29 0 2026-03-10T14:08:07.354 INFO:tasks.workunit.client.0.vm03.stdout:5/298: dread d4/d16/d19/f25 [0,4194304] 0 2026-03-10T14:08:07.361 INFO:tasks.workunit.client.0.vm03.stdout:5/299: dwrite d4/d40/f5b [0,4194304] 0 2026-03-10T14:08:07.363 INFO:tasks.workunit.client.0.vm03.stdout:5/300: stat d4/d6/de/f4f 0 2026-03-10T14:08:07.376 INFO:tasks.workunit.client.0.vm03.stdout:3/162: creat d1d/f32 x:0 0 0 2026-03-10T14:08:07.376 INFO:tasks.workunit.client.0.vm03.stdout:5/301: dwrite d4/d6/f63 [0,4194304] 0 2026-03-10T14:08:07.377 INFO:tasks.workunit.client.0.vm03.stdout:3/163: truncate d1d/f31 390162 0 2026-03-10T14:08:07.378 INFO:tasks.workunit.client.0.vm03.stdout:3/164: truncate d1d/f2c 666935 0 2026-03-10T14:08:07.387 INFO:tasks.workunit.client.0.vm03.stdout:3/165: dread d1d/f2c [0,4194304] 0 2026-03-10T14:08:07.387 INFO:tasks.workunit.client.0.vm03.stdout:3/166: fdatasync f1c 0 2026-03-10T14:08:07.392 INFO:tasks.workunit.client.0.vm03.stdout:5/302: creat d4/d13/f69 x:0 0 0 2026-03-10T14:08:07.393 INFO:tasks.workunit.client.0.vm03.stdout:5/303: stat d4/d16/c1e 0 2026-03-10T14:08:07.394 INFO:tasks.workunit.client.0.vm03.stdout:3/167: dwrite f17 [0,4194304] 0 2026-03-10T14:08:07.414 INFO:tasks.workunit.client.1.vm04.stdout:6/624: truncate d3/f8d 8317628 0 2026-03-10T14:08:07.416 INFO:tasks.workunit.client.0.vm03.stdout:5/304: mknod d4/d16/d19/d23/c6a 0 2026-03-10T14:08:07.416 INFO:tasks.workunit.client.0.vm03.stdout:5/305: write d4/d16/d19/f25 [2577176,58575] 0 2026-03-10T14:08:07.416 INFO:tasks.workunit.client.1.vm04.stdout:6/625: chown d3/de/d35/d3f/d2d/d32/d23/d47/l8b 113075070 1 2026-03-10T14:08:07.417 INFO:tasks.workunit.client.0.vm03.stdout:3/168: chown d1d/f26 1252 1 2026-03-10T14:08:07.422 INFO:tasks.workunit.client.0.vm03.stdout:5/306: mknod d4/d53/c6b 0 2026-03-10T14:08:07.424 INFO:tasks.workunit.client.0.vm03.stdout:5/307: stat d4/d13/d43/f58 0 2026-03-10T14:08:07.425 INFO:tasks.workunit.client.0.vm03.stdout:5/308: fsync d4/d40/d4e/f5c 0 2026-03-10T14:08:07.445 INFO:tasks.workunit.client.1.vm04.stdout:6/626: sync 2026-03-10T14:08:07.447 INFO:tasks.workunit.client.1.vm04.stdout:6/627: read - d3/f78 zero size 2026-03-10T14:08:07.449 INFO:tasks.workunit.client.0.vm03.stdout:6/188: dwrite d8/fd [0,4194304] 0 2026-03-10T14:08:07.451 INFO:tasks.workunit.client.0.vm03.stdout:6/189: rename d8/db to d8/db/d2c/d39 22 2026-03-10T14:08:07.451 INFO:tasks.workunit.client.0.vm03.stdout:6/190: write d8/db/f1f [3037334,8718] 0 2026-03-10T14:08:07.453 INFO:tasks.workunit.client.0.vm03.stdout:6/191: truncate d8/d1b/f31 774101 0 2026-03-10T14:08:07.453 INFO:tasks.workunit.client.0.vm03.stdout:6/192: readlink d8/l36 0 2026-03-10T14:08:07.459 INFO:tasks.workunit.client.0.vm03.stdout:6/193: mkdir d8/db/d2c/d2d/d32/d3a 0 2026-03-10T14:08:07.463 INFO:tasks.workunit.client.0.vm03.stdout:6/194: dread d8/d1b/f29 [0,4194304] 0 2026-03-10T14:08:07.465 INFO:tasks.workunit.client.0.vm03.stdout:6/195: mkdir d8/d3b 0 2026-03-10T14:08:07.493 INFO:tasks.workunit.client.0.vm03.stdout:0/214: dwrite d3/d11/f18 [0,4194304] 0 2026-03-10T14:08:07.496 INFO:tasks.workunit.client.0.vm03.stdout:0/215: fdatasync d3/fe 0 2026-03-10T14:08:07.499 INFO:tasks.workunit.client.0.vm03.stdout:0/216: fdatasync d3/d17/f35 0 2026-03-10T14:08:07.508 INFO:tasks.workunit.client.0.vm03.stdout:0/217: fsync d3/d16/f34 0 2026-03-10T14:08:07.515 INFO:tasks.workunit.client.0.vm03.stdout:0/218: rename d3/d11/c37 to d3/d11/c44 0 2026-03-10T14:08:07.524 INFO:tasks.workunit.client.0.vm03.stdout:0/219: truncate d3/d16/f2d 2663580 0 2026-03-10T14:08:07.526 INFO:tasks.workunit.client.1.vm04.stdout:1/728: write d3/d20/f27 [5263024,31090] 0 2026-03-10T14:08:07.532 INFO:tasks.workunit.client.0.vm03.stdout:9/193: write d2/f2c [320496,29492] 0 2026-03-10T14:08:07.539 INFO:tasks.workunit.client.0.vm03.stdout:9/194: rename d2/d29/d38/d3a to d2/d29/d33/d41 0 2026-03-10T14:08:07.539 INFO:tasks.workunit.client.0.vm03.stdout:9/195: truncate d2/d29/d33/f3c 692922 0 2026-03-10T14:08:07.541 INFO:tasks.workunit.client.1.vm04.stdout:1/729: mkdir d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb/dfb 0 2026-03-10T14:08:07.544 INFO:tasks.workunit.client.0.vm03.stdout:9/196: readlink d2/l21 0 2026-03-10T14:08:07.547 INFO:tasks.workunit.client.0.vm03.stdout:4/253: getdents d5/d9 0 2026-03-10T14:08:07.547 INFO:tasks.workunit.client.0.vm03.stdout:9/197: dwrite d2/f15 [0,4194304] 0 2026-03-10T14:08:07.550 INFO:tasks.workunit.client.1.vm04.stdout:8/811: dwrite d0/d3/d63/d29/f45 [4194304,4194304] 0 2026-03-10T14:08:07.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:07 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:08:07.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:07 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.ywwcto/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:08:07.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:07 vm04.local ceph-mon[55966]: mgrmap e22: vm04.ywwcto(active, since 1.17031s) 2026-03-10T14:08:07.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:07 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.ywwcto/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:08:07.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:07 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.ywwcto/trash_purge_schedule"}]: dispatch 2026-03-10T14:08:07.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:07 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.ywwcto/trash_purge_schedule"}]: dispatch 2026-03-10T14:08:07.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:07 vm04.local ceph-mon[55966]: pgmap v3: 65 pgs: 65 active+clean; 2.2 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:08:07.569 INFO:tasks.workunit.client.1.vm04.stdout:8/812: read d0/d3/d63/d12/f1d [611299,78592] 0 2026-03-10T14:08:07.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:07 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:08:07.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:07 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.ywwcto/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:08:07.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:07 vm03.local ceph-mon[49718]: mgrmap e22: vm04.ywwcto(active, since 1.17031s) 2026-03-10T14:08:07.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:07 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.ywwcto/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:08:07.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:07 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.ywwcto/trash_purge_schedule"}]: dispatch 2026-03-10T14:08:07.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:07 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm04.ywwcto/trash_purge_schedule"}]: dispatch 2026-03-10T14:08:07.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:07 vm03.local ceph-mon[49718]: pgmap v3: 65 pgs: 65 active+clean; 2.2 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:08:07.581 INFO:tasks.workunit.client.0.vm03.stdout:4/254: fsync d5/fe 0 2026-03-10T14:08:07.581 INFO:tasks.workunit.client.0.vm03.stdout:4/255: dread d5/d9/db/f3d [0,4194304] 0 2026-03-10T14:08:07.586 INFO:tasks.workunit.client.1.vm04.stdout:8/813: rename d0/d3/d63/d12/d51/cee to d0/d3/d73/cfd 0 2026-03-10T14:08:07.598 INFO:tasks.workunit.client.1.vm04.stdout:8/814: truncate d0/f26 2455283 0 2026-03-10T14:08:07.599 INFO:tasks.workunit.client.0.vm03.stdout:7/197: dwrite d5/d9/d14/d21/f29 [0,4194304] 0 2026-03-10T14:08:07.603 INFO:tasks.workunit.client.0.vm03.stdout:4/256: link d5/d9/db/f3d d5/d9/db/d19/f51 0 2026-03-10T14:08:07.604 INFO:tasks.workunit.client.0.vm03.stdout:4/257: write d5/d9/db/d19/d34/f4c [1619070,44398] 0 2026-03-10T14:08:07.606 INFO:tasks.workunit.client.0.vm03.stdout:4/258: dread d5/f3c [0,4194304] 0 2026-03-10T14:08:07.619 INFO:tasks.workunit.client.1.vm04.stdout:8/815: mknod d0/d3/dd/cfe 0 2026-03-10T14:08:07.619 INFO:tasks.workunit.client.1.vm04.stdout:5/797: write d7/d12/f42 [3498692,63571] 0 2026-03-10T14:08:07.621 INFO:tasks.workunit.client.0.vm03.stdout:9/198: getdents d2/d14/d2b/d34 0 2026-03-10T14:08:07.633 INFO:tasks.workunit.client.1.vm04.stdout:8/816: symlink d0/d3/d63/d12/d51/d67/d96/lff 0 2026-03-10T14:08:07.639 INFO:tasks.workunit.client.1.vm04.stdout:5/798: fsync d7/d12/d2b/d3e/f4a 0 2026-03-10T14:08:07.639 INFO:tasks.workunit.client.1.vm04.stdout:3/739: write da/dc/d3f/d54/d66/fa7 [763890,58753] 0 2026-03-10T14:08:07.640 INFO:tasks.workunit.client.1.vm04.stdout:3/740: write da/d3e/ff4 [729864,23550] 0 2026-03-10T14:08:07.642 INFO:tasks.workunit.client.1.vm04.stdout:8/817: mknod d0/d3/d63/d12/d51/d67/d96/c100 0 2026-03-10T14:08:07.642 INFO:tasks.workunit.client.0.vm03.stdout:9/199: truncate d2/d14/f1a 4605122 0 2026-03-10T14:08:07.643 INFO:tasks.workunit.client.0.vm03.stdout:9/200: chown d2/f2c 210099773 1 2026-03-10T14:08:07.643 INFO:tasks.workunit.client.1.vm04.stdout:8/818: read - d0/d3/d63/d12/d51/d67/ff0 zero size 2026-03-10T14:08:07.643 INFO:tasks.workunit.client.0.vm03.stdout:9/201: write d2/f37 [140043,87586] 0 2026-03-10T14:08:07.650 INFO:tasks.workunit.client.0.vm03.stdout:4/259: symlink d5/d9/db/d19/d38/l52 0 2026-03-10T14:08:07.653 INFO:tasks.workunit.client.1.vm04.stdout:3/741: sync 2026-03-10T14:08:07.656 INFO:tasks.workunit.client.1.vm04.stdout:8/819: symlink d0/dc1/def/l101 0 2026-03-10T14:08:07.656 INFO:tasks.workunit.client.0.vm03.stdout:1/253: write d0/d18/f21 [728723,74660] 0 2026-03-10T14:08:07.663 INFO:tasks.workunit.client.1.vm04.stdout:5/799: creat d7/d12/d2b/d3e/d57/d8a/dec/f100 x:0 0 0 2026-03-10T14:08:07.664 INFO:tasks.workunit.client.1.vm04.stdout:5/800: dread - d7/d12/d2b/d3e/d57/d77/da5/faa zero size 2026-03-10T14:08:07.665 INFO:tasks.workunit.client.0.vm03.stdout:2/228: rmdir d5 39 2026-03-10T14:08:07.667 INFO:tasks.workunit.client.1.vm04.stdout:2/701: dwrite d0/d14/d91/d8/d17/d35/f5f [0,4194304] 0 2026-03-10T14:08:07.678 INFO:tasks.workunit.client.0.vm03.stdout:4/260: mkdir d5/d9/db/d19/d38/d53 0 2026-03-10T14:08:07.681 INFO:tasks.workunit.client.0.vm03.stdout:4/261: dwrite d5/d9/db/f29 [4194304,4194304] 0 2026-03-10T14:08:07.696 INFO:tasks.workunit.client.1.vm04.stdout:4/691: write d4/f2c [4222181,2587] 0 2026-03-10T14:08:07.700 INFO:tasks.workunit.client.0.vm03.stdout:1/254: rmdir d0/d18/d3b 39 2026-03-10T14:08:07.700 INFO:tasks.workunit.client.0.vm03.stdout:1/255: write d0/d2/df/f31 [611185,114408] 0 2026-03-10T14:08:07.711 INFO:tasks.workunit.client.0.vm03.stdout:4/262: dread d5/d9/f11 [0,4194304] 0 2026-03-10T14:08:07.716 INFO:tasks.workunit.client.1.vm04.stdout:9/668: write d9/d33/f8f [78532,27458] 0 2026-03-10T14:08:07.716 INFO:tasks.workunit.client.1.vm04.stdout:7/759: write d2/dc/d4d/dcd/f7a [3450731,22021] 0 2026-03-10T14:08:07.717 INFO:tasks.workunit.client.0.vm03.stdout:9/202: getdents d2/d29/d33/d41 0 2026-03-10T14:08:07.727 INFO:tasks.workunit.client.0.vm03.stdout:8/229: dwrite da/f33 [0,4194304] 0 2026-03-10T14:08:07.729 INFO:tasks.workunit.client.0.vm03.stdout:1/256: creat d0/d2/df/d27/f52 x:0 0 0 2026-03-10T14:08:07.731 INFO:tasks.workunit.client.0.vm03.stdout:4/263: symlink d5/d9/db/l54 0 2026-03-10T14:08:07.733 INFO:tasks.workunit.client.0.vm03.stdout:9/203: rename d2/d14/f27 to d2/d14/d2b/f42 0 2026-03-10T14:08:07.733 INFO:tasks.workunit.client.0.vm03.stdout:2/229: symlink d5/d10/d31/l43 0 2026-03-10T14:08:07.734 INFO:tasks.workunit.client.0.vm03.stdout:2/230: chown d5/d10 870 1 2026-03-10T14:08:07.734 INFO:tasks.workunit.client.0.vm03.stdout:4/264: dread d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:07.740 INFO:tasks.workunit.client.0.vm03.stdout:8/230: chown da/d24/f3d 59 1 2026-03-10T14:08:07.741 INFO:tasks.workunit.client.0.vm03.stdout:9/204: mkdir d2/d14/d2b/d43 0 2026-03-10T14:08:07.747 INFO:tasks.workunit.client.0.vm03.stdout:9/205: creat d2/d14/d2b/f44 x:0 0 0 2026-03-10T14:08:07.751 INFO:tasks.workunit.client.0.vm03.stdout:9/206: dwrite d2/d29/d33/f3c [0,4194304] 0 2026-03-10T14:08:07.760 INFO:tasks.workunit.client.0.vm03.stdout:4/265: mkdir d5/d9/db/d19/d38/d53/d55 0 2026-03-10T14:08:07.760 INFO:tasks.workunit.client.0.vm03.stdout:4/266: write d5/f49 [1017299,15607] 0 2026-03-10T14:08:07.762 INFO:tasks.workunit.client.0.vm03.stdout:9/207: read d2/d14/d2b/f2d [1776650,129083] 0 2026-03-10T14:08:07.762 INFO:tasks.workunit.client.0.vm03.stdout:9/208: readlink d2/l21 0 2026-03-10T14:08:07.769 INFO:tasks.workunit.client.1.vm04.stdout:0/746: write d0/d2/d15/f59 [824891,75104] 0 2026-03-10T14:08:07.776 INFO:tasks.workunit.client.1.vm04.stdout:9/669: mknod d9/d58/dda/ce3 0 2026-03-10T14:08:07.782 INFO:tasks.workunit.client.1.vm04.stdout:9/670: write d9/d44/d70/fdb [1894847,47116] 0 2026-03-10T14:08:07.782 INFO:tasks.workunit.client.1.vm04.stdout:7/760: symlink d2/df9/l10c 0 2026-03-10T14:08:07.782 INFO:tasks.workunit.client.1.vm04.stdout:8/820: link d0/d3/d63/d12/d51/l49 d0/d3/d73/db8/l102 0 2026-03-10T14:08:07.782 INFO:tasks.workunit.client.1.vm04.stdout:8/821: fdatasync d0/d3/d63/d29/f45 0 2026-03-10T14:08:07.789 INFO:tasks.workunit.client.1.vm04.stdout:4/692: creat d4/df/d34/fef x:0 0 0 2026-03-10T14:08:07.790 INFO:tasks.workunit.client.0.vm03.stdout:4/267: getdents d5/d9/d2b 0 2026-03-10T14:08:07.792 INFO:tasks.workunit.client.1.vm04.stdout:0/747: dread d0/d2/d15/f2f [0,4194304] 0 2026-03-10T14:08:07.796 INFO:tasks.workunit.client.0.vm03.stdout:9/209: sync 2026-03-10T14:08:07.801 INFO:tasks.workunit.client.0.vm03.stdout:4/268: read d5/d9/db/ff [311939,100927] 0 2026-03-10T14:08:07.807 INFO:tasks.workunit.client.1.vm04.stdout:7/761: mknod d2/dc/de/d2d/d60/d7c/d44/dc0/c10d 0 2026-03-10T14:08:07.809 INFO:tasks.workunit.client.0.vm03.stdout:9/210: chown d2/c18 622 1 2026-03-10T14:08:07.820 INFO:tasks.workunit.client.0.vm03.stdout:4/269: write f3 [7188901,74834] 0 2026-03-10T14:08:07.834 INFO:tasks.workunit.client.1.vm04.stdout:4/693: dread d4/d14/f98 [0,4194304] 0 2026-03-10T14:08:07.837 INFO:tasks.workunit.client.0.vm03.stdout:3/169: write f1b [182955,34042] 0 2026-03-10T14:08:07.837 INFO:tasks.workunit.client.0.vm03.stdout:9/211: fdatasync d2/d14/f1a 0 2026-03-10T14:08:07.838 INFO:tasks.workunit.client.1.vm04.stdout:0/748: readlink d0/d2/d15/d22/d38/d56/l9b 0 2026-03-10T14:08:07.840 INFO:tasks.workunit.client.0.vm03.stdout:4/270: dwrite d5/d9/f11 [0,4194304] 0 2026-03-10T14:08:07.844 INFO:tasks.workunit.client.0.vm03.stdout:4/271: dread d5/d9/db/d19/f45 [0,4194304] 0 2026-03-10T14:08:07.855 INFO:tasks.workunit.client.1.vm04.stdout:6/628: write d3/de/d35/d3f/f96 [402552,43166] 0 2026-03-10T14:08:07.855 INFO:tasks.workunit.client.0.vm03.stdout:3/170: rmdir d1d/d29 39 2026-03-10T14:08:07.855 INFO:tasks.workunit.client.0.vm03.stdout:3/171: write fe [5025905,86696] 0 2026-03-10T14:08:07.879 INFO:tasks.workunit.client.0.vm03.stdout:4/272: dread d5/d9/db/f28 [0,4194304] 0 2026-03-10T14:08:07.880 INFO:tasks.workunit.client.0.vm03.stdout:4/273: write d5/d9/db/d19/d34/f4c [2669633,51323] 0 2026-03-10T14:08:07.883 INFO:tasks.workunit.client.0.vm03.stdout:9/212: creat d2/d14/d2b/d43/f45 x:0 0 0 2026-03-10T14:08:07.889 INFO:tasks.workunit.client.0.vm03.stdout:6/196: write d8/fe [3925287,72089] 0 2026-03-10T14:08:07.904 INFO:tasks.workunit.client.0.vm03.stdout:3/172: rmdir d1d 39 2026-03-10T14:08:07.905 INFO:tasks.workunit.client.0.vm03.stdout:9/213: mkdir d2/d29/d33/d41/d46 0 2026-03-10T14:08:07.905 INFO:tasks.workunit.client.0.vm03.stdout:9/214: chown d2/d14/l1d 62148280 1 2026-03-10T14:08:07.906 INFO:tasks.workunit.client.0.vm03.stdout:6/197: fsync d8/d11/f2a 0 2026-03-10T14:08:07.907 INFO:tasks.workunit.client.0.vm03.stdout:6/198: write d8/db/df/f27 [275989,58553] 0 2026-03-10T14:08:07.910 INFO:tasks.workunit.client.0.vm03.stdout:9/215: dread d2/f2c [0,4194304] 0 2026-03-10T14:08:07.923 INFO:tasks.workunit.client.0.vm03.stdout:6/199: chown d8/db/d12/c17 318061902 1 2026-03-10T14:08:07.927 INFO:tasks.workunit.client.0.vm03.stdout:4/274: getdents d5/d9 0 2026-03-10T14:08:07.928 INFO:tasks.workunit.client.0.vm03.stdout:4/275: truncate d5/d9/db/f20 9273960 0 2026-03-10T14:08:07.940 INFO:tasks.workunit.client.0.vm03.stdout:5/309: write d4/d6/fa [1566283,22065] 0 2026-03-10T14:08:07.941 INFO:tasks.workunit.client.0.vm03.stdout:5/310: rename d4 to d4/d40/d6c 22 2026-03-10T14:08:07.942 INFO:tasks.workunit.client.1.vm04.stdout:1/730: dwrite d3/d5c/f71 [0,4194304] 0 2026-03-10T14:08:07.943 INFO:tasks.workunit.client.0.vm03.stdout:5/311: dread d4/d6/f63 [0,4194304] 0 2026-03-10T14:08:07.944 INFO:tasks.workunit.client.0.vm03.stdout:5/312: write d4/d16/f2d [3096336,82121] 0 2026-03-10T14:08:07.945 INFO:tasks.workunit.client.0.vm03.stdout:5/313: write d4/d6/de/f5a [446727,84283] 0 2026-03-10T14:08:07.961 INFO:tasks.workunit.client.0.vm03.stdout:3/173: getdents d1d/d29 0 2026-03-10T14:08:07.961 INFO:tasks.workunit.client.0.vm03.stdout:3/174: write fb [853250,112104] 0 2026-03-10T14:08:07.962 INFO:tasks.workunit.client.0.vm03.stdout:3/175: chown d1d/f2c 1 1 2026-03-10T14:08:07.962 INFO:tasks.workunit.client.0.vm03.stdout:3/176: write f1c [4790493,120637] 0 2026-03-10T14:08:07.966 INFO:tasks.workunit.client.0.vm03.stdout:4/276: creat d5/d9/db/d19/d38/f56 x:0 0 0 2026-03-10T14:08:07.970 INFO:tasks.workunit.client.1.vm04.stdout:2/702: write d0/d14/d39/fa2 [140200,13713] 0 2026-03-10T14:08:07.974 INFO:tasks.workunit.client.1.vm04.stdout:2/703: fdatasync d0/d14/d39/f44 0 2026-03-10T14:08:08.003 INFO:tasks.workunit.client.0.vm03.stdout:6/200: creat d8/db/f3c x:0 0 0 2026-03-10T14:08:08.011 INFO:tasks.workunit.client.0.vm03.stdout:1/257: truncate d0/d2/df/f31 818949 0 2026-03-10T14:08:08.012 INFO:tasks.workunit.client.0.vm03.stdout:7/198: creat d5/d9/d14/d26/d36/f3a x:0 0 0 2026-03-10T14:08:08.014 INFO:tasks.workunit.client.0.vm03.stdout:1/258: dwrite d0/d2/fe [0,4194304] 0 2026-03-10T14:08:08.016 INFO:tasks.workunit.client.0.vm03.stdout:9/216: getdents d2/d14/d2b 0 2026-03-10T14:08:08.016 INFO:tasks.workunit.client.0.vm03.stdout:2/231: truncate d5/f9 3217564 0 2026-03-10T14:08:08.016 INFO:tasks.workunit.client.1.vm04.stdout:3/742: dwrite f8 [0,4194304] 0 2026-03-10T14:08:08.032 INFO:tasks.workunit.client.0.vm03.stdout:0/220: creat d3/d11/d25/d30/f45 x:0 0 0 2026-03-10T14:08:08.034 INFO:tasks.workunit.client.0.vm03.stdout:6/201: creat d8/d1b/f3d x:0 0 0 2026-03-10T14:08:08.037 INFO:tasks.workunit.client.0.vm03.stdout:6/202: dwrite d8/fe [0,4194304] 0 2026-03-10T14:08:08.041 INFO:tasks.workunit.client.1.vm04.stdout:3/743: unlink da/dc/d35/d52/d6d/f9d 0 2026-03-10T14:08:08.046 INFO:tasks.workunit.client.0.vm03.stdout:3/177: mkdir d1d/d33 0 2026-03-10T14:08:08.047 INFO:tasks.workunit.client.1.vm04.stdout:3/744: readlink da/dc/d35/dcd/ld9 0 2026-03-10T14:08:08.047 INFO:tasks.workunit.client.1.vm04.stdout:5/801: write d7/d12/d2b/d3e/d57/d77/fa2 [1157197,53688] 0 2026-03-10T14:08:08.047 INFO:tasks.workunit.client.1.vm04.stdout:5/802: chown d7/d12/d2b/d3e/d3f/da6/fee 1159218 1 2026-03-10T14:08:08.047 INFO:tasks.workunit.client.0.vm03.stdout:0/221: read d3/f1e [476931,97119] 0 2026-03-10T14:08:08.047 INFO:tasks.workunit.client.0.vm03.stdout:8/231: truncate da/d36/f3b 740118 0 2026-03-10T14:08:08.054 INFO:tasks.workunit.client.1.vm04.stdout:3/745: creat da/dc/d47/d9b/dcb/ffc x:0 0 0 2026-03-10T14:08:08.074 INFO:tasks.workunit.client.0.vm03.stdout:9/217: creat d2/d29/d38/f47 x:0 0 0 2026-03-10T14:08:08.077 INFO:tasks.workunit.client.1.vm04.stdout:3/746: read da/d3e/f44 [6689707,107495] 0 2026-03-10T14:08:08.077 INFO:tasks.workunit.client.0.vm03.stdout:9/218: dwrite d2/d14/d2b/d34/f3e [0,4194304] 0 2026-03-10T14:08:08.084 INFO:tasks.workunit.client.0.vm03.stdout:2/232: creat d5/d10/d1c/f44 x:0 0 0 2026-03-10T14:08:08.090 INFO:tasks.workunit.client.1.vm04.stdout:9/671: dwrite d9/d44/d4d/f66 [0,4194304] 0 2026-03-10T14:08:08.091 INFO:tasks.workunit.client.1.vm04.stdout:8/822: dwrite d0/d3/d5/f70 [0,4194304] 0 2026-03-10T14:08:08.095 INFO:tasks.workunit.client.1.vm04.stdout:7/762: dwrite d2/dc/d4d/fe4 [0,4194304] 0 2026-03-10T14:08:08.106 INFO:tasks.workunit.client.1.vm04.stdout:4/694: write d4/df/db2/db6/dc9/dd0/fd3 [964799,99150] 0 2026-03-10T14:08:08.106 INFO:tasks.workunit.client.1.vm04.stdout:0/749: write d0/d2/d15/d49/d50/d61/d75/f98 [337727,113] 0 2026-03-10T14:08:08.113 INFO:tasks.workunit.client.1.vm04.stdout:6/629: dwrite d3/de/d35/d3f/d2d/f89 [0,4194304] 0 2026-03-10T14:08:08.127 INFO:tasks.workunit.client.1.vm04.stdout:1/731: dwrite d3/d22/d63/d35/dd9/d13/d38/fd3 [0,4194304] 0 2026-03-10T14:08:08.130 INFO:tasks.workunit.client.0.vm03.stdout:5/314: rename d4/d13/f69 to d4/d6/f6d 0 2026-03-10T14:08:08.135 INFO:tasks.workunit.client.1.vm04.stdout:3/747: symlink da/dc/d3f/d54/d66/lfd 0 2026-03-10T14:08:08.145 INFO:tasks.workunit.client.1.vm04.stdout:2/704: truncate d0/d14/d91/f1d 8975258 0 2026-03-10T14:08:08.145 INFO:tasks.workunit.client.1.vm04.stdout:2/705: fsync d0/d14/d39/fa2 0 2026-03-10T14:08:08.156 INFO:tasks.workunit.client.0.vm03.stdout:4/277: dwrite d5/d9/db/f28 [0,4194304] 0 2026-03-10T14:08:08.159 INFO:tasks.workunit.client.0.vm03.stdout:4/278: write d5/d9/db/d19/d38/f56 [295181,24099] 0 2026-03-10T14:08:08.163 INFO:tasks.workunit.client.1.vm04.stdout:7/763: mkdir d2/dc/de/d2d/d38/d50/dc8/d10e 0 2026-03-10T14:08:08.166 INFO:tasks.workunit.client.1.vm04.stdout:4/695: chown d4/d14/l73 46930 1 2026-03-10T14:08:08.177 INFO:tasks.workunit.client.1.vm04.stdout:5/803: dwrite d7/d26/d6b/d6e/f81 [0,4194304] 0 2026-03-10T14:08:08.177 INFO:tasks.workunit.client.0.vm03.stdout:3/178: rmdir d1d 39 2026-03-10T14:08:08.177 INFO:tasks.workunit.client.0.vm03.stdout:3/179: write f1c [3631053,87229] 0 2026-03-10T14:08:08.185 INFO:tasks.workunit.client.1.vm04.stdout:0/750: dread d0/d2/d15/d22/d38/d56/d66/f2e [0,4194304] 0 2026-03-10T14:08:08.187 INFO:tasks.workunit.client.1.vm04.stdout:0/751: dread d0/d2/d15/d22/d38/d56/d66/f2e [0,4194304] 0 2026-03-10T14:08:08.201 INFO:tasks.workunit.client.0.vm03.stdout:0/222: rmdir d3/d11/d25 39 2026-03-10T14:08:08.205 INFO:tasks.workunit.client.0.vm03.stdout:1/259: creat d0/d18/d3b/f53 x:0 0 0 2026-03-10T14:08:08.209 INFO:tasks.workunit.client.0.vm03.stdout:7/199: unlink d5/d9/d14/d21/d28/l31 0 2026-03-10T14:08:08.214 INFO:tasks.workunit.client.0.vm03.stdout:9/219: readlink d2/d29/l32 0 2026-03-10T14:08:08.215 INFO:tasks.workunit.client.1.vm04.stdout:1/732: creat d3/d22/d63/d35/dd9/d13/d38/db5/dc4/ffc x:0 0 0 2026-03-10T14:08:08.215 INFO:tasks.workunit.client.0.vm03.stdout:2/233: creat d5/d2a/f45 x:0 0 0 2026-03-10T14:08:08.215 INFO:tasks.workunit.client.0.vm03.stdout:2/234: readlink d5/d10/l1a 0 2026-03-10T14:08:08.219 INFO:tasks.workunit.client.0.vm03.stdout:5/315: fsync d4/d6/fb 0 2026-03-10T14:08:08.225 INFO:tasks.workunit.client.0.vm03.stdout:6/203: getdents d8/db/d2c/d2d/d32/d3a 0 2026-03-10T14:08:08.228 INFO:tasks.workunit.client.1.vm04.stdout:8/823: truncate d0/d3/fc3 1648711 0 2026-03-10T14:08:08.238 INFO:tasks.workunit.client.1.vm04.stdout:7/764: rmdir d2/d94 39 2026-03-10T14:08:08.238 INFO:tasks.workunit.client.0.vm03.stdout:4/279: unlink d5/d9/l2e 0 2026-03-10T14:08:08.239 INFO:tasks.workunit.client.0.vm03.stdout:4/280: chown d5/d9/db/d19/d38/f56 50867 1 2026-03-10T14:08:08.239 INFO:tasks.workunit.client.0.vm03.stdout:4/281: read d5/d9/f11 [2815524,660] 0 2026-03-10T14:08:08.239 INFO:tasks.workunit.client.0.vm03.stdout:2/235: dread f4 [0,4194304] 0 2026-03-10T14:08:08.245 INFO:tasks.workunit.client.0.vm03.stdout:8/232: creat da/d36/d40/f47 x:0 0 0 2026-03-10T14:08:08.247 INFO:tasks.workunit.client.1.vm04.stdout:0/752: fdatasync d0/d2/d15/d22/d38/fc6 0 2026-03-10T14:08:08.252 INFO:tasks.workunit.client.1.vm04.stdout:1/733: creat d3/d22/d63/d35/dd9/d13/d38/db5/ffd x:0 0 0 2026-03-10T14:08:08.261 INFO:tasks.workunit.client.1.vm04.stdout:9/672: write d9/da/d5d/f8b [775742,35351] 0 2026-03-10T14:08:08.261 INFO:tasks.workunit.client.1.vm04.stdout:9/673: dread - d9/d58/db5/da5/fc9 zero size 2026-03-10T14:08:08.264 INFO:tasks.workunit.client.1.vm04.stdout:6/630: dwrite d3/de/d35/d3a/f51 [0,4194304] 0 2026-03-10T14:08:08.266 INFO:tasks.workunit.client.0.vm03.stdout:6/204: sync 2026-03-10T14:08:08.272 INFO:tasks.workunit.client.1.vm04.stdout:2/706: write d0/d14/d91/d8/f42 [3315224,64321] 0 2026-03-10T14:08:08.278 INFO:tasks.workunit.client.0.vm03.stdout:5/316: rename d4/d53 to d4/d16/d19/d6e 0 2026-03-10T14:08:08.284 INFO:tasks.workunit.client.1.vm04.stdout:0/753: mkdir d0/dee 0 2026-03-10T14:08:08.296 INFO:tasks.workunit.client.0.vm03.stdout:0/223: mkdir d3/d46 0 2026-03-10T14:08:08.296 INFO:tasks.workunit.client.0.vm03.stdout:1/260: mknod d0/d2/df/c54 0 2026-03-10T14:08:08.296 INFO:tasks.workunit.client.1.vm04.stdout:0/754: write d0/d2/f9 [218439,37391] 0 2026-03-10T14:08:08.296 INFO:tasks.workunit.client.1.vm04.stdout:1/734: creat d3/d22/ffe x:0 0 0 2026-03-10T14:08:08.296 INFO:tasks.workunit.client.1.vm04.stdout:1/735: read - d3/d22/d63/d35/dd9/d13/d38/db5/dc4/ff6 zero size 2026-03-10T14:08:08.296 INFO:tasks.workunit.client.1.vm04.stdout:1/736: chown d3/d22/d63/ded 71796632 1 2026-03-10T14:08:08.296 INFO:tasks.workunit.client.1.vm04.stdout:8/824: truncate d0/d3/d63/d12/d51/f97 1823793 0 2026-03-10T14:08:08.302 INFO:tasks.workunit.client.0.vm03.stdout:0/224: dread d3/f10 [4194304,4194304] 0 2026-03-10T14:08:08.304 INFO:tasks.workunit.client.1.vm04.stdout:5/804: dwrite d7/d12/d2b/d3e/d57/d8a/fb0 [4194304,4194304] 0 2026-03-10T14:08:08.307 INFO:tasks.workunit.client.1.vm04.stdout:6/631: creat d3/de/d35/d3f/d2d/d38/d40/fbf x:0 0 0 2026-03-10T14:08:08.309 INFO:tasks.workunit.client.1.vm04.stdout:4/696: rename d4/df/d34/fc5 to d4/df/db2/db4/d47/d4f/ff0 0 2026-03-10T14:08:08.313 INFO:tasks.workunit.client.1.vm04.stdout:4/697: dwrite d4/d14/d3c/f41 [0,4194304] 0 2026-03-10T14:08:08.317 INFO:tasks.workunit.client.1.vm04.stdout:4/698: chown d4/d14/d3c/d62/fe4 86 1 2026-03-10T14:08:08.326 INFO:tasks.workunit.client.0.vm03.stdout:6/205: creat d8/db/d2c/d2d/d32/f3e x:0 0 0 2026-03-10T14:08:08.327 INFO:tasks.workunit.client.0.vm03.stdout:6/206: chown d8/db/df/f37 39536 1 2026-03-10T14:08:08.330 INFO:tasks.workunit.client.1.vm04.stdout:1/737: mkdir d3/d22/d63/d35/dd9/d13/d38/db5/dff 0 2026-03-10T14:08:08.330 INFO:tasks.workunit.client.0.vm03.stdout:6/207: dwrite f0 [0,4194304] 0 2026-03-10T14:08:08.330 INFO:tasks.workunit.client.0.vm03.stdout:5/317: readlink d4/d40/d4e/l64 0 2026-03-10T14:08:08.330 INFO:tasks.workunit.client.1.vm04.stdout:3/748: link da/dc/d35/d37/cbc da/d8e/cfe 0 2026-03-10T14:08:08.331 INFO:tasks.workunit.client.0.vm03.stdout:6/208: write d8/db/d2c/d2d/d32/f3e [335382,125338] 0 2026-03-10T14:08:08.331 INFO:tasks.workunit.client.0.vm03.stdout:6/209: chown d8/db/f3c 15948185 1 2026-03-10T14:08:08.345 INFO:tasks.workunit.client.0.vm03.stdout:3/180: mknod d1d/c34 0 2026-03-10T14:08:08.346 INFO:tasks.workunit.client.0.vm03.stdout:9/220: unlink d2/f11 0 2026-03-10T14:08:08.356 INFO:tasks.workunit.client.1.vm04.stdout:2/707: sync 2026-03-10T14:08:08.356 INFO:tasks.workunit.client.0.vm03.stdout:3/181: dwrite f1c [4194304,4194304] 0 2026-03-10T14:08:08.363 INFO:tasks.workunit.client.1.vm04.stdout:2/708: dread d0/d14/d91/d8/d17/d4e/d85/f89 [0,4194304] 0 2026-03-10T14:08:08.370 INFO:tasks.workunit.client.0.vm03.stdout:8/233: unlink da/c29 0 2026-03-10T14:08:08.370 INFO:tasks.workunit.client.0.vm03.stdout:1/261: mkdir d0/d2/d34/d55 0 2026-03-10T14:08:08.371 INFO:tasks.workunit.client.1.vm04.stdout:3/749: mknod da/dc/d35/d52/d53/d78/cff 0 2026-03-10T14:08:08.379 INFO:tasks.workunit.client.0.vm03.stdout:5/318: unlink d4/d40/d4e/f56 0 2026-03-10T14:08:08.383 INFO:tasks.workunit.client.1.vm04.stdout:6/632: fsync d3/de/d35/d3f/d2d/fa6 0 2026-03-10T14:08:08.387 INFO:tasks.workunit.client.0.vm03.stdout:6/210: rename d8/d1b/d1c/l38 to d8/db/d12/l3f 0 2026-03-10T14:08:08.390 INFO:tasks.workunit.client.1.vm04.stdout:9/674: rename d9/d44/d4d/c68 to d9/da/d5d/dd6/ce4 0 2026-03-10T14:08:08.391 INFO:tasks.workunit.client.1.vm04.stdout:9/675: stat d9/d5c/d93/d96/d9d/fa2 0 2026-03-10T14:08:08.392 INFO:tasks.workunit.client.0.vm03.stdout:6/211: dwrite f5 [0,4194304] 0 2026-03-10T14:08:08.394 INFO:tasks.workunit.client.0.vm03.stdout:4/282: link d5/d9/c4f d5/d9/db/c57 0 2026-03-10T14:08:08.395 INFO:tasks.workunit.client.0.vm03.stdout:4/283: chown d5/d9/db/ff 0 1 2026-03-10T14:08:08.404 INFO:tasks.workunit.client.1.vm04.stdout:7/765: write d2/d94/f29 [3998577,12821] 0 2026-03-10T14:08:08.404 INFO:tasks.workunit.client.1.vm04.stdout:6/633: sync 2026-03-10T14:08:08.406 INFO:tasks.workunit.client.0.vm03.stdout:7/200: dwrite d5/d9/f10 [0,4194304] 0 2026-03-10T14:08:08.407 INFO:tasks.workunit.client.1.vm04.stdout:0/755: write d0/d2/d15/d22/f81 [4417101,63204] 0 2026-03-10T14:08:08.416 INFO:tasks.workunit.client.1.vm04.stdout:5/805: dwrite d7/d12/d2b/f53 [0,4194304] 0 2026-03-10T14:08:08.419 INFO:tasks.workunit.client.1.vm04.stdout:7/766: dread d2/d94/f7e [0,4194304] 0 2026-03-10T14:08:08.422 INFO:tasks.workunit.client.1.vm04.stdout:7/767: read d2/f4 [3261880,20569] 0 2026-03-10T14:08:08.428 INFO:tasks.workunit.client.1.vm04.stdout:8/825: rename d0/d3/d73/db8/dd5/df8 to d0/d3/d73/db8/d103 0 2026-03-10T14:08:08.430 INFO:tasks.workunit.client.0.vm03.stdout:0/225: mkdir d3/d11/d25/d47 0 2026-03-10T14:08:08.431 INFO:tasks.workunit.client.0.vm03.stdout:0/226: write d3/d11/f18 [4708805,84835] 0 2026-03-10T14:08:08.436 INFO:tasks.workunit.client.1.vm04.stdout:2/709: symlink d0/d14/d91/d8/d17/ld9 0 2026-03-10T14:08:08.441 INFO:tasks.workunit.client.0.vm03.stdout:1/262: rename d0/d2/df/d27/f3f to d0/d2/d34/f56 0 2026-03-10T14:08:08.451 INFO:tasks.workunit.client.1.vm04.stdout:0/756: unlink d0/d2/d15/d49/d50/d5c/fcd 0 2026-03-10T14:08:08.453 INFO:tasks.workunit.client.1.vm04.stdout:5/806: mkdir d7/d59/d7e/d101 0 2026-03-10T14:08:08.456 INFO:tasks.workunit.client.0.vm03.stdout:9/221: mknod d2/d14/c48 0 2026-03-10T14:08:08.459 INFO:tasks.workunit.client.0.vm03.stdout:9/222: dwrite d2/d14/d2b/f44 [0,4194304] 0 2026-03-10T14:08:08.461 INFO:tasks.workunit.client.0.vm03.stdout:9/223: write d2/d14/f3d [905611,13266] 0 2026-03-10T14:08:08.461 INFO:tasks.workunit.client.1.vm04.stdout:8/826: dread - d0/d3/d63/d29/fe3 zero size 2026-03-10T14:08:08.473 INFO:tasks.workunit.client.1.vm04.stdout:2/710: sync 2026-03-10T14:08:08.475 INFO:tasks.workunit.client.0.vm03.stdout:2/236: dwrite d5/d10/d17/f28 [0,4194304] 0 2026-03-10T14:08:08.477 INFO:tasks.workunit.client.1.vm04.stdout:4/699: dwrite d4/d14/f27 [4194304,4194304] 0 2026-03-10T14:08:08.492 INFO:tasks.workunit.client.1.vm04.stdout:1/738: rename f1 to d3/d22/deb/f100 0 2026-03-10T14:08:08.493 INFO:tasks.workunit.client.1.vm04.stdout:1/739: fsync d3/d22/d63/d35/d6c/fec 0 2026-03-10T14:08:08.493 INFO:tasks.workunit.client.1.vm04.stdout:1/740: dread - d3/d5c/ff2 zero size 2026-03-10T14:08:08.494 INFO:tasks.workunit.client.1.vm04.stdout:1/741: stat d3/d22/d63/d35/dd9/d13/d1a/c64 0 2026-03-10T14:08:08.500 INFO:tasks.workunit.client.1.vm04.stdout:2/711: mkdir d0/d14/d91/d4a/d66/dda 0 2026-03-10T14:08:08.504 INFO:tasks.workunit.client.1.vm04.stdout:2/712: dwrite d0/db8/fca [0,4194304] 0 2026-03-10T14:08:08.505 INFO:tasks.workunit.client.1.vm04.stdout:2/713: dread d0/d14/d91/d4a/d66/f7d [0,4194304] 0 2026-03-10T14:08:08.508 INFO:tasks.workunit.client.1.vm04.stdout:4/700: creat d4/d14/d3c/d85/ff1 x:0 0 0 2026-03-10T14:08:08.522 INFO:tasks.workunit.client.0.vm03.stdout:1/263: symlink d0/l57 0 2026-03-10T14:08:08.522 INFO:tasks.workunit.client.1.vm04.stdout:6/634: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/fc0 x:0 0 0 2026-03-10T14:08:08.528 INFO:tasks.workunit.client.1.vm04.stdout:3/750: rename da/f22 to da/dc/d47/f100 0 2026-03-10T14:08:08.532 INFO:tasks.workunit.client.1.vm04.stdout:1/742: creat d3/d8f/db7/dce/f101 x:0 0 0 2026-03-10T14:08:08.532 INFO:tasks.workunit.client.0.vm03.stdout:9/224: rmdir d2/d29/d33/d41 39 2026-03-10T14:08:08.533 INFO:tasks.workunit.client.1.vm04.stdout:1/743: dread d3/d22/d63/f7f [0,4194304] 0 2026-03-10T14:08:08.535 INFO:tasks.workunit.client.0.vm03.stdout:7/201: mknod d5/d9/d14/c3b 0 2026-03-10T14:08:08.535 INFO:tasks.workunit.client.0.vm03.stdout:7/202: dread - d5/d9/d14/f2d zero size 2026-03-10T14:08:08.541 INFO:tasks.workunit.client.1.vm04.stdout:2/714: symlink d0/d14/d91/d4a/d8c/dab/ldb 0 2026-03-10T14:08:08.543 INFO:tasks.workunit.client.1.vm04.stdout:2/715: write d0/d14/d91/d8/d17/d35/f81 [278472,305] 0 2026-03-10T14:08:08.544 INFO:tasks.workunit.client.0.vm03.stdout:0/227: getdents d3/d16/d21/d3c 0 2026-03-10T14:08:08.548 INFO:tasks.workunit.client.1.vm04.stdout:6/635: symlink d3/de/d35/d3a/d43/d9c/lc1 0 2026-03-10T14:08:08.549 INFO:tasks.workunit.client.1.vm04.stdout:6/636: read d3/f57 [165966,4689] 0 2026-03-10T14:08:08.554 INFO:tasks.workunit.client.1.vm04.stdout:8/827: dread d0/d75/d8a/f9e [0,4194304] 0 2026-03-10T14:08:08.554 INFO:tasks.workunit.client.1.vm04.stdout:8/828: dread - d0/d3/d63/d12/df5/ff9 zero size 2026-03-10T14:08:08.558 INFO:tasks.workunit.client.1.vm04.stdout:8/829: dwrite d0/d3/d63/d12/f2c [0,4194304] 0 2026-03-10T14:08:08.560 INFO:tasks.workunit.client.1.vm04.stdout:7/768: rename d2/dc/de/d2d/d60/d7c/d36/daa/cc9 to d2/dc/de/d2d/d60/d7c/d64/dbf/c10f 0 2026-03-10T14:08:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:08 vm04.local ceph-mon[55966]: pgmap v4: 65 pgs: 65 active+clean; 2.2 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:08:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:08 vm04.local ceph-mon[55966]: Deploying cephadm binary to vm03 2026-03-10T14:08:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:08 vm04.local ceph-mon[55966]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s)) 2026-03-10T14:08:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:08 vm04.local ceph-mon[55966]: Cluster is now healthy 2026-03-10T14:08:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:08 vm04.local ceph-mon[55966]: Deploying cephadm binary to vm04 2026-03-10T14:08:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:08 vm04.local ceph-mon[55966]: mgrmap e23: vm04.ywwcto(active, since 2s) 2026-03-10T14:08:08.564 INFO:tasks.workunit.client.0.vm03.stdout:4/284: link d5/d9/db/f48 d5/d9/db/d19/d34/f58 0 2026-03-10T14:08:08.569 INFO:tasks.workunit.client.1.vm04.stdout:5/807: dread d7/d2d/d32/d34/f4e [0,4194304] 0 2026-03-10T14:08:08.575 INFO:tasks.workunit.client.0.vm03.stdout:3/182: write d1d/f2c [1069005,70026] 0 2026-03-10T14:08:08.579 INFO:tasks.workunit.client.0.vm03.stdout:3/183: dwrite d1d/f32 [0,4194304] 0 2026-03-10T14:08:08.589 INFO:tasks.workunit.client.0.vm03.stdout:3/184: fdatasync f1c 0 2026-03-10T14:08:08.589 INFO:tasks.workunit.client.0.vm03.stdout:3/185: dwrite fc [0,4194304] 0 2026-03-10T14:08:08.589 INFO:tasks.workunit.client.0.vm03.stdout:3/186: write fb [3320624,128898] 0 2026-03-10T14:08:08.592 INFO:tasks.workunit.client.0.vm03.stdout:3/187: dwrite d1d/d29/f2f [0,4194304] 0 2026-03-10T14:08:08.594 INFO:tasks.workunit.client.0.vm03.stdout:5/319: getdents d4/d40/d4e 0 2026-03-10T14:08:08.594 INFO:tasks.workunit.client.1.vm04.stdout:6/637: truncate d3/de/d35/d3a/d43/f8a 467002 0 2026-03-10T14:08:08.597 INFO:tasks.workunit.client.0.vm03.stdout:6/212: dwrite d8/d1b/f29 [0,4194304] 0 2026-03-10T14:08:08.603 INFO:tasks.workunit.client.0.vm03.stdout:5/320: dread d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:08.603 INFO:tasks.workunit.client.0.vm03.stdout:5/321: fsync d4/d40/f5b 0 2026-03-10T14:08:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:08 vm03.local ceph-mon[49718]: pgmap v4: 65 pgs: 65 active+clean; 2.2 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:08:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:08 vm03.local ceph-mon[49718]: Deploying cephadm binary to vm03 2026-03-10T14:08:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:08 vm03.local ceph-mon[49718]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s)) 2026-03-10T14:08:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:08 vm03.local ceph-mon[49718]: Cluster is now healthy 2026-03-10T14:08:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:08 vm03.local ceph-mon[49718]: Deploying cephadm binary to vm04 2026-03-10T14:08:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:08 vm03.local ceph-mon[49718]: mgrmap e23: vm04.ywwcto(active, since 2s) 2026-03-10T14:08:08.612 INFO:tasks.workunit.client.0.vm03.stdout:9/225: mknod d2/d29/d33/c49 0 2026-03-10T14:08:08.620 INFO:tasks.workunit.client.0.vm03.stdout:8/234: write da/f2e [5027394,29954] 0 2026-03-10T14:08:08.620 INFO:tasks.workunit.client.1.vm04.stdout:8/830: creat d0/d3/d63/d29/f104 x:0 0 0 2026-03-10T14:08:08.620 INFO:tasks.workunit.client.1.vm04.stdout:9/676: dwrite d9/d33/f4b [0,4194304] 0 2026-03-10T14:08:08.622 INFO:tasks.workunit.client.1.vm04.stdout:7/769: mkdir d2/d2a/d42/d110 0 2026-03-10T14:08:08.626 INFO:tasks.workunit.client.1.vm04.stdout:7/770: chown d2/d2a/f93 724552699 1 2026-03-10T14:08:08.630 INFO:tasks.workunit.client.0.vm03.stdout:2/237: write d5/f2d [4737510,62735] 0 2026-03-10T14:08:08.630 INFO:tasks.workunit.client.0.vm03.stdout:1/264: unlink d0/d18/f21 0 2026-03-10T14:08:08.631 INFO:tasks.workunit.client.1.vm04.stdout:3/751: symlink da/dc/d35/dcd/l101 0 2026-03-10T14:08:08.632 INFO:tasks.workunit.client.1.vm04.stdout:3/752: write da/dc/d3f/d54/ff3 [95757,26924] 0 2026-03-10T14:08:08.635 INFO:tasks.workunit.client.1.vm04.stdout:1/744: mkdir d3/d20/d102 0 2026-03-10T14:08:08.638 INFO:tasks.workunit.client.0.vm03.stdout:0/228: creat d3/d11/d25/d47/f48 x:0 0 0 2026-03-10T14:08:08.638 INFO:tasks.workunit.client.0.vm03.stdout:3/188: readlink l16 0 2026-03-10T14:08:08.638 INFO:tasks.workunit.client.1.vm04.stdout:4/701: creat d4/d14/ff2 x:0 0 0 2026-03-10T14:08:08.640 INFO:tasks.workunit.client.1.vm04.stdout:6/638: symlink d3/de/d35/d3a/d43/d9c/lc2 0 2026-03-10T14:08:08.641 INFO:tasks.workunit.client.0.vm03.stdout:3/189: dread d1d/f2c [0,4194304] 0 2026-03-10T14:08:08.642 INFO:tasks.workunit.client.0.vm03.stdout:6/213: creat d8/db/d12/f40 x:0 0 0 2026-03-10T14:08:08.664 INFO:tasks.workunit.client.1.vm04.stdout:9/677: mkdir d9/da/d8c/de5 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.1.vm04.stdout:9/678: write d9/da/dd/d1c/fdd [916302,29932] 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.1.vm04.stdout:7/771: mknod d2/dc/de/d2d/d60/c111 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.1.vm04.stdout:3/753: fdatasync da/dc/d47/d9b/fbe 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.1.vm04.stdout:3/754: chown da/dc/d35/d52 55333 1 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.1.vm04.stdout:4/702: write d4/d14/d64/fdc [233405,90103] 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.1.vm04.stdout:6/639: unlink d3/f78 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.1.vm04.stdout:0/757: rename d0/d2/d15/d49/d50/fb7 to d0/d2/d15/fef 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.1.vm04.stdout:3/755: mkdir da/dc/d3f/d61/d102 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:6/214: write d8/fd [1562168,124393] 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:5/322: creat d4/d16/d19/f6f x:0 0 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:5/323: chown d4/d6/de 1 1 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:5/324: chown d4/d13/c2b 104 1 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:5/325: write d4/d16/d19/f6f [823347,85369] 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:9/226: symlink d2/d14/d2b/d34/l4a 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:9/227: dread - d2/d29/f35 zero size 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:8/235: fdatasync da/d36/f42 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:2/238: rename d5/d10/d31/l43 to d5/d10/d31/l46 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:6/215: creat d8/db/d2c/d2d/d32/f41 x:0 0 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:5/326: creat d4/d13/d1f/f70 x:0 0 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:6/216: write d8/db/d2c/d2d/d32/f41 [748259,121592] 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:9/228: readlink d2/l12 0 2026-03-10T14:08:08.665 INFO:tasks.workunit.client.0.vm03.stdout:8/236: chown da/c21 4 1 2026-03-10T14:08:08.674 INFO:tasks.workunit.client.0.vm03.stdout:1/265: creat d0/d2/df/d27/f58 x:0 0 0 2026-03-10T14:08:08.674 INFO:tasks.workunit.client.0.vm03.stdout:4/285: truncate d5/f7 2199236 0 2026-03-10T14:08:08.674 INFO:tasks.workunit.client.0.vm03.stdout:0/229: dread d3/d17/f35 [0,4194304] 0 2026-03-10T14:08:08.675 INFO:tasks.workunit.client.1.vm04.stdout:8/831: rename d0/dc1 to d0/d3/d63/d12/d51/d67/d96/d105 0 2026-03-10T14:08:08.678 INFO:tasks.workunit.client.0.vm03.stdout:3/190: symlink d1d/d33/l35 0 2026-03-10T14:08:08.678 INFO:tasks.workunit.client.0.vm03.stdout:5/327: creat d4/d16/f71 x:0 0 0 2026-03-10T14:08:08.679 INFO:tasks.workunit.client.0.vm03.stdout:6/217: chown d8/db/d12/l3f 4011152 1 2026-03-10T14:08:08.680 INFO:tasks.workunit.client.1.vm04.stdout:5/808: getdents d7/d12/d2b/d93 0 2026-03-10T14:08:08.681 INFO:tasks.workunit.client.0.vm03.stdout:9/229: creat d2/d29/d38/f4b x:0 0 0 2026-03-10T14:08:08.682 INFO:tasks.workunit.client.0.vm03.stdout:6/218: dread d8/d11/d18/f21 [0,4194304] 0 2026-03-10T14:08:08.684 INFO:tasks.workunit.client.1.vm04.stdout:8/832: symlink d0/d3/dd/d76/l106 0 2026-03-10T14:08:08.684 INFO:tasks.workunit.client.0.vm03.stdout:6/219: rename d8/db/d2c/d2d/d32/d3a to d8/db/d2c/d2d/d32/d3a/d42 22 2026-03-10T14:08:08.685 INFO:tasks.workunit.client.0.vm03.stdout:2/239: symlink d5/d35/l47 0 2026-03-10T14:08:08.686 INFO:tasks.workunit.client.0.vm03.stdout:4/286: creat d5/d9/db/d19/d38/d53/f59 x:0 0 0 2026-03-10T14:08:08.689 INFO:tasks.workunit.client.1.vm04.stdout:3/756: mkdir da/dc/d35/d52/d103 0 2026-03-10T14:08:08.691 INFO:tasks.workunit.client.1.vm04.stdout:5/809: chown d7/d12/d2b/d3e/d57/d77/fdd 3686 1 2026-03-10T14:08:08.695 INFO:tasks.workunit.client.0.vm03.stdout:5/328: read d4/d13/d43/f58 [6188856,54395] 0 2026-03-10T14:08:08.696 INFO:tasks.workunit.client.0.vm03.stdout:8/237: creat da/d3c/f48 x:0 0 0 2026-03-10T14:08:08.696 INFO:tasks.workunit.client.0.vm03.stdout:5/329: read d4/d13/d43/f51 [34046,62582] 0 2026-03-10T14:08:08.700 INFO:tasks.workunit.client.0.vm03.stdout:2/240: dwrite d5/d10/d17/f33 [0,4194304] 0 2026-03-10T14:08:08.702 INFO:tasks.workunit.client.1.vm04.stdout:8/833: dread d0/d3/fc3 [0,4194304] 0 2026-03-10T14:08:08.702 INFO:tasks.workunit.client.0.vm03.stdout:6/220: rename d8/d11/c14 to d8/d3b/c43 0 2026-03-10T14:08:08.715 INFO:tasks.workunit.client.0.vm03.stdout:6/221: creat d8/db/d12/f44 x:0 0 0 2026-03-10T14:08:08.716 INFO:tasks.workunit.client.0.vm03.stdout:3/191: creat d1d/f36 x:0 0 0 2026-03-10T14:08:08.717 INFO:tasks.workunit.client.1.vm04.stdout:3/757: creat da/ded/f104 x:0 0 0 2026-03-10T14:08:08.718 INFO:tasks.workunit.client.0.vm03.stdout:9/230: creat d2/f4c x:0 0 0 2026-03-10T14:08:08.719 INFO:tasks.workunit.client.0.vm03.stdout:9/231: chown d2/l12 7100024 1 2026-03-10T14:08:08.719 INFO:tasks.workunit.client.0.vm03.stdout:9/232: readlink d2/l12 0 2026-03-10T14:08:08.719 INFO:tasks.workunit.client.0.vm03.stdout:1/266: truncate d0/f4 2521092 0 2026-03-10T14:08:08.719 INFO:tasks.workunit.client.1.vm04.stdout:8/834: mkdir d0/d3/d63/d12/df5/d107 0 2026-03-10T14:08:08.720 INFO:tasks.workunit.client.0.vm03.stdout:3/192: dread fb [0,4194304] 0 2026-03-10T14:08:08.720 INFO:tasks.workunit.client.0.vm03.stdout:9/233: fdatasync d2/d29/d38/f47 0 2026-03-10T14:08:08.720 INFO:tasks.workunit.client.0.vm03.stdout:1/267: truncate d0/d2/df/d27/f58 600996 0 2026-03-10T14:08:08.721 INFO:tasks.workunit.client.1.vm04.stdout:5/810: rename d7/d12/d2b/d3e/d57/ld2 to d7/d12/d2b/d3e/d3f/da6/l102 0 2026-03-10T14:08:08.721 INFO:tasks.workunit.client.0.vm03.stdout:5/330: truncate d4/d13/d1f/f20 3183033 0 2026-03-10T14:08:08.722 INFO:tasks.workunit.client.1.vm04.stdout:5/811: chown d7/d2d/d32/f3b 562655 1 2026-03-10T14:08:08.722 INFO:tasks.workunit.client.0.vm03.stdout:9/234: truncate d2/d14/d2b/d43/f45 195228 0 2026-03-10T14:08:08.722 INFO:tasks.workunit.client.0.vm03.stdout:6/222: symlink d8/db/d2c/d2d/l45 0 2026-03-10T14:08:08.726 INFO:tasks.workunit.client.1.vm04.stdout:8/835: fdatasync d0/d75/f83 0 2026-03-10T14:08:08.726 INFO:tasks.workunit.client.0.vm03.stdout:4/287: dread d5/fe [0,4194304] 0 2026-03-10T14:08:08.734 INFO:tasks.workunit.client.1.vm04.stdout:2/716: write d0/d14/d91/d4a/d66/f7d [1090226,4690] 0 2026-03-10T14:08:08.744 INFO:tasks.workunit.client.1.vm04.stdout:1/745: write d3/d22/f8e [3264681,89351] 0 2026-03-10T14:08:08.744 INFO:tasks.workunit.client.1.vm04.stdout:1/746: readlink d3/d5c/la1 0 2026-03-10T14:08:08.748 INFO:tasks.workunit.client.1.vm04.stdout:1/747: read d3/d22/d63/d35/dd9/d13/da0/dc5/fc6 [2240606,90151] 0 2026-03-10T14:08:08.750 INFO:tasks.workunit.client.1.vm04.stdout:3/758: link da/dc/d35/dcd/dde/dac/fca da/dc/d3f/d61/d102/f105 0 2026-03-10T14:08:08.752 INFO:tasks.workunit.client.0.vm03.stdout:2/241: readlink d5/d10/d31/l46 0 2026-03-10T14:08:08.753 INFO:tasks.workunit.client.1.vm04.stdout:9/679: dwrite d9/d5c/fb6 [0,4194304] 0 2026-03-10T14:08:08.754 INFO:tasks.workunit.client.0.vm03.stdout:3/193: rmdir d1d/d29 39 2026-03-10T14:08:08.759 INFO:tasks.workunit.client.1.vm04.stdout:5/812: truncate d7/d12/d2b/d3e/d57/d77/da5/faa 224567 0 2026-03-10T14:08:08.762 INFO:tasks.workunit.client.0.vm03.stdout:8/238: mkdir da/d24/d49 0 2026-03-10T14:08:08.762 INFO:tasks.workunit.client.1.vm04.stdout:7/772: write d2/dc/de/d11/ffd [1568095,17641] 0 2026-03-10T14:08:08.766 INFO:tasks.workunit.client.1.vm04.stdout:8/836: fdatasync d0/d3/d73/f98 0 2026-03-10T14:08:08.768 INFO:tasks.workunit.client.1.vm04.stdout:6/640: write d3/de/d35/d3a/d43/d9c/fa8 [259584,127146] 0 2026-03-10T14:08:08.774 INFO:tasks.workunit.client.1.vm04.stdout:0/758: write d0/d2/d15/d22/f88 [2065363,81231] 0 2026-03-10T14:08:08.783 INFO:tasks.workunit.client.1.vm04.stdout:4/703: dwrite d4/d14/d3c/d5e/f7b [4194304,4194304] 0 2026-03-10T14:08:08.787 INFO:tasks.workunit.client.1.vm04.stdout:1/748: creat d3/d22/d63/d35/dd9/d13/da0/f103 x:0 0 0 2026-03-10T14:08:08.792 INFO:tasks.workunit.client.0.vm03.stdout:0/230: dwrite d3/d17/f35 [0,4194304] 0 2026-03-10T14:08:08.793 INFO:tasks.workunit.client.1.vm04.stdout:3/759: write da/dc/d3f/d54/d66/f99 [403150,106565] 0 2026-03-10T14:08:08.797 INFO:tasks.workunit.client.0.vm03.stdout:2/242: symlink d5/d10/d1f/l48 0 2026-03-10T14:08:08.797 INFO:tasks.workunit.client.0.vm03.stdout:2/243: stat d5/ff 0 2026-03-10T14:08:08.798 INFO:tasks.workunit.client.0.vm03.stdout:2/244: write d5/d10/d1f/f3e [815544,19755] 0 2026-03-10T14:08:08.798 INFO:tasks.workunit.client.1.vm04.stdout:5/813: creat d7/d2d/d69/db8/f103 x:0 0 0 2026-03-10T14:08:08.801 INFO:tasks.workunit.client.0.vm03.stdout:2/245: dwrite f1 [0,4194304] 0 2026-03-10T14:08:08.802 INFO:tasks.workunit.client.0.vm03.stdout:2/246: write d5/d10/d31/f3d [893490,108792] 0 2026-03-10T14:08:08.812 INFO:tasks.workunit.client.0.vm03.stdout:3/194: fsync d1d/f26 0 2026-03-10T14:08:08.812 INFO:tasks.workunit.client.0.vm03.stdout:3/195: read fc [3157489,81344] 0 2026-03-10T14:08:08.816 INFO:tasks.workunit.client.0.vm03.stdout:7/203: mknod d5/d9/d14/d21/d28/c3c 0 2026-03-10T14:08:08.822 INFO:tasks.workunit.client.1.vm04.stdout:6/641: creat d3/de/d35/d3a/d43/d4c/d5e/fc3 x:0 0 0 2026-03-10T14:08:08.824 INFO:tasks.workunit.client.0.vm03.stdout:7/204: read d5/d9/f30 [1569215,259] 0 2026-03-10T14:08:08.827 INFO:tasks.workunit.client.0.vm03.stdout:1/268: mknod d0/c59 0 2026-03-10T14:08:08.842 INFO:tasks.workunit.client.0.vm03.stdout:9/235: symlink d2/l4d 0 2026-03-10T14:08:08.846 INFO:tasks.workunit.client.0.vm03.stdout:9/236: chown d2/d29/d33/c40 126290757 1 2026-03-10T14:08:08.846 INFO:tasks.workunit.client.0.vm03.stdout:9/237: write d2/d29/d38/f4b [455860,69658] 0 2026-03-10T14:08:08.846 INFO:tasks.workunit.client.0.vm03.stdout:9/238: chown d2/d29/d33 1582 1 2026-03-10T14:08:08.846 INFO:tasks.workunit.client.0.vm03.stdout:9/239: truncate d2/d29/f35 1034501 0 2026-03-10T14:08:08.847 INFO:tasks.workunit.client.0.vm03.stdout:9/240: dwrite d2/d14/f3d [0,4194304] 0 2026-03-10T14:08:08.851 INFO:tasks.workunit.client.0.vm03.stdout:9/241: dwrite d2/d14/f3d [0,4194304] 0 2026-03-10T14:08:08.856 INFO:tasks.workunit.client.0.vm03.stdout:0/231: creat d3/d11/d25/f49 x:0 0 0 2026-03-10T14:08:08.858 INFO:tasks.workunit.client.0.vm03.stdout:5/331: dwrite d4/d16/f1c [0,4194304] 0 2026-03-10T14:08:08.864 INFO:tasks.workunit.client.0.vm03.stdout:2/247: creat d5/d35/f49 x:0 0 0 2026-03-10T14:08:08.864 INFO:tasks.workunit.client.0.vm03.stdout:0/232: read d3/fe [688428,32223] 0 2026-03-10T14:08:08.874 INFO:tasks.workunit.client.1.vm04.stdout:8/837: fdatasync d0/d3/d63/d12/d51/d67/fcd 0 2026-03-10T14:08:08.877 INFO:tasks.workunit.client.0.vm03.stdout:3/196: chown d1d/d29/f2f 414578891 1 2026-03-10T14:08:08.877 INFO:tasks.workunit.client.0.vm03.stdout:3/197: write d1d/f2b [193406,84425] 0 2026-03-10T14:08:08.878 INFO:tasks.workunit.client.0.vm03.stdout:3/198: write d1d/f36 [893540,100991] 0 2026-03-10T14:08:08.879 INFO:tasks.workunit.client.0.vm03.stdout:8/239: mknod da/d24/c4a 0 2026-03-10T14:08:08.893 INFO:tasks.workunit.client.1.vm04.stdout:9/680: creat d9/fe6 x:0 0 0 2026-03-10T14:08:08.897 INFO:tasks.workunit.client.1.vm04.stdout:7/773: write d2/dc/d4d/dcd/f45 [455146,33987] 0 2026-03-10T14:08:08.899 INFO:tasks.workunit.client.1.vm04.stdout:2/717: write d0/d14/d91/d8/d17/f73 [3464046,12945] 0 2026-03-10T14:08:08.900 INFO:tasks.workunit.client.0.vm03.stdout:6/223: dwrite d8/db/d12/f26 [0,4194304] 0 2026-03-10T14:08:08.901 INFO:tasks.workunit.client.1.vm04.stdout:2/718: chown d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/l63 1 1 2026-03-10T14:08:08.902 INFO:tasks.workunit.client.1.vm04.stdout:0/759: dwrite d0/d2/d15/f2f [0,4194304] 0 2026-03-10T14:08:08.910 INFO:tasks.workunit.client.1.vm04.stdout:4/704: write d4/d14/d3c/f46 [740449,42438] 0 2026-03-10T14:08:08.914 INFO:tasks.workunit.client.1.vm04.stdout:5/814: symlink d7/d26/ddf/l104 0 2026-03-10T14:08:08.918 INFO:tasks.workunit.client.1.vm04.stdout:8/838: creat d0/d3/d63/d12/d51/d67/d96/f108 x:0 0 0 2026-03-10T14:08:08.926 INFO:tasks.workunit.client.1.vm04.stdout:9/681: rename d9/d5c/d93 to d9/da/dd/de7 0 2026-03-10T14:08:08.932 INFO:tasks.workunit.client.0.vm03.stdout:3/199: creat d1d/d33/f37 x:0 0 0 2026-03-10T14:08:08.932 INFO:tasks.workunit.client.1.vm04.stdout:0/760: truncate d0/f19 1902485 0 2026-03-10T14:08:08.937 INFO:tasks.workunit.client.1.vm04.stdout:5/815: mkdir d7/d2d/d32/d34/d105 0 2026-03-10T14:08:08.944 INFO:tasks.workunit.client.1.vm04.stdout:6/642: link d3/de/d35/d3f/d2d/d32/d23/d4e/c5b d3/de/d35/d3f/dba/cc4 0 2026-03-10T14:08:08.945 INFO:tasks.workunit.client.1.vm04.stdout:6/643: write d3/de/d35/d3a/d43/d4c/d5e/d76/f94 [4682902,93610] 0 2026-03-10T14:08:08.946 INFO:tasks.workunit.client.0.vm03.stdout:9/242: fdatasync d2/d14/f1b 0 2026-03-10T14:08:08.946 INFO:tasks.workunit.client.0.vm03.stdout:9/243: chown d2/c1f 108 1 2026-03-10T14:08:08.946 INFO:tasks.workunit.client.0.vm03.stdout:9/244: chown d2/f2f 71756 1 2026-03-10T14:08:08.951 INFO:tasks.workunit.client.1.vm04.stdout:7/774: rename d2/d94/f7e to d2/d2a/d42/d86/df4/f112 0 2026-03-10T14:08:08.952 INFO:tasks.workunit.client.0.vm03.stdout:5/332: dread d4/d6/f10 [0,4194304] 0 2026-03-10T14:08:08.959 INFO:tasks.workunit.client.1.vm04.stdout:4/705: dread d4/f96 [0,4194304] 0 2026-03-10T14:08:08.964 INFO:tasks.workunit.client.1.vm04.stdout:9/682: mknod d9/da/d5d/dd6/ce8 0 2026-03-10T14:08:08.966 INFO:tasks.workunit.client.0.vm03.stdout:6/224: creat d8/db/d2c/d2d/f46 x:0 0 0 2026-03-10T14:08:08.971 INFO:tasks.workunit.client.0.vm03.stdout:4/288: truncate d5/d9/f16 4096677 0 2026-03-10T14:08:08.977 INFO:tasks.workunit.client.1.vm04.stdout:0/761: chown d0/d2/d15/d49/lab 150 1 2026-03-10T14:08:08.982 INFO:tasks.workunit.client.1.vm04.stdout:8/839: truncate d0/d3/dd/d78/f8d 1039865 0 2026-03-10T14:08:08.983 INFO:tasks.workunit.client.1.vm04.stdout:8/840: readlink d0/d3/d5/la6 0 2026-03-10T14:08:08.985 INFO:tasks.workunit.client.1.vm04.stdout:1/749: write d3/f2c [1936960,460] 0 2026-03-10T14:08:08.989 INFO:tasks.workunit.client.1.vm04.stdout:1/750: dwrite d3/f2c [4194304,4194304] 0 2026-03-10T14:08:08.989 INFO:tasks.workunit.client.0.vm03.stdout:6/225: dread d8/db/f1f [0,4194304] 0 2026-03-10T14:08:08.993 INFO:tasks.workunit.client.1.vm04.stdout:1/751: dwrite d3/d5c/f71 [0,4194304] 0 2026-03-10T14:08:08.997 INFO:tasks.workunit.client.1.vm04.stdout:1/752: dread - d3/d22/d63/d35/dd9/d13/d38/ff1 zero size 2026-03-10T14:08:09.007 INFO:tasks.workunit.client.0.vm03.stdout:7/205: write d5/fb [2714164,77899] 0 2026-03-10T14:08:09.016 INFO:tasks.workunit.client.0.vm03.stdout:3/200: creat d1d/d29/f38 x:0 0 0 2026-03-10T14:08:09.017 INFO:tasks.workunit.client.1.vm04.stdout:3/760: write da/dc/d35/d52/f69 [333938,21140] 0 2026-03-10T14:08:09.019 INFO:tasks.workunit.client.1.vm04.stdout:1/753: dread d3/d5c/fb2 [0,4194304] 0 2026-03-10T14:08:09.027 INFO:tasks.workunit.client.0.vm03.stdout:1/269: rmdir d0/d2/d34/d55 0 2026-03-10T14:08:09.030 INFO:tasks.workunit.client.0.vm03.stdout:9/245: creat d2/d29/d38/f4e x:0 0 0 2026-03-10T14:08:09.033 INFO:tasks.workunit.client.0.vm03.stdout:5/333: creat d4/d13/d43/f72 x:0 0 0 2026-03-10T14:08:09.057 INFO:tasks.workunit.client.1.vm04.stdout:8/841: mknod d0/d3/d63/d12/df5/c109 0 2026-03-10T14:08:09.058 INFO:tasks.workunit.client.1.vm04.stdout:8/842: chown d0/d3/dd/fc0 430 1 2026-03-10T14:08:09.063 INFO:tasks.workunit.client.1.vm04.stdout:2/719: write d0/d14/d91/f1d [6163070,125221] 0 2026-03-10T14:08:09.075 INFO:tasks.workunit.client.0.vm03.stdout:0/233: truncate d3/d11/d2c/f3b 247011 0 2026-03-10T14:08:09.076 INFO:tasks.workunit.client.0.vm03.stdout:1/270: write d0/f48 [1007871,68045] 0 2026-03-10T14:08:09.076 INFO:tasks.workunit.client.1.vm04.stdout:1/754: truncate d3/d22/d63/d35/dd9/d13/d38/db5/fb8 1624097 0 2026-03-10T14:08:09.077 INFO:tasks.workunit.client.0.vm03.stdout:1/271: readlink d0/d2/df/d16/d41/l4a 0 2026-03-10T14:08:09.079 INFO:tasks.workunit.client.0.vm03.stdout:0/234: dwrite d3/d16/f3e [0,4194304] 0 2026-03-10T14:08:09.081 INFO:tasks.workunit.client.1.vm04.stdout:1/755: dwrite d3/d22/d63/d35/dd9/d13/d38/db5/dc4/ff6 [0,4194304] 0 2026-03-10T14:08:09.081 INFO:tasks.workunit.client.0.vm03.stdout:0/235: stat d3/d16/f34 0 2026-03-10T14:08:09.101 INFO:tasks.workunit.client.1.vm04.stdout:5/816: rename d7/d26/d6b/c7f to d7/d59/d7e/dc6/c106 0 2026-03-10T14:08:09.106 INFO:tasks.workunit.client.0.vm03.stdout:5/334: read d4/d6/f63 [263510,75371] 0 2026-03-10T14:08:09.109 INFO:tasks.workunit.client.0.vm03.stdout:8/240: dwrite f7 [0,4194304] 0 2026-03-10T14:08:09.109 INFO:tasks.workunit.client.0.vm03.stdout:8/241: fsync da/d24/f3d 0 2026-03-10T14:08:09.109 INFO:tasks.workunit.client.0.vm03.stdout:8/242: dread - da/d3c/f48 zero size 2026-03-10T14:08:09.114 INFO:tasks.workunit.client.1.vm04.stdout:7/775: dwrite d2/dc/de/d2d/d60/d7c/d36/d8b/fb8 [0,4194304] 0 2026-03-10T14:08:09.128 INFO:tasks.workunit.client.1.vm04.stdout:9/683: dwrite d9/da/dd/d1c/f22 [0,4194304] 0 2026-03-10T14:08:09.130 INFO:tasks.workunit.client.0.vm03.stdout:4/289: dwrite d5/d9/db/d19/f45 [0,4194304] 0 2026-03-10T14:08:09.133 INFO:tasks.workunit.client.0.vm03.stdout:4/290: readlink d5/lc 0 2026-03-10T14:08:09.135 INFO:tasks.workunit.client.1.vm04.stdout:0/762: creat d0/dee/ff0 x:0 0 0 2026-03-10T14:08:09.141 INFO:tasks.workunit.client.0.vm03.stdout:2/248: link d5/d10/c14 d5/c4a 0 2026-03-10T14:08:09.141 INFO:tasks.workunit.client.1.vm04.stdout:0/763: read - d0/d2/d15/d22/d38/f71 zero size 2026-03-10T14:08:09.141 INFO:tasks.workunit.client.1.vm04.stdout:8/843: mknod d0/d3/d63/d12/d51/d67/d96/dc8/c10a 0 2026-03-10T14:08:09.143 INFO:tasks.workunit.client.1.vm04.stdout:2/720: creat d0/d14/d91/d8/fdc x:0 0 0 2026-03-10T14:08:09.146 INFO:tasks.workunit.client.1.vm04.stdout:5/817: symlink d7/d9/l107 0 2026-03-10T14:08:09.149 INFO:tasks.workunit.client.1.vm04.stdout:4/706: rename d4/df/db2/db4/d47/d70/la2 to d4/d14/d6d/lf3 0 2026-03-10T14:08:09.156 INFO:tasks.workunit.client.1.vm04.stdout:9/684: creat d9/da/dd/de7/d96/d9d/fe9 x:0 0 0 2026-03-10T14:08:09.166 INFO:tasks.workunit.client.0.vm03.stdout:3/201: write d1d/f2c [882119,49263] 0 2026-03-10T14:08:09.168 INFO:tasks.workunit.client.0.vm03.stdout:8/243: chown da/c19 375 1 2026-03-10T14:08:09.169 INFO:tasks.workunit.client.0.vm03.stdout:9/246: read d2/d14/d2b/f42 [1625211,13634] 0 2026-03-10T14:08:09.171 INFO:tasks.workunit.client.1.vm04.stdout:7/776: rename d2/dc/de/d2d/d38/f41 to d2/dc/de/d2d/d60/d81/f113 0 2026-03-10T14:08:09.175 INFO:tasks.workunit.client.1.vm04.stdout:4/707: symlink d4/df/db2/db6/dc9/dd0/lf4 0 2026-03-10T14:08:09.176 INFO:tasks.workunit.client.1.vm04.stdout:4/708: chown d4/df/db2/db4/d47/d4f/d8c/fea 80017 1 2026-03-10T14:08:09.176 INFO:tasks.workunit.client.1.vm04.stdout:4/709: write d4/d14/d64/dcb/fec [359451,18249] 0 2026-03-10T14:08:09.178 INFO:tasks.workunit.client.1.vm04.stdout:4/710: truncate d4/df/db2/db6/dc9/dd0/fd3 1791429 0 2026-03-10T14:08:09.178 INFO:tasks.workunit.client.1.vm04.stdout:4/711: chown d4/df/db2/db6/dc9 720998 1 2026-03-10T14:08:09.184 INFO:tasks.workunit.client.1.vm04.stdout:3/761: write da/dc/f2a [4879680,122377] 0 2026-03-10T14:08:09.187 INFO:tasks.workunit.client.1.vm04.stdout:6/644: dwrite d3/f8d [0,4194304] 0 2026-03-10T14:08:09.187 INFO:tasks.workunit.client.0.vm03.stdout:7/206: dwrite d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:08:09.189 INFO:tasks.workunit.client.0.vm03.stdout:7/207: write d5/d9/f10 [4556380,125641] 0 2026-03-10T14:08:09.209 INFO:tasks.workunit.client.0.vm03.stdout:1/272: link d0/d18/f25 d0/d2/df/d16/d20/f5a 0 2026-03-10T14:08:09.212 INFO:tasks.workunit.client.0.vm03.stdout:1/273: dread d0/f48 [0,4194304] 0 2026-03-10T14:08:09.213 INFO:tasks.workunit.client.1.vm04.stdout:1/756: rmdir d3/d22/d63/d35/dd9/d13/d38/db5/dc4/dee 0 2026-03-10T14:08:09.214 INFO:tasks.workunit.client.1.vm04.stdout:1/757: chown d3/d22/d63/d35/dd9/d13/da0/dc5/dfa 1925957382 1 2026-03-10T14:08:09.217 INFO:tasks.workunit.client.0.vm03.stdout:0/236: unlink d3/d11/c44 0 2026-03-10T14:08:09.222 INFO:tasks.workunit.client.1.vm04.stdout:0/764: rename d0/d2/d15/fb4 to d0/dee/ff1 0 2026-03-10T14:08:09.225 INFO:tasks.workunit.client.0.vm03.stdout:5/335: mkdir d4/d13/d73 0 2026-03-10T14:08:09.233 INFO:tasks.workunit.client.1.vm04.stdout:4/712: dread d4/d14/d64/fab [0,4194304] 0 2026-03-10T14:08:09.239 INFO:tasks.workunit.client.0.vm03.stdout:8/244: mkdir da/d3c/d4b 0 2026-03-10T14:08:09.266 INFO:tasks.workunit.client.0.vm03.stdout:4/291: getdents d5/d9/d2b 0 2026-03-10T14:08:09.266 INFO:tasks.workunit.client.0.vm03.stdout:2/249: rename d5/d10/f39 to d5/d10/d1c/d40/f4b 0 2026-03-10T14:08:09.266 INFO:tasks.workunit.client.0.vm03.stdout:2/250: stat d5/d35/f49 0 2026-03-10T14:08:09.266 INFO:tasks.workunit.client.1.vm04.stdout:6/645: dread - d3/de/d35/d3f/d2d/d32/d5c/f7f zero size 2026-03-10T14:08:09.266 INFO:tasks.workunit.client.1.vm04.stdout:6/646: readlink d3/de/l2c 0 2026-03-10T14:08:09.266 INFO:tasks.workunit.client.1.vm04.stdout:3/762: dread da/fd [0,4194304] 0 2026-03-10T14:08:09.266 INFO:tasks.workunit.client.1.vm04.stdout:4/713: dread d4/df/d31/f3d [0,4194304] 0 2026-03-10T14:08:09.266 INFO:tasks.workunit.client.1.vm04.stdout:4/714: dwrite d4/f2c [4194304,4194304] 0 2026-03-10T14:08:09.270 INFO:tasks.workunit.client.1.vm04.stdout:6/647: rename d3/de/d35/d3f/d2d/d32/d23/d83/dad/lbe to d3/de/d35/d3f/d2d/d32/d23/d83/dad/lc5 0 2026-03-10T14:08:09.272 INFO:tasks.workunit.client.1.vm04.stdout:6/648: dread d3/de/d35/d3a/d43/d4c/d5e/d76/f94 [0,4194304] 0 2026-03-10T14:08:09.273 INFO:tasks.workunit.client.1.vm04.stdout:1/758: mknod d3/d22/d63/d35/dd9/d13/da0/dc5/dfa/c104 0 2026-03-10T14:08:09.278 INFO:tasks.workunit.client.0.vm03.stdout:8/245: fsync da/fe 0 2026-03-10T14:08:09.282 INFO:tasks.workunit.client.1.vm04.stdout:9/685: rename d9/f4f to d9/da/d8c/fea 0 2026-03-10T14:08:09.282 INFO:tasks.workunit.client.1.vm04.stdout:9/686: chown d9/d58/db5/fc6 7654 1 2026-03-10T14:08:09.285 INFO:tasks.workunit.client.0.vm03.stdout:9/247: symlink d2/d14/d2b/d34/l4f 0 2026-03-10T14:08:09.286 INFO:tasks.workunit.client.1.vm04.stdout:5/818: sync 2026-03-10T14:08:09.289 INFO:tasks.workunit.client.1.vm04.stdout:6/649: truncate d3/f57 3483401 0 2026-03-10T14:08:09.291 INFO:tasks.workunit.client.0.vm03.stdout:1/274: unlink d0/f4 0 2026-03-10T14:08:09.294 INFO:tasks.workunit.client.1.vm04.stdout:1/759: mknod d3/d22/d63/d35/dd9/d13/d38/db5/dc4/c105 0 2026-03-10T14:08:09.296 INFO:tasks.workunit.client.1.vm04.stdout:4/715: truncate d4/fb3 1751062 0 2026-03-10T14:08:09.306 INFO:tasks.workunit.client.0.vm03.stdout:8/246: chown da/d3a 77365 1 2026-03-10T14:08:09.306 INFO:tasks.workunit.client.1.vm04.stdout:3/763: rename da/dc/d35/dcd to da/dc/d47/d9b/d106 0 2026-03-10T14:08:09.306 INFO:tasks.workunit.client.1.vm04.stdout:9/687: fsync d9/da/f6b 0 2026-03-10T14:08:09.306 INFO:tasks.workunit.client.1.vm04.stdout:6/650: rmdir d3/de/d35/d3f/d2d/d32/d23/d24/d6f/d71 39 2026-03-10T14:08:09.306 INFO:tasks.workunit.client.0.vm03.stdout:0/237: sync 2026-03-10T14:08:09.307 INFO:tasks.workunit.client.0.vm03.stdout:4/292: sync 2026-03-10T14:08:09.307 INFO:tasks.workunit.client.1.vm04.stdout:5/819: sync 2026-03-10T14:08:09.308 INFO:tasks.workunit.client.1.vm04.stdout:5/820: chown d7/d2d/d69/db8/ldb 53 1 2026-03-10T14:08:09.315 INFO:tasks.workunit.client.1.vm04.stdout:6/651: dread d3/de/d35/d3f/d2d/d32/d23/d83/dad/fbb [0,4194304] 0 2026-03-10T14:08:09.318 INFO:tasks.workunit.client.1.vm04.stdout:1/760: mknod d3/d22/d6d/c106 0 2026-03-10T14:08:09.324 INFO:tasks.workunit.client.0.vm03.stdout:0/238: dread d3/d11/f13 [0,4194304] 0 2026-03-10T14:08:09.325 INFO:tasks.workunit.client.0.vm03.stdout:0/239: chown d3/d11/d25/f22 0 1 2026-03-10T14:08:09.327 INFO:tasks.workunit.client.0.vm03.stdout:6/226: truncate d8/d1b/f31 431423 0 2026-03-10T14:08:09.327 INFO:tasks.workunit.client.0.vm03.stdout:6/227: rename d8/db/df to d8/db/df/d47 22 2026-03-10T14:08:09.328 INFO:tasks.workunit.client.0.vm03.stdout:6/228: fsync f3 0 2026-03-10T14:08:09.328 INFO:tasks.workunit.client.0.vm03.stdout:6/229: write f5 [2050685,82356] 0 2026-03-10T14:08:09.329 INFO:tasks.workunit.client.1.vm04.stdout:4/716: rmdir d4/d14/d3c/d5e 39 2026-03-10T14:08:09.339 INFO:tasks.workunit.client.1.vm04.stdout:9/688: readlink d9/da/dd/d1c/da3/lc8 0 2026-03-10T14:08:09.344 INFO:tasks.workunit.client.0.vm03.stdout:9/248: dwrite d2/d14/f3d [4194304,4194304] 0 2026-03-10T14:08:09.349 INFO:tasks.workunit.client.1.vm04.stdout:5/821: symlink d7/d26/ddf/l108 0 2026-03-10T14:08:09.355 INFO:tasks.workunit.client.1.vm04.stdout:6/652: unlink d3/de/d35/d3a/d43/d4c/d5e/d76/f94 0 2026-03-10T14:08:09.363 INFO:tasks.workunit.client.1.vm04.stdout:1/761: rmdir d3/d22/d63/d35/d6c 39 2026-03-10T14:08:09.385 INFO:tasks.workunit.client.1.vm04.stdout:4/717: mkdir d4/d14/d6d/df5 0 2026-03-10T14:08:09.386 INFO:tasks.workunit.client.0.vm03.stdout:7/208: dread d5/d9/f30 [0,4194304] 0 2026-03-10T14:08:09.389 INFO:tasks.workunit.client.1.vm04.stdout:5/822: mknod d7/d59/d7d/c109 0 2026-03-10T14:08:09.391 INFO:tasks.workunit.client.0.vm03.stdout:0/240: mkdir d3/d11/d2c/d4a 0 2026-03-10T14:08:09.391 INFO:tasks.workunit.client.1.vm04.stdout:6/653: truncate d3/de/d35/d3a/d43/f4b 2619604 0 2026-03-10T14:08:09.393 INFO:tasks.workunit.client.1.vm04.stdout:1/762: creat d3/d22/f107 x:0 0 0 2026-03-10T14:08:09.395 INFO:tasks.workunit.client.1.vm04.stdout:6/654: dwrite d3/de/d35/d3a/d43/d4c/d5e/fc3 [0,4194304] 0 2026-03-10T14:08:09.397 INFO:tasks.workunit.client.1.vm04.stdout:5/823: dread - d7/d9/db5/fcf zero size 2026-03-10T14:08:09.401 INFO:tasks.workunit.client.0.vm03.stdout:6/230: dwrite f2 [8388608,4194304] 0 2026-03-10T14:08:09.414 INFO:tasks.workunit.client.0.vm03.stdout:2/251: creat d5/d10/f4c x:0 0 0 2026-03-10T14:08:09.420 INFO:tasks.workunit.client.0.vm03.stdout:8/247: mkdir da/d3c/d4b/d4c 0 2026-03-10T14:08:09.430 INFO:tasks.workunit.client.1.vm04.stdout:5/824: getdents d7/d12/d45 0 2026-03-10T14:08:09.431 INFO:tasks.workunit.client.0.vm03.stdout:7/209: truncate d5/d9/f17 2278753 0 2026-03-10T14:08:09.433 INFO:tasks.workunit.client.0.vm03.stdout:4/293: mknod d5/c5a 0 2026-03-10T14:08:09.438 INFO:tasks.workunit.client.0.vm03.stdout:9/249: creat d2/d29/d33/d41/f50 x:0 0 0 2026-03-10T14:08:09.438 INFO:tasks.workunit.client.0.vm03.stdout:9/250: write d2/f37 [933917,49174] 0 2026-03-10T14:08:09.444 INFO:tasks.workunit.client.0.vm03.stdout:4/294: chown d5/f7 2130502306 1 2026-03-10T14:08:09.445 INFO:tasks.workunit.client.0.vm03.stdout:4/295: read d5/d9/db/f48 [1277533,48502] 0 2026-03-10T14:08:09.445 INFO:tasks.workunit.client.0.vm03.stdout:0/241: mkdir d3/d11/d2c/d4a/d4b 0 2026-03-10T14:08:09.445 INFO:tasks.workunit.client.0.vm03.stdout:4/296: chown d5/d9 0 1 2026-03-10T14:08:09.448 INFO:tasks.workunit.client.0.vm03.stdout:6/231: mknod d8/c48 0 2026-03-10T14:08:09.450 INFO:tasks.workunit.client.0.vm03.stdout:0/242: dread d3/d11/f18 [0,4194304] 0 2026-03-10T14:08:09.451 INFO:tasks.workunit.client.1.vm04.stdout:5/825: sync 2026-03-10T14:08:09.454 INFO:tasks.workunit.client.0.vm03.stdout:9/251: creat d2/d29/d38/f51 x:0 0 0 2026-03-10T14:08:09.454 INFO:tasks.workunit.client.0.vm03.stdout:0/243: stat d3/f10 0 2026-03-10T14:08:09.455 INFO:tasks.workunit.client.1.vm04.stdout:5/826: chown d7/d9/l68 48428 1 2026-03-10T14:08:09.455 INFO:tasks.workunit.client.0.vm03.stdout:4/297: dwrite d5/d9/db/d19/d38/d53/f59 [0,4194304] 0 2026-03-10T14:08:09.457 INFO:tasks.workunit.client.0.vm03.stdout:4/298: chown d5/d9/db/f10 40854 1 2026-03-10T14:08:09.457 INFO:tasks.workunit.client.0.vm03.stdout:4/299: fsync d5/d9/db/d19/f45 0 2026-03-10T14:08:09.467 INFO:tasks.workunit.client.0.vm03.stdout:0/244: creat d3/d11/d2c/d4a/f4c x:0 0 0 2026-03-10T14:08:09.475 INFO:tasks.workunit.client.0.vm03.stdout:4/300: unlink d5/d9/c3e 0 2026-03-10T14:08:09.479 INFO:tasks.workunit.client.0.vm03.stdout:4/301: chown d5/d9/db/f32 3 1 2026-03-10T14:08:09.481 INFO:tasks.workunit.client.0.vm03.stdout:4/302: mkdir d5/d47/d5b 0 2026-03-10T14:08:09.483 INFO:tasks.workunit.client.0.vm03.stdout:9/252: rename d2/c18 to d2/d14/d2b/c52 0 2026-03-10T14:08:09.484 INFO:tasks.workunit.client.0.vm03.stdout:4/303: creat d5/d9/db/d19/d34/f5c x:0 0 0 2026-03-10T14:08:09.484 INFO:tasks.workunit.client.0.vm03.stdout:4/304: stat d5/d9/db/d19/l30 0 2026-03-10T14:08:09.489 INFO:tasks.workunit.client.0.vm03.stdout:9/253: rename d2/d14/d2b/f42 to d2/d29/d33/d41/f53 0 2026-03-10T14:08:09.489 INFO:tasks.workunit.client.0.vm03.stdout:0/245: getdents d3 0 2026-03-10T14:08:09.490 INFO:tasks.workunit.client.0.vm03.stdout:0/246: dread - d3/d11/d25/f2a zero size 2026-03-10T14:08:09.493 INFO:tasks.workunit.client.0.vm03.stdout:4/305: unlink d5/d9/db/f32 0 2026-03-10T14:08:09.494 INFO:tasks.workunit.client.1.vm04.stdout:8/844: dwrite d0/d3/dd/d89/fd0 [0,4194304] 0 2026-03-10T14:08:09.509 INFO:tasks.workunit.client.0.vm03.stdout:9/254: dwrite d2/d14/f1a [0,4194304] 0 2026-03-10T14:08:09.511 INFO:tasks.workunit.client.1.vm04.stdout:8/845: truncate d0/d3/d63/d12/d69/f8c 438357 0 2026-03-10T14:08:09.512 INFO:tasks.workunit.client.0.vm03.stdout:4/306: creat d5/d9/db/d19/d34/f5d x:0 0 0 2026-03-10T14:08:09.515 INFO:tasks.workunit.client.0.vm03.stdout:4/307: mknod d5/d9/d2b/c5e 0 2026-03-10T14:08:09.519 INFO:tasks.workunit.client.1.vm04.stdout:8/846: truncate d0/d3/d63/d12/d51/f97 347354 0 2026-03-10T14:08:09.520 INFO:tasks.workunit.client.1.vm04.stdout:8/847: write d0/d3/d63/d29/ffa [521239,51646] 0 2026-03-10T14:08:09.531 INFO:tasks.workunit.client.0.vm03.stdout:9/255: creat d2/f54 x:0 0 0 2026-03-10T14:08:09.533 INFO:tasks.workunit.client.1.vm04.stdout:2/721: write d0/d14/d91/d8/d17/f1f [95970,116346] 0 2026-03-10T14:08:09.534 INFO:tasks.workunit.client.1.vm04.stdout:2/722: readlink d0/d14/d91/d4a/d8c/dab/d46/dc8/lcf 0 2026-03-10T14:08:09.535 INFO:tasks.workunit.client.1.vm04.stdout:2/723: chown d0/d14/d91/d4a/d66/dcd/fd8 1060138330 1 2026-03-10T14:08:09.542 INFO:tasks.workunit.client.0.vm03.stdout:9/256: mkdir d2/d29/d33/d55 0 2026-03-10T14:08:09.544 INFO:tasks.workunit.client.0.vm03.stdout:9/257: creat d2/d14/d2b/d43/f56 x:0 0 0 2026-03-10T14:08:09.545 INFO:tasks.workunit.client.0.vm03.stdout:9/258: write d2/f54 [267076,119889] 0 2026-03-10T14:08:09.555 INFO:tasks.workunit.client.0.vm03.stdout:9/259: truncate d2/d14/f1b 1048290 0 2026-03-10T14:08:09.570 INFO:tasks.workunit.client.0.vm03.stdout:9/260: sync 2026-03-10T14:08:09.572 INFO:tasks.workunit.client.0.vm03.stdout:9/261: creat d2/d29/d33/d41/f57 x:0 0 0 2026-03-10T14:08:09.573 INFO:tasks.workunit.client.0.vm03.stdout:9/262: fsync d2/f4c 0 2026-03-10T14:08:09.577 INFO:tasks.workunit.client.0.vm03.stdout:9/263: symlink d2/d29/d33/d41/d46/l58 0 2026-03-10T14:08:09.578 INFO:tasks.workunit.client.0.vm03.stdout:9/264: write d2/d29/d33/d41/f50 [709568,85721] 0 2026-03-10T14:08:09.579 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:09 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:08] ENGINE Bus STARTING 2026-03-10T14:08:09.579 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:09 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:08] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T14:08:09.579 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:09 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:08] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T14:08:09.579 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:09 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:08] ENGINE Bus STARTED 2026-03-10T14:08:09.579 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:09 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:08] ENGINE Client ('192.168.123.104', 34262) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T14:08:09.583 INFO:tasks.workunit.client.0.vm03.stdout:9/265: sync 2026-03-10T14:08:09.586 INFO:tasks.workunit.client.0.vm03.stdout:9/266: write d2/d29/d33/d41/f57 [121814,86210] 0 2026-03-10T14:08:09.590 INFO:tasks.workunit.client.0.vm03.stdout:9/267: creat d2/d14/d2b/d34/f59 x:0 0 0 2026-03-10T14:08:09.613 INFO:tasks.workunit.client.1.vm04.stdout:7/777: dwrite d2/dc/de/d2d/d60/d7c/d36/d8b/fde [4194304,4194304] 0 2026-03-10T14:08:09.631 INFO:tasks.workunit.client.0.vm03.stdout:3/202: dwrite fb [0,4194304] 0 2026-03-10T14:08:09.635 INFO:tasks.workunit.client.0.vm03.stdout:3/203: mkdir d1d/d39 0 2026-03-10T14:08:09.635 INFO:tasks.workunit.client.0.vm03.stdout:3/204: readlink d1d/d29/l2d 0 2026-03-10T14:08:09.640 INFO:tasks.workunit.client.0.vm03.stdout:3/205: creat d1d/d33/f3a x:0 0 0 2026-03-10T14:08:09.641 INFO:tasks.workunit.client.0.vm03.stdout:3/206: write d1d/d29/f2f [4342585,80206] 0 2026-03-10T14:08:09.647 INFO:tasks.workunit.client.0.vm03.stdout:3/207: unlink d1d/d29/f38 0 2026-03-10T14:08:09.647 INFO:tasks.workunit.client.0.vm03.stdout:3/208: truncate d1d/d33/f37 883205 0 2026-03-10T14:08:09.709 INFO:tasks.workunit.client.1.vm04.stdout:0/765: dwrite d0/d2/d25/f64 [0,4194304] 0 2026-03-10T14:08:09.711 INFO:tasks.workunit.client.1.vm04.stdout:0/766: mkdir d0/d2/d15/d22/d62/df2 0 2026-03-10T14:08:09.712 INFO:tasks.workunit.client.1.vm04.stdout:0/767: stat d0/d2/d15/d49/d50/d5c/da4/fdf 0 2026-03-10T14:08:09.712 INFO:tasks.workunit.client.1.vm04.stdout:0/768: stat d0/d2/d15/d22/d38/d56/dc1/fc3 0 2026-03-10T14:08:09.749 INFO:tasks.workunit.client.0.vm03.stdout:2/252: rmdir d5/d10/d1c 39 2026-03-10T14:08:09.759 INFO:tasks.workunit.client.0.vm03.stdout:2/253: write d5/d10/f4c [483367,70613] 0 2026-03-10T14:08:09.761 INFO:tasks.workunit.client.0.vm03.stdout:2/254: creat d5/f4d x:0 0 0 2026-03-10T14:08:09.767 INFO:tasks.workunit.client.0.vm03.stdout:2/255: fsync d5/f9 0 2026-03-10T14:08:09.775 INFO:tasks.workunit.client.0.vm03.stdout:5/336: write d4/d6/f63 [1327830,45076] 0 2026-03-10T14:08:09.784 INFO:tasks.workunit.client.0.vm03.stdout:5/337: dread d4/d13/d1f/f21 [0,4194304] 0 2026-03-10T14:08:09.785 INFO:tasks.workunit.client.0.vm03.stdout:5/338: chown d4/d16/d19/d23 11 1 2026-03-10T14:08:09.785 INFO:tasks.workunit.client.0.vm03.stdout:5/339: stat d4/l2a 0 2026-03-10T14:08:09.795 INFO:tasks.workunit.client.0.vm03.stdout:5/340: dwrite d4/d6/fb [0,4194304] 0 2026-03-10T14:08:09.802 INFO:tasks.workunit.client.0.vm03.stdout:5/341: creat d4/d13/d1f/f74 x:0 0 0 2026-03-10T14:08:09.802 INFO:tasks.workunit.client.0.vm03.stdout:5/342: chown d4/l2a 0 1 2026-03-10T14:08:09.804 INFO:tasks.workunit.client.0.vm03.stdout:5/343: unlink d4/c31 0 2026-03-10T14:08:09.812 INFO:tasks.workunit.client.0.vm03.stdout:5/344: link d4/d6/l61 d4/d16/d19/l75 0 2026-03-10T14:08:09.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:09 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:08] ENGINE Bus STARTING 2026-03-10T14:08:09.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:09 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:08] ENGINE Serving on http://192.168.123.104:8765 2026-03-10T14:08:09.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:09 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:08] ENGINE Serving on https://192.168.123.104:7150 2026-03-10T14:08:09.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:09 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:08] ENGINE Bus STARTED 2026-03-10T14:08:09.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:09 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:08] ENGINE Client ('192.168.123.104', 34262) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T14:08:09.816 INFO:tasks.workunit.client.0.vm03.stdout:5/345: getdents d4/d13 0 2026-03-10T14:08:09.820 INFO:tasks.workunit.client.0.vm03.stdout:5/346: symlink d4/l76 0 2026-03-10T14:08:09.833 INFO:tasks.workunit.client.0.vm03.stdout:9/268: fsync d2/d14/f30 0 2026-03-10T14:08:09.835 INFO:tasks.workunit.client.0.vm03.stdout:9/269: unlink d2/d14/d2b/f44 0 2026-03-10T14:08:09.839 INFO:tasks.workunit.client.0.vm03.stdout:9/270: rename d2/c31 to d2/d14/d2b/d34/c5a 0 2026-03-10T14:08:09.840 INFO:tasks.workunit.client.1.vm04.stdout:3/764: write da/dc/d3f/d54/f82 [110951,125225] 0 2026-03-10T14:08:09.855 INFO:tasks.workunit.client.1.vm04.stdout:3/765: dwrite da/dc/d35/d52/f79 [0,4194304] 0 2026-03-10T14:08:09.958 INFO:tasks.workunit.client.1.vm04.stdout:6/655: unlink d3/d1d/ca2 0 2026-03-10T14:08:09.962 INFO:tasks.workunit.client.1.vm04.stdout:6/656: rmdir d3/de/d35/d3f/d2d/d32/d23/d4e 39 2026-03-10T14:08:10.104 INFO:tasks.workunit.client.0.vm03.stdout:7/210: unlink d5/d9/f2e 0 2026-03-10T14:08:10.121 INFO:tasks.workunit.client.1.vm04.stdout:1/763: write d3/d22/d63/d35/dd9/d13/fc3 [45986,98636] 0 2026-03-10T14:08:10.130 INFO:tasks.workunit.client.1.vm04.stdout:1/764: read d3/fc [42014,113327] 0 2026-03-10T14:08:10.148 INFO:tasks.workunit.client.0.vm03.stdout:1/275: symlink d0/d18/l5b 0 2026-03-10T14:08:10.151 INFO:tasks.workunit.client.0.vm03.stdout:1/276: dread d0/d2/df/d16/d20/f38 [0,4194304] 0 2026-03-10T14:08:10.216 INFO:tasks.workunit.client.0.vm03.stdout:6/232: truncate f2 2121726 0 2026-03-10T14:08:10.217 INFO:tasks.workunit.client.0.vm03.stdout:6/233: rmdir d8/d1b 39 2026-03-10T14:08:10.219 INFO:tasks.workunit.client.0.vm03.stdout:6/234: chown d8/d1b/d1c 27608 1 2026-03-10T14:08:10.222 INFO:tasks.workunit.client.0.vm03.stdout:6/235: mkdir d8/db/d49 0 2026-03-10T14:08:10.225 INFO:tasks.workunit.client.0.vm03.stdout:6/236: dread d8/d11/f2a [0,4194304] 0 2026-03-10T14:08:10.230 INFO:tasks.workunit.client.0.vm03.stdout:6/237: creat d8/db/d49/f4a x:0 0 0 2026-03-10T14:08:10.231 INFO:tasks.workunit.client.0.vm03.stdout:6/238: write d8/db/d2c/d2d/f46 [43260,66224] 0 2026-03-10T14:08:10.232 INFO:tasks.workunit.client.0.vm03.stdout:6/239: chown d8/l9 1734889 1 2026-03-10T14:08:10.233 INFO:tasks.workunit.client.0.vm03.stdout:6/240: truncate d8/db/d2c/d2d/d32/f3e 704563 0 2026-03-10T14:08:10.236 INFO:tasks.workunit.client.0.vm03.stdout:6/241: creat d8/db/d2c/d2d/d32/f4b x:0 0 0 2026-03-10T14:08:10.293 INFO:tasks.workunit.client.0.vm03.stdout:0/247: write d3/d16/f34 [602840,123074] 0 2026-03-10T14:08:10.306 INFO:tasks.workunit.client.0.vm03.stdout:4/308: write d5/d9/f25 [4317088,21121] 0 2026-03-10T14:08:10.310 INFO:tasks.workunit.client.0.vm03.stdout:4/309: link d5/d9/db/d19/c2f d5/d47/d5b/c5f 0 2026-03-10T14:08:10.312 INFO:tasks.workunit.client.0.vm03.stdout:4/310: getdents d5/d47 0 2026-03-10T14:08:10.313 INFO:tasks.workunit.client.0.vm03.stdout:4/311: fsync d5/d9/f25 0 2026-03-10T14:08:10.314 INFO:tasks.workunit.client.0.vm03.stdout:4/312: mknod d5/d9/db/d19/d38/d53/c60 0 2026-03-10T14:08:10.315 INFO:tasks.workunit.client.0.vm03.stdout:4/313: stat d5/d47/c4d 0 2026-03-10T14:08:10.316 INFO:tasks.workunit.client.0.vm03.stdout:4/314: rmdir d5/d47/d5b 39 2026-03-10T14:08:10.320 INFO:tasks.workunit.client.0.vm03.stdout:4/315: unlink d5/d9/db/c27 0 2026-03-10T14:08:10.320 INFO:tasks.workunit.client.0.vm03.stdout:4/316: chown d5/fe 8378325 1 2026-03-10T14:08:10.320 INFO:tasks.workunit.client.0.vm03.stdout:4/317: readlink d5/d47/l40 0 2026-03-10T14:08:10.321 INFO:tasks.workunit.client.0.vm03.stdout:4/318: chown d5/d9/db/c37 7 1 2026-03-10T14:08:10.326 INFO:tasks.workunit.client.1.vm04.stdout:2/724: dwrite d0/d14/f6b [0,4194304] 0 2026-03-10T14:08:10.332 INFO:tasks.workunit.client.1.vm04.stdout:2/725: dwrite d0/d14/d91/d4a/d66/f7d [0,4194304] 0 2026-03-10T14:08:10.346 INFO:tasks.workunit.client.0.vm03.stdout:4/319: getdents d5/d9/db/d19/d34 0 2026-03-10T14:08:10.370 INFO:tasks.workunit.client.1.vm04.stdout:2/726: dread d0/d14/d91/d4a/d8c/dab/f36 [0,4194304] 0 2026-03-10T14:08:10.412 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:10 vm04.local ceph-mon[55966]: pgmap v5: 65 pgs: 65 active+clean; 2.2 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:08:10.413 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:10 vm04.local ceph-mon[55966]: mgrmap e24: vm04.ywwcto(active, since 4s) 2026-03-10T14:08:10.413 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:10 vm04.local ceph-mon[55966]: Standby manager daemon vm03.rwbbep started 2026-03-10T14:08:10.413 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:10 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.rwbbep/crt"}]: dispatch 2026-03-10T14:08:10.413 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:10 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:10.413 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:10 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.rwbbep/key"}]: dispatch 2026-03-10T14:08:10.413 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:10 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:08:10.440 INFO:tasks.workunit.client.0.vm03.stdout:3/209: truncate d1d/f26 273548 0 2026-03-10T14:08:10.442 INFO:tasks.workunit.client.0.vm03.stdout:3/210: symlink d1d/l3b 0 2026-03-10T14:08:10.443 INFO:tasks.workunit.client.0.vm03.stdout:3/211: write d1d/d33/f3a [213093,56309] 0 2026-03-10T14:08:10.451 INFO:tasks.workunit.client.0.vm03.stdout:3/212: rename f1c to d1d/f3c 0 2026-03-10T14:08:10.452 INFO:tasks.workunit.client.0.vm03.stdout:3/213: fsync d1d/d33/f3a 0 2026-03-10T14:08:10.452 INFO:tasks.workunit.client.0.vm03.stdout:3/214: truncate d1d/f2b 332383 0 2026-03-10T14:08:10.454 INFO:tasks.workunit.client.0.vm03.stdout:3/215: mknod d1d/d33/c3d 0 2026-03-10T14:08:10.473 INFO:tasks.workunit.client.1.vm04.stdout:0/769: write d0/d2/d15/d22/d38/d56/d66/f2e [4965415,115481] 0 2026-03-10T14:08:10.478 INFO:tasks.workunit.client.1.vm04.stdout:0/770: symlink d0/d2/d15/d22/d38/d56/da7/lf3 0 2026-03-10T14:08:10.482 INFO:tasks.workunit.client.1.vm04.stdout:0/771: mknod d0/d2/d15/d22/d38/d56/dcb/dce/cf4 0 2026-03-10T14:08:10.482 INFO:tasks.workunit.client.1.vm04.stdout:0/772: stat d0/d2/d15/d22/d38/d56/dc1/dd4/fd9 0 2026-03-10T14:08:10.486 INFO:tasks.workunit.client.0.vm03.stdout:2/256: write d5/d2a/f37 [2830054,69024] 0 2026-03-10T14:08:10.505 INFO:tasks.workunit.client.0.vm03.stdout:2/257: dread d5/d10/d31/f3d [0,4194304] 0 2026-03-10T14:08:10.529 INFO:tasks.workunit.client.0.vm03.stdout:2/258: dwrite d5/d10/d17/f28 [0,4194304] 0 2026-03-10T14:08:10.546 INFO:tasks.workunit.client.0.vm03.stdout:2/259: getdents d5/d10 0 2026-03-10T14:08:10.549 INFO:tasks.workunit.client.0.vm03.stdout:2/260: fsync d5/d10/d1c/f3c 0 2026-03-10T14:08:10.566 INFO:tasks.workunit.client.0.vm03.stdout:5/347: dwrite d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:10.571 INFO:tasks.workunit.client.0.vm03.stdout:5/348: creat d4/d13/d73/f77 x:0 0 0 2026-03-10T14:08:10.578 INFO:tasks.workunit.client.0.vm03.stdout:5/349: creat d4/d6/f78 x:0 0 0 2026-03-10T14:08:10.580 INFO:tasks.workunit.client.0.vm03.stdout:5/350: rename d4/d6/de/f5a to d4/d16/d19/f79 0 2026-03-10T14:08:10.584 INFO:tasks.workunit.client.0.vm03.stdout:5/351: creat d4/d16/d19/d23/d3f/f7a x:0 0 0 2026-03-10T14:08:10.586 INFO:tasks.workunit.client.0.vm03.stdout:5/352: write d4/fd [6027622,69492] 0 2026-03-10T14:08:10.591 INFO:tasks.workunit.client.0.vm03.stdout:5/353: unlink d4/d35/c44 0 2026-03-10T14:08:10.605 INFO:tasks.workunit.client.0.vm03.stdout:8/248: write da/f15 [2396356,10785] 0 2026-03-10T14:08:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:10 vm03.local ceph-mon[49718]: pgmap v5: 65 pgs: 65 active+clean; 2.2 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:08:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:10 vm03.local ceph-mon[49718]: mgrmap e24: vm04.ywwcto(active, since 4s) 2026-03-10T14:08:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:10 vm03.local ceph-mon[49718]: Standby manager daemon vm03.rwbbep started 2026-03-10T14:08:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:10 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.rwbbep/crt"}]: dispatch 2026-03-10T14:08:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:10 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:10 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.rwbbep/key"}]: dispatch 2026-03-10T14:08:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:10 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:08:10.609 INFO:tasks.workunit.client.0.vm03.stdout:8/249: mkdir da/d36/d4d 0 2026-03-10T14:08:10.611 INFO:tasks.workunit.client.0.vm03.stdout:8/250: truncate da/f30 91410 0 2026-03-10T14:08:10.614 INFO:tasks.workunit.client.0.vm03.stdout:8/251: getdents da/d3c/d4b 0 2026-03-10T14:08:10.615 INFO:tasks.workunit.client.0.vm03.stdout:9/271: write d2/f2c [1067581,96992] 0 2026-03-10T14:08:10.621 INFO:tasks.workunit.client.0.vm03.stdout:8/252: creat da/d3a/d44/f4e x:0 0 0 2026-03-10T14:08:10.621 INFO:tasks.workunit.client.0.vm03.stdout:9/272: creat d2/d14/d2b/d43/f5b x:0 0 0 2026-03-10T14:08:10.625 INFO:tasks.workunit.client.0.vm03.stdout:9/273: mkdir d2/d29/d33/d41/d5c 0 2026-03-10T14:08:10.631 INFO:tasks.workunit.client.0.vm03.stdout:8/253: unlink da/d24/c38 0 2026-03-10T14:08:10.631 INFO:tasks.workunit.client.0.vm03.stdout:9/274: mknod d2/d14/d2b/d43/c5d 0 2026-03-10T14:08:10.637 INFO:tasks.workunit.client.0.vm03.stdout:9/275: stat d2/d14/c48 0 2026-03-10T14:08:10.638 INFO:tasks.workunit.client.0.vm03.stdout:9/276: chown d2/f4c 0 1 2026-03-10T14:08:10.640 INFO:tasks.workunit.client.0.vm03.stdout:8/254: dwrite f7 [0,4194304] 0 2026-03-10T14:08:10.646 INFO:tasks.workunit.client.0.vm03.stdout:9/277: dwrite d2/d14/d2b/d43/f5b [0,4194304] 0 2026-03-10T14:08:10.656 INFO:tasks.workunit.client.1.vm04.stdout:5/827: symlink d7/d26/l10a 0 2026-03-10T14:08:10.660 INFO:tasks.workunit.client.1.vm04.stdout:5/828: unlink d7/d12/d2b/d3e/d57/d8a/f94 0 2026-03-10T14:08:10.717 INFO:tasks.workunit.client.1.vm04.stdout:3/766: dwrite da/dc/d35/f50 [0,4194304] 0 2026-03-10T14:08:10.751 INFO:tasks.workunit.client.1.vm04.stdout:6/657: rmdir d3/de/d35/d3f 39 2026-03-10T14:08:10.753 INFO:tasks.workunit.client.1.vm04.stdout:6/658: creat d3/d1d/d73/fc6 x:0 0 0 2026-03-10T14:08:10.758 INFO:tasks.workunit.client.1.vm04.stdout:6/659: readlink d3/de/d35/d3f/d2d/d32/d23/d83/dad/lc5 0 2026-03-10T14:08:10.762 INFO:tasks.workunit.client.1.vm04.stdout:6/660: creat d3/de/d35/d3f/d2d/fc7 x:0 0 0 2026-03-10T14:08:10.765 INFO:tasks.workunit.client.1.vm04.stdout:6/661: rmdir d3/de/d35/d3a/d43/d4c/d5e/d76 39 2026-03-10T14:08:10.769 INFO:tasks.workunit.client.1.vm04.stdout:6/662: mknod d3/de/d35/d3f/d2d/d32/d23/cc8 0 2026-03-10T14:08:10.773 INFO:tasks.workunit.client.1.vm04.stdout:6/663: symlink d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/lc9 0 2026-03-10T14:08:10.774 INFO:tasks.workunit.client.1.vm04.stdout:6/664: write d3/de/d35/d3f/d2d/d32/d23/d24/d6f/fc0 [993896,55362] 0 2026-03-10T14:08:10.778 INFO:tasks.workunit.client.1.vm04.stdout:6/665: fdatasync d3/de/d35/d3f/fa3 0 2026-03-10T14:08:10.782 INFO:tasks.workunit.client.1.vm04.stdout:6/666: symlink d3/de/d35/d3f/d2d/lca 0 2026-03-10T14:08:10.784 INFO:tasks.workunit.client.1.vm04.stdout:6/667: write d3/de/d35/d3f/d2d/d38/d40/fbf [755650,18655] 0 2026-03-10T14:08:10.785 INFO:tasks.workunit.client.1.vm04.stdout:8/848: creat d0/f10b x:0 0 0 2026-03-10T14:08:10.788 INFO:tasks.workunit.client.1.vm04.stdout:6/668: truncate d3/de/d35/d3f/d2d/d32/d23/d24/f36 1020402 0 2026-03-10T14:08:10.790 INFO:tasks.workunit.client.1.vm04.stdout:6/669: fsync d3/f4 0 2026-03-10T14:08:10.811 INFO:tasks.workunit.client.1.vm04.stdout:9/689: rename d9/d58/db5/l2c to d9/da/d8c/leb 0 2026-03-10T14:08:10.812 INFO:tasks.workunit.client.1.vm04.stdout:6/670: dread d3/d1d/d73/fc [0,4194304] 0 2026-03-10T14:08:10.816 INFO:tasks.workunit.client.1.vm04.stdout:9/690: mkdir d9/da/dd/d1c/da3/dec 0 2026-03-10T14:08:10.816 INFO:tasks.workunit.client.1.vm04.stdout:4/718: rename d4/df/db2/db4/d47/cd1 to d4/d14/d3c/d62/cf6 0 2026-03-10T14:08:10.818 INFO:tasks.workunit.client.1.vm04.stdout:9/691: fsync d9/da/dd/d1c/f27 0 2026-03-10T14:08:10.819 INFO:tasks.workunit.client.1.vm04.stdout:4/719: truncate d4/fa 2263017 0 2026-03-10T14:08:10.820 INFO:tasks.workunit.client.1.vm04.stdout:1/765: rename d3/d20/d102 to d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb/d108 0 2026-03-10T14:08:10.821 INFO:tasks.workunit.client.1.vm04.stdout:6/671: getdents d3/de/d35/d3a/d43 0 2026-03-10T14:08:10.823 INFO:tasks.workunit.client.1.vm04.stdout:8/849: rename d0/d3/d63/d29/f104 to d0/d3/dd/d78/f10c 0 2026-03-10T14:08:10.825 INFO:tasks.workunit.client.1.vm04.stdout:1/766: mkdir d3/d22/d2f/d109 0 2026-03-10T14:08:10.828 INFO:tasks.workunit.client.1.vm04.stdout:9/692: creat d9/d44/d4d/fed x:0 0 0 2026-03-10T14:08:10.829 INFO:tasks.workunit.client.1.vm04.stdout:4/720: rename d4/d14/d64/dcb to d4/df7 0 2026-03-10T14:08:10.832 INFO:tasks.workunit.client.1.vm04.stdout:4/721: dread - d4/d14/d3c/d62/fe4 zero size 2026-03-10T14:08:10.832 INFO:tasks.workunit.client.1.vm04.stdout:9/693: creat d9/d44/d70/fee x:0 0 0 2026-03-10T14:08:10.834 INFO:tasks.workunit.client.1.vm04.stdout:1/767: creat d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb/dfb/f10a x:0 0 0 2026-03-10T14:08:10.842 INFO:tasks.workunit.client.1.vm04.stdout:1/768: fdatasync d3/d22/d63/f97 0 2026-03-10T14:08:10.843 INFO:tasks.workunit.client.1.vm04.stdout:9/694: dwrite d9/d33/f4b [0,4194304] 0 2026-03-10T14:08:10.844 INFO:tasks.workunit.client.1.vm04.stdout:9/695: chown d9/da/d5d/c95 1295852 1 2026-03-10T14:08:10.848 INFO:tasks.workunit.client.1.vm04.stdout:9/696: rename d9/d44/d70/f71 to d9/d58/dda/fef 0 2026-03-10T14:08:10.849 INFO:tasks.workunit.client.1.vm04.stdout:1/769: symlink d3/d22/d63/d35/dd9/d13/d38/d58/d5b/l10b 0 2026-03-10T14:08:10.849 INFO:tasks.workunit.client.1.vm04.stdout:9/697: fdatasync d9/d5c/fdc 0 2026-03-10T14:08:10.856 INFO:tasks.workunit.client.1.vm04.stdout:9/698: fsync d9/da/dd/d1c/da3/fb3 0 2026-03-10T14:08:10.856 INFO:tasks.workunit.client.1.vm04.stdout:1/770: write d3/d22/d63/d35/dd9/d13/d38/df3/ff4 [189398,27512] 0 2026-03-10T14:08:10.857 INFO:tasks.workunit.client.1.vm04.stdout:9/699: stat d9/da/dd/d74/fe1 0 2026-03-10T14:08:10.862 INFO:tasks.workunit.client.1.vm04.stdout:9/700: creat d9/d44/d70/ff0 x:0 0 0 2026-03-10T14:08:10.864 INFO:tasks.workunit.client.1.vm04.stdout:1/771: mknod d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb/dfb/c10c 0 2026-03-10T14:08:10.865 INFO:tasks.workunit.client.0.vm03.stdout:1/277: write d0/f24 [692028,4085] 0 2026-03-10T14:08:10.869 INFO:tasks.workunit.client.1.vm04.stdout:9/701: symlink d9/d44/d4d/lf1 0 2026-03-10T14:08:10.878 INFO:tasks.workunit.client.0.vm03.stdout:1/278: symlink d0/d2/df/d27/l5c 0 2026-03-10T14:08:10.878 INFO:tasks.workunit.client.1.vm04.stdout:4/722: dread d4/d14/d3c/d5e/f92 [0,4194304] 0 2026-03-10T14:08:10.878 INFO:tasks.workunit.client.1.vm04.stdout:1/772: rmdir d3/d22/d63/d35/dd9/d13/d38/db5/dc4/dda 0 2026-03-10T14:08:10.878 INFO:tasks.workunit.client.1.vm04.stdout:1/773: dread - d3/d20/d60/ff7 zero size 2026-03-10T14:08:10.881 INFO:tasks.workunit.client.1.vm04.stdout:4/723: dread d4/df/f60 [0,4194304] 0 2026-03-10T14:08:10.882 INFO:tasks.workunit.client.1.vm04.stdout:4/724: fdatasync d4/fa 0 2026-03-10T14:08:10.882 INFO:tasks.workunit.client.0.vm03.stdout:1/279: unlink d0/d2/df/d16/c47 0 2026-03-10T14:08:10.883 INFO:tasks.workunit.client.0.vm03.stdout:1/280: fsync d0/d2/df/d27/f49 0 2026-03-10T14:08:10.883 INFO:tasks.workunit.client.0.vm03.stdout:1/281: dread - d0/d2/df/d27/f52 zero size 2026-03-10T14:08:10.884 INFO:tasks.workunit.client.0.vm03.stdout:7/211: write d5/d9/f17 [1281789,64636] 0 2026-03-10T14:08:10.885 INFO:tasks.workunit.client.0.vm03.stdout:7/212: readlink d5/l25 0 2026-03-10T14:08:10.895 INFO:tasks.workunit.client.0.vm03.stdout:6/242: dwrite d8/db/df/f10 [0,4194304] 0 2026-03-10T14:08:10.901 INFO:tasks.workunit.client.0.vm03.stdout:1/282: symlink d0/d2/df/l5d 0 2026-03-10T14:08:10.905 INFO:tasks.workunit.client.0.vm03.stdout:6/243: rename l4 to d8/db/d2c/d2d/l4c 0 2026-03-10T14:08:10.905 INFO:tasks.workunit.client.0.vm03.stdout:1/283: dwrite d0/d2/df/f1f [4194304,4194304] 0 2026-03-10T14:08:10.907 INFO:tasks.workunit.client.0.vm03.stdout:6/244: chown d8/db/d49 3 1 2026-03-10T14:08:10.930 INFO:tasks.workunit.client.0.vm03.stdout:6/245: link d8/db/d12/c17 d8/d1b/d1c/c4d 0 2026-03-10T14:08:10.934 INFO:tasks.workunit.client.0.vm03.stdout:1/284: getdents d0/d18/d1d 0 2026-03-10T14:08:10.934 INFO:tasks.workunit.client.0.vm03.stdout:6/246: mknod d8/db/d49/c4e 0 2026-03-10T14:08:10.935 INFO:tasks.workunit.client.0.vm03.stdout:6/247: truncate d8/db/d2c/d2d/f46 788646 0 2026-03-10T14:08:10.937 INFO:tasks.workunit.client.0.vm03.stdout:1/285: creat d0/d18/d1d/f5e x:0 0 0 2026-03-10T14:08:10.939 INFO:tasks.workunit.client.0.vm03.stdout:6/248: dwrite d8/db/d2c/d2d/f46 [0,4194304] 0 2026-03-10T14:08:10.945 INFO:tasks.workunit.client.0.vm03.stdout:1/286: dwrite d0/d2/df/f1f [4194304,4194304] 0 2026-03-10T14:08:10.945 INFO:tasks.workunit.client.0.vm03.stdout:6/249: dread - d8/d1b/f3d zero size 2026-03-10T14:08:10.947 INFO:tasks.workunit.client.0.vm03.stdout:1/287: chown d0/d2/f1a 299423 1 2026-03-10T14:08:10.951 INFO:tasks.workunit.client.0.vm03.stdout:1/288: rename d0/f2c to d0/d2/d34/f5f 0 2026-03-10T14:08:10.953 INFO:tasks.workunit.client.0.vm03.stdout:6/250: dread f0 [0,4194304] 0 2026-03-10T14:08:10.953 INFO:tasks.workunit.client.0.vm03.stdout:1/289: creat d0/d18/f60 x:0 0 0 2026-03-10T14:08:10.953 INFO:tasks.workunit.client.0.vm03.stdout:6/251: read f5 [1346506,62633] 0 2026-03-10T14:08:10.954 INFO:tasks.workunit.client.0.vm03.stdout:1/290: write d0/d18/d3b/f53 [676833,62652] 0 2026-03-10T14:08:10.958 INFO:tasks.workunit.client.0.vm03.stdout:6/252: dwrite d8/d1b/f29 [0,4194304] 0 2026-03-10T14:08:10.964 INFO:tasks.workunit.client.0.vm03.stdout:1/291: fsync d0/d2/f14 0 2026-03-10T14:08:10.971 INFO:tasks.workunit.client.0.vm03.stdout:1/292: dread - d0/d2/df/d27/f52 zero size 2026-03-10T14:08:10.971 INFO:tasks.workunit.client.0.vm03.stdout:1/293: chown d0 27984 1 2026-03-10T14:08:10.971 INFO:tasks.workunit.client.0.vm03.stdout:6/253: dwrite d8/d1b/f3d [0,4194304] 0 2026-03-10T14:08:10.975 INFO:tasks.workunit.client.0.vm03.stdout:6/254: creat d8/db/d49/f4f x:0 0 0 2026-03-10T14:08:10.975 INFO:tasks.workunit.client.0.vm03.stdout:1/294: unlink d0/f30 0 2026-03-10T14:08:10.975 INFO:tasks.workunit.client.0.vm03.stdout:6/255: truncate d8/db/d49/f4a 706565 0 2026-03-10T14:08:10.976 INFO:tasks.workunit.client.0.vm03.stdout:6/256: fdatasync d8/db/d12/f40 0 2026-03-10T14:08:10.976 INFO:tasks.workunit.client.0.vm03.stdout:6/257: chown d8/d3b 548405 1 2026-03-10T14:08:10.981 INFO:tasks.workunit.client.0.vm03.stdout:1/295: creat d0/d2/df/d16/f61 x:0 0 0 2026-03-10T14:08:10.991 INFO:tasks.workunit.client.0.vm03.stdout:6/258: rmdir d8 39 2026-03-10T14:08:10.994 INFO:tasks.workunit.client.0.vm03.stdout:6/259: creat d8/d1b/d1c/f50 x:0 0 0 2026-03-10T14:08:10.996 INFO:tasks.workunit.client.0.vm03.stdout:6/260: unlink d8/d3b/c43 0 2026-03-10T14:08:10.996 INFO:tasks.workunit.client.0.vm03.stdout:6/261: write d8/db/d12/f26 [3930901,108396] 0 2026-03-10T14:08:10.998 INFO:tasks.workunit.client.0.vm03.stdout:6/262: readlink d8/l1e 0 2026-03-10T14:08:11.022 INFO:tasks.workunit.client.0.vm03.stdout:0/248: rmdir d3/d11/d25/d30 39 2026-03-10T14:08:11.027 INFO:tasks.workunit.client.0.vm03.stdout:0/249: dwrite d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:11.028 INFO:tasks.workunit.client.0.vm03.stdout:0/250: stat d3/d16/c42 0 2026-03-10T14:08:11.030 INFO:tasks.workunit.client.0.vm03.stdout:0/251: chown d3/d11/l23 4184 1 2026-03-10T14:08:11.036 INFO:tasks.workunit.client.0.vm03.stdout:0/252: dwrite d3/d11/d2c/d4a/f4c [0,4194304] 0 2026-03-10T14:08:11.037 INFO:tasks.workunit.client.0.vm03.stdout:0/253: readlink d3/l5 0 2026-03-10T14:08:11.045 INFO:tasks.workunit.client.0.vm03.stdout:0/254: dwrite d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:11.053 INFO:tasks.workunit.client.0.vm03.stdout:0/255: read d3/d11/d25/f22 [3323790,81045] 0 2026-03-10T14:08:11.128 INFO:tasks.workunit.client.0.vm03.stdout:4/320: truncate d5/d9/f16 4242295 0 2026-03-10T14:08:11.140 INFO:tasks.workunit.client.0.vm03.stdout:4/321: rmdir d5/d9/db/d19 39 2026-03-10T14:08:11.146 INFO:tasks.workunit.client.0.vm03.stdout:4/322: unlink d5/d9/db/d19/d38/f56 0 2026-03-10T14:08:11.150 INFO:tasks.workunit.client.0.vm03.stdout:4/323: creat d5/d9/db/d19/d38/d53/d55/f61 x:0 0 0 2026-03-10T14:08:11.150 INFO:tasks.workunit.client.0.vm03.stdout:4/324: stat d5/d9/db/f20 0 2026-03-10T14:08:11.151 INFO:tasks.workunit.client.0.vm03.stdout:4/325: chown d5/d9/db/d19/d34/f5c 26 1 2026-03-10T14:08:11.155 INFO:tasks.workunit.client.0.vm03.stdout:4/326: dwrite d5/d9/db/f24 [0,4194304] 0 2026-03-10T14:08:11.165 INFO:tasks.workunit.client.1.vm04.stdout:2/727: truncate d0/d14/d91/d4a/d8c/fac 4724107 0 2026-03-10T14:08:11.167 INFO:tasks.workunit.client.0.vm03.stdout:4/327: mkdir d5/d47/d62 0 2026-03-10T14:08:11.172 INFO:tasks.workunit.client.0.vm03.stdout:4/328: creat d5/d9/d2b/f63 x:0 0 0 2026-03-10T14:08:11.188 INFO:tasks.workunit.client.0.vm03.stdout:4/329: dread f3 [4194304,4194304] 0 2026-03-10T14:08:11.191 INFO:tasks.workunit.client.0.vm03.stdout:4/330: mkdir d5/d47/d5b/d64 0 2026-03-10T14:08:11.219 INFO:tasks.workunit.client.1.vm04.stdout:0/773: write d0/d2/d15/d22/d38/fc6 [894170,17321] 0 2026-03-10T14:08:11.223 INFO:tasks.workunit.client.1.vm04.stdout:0/774: mkdir d0/d2/d15/d22/d38/d56/dcb/dce/df5 0 2026-03-10T14:08:11.252 INFO:tasks.workunit.client.0.vm03.stdout:2/261: dwrite d5/d10/d1c/f3c [0,4194304] 0 2026-03-10T14:08:11.269 INFO:tasks.workunit.client.0.vm03.stdout:5/354: dwrite d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:08:11.309 INFO:tasks.workunit.client.0.vm03.stdout:8/255: getdents da/d24 0 2026-03-10T14:08:11.310 INFO:tasks.workunit.client.0.vm03.stdout:9/278: rmdir d2/d14/d2b/d43 39 2026-03-10T14:08:11.313 INFO:tasks.workunit.client.0.vm03.stdout:8/256: creat da/d3c/f4f x:0 0 0 2026-03-10T14:08:11.314 INFO:tasks.workunit.client.0.vm03.stdout:8/257: mkdir da/d36/d40/d50 0 2026-03-10T14:08:11.317 INFO:tasks.workunit.client.0.vm03.stdout:8/258: mkdir da/d3c/d51 0 2026-03-10T14:08:11.318 INFO:tasks.workunit.client.0.vm03.stdout:8/259: stat da/d36/f42 0 2026-03-10T14:08:11.322 INFO:tasks.workunit.client.0.vm03.stdout:9/279: dwrite d2/d14/d2b/d43/f45 [0,4194304] 0 2026-03-10T14:08:11.327 INFO:tasks.workunit.client.0.vm03.stdout:9/280: write d2/f4c [106683,31050] 0 2026-03-10T14:08:11.332 INFO:tasks.workunit.client.0.vm03.stdout:8/260: rename da/f37 to da/d24/f52 0 2026-03-10T14:08:11.340 INFO:tasks.workunit.client.1.vm04.stdout:5/829: dwrite d7/d12/d2b/d8c/fe5 [0,4194304] 0 2026-03-10T14:08:11.342 INFO:tasks.workunit.client.0.vm03.stdout:9/281: symlink d2/d29/d33/d41/l5e 0 2026-03-10T14:08:11.342 INFO:tasks.workunit.client.0.vm03.stdout:8/261: fsync da/d24/f3d 0 2026-03-10T14:08:11.348 INFO:tasks.workunit.client.0.vm03.stdout:9/282: dread d2/f1e [0,4194304] 0 2026-03-10T14:08:11.348 INFO:tasks.workunit.client.1.vm04.stdout:5/830: creat d7/d9/db5/f10b x:0 0 0 2026-03-10T14:08:11.350 INFO:tasks.workunit.client.0.vm03.stdout:8/262: read da/fd [1297833,35596] 0 2026-03-10T14:08:11.354 INFO:tasks.workunit.client.1.vm04.stdout:5/831: truncate d7/d59/fad 620103 0 2026-03-10T14:08:11.355 INFO:tasks.workunit.client.0.vm03.stdout:9/283: getdents d2 0 2026-03-10T14:08:11.359 INFO:tasks.workunit.client.1.vm04.stdout:5/832: mkdir d7/d12/d2b/d8c/d10c 0 2026-03-10T14:08:11.360 INFO:tasks.workunit.client.0.vm03.stdout:9/284: creat d2/d29/d33/d55/f5f x:0 0 0 2026-03-10T14:08:11.363 INFO:tasks.workunit.client.0.vm03.stdout:9/285: mkdir d2/d29/d33/d60 0 2026-03-10T14:08:11.368 INFO:tasks.workunit.client.0.vm03.stdout:9/286: creat d2/d14/f61 x:0 0 0 2026-03-10T14:08:11.372 INFO:tasks.workunit.client.1.vm04.stdout:5/833: getdents d7/d59/d7d/d9a 0 2026-03-10T14:08:11.383 INFO:tasks.workunit.client.1.vm04.stdout:7/778: mknod d2/c114 0 2026-03-10T14:08:11.383 INFO:tasks.workunit.client.1.vm04.stdout:5/834: symlink d7/d12/d2b/d93/d9e/l10d 0 2026-03-10T14:08:11.383 INFO:tasks.workunit.client.0.vm03.stdout:9/287: creat d2/d14/d2b/f62 x:0 0 0 2026-03-10T14:08:11.385 INFO:tasks.workunit.client.1.vm04.stdout:5/835: dread - d7/d59/d7d/fb9 zero size 2026-03-10T14:08:11.386 INFO:tasks.workunit.client.0.vm03.stdout:9/288: write d2/d29/d38/f4b [86816,16005] 0 2026-03-10T14:08:11.386 INFO:tasks.workunit.client.1.vm04.stdout:7/779: mkdir d2/dac/d115 0 2026-03-10T14:08:11.386 INFO:tasks.workunit.client.0.vm03.stdout:9/289: stat d2/d29/d38/f47 0 2026-03-10T14:08:11.386 INFO:tasks.workunit.client.0.vm03.stdout:9/290: chown d2/d14/d2b 281652 1 2026-03-10T14:08:11.387 INFO:tasks.workunit.client.0.vm03.stdout:9/291: stat d2/d29/d33/d41/d46 0 2026-03-10T14:08:11.403 INFO:tasks.workunit.client.1.vm04.stdout:3/767: write da/dc/d35/d37/fec [115984,112810] 0 2026-03-10T14:08:11.405 INFO:tasks.workunit.client.1.vm04.stdout:3/768: creat da/dc/d35/f107 x:0 0 0 2026-03-10T14:08:11.405 INFO:tasks.workunit.client.1.vm04.stdout:3/769: read da/dc/d35/d52/f79 [3831524,91372] 0 2026-03-10T14:08:11.406 INFO:tasks.workunit.client.0.vm03.stdout:9/292: mknod d2/d14/d2b/d34/c63 0 2026-03-10T14:08:11.407 INFO:tasks.workunit.client.1.vm04.stdout:5/836: rmdir d7/d2d/d32/d34/d105 0 2026-03-10T14:08:11.408 INFO:tasks.workunit.client.1.vm04.stdout:3/770: chown da/dc/d35/d52/d70/c88 824318 1 2026-03-10T14:08:11.415 INFO:tasks.workunit.client.1.vm04.stdout:3/771: rename da/dc/d47/fc8 to da/dc/d47/d9b/d106/f108 0 2026-03-10T14:08:11.435 INFO:tasks.workunit.client.0.vm03.stdout:9/293: rename d2/d14/d2b/f62 to d2/d14/f64 0 2026-03-10T14:08:11.435 INFO:tasks.workunit.client.0.vm03.stdout:9/294: creat d2/d29/d33/d60/f65 x:0 0 0 2026-03-10T14:08:11.435 INFO:tasks.workunit.client.1.vm04.stdout:3/772: getdents da/dc/d35/d52/d53/d78 0 2026-03-10T14:08:11.435 INFO:tasks.workunit.client.1.vm04.stdout:3/773: creat da/dc/d3f/d61/df7/f109 x:0 0 0 2026-03-10T14:08:11.469 INFO:tasks.workunit.client.1.vm04.stdout:7/780: sync 2026-03-10T14:08:11.472 INFO:tasks.workunit.client.1.vm04.stdout:7/781: fdatasync d2/dc/de/d2d/d60/d7c/d64/f6a 0 2026-03-10T14:08:11.477 INFO:tasks.workunit.client.1.vm04.stdout:7/782: dread d2/dc/de/d11/f19 [0,4194304] 0 2026-03-10T14:08:11.479 INFO:tasks.workunit.client.1.vm04.stdout:7/783: truncate d2/dc/de/d2d/d60/ff2 183985 0 2026-03-10T14:08:11.481 INFO:tasks.workunit.client.1.vm04.stdout:7/784: rename d2/dc/de/d2d/d60/d7c/lb7 to d2/dc/de/d2d/d60/d7c/d64/l116 0 2026-03-10T14:08:11.568 INFO:tasks.workunit.client.0.vm03.stdout:9/295: sync 2026-03-10T14:08:11.572 INFO:tasks.workunit.client.0.vm03.stdout:9/296: write d2/d29/d38/f47 [728370,103043] 0 2026-03-10T14:08:11.576 INFO:tasks.workunit.client.1.vm04.stdout:6/672: write d3/de/d35/d3a/d43/f8a [859403,44550] 0 2026-03-10T14:08:11.576 INFO:tasks.workunit.client.1.vm04.stdout:8/850: write d0/d3/dd/d78/fd2 [225134,39941] 0 2026-03-10T14:08:11.577 INFO:tasks.workunit.client.1.vm04.stdout:6/673: chown d3/de/d35/d3a/d43/f8a 6 1 2026-03-10T14:08:11.577 INFO:tasks.workunit.client.1.vm04.stdout:8/851: readlink d0/d3/d63/d12/d69/l7a 0 2026-03-10T14:08:11.581 INFO:tasks.workunit.client.1.vm04.stdout:8/852: dwrite d0/d3/d63/d12/d51/d67/d96/dc8/dcf/fda [0,4194304] 0 2026-03-10T14:08:11.600 INFO:tasks.workunit.client.0.vm03.stdout:9/297: dread d2/f15 [0,4194304] 0 2026-03-10T14:08:11.600 INFO:tasks.workunit.client.1.vm04.stdout:4/725: getdents d4 0 2026-03-10T14:08:11.605 INFO:tasks.workunit.client.0.vm03.stdout:9/298: rename d2/d14/d2b/d43/c5d to d2/d29/d38/c66 0 2026-03-10T14:08:11.606 INFO:tasks.workunit.client.1.vm04.stdout:4/726: creat d4/df/d34/d6f/ff8 x:0 0 0 2026-03-10T14:08:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:11 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:08:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:11 vm03.local ceph-mon[49718]: mgrmap e25: vm04.ywwcto(active, since 5s), standbys: vm03.rwbbep 2026-03-10T14:08:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:11 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:11 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:11.608 INFO:tasks.workunit.client.0.vm03.stdout:9/299: mkdir d2/d67 0 2026-03-10T14:08:11.608 INFO:tasks.workunit.client.1.vm04.stdout:4/727: mkdir d4/df/db2/db4/d47/d4f/df9 0 2026-03-10T14:08:11.611 INFO:tasks.workunit.client.1.vm04.stdout:4/728: fsync d4/df/d34/f7c 0 2026-03-10T14:08:11.612 INFO:tasks.workunit.client.1.vm04.stdout:4/729: write d4/d14/ff2 [536430,98318] 0 2026-03-10T14:08:11.612 INFO:tasks.workunit.client.0.vm03.stdout:9/300: write d2/f2c [1673838,88218] 0 2026-03-10T14:08:11.618 INFO:tasks.workunit.client.1.vm04.stdout:4/730: rmdir d4/df/db2/db4/d47 39 2026-03-10T14:08:11.619 INFO:tasks.workunit.client.1.vm04.stdout:4/731: write d4/fe [4625501,28750] 0 2026-03-10T14:08:11.623 INFO:tasks.workunit.client.1.vm04.stdout:8/853: dread d0/d3/d63/d12/d51/f4f [0,4194304] 0 2026-03-10T14:08:11.627 INFO:tasks.workunit.client.0.vm03.stdout:9/301: truncate d2/d14/f25 490072 0 2026-03-10T14:08:11.628 INFO:tasks.workunit.client.1.vm04.stdout:8/854: creat d0/d3/d5/f10d x:0 0 0 2026-03-10T14:08:11.647 INFO:tasks.workunit.client.0.vm03.stdout:7/213: truncate d5/f32 46708 0 2026-03-10T14:08:11.647 INFO:tasks.workunit.client.1.vm04.stdout:1/774: write d3/d22/d63/d35/f9a [761488,112605] 0 2026-03-10T14:08:11.649 INFO:tasks.workunit.client.1.vm04.stdout:9/702: write d9/d58/db5/fc6 [956272,87552] 0 2026-03-10T14:08:11.651 INFO:tasks.workunit.client.1.vm04.stdout:1/775: write d3/d5c/f71 [3647612,2056] 0 2026-03-10T14:08:11.654 INFO:tasks.workunit.client.1.vm04.stdout:9/703: write d9/d5c/fb6 [4171477,35802] 0 2026-03-10T14:08:11.664 INFO:tasks.workunit.client.1.vm04.stdout:1/776: creat d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb/d108/f10d x:0 0 0 2026-03-10T14:08:11.669 INFO:tasks.workunit.client.0.vm03.stdout:1/296: dwrite d0/f11 [0,4194304] 0 2026-03-10T14:08:11.669 INFO:tasks.workunit.client.1.vm04.stdout:9/704: unlink d9/da/dd/f31 0 2026-03-10T14:08:11.681 INFO:tasks.workunit.client.0.vm03.stdout:6/263: dwrite d8/d11/d18/f21 [0,4194304] 0 2026-03-10T14:08:11.688 INFO:tasks.workunit.client.0.vm03.stdout:1/297: mknod d0/d2/df/d16/d41/c62 0 2026-03-10T14:08:11.695 INFO:tasks.workunit.client.1.vm04.stdout:9/705: fsync d9/da/dd/de7/d96/f9c 0 2026-03-10T14:08:11.696 INFO:tasks.workunit.client.1.vm04.stdout:2/728: dwrite d0/d14/d91/d3a/fb5 [0,4194304] 0 2026-03-10T14:08:11.697 INFO:tasks.workunit.client.0.vm03.stdout:0/256: dwrite d3/f1e [0,4194304] 0 2026-03-10T14:08:11.701 INFO:tasks.workunit.client.0.vm03.stdout:3/216: write d1d/f26 [1138311,27709] 0 2026-03-10T14:08:11.702 INFO:tasks.workunit.client.0.vm03.stdout:4/331: write d5/d9/db/f3d [203435,111331] 0 2026-03-10T14:08:11.714 INFO:tasks.workunit.client.1.vm04.stdout:0/775: dwrite d0/d2/d15/d22/d38/fe6 [0,4194304] 0 2026-03-10T14:08:11.714 INFO:tasks.workunit.client.0.vm03.stdout:2/262: write d5/fa [1747597,110583] 0 2026-03-10T14:08:11.714 INFO:tasks.workunit.client.0.vm03.stdout:5/355: dwrite d4/f17 [0,4194304] 0 2026-03-10T14:08:11.724 INFO:tasks.workunit.client.0.vm03.stdout:1/298: rmdir d0/d18/d3b 39 2026-03-10T14:08:11.740 INFO:tasks.workunit.client.0.vm03.stdout:1/299: read d0/f24 [353631,22305] 0 2026-03-10T14:08:11.753 INFO:tasks.workunit.client.1.vm04.stdout:2/729: mknod d0/d14/d91/d4a/d8c/dab/d46/dc8/cdd 0 2026-03-10T14:08:11.759 INFO:tasks.workunit.client.0.vm03.stdout:3/217: fsync f1b 0 2026-03-10T14:08:11.759 INFO:tasks.workunit.client.0.vm03.stdout:8/263: dwrite da/f31 [0,4194304] 0 2026-03-10T14:08:11.760 INFO:tasks.workunit.client.0.vm03.stdout:0/257: write d3/d11/d25/d30/f45 [645189,53589] 0 2026-03-10T14:08:11.761 INFO:tasks.workunit.client.1.vm04.stdout:3/774: getdents da/dc/d35 0 2026-03-10T14:08:11.768 INFO:tasks.workunit.client.0.vm03.stdout:3/218: dwrite d1d/f31 [0,4194304] 0 2026-03-10T14:08:11.775 INFO:tasks.workunit.client.0.vm03.stdout:5/356: creat d4/d16/f7b x:0 0 0 2026-03-10T14:08:11.775 INFO:tasks.workunit.client.0.vm03.stdout:0/258: dread d3/f1e [0,4194304] 0 2026-03-10T14:08:11.782 INFO:tasks.workunit.client.1.vm04.stdout:5/837: dwrite d7/f21 [0,4194304] 0 2026-03-10T14:08:11.787 INFO:tasks.workunit.client.0.vm03.stdout:1/300: dread d0/d2/df/f1b [0,4194304] 0 2026-03-10T14:08:11.810 INFO:tasks.workunit.client.0.vm03.stdout:7/214: symlink d5/d9/d14/d26/d36/l3d 0 2026-03-10T14:08:11.811 INFO:tasks.workunit.client.0.vm03.stdout:7/215: dread - d5/d9/d14/d26/f38 zero size 2026-03-10T14:08:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:11 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:08:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:11 vm04.local ceph-mon[55966]: mgrmap e25: vm04.ywwcto(active, since 5s), standbys: vm03.rwbbep 2026-03-10T14:08:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:11 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:11.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:11 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:11.814 INFO:tasks.workunit.client.1.vm04.stdout:7/785: write d2/dc/f8d [3173951,126956] 0 2026-03-10T14:08:11.818 INFO:tasks.workunit.client.1.vm04.stdout:7/786: read - d2/dc/de/d2d/d38/d50/dc8/f7b zero size 2026-03-10T14:08:11.839 INFO:tasks.workunit.client.1.vm04.stdout:6/674: dwrite d3/de/d35/d3f/f22 [0,4194304] 0 2026-03-10T14:08:11.881 INFO:tasks.workunit.client.0.vm03.stdout:5/357: symlink d4/d6/de/l7c 0 2026-03-10T14:08:11.881 INFO:tasks.workunit.client.0.vm03.stdout:5/358: readlink d4/d6/l2f 0 2026-03-10T14:08:11.881 INFO:tasks.workunit.client.0.vm03.stdout:5/359: read - d4/d6/f6d zero size 2026-03-10T14:08:11.882 INFO:tasks.workunit.client.0.vm03.stdout:5/360: chown d4/d6/c3e 6480 1 2026-03-10T14:08:11.884 INFO:tasks.workunit.client.0.vm03.stdout:9/302: truncate d2/d29/d33/f3c 3467284 0 2026-03-10T14:08:11.895 INFO:tasks.workunit.client.1.vm04.stdout:4/732: dwrite d4/d14/d1b/f9d [0,4194304] 0 2026-03-10T14:08:11.895 INFO:tasks.workunit.client.0.vm03.stdout:4/332: creat d5/d47/f65 x:0 0 0 2026-03-10T14:08:11.895 INFO:tasks.workunit.client.0.vm03.stdout:4/333: chown d5/d9/f11 517481 1 2026-03-10T14:08:11.895 INFO:tasks.workunit.client.0.vm03.stdout:8/264: mknod da/d3c/d51/c53 0 2026-03-10T14:08:11.895 INFO:tasks.workunit.client.0.vm03.stdout:8/265: stat da/d3c 0 2026-03-10T14:08:11.895 INFO:tasks.workunit.client.0.vm03.stdout:6/264: dwrite d8/d11/f2a [0,4194304] 0 2026-03-10T14:08:11.902 INFO:tasks.workunit.client.1.vm04.stdout:1/777: dwrite d3/d22/d63/d35/dd9/fd [0,4194304] 0 2026-03-10T14:08:11.904 INFO:tasks.workunit.client.1.vm04.stdout:8/855: dwrite d0/d3/d63/d12/d51/f97 [0,4194304] 0 2026-03-10T14:08:11.904 INFO:tasks.workunit.client.0.vm03.stdout:2/263: link d5/d10/f13 d5/d10/d31/f4e 0 2026-03-10T14:08:11.959 INFO:tasks.workunit.client.0.vm03.stdout:4/334: mknod d5/d9/d2b/c66 0 2026-03-10T14:08:11.963 INFO:tasks.workunit.client.0.vm03.stdout:8/266: mknod da/d3c/d4b/c54 0 2026-03-10T14:08:11.966 INFO:tasks.workunit.client.0.vm03.stdout:4/335: dwrite d5/d9/db/f29 [0,4194304] 0 2026-03-10T14:08:11.974 INFO:tasks.workunit.client.0.vm03.stdout:6/265: unlink d8/db/d49/f4f 0 2026-03-10T14:08:11.978 INFO:tasks.workunit.client.1.vm04.stdout:0/776: chown d0/c23 3 1 2026-03-10T14:08:11.978 INFO:tasks.workunit.client.0.vm03.stdout:2/264: mkdir d5/d10/d1f/d4f 0 2026-03-10T14:08:11.989 INFO:tasks.workunit.client.1.vm04.stdout:3/775: readlink da/dc/lf1 0 2026-03-10T14:08:11.992 INFO:tasks.workunit.client.1.vm04.stdout:2/730: dread d0/d14/d54/f9c [0,4194304] 0 2026-03-10T14:08:11.995 INFO:tasks.workunit.client.0.vm03.stdout:2/265: dwrite f1 [0,4194304] 0 2026-03-10T14:08:11.999 INFO:tasks.workunit.client.1.vm04.stdout:2/731: dwrite d0/d14/d39/fa2 [0,4194304] 0 2026-03-10T14:08:12.000 INFO:tasks.workunit.client.1.vm04.stdout:3/776: dwrite da/ded/ffb [0,4194304] 0 2026-03-10T14:08:12.000 INFO:tasks.workunit.client.0.vm03.stdout:2/266: dread d5/d10/d1c/f3c [0,4194304] 0 2026-03-10T14:08:12.004 INFO:tasks.workunit.client.0.vm03.stdout:6/266: dread d8/db/d2c/d2d/d32/f3e [0,4194304] 0 2026-03-10T14:08:12.008 INFO:tasks.workunit.client.1.vm04.stdout:2/732: dread - d0/d14/d91/d8/d17/d4e/d85/f88 zero size 2026-03-10T14:08:12.009 INFO:tasks.workunit.client.0.vm03.stdout:6/267: dwrite d8/db/d2c/d2d/d32/f4b [0,4194304] 0 2026-03-10T14:08:12.022 INFO:tasks.workunit.client.0.vm03.stdout:9/303: link d2/d14/d2b/d34/f59 d2/d14/d2b/f68 0 2026-03-10T14:08:12.023 INFO:tasks.workunit.client.1.vm04.stdout:4/733: fdatasync d4/d14/d64/fd6 0 2026-03-10T14:08:12.023 INFO:tasks.workunit.client.0.vm03.stdout:9/304: fsync d2/d29/d38/f51 0 2026-03-10T14:08:12.025 INFO:tasks.workunit.client.0.vm03.stdout:1/301: link d0/d18/l5b d0/l63 0 2026-03-10T14:08:12.026 INFO:tasks.workunit.client.0.vm03.stdout:1/302: write d0/f11 [3681351,130849] 0 2026-03-10T14:08:12.027 INFO:tasks.workunit.client.0.vm03.stdout:1/303: chown d0/d2/df/d16/f61 7 1 2026-03-10T14:08:12.050 INFO:tasks.workunit.client.1.vm04.stdout:8/856: creat d0/d3/d73/f10e x:0 0 0 2026-03-10T14:08:12.066 INFO:tasks.workunit.client.0.vm03.stdout:6/268: mkdir d8/db/d12/d51 0 2026-03-10T14:08:12.067 INFO:tasks.workunit.client.1.vm04.stdout:0/777: truncate d0/d2/d15/d22/d38/f7d 401879 0 2026-03-10T14:08:12.068 INFO:tasks.workunit.client.0.vm03.stdout:8/267: mknod da/d36/d40/d50/c55 0 2026-03-10T14:08:12.069 INFO:tasks.workunit.client.0.vm03.stdout:8/268: chown da/d3c/d4b/d4c 1 1 2026-03-10T14:08:12.069 INFO:tasks.workunit.client.1.vm04.stdout:9/706: link d9/da/dd/de7/fd2 d9/d44/d4d/ff2 0 2026-03-10T14:08:12.070 INFO:tasks.workunit.client.1.vm04.stdout:9/707: truncate d9/da/dd/f48 4935192 0 2026-03-10T14:08:12.070 INFO:tasks.workunit.client.0.vm03.stdout:6/269: dread d8/d1b/f3d [0,4194304] 0 2026-03-10T14:08:12.072 INFO:tasks.workunit.client.0.vm03.stdout:8/269: dread da/f31 [0,4194304] 0 2026-03-10T14:08:12.076 INFO:tasks.workunit.client.1.vm04.stdout:7/787: dread d2/dc/de/d2d/d60/d7c/d3b/f48 [0,4194304] 0 2026-03-10T14:08:12.080 INFO:tasks.workunit.client.1.vm04.stdout:0/778: creat d0/d2/d15/d22/d38/d56/dcb/ff6 x:0 0 0 2026-03-10T14:08:12.085 INFO:tasks.workunit.client.1.vm04.stdout:6/675: link d3/f4 d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fcb 0 2026-03-10T14:08:12.085 INFO:tasks.workunit.client.1.vm04.stdout:7/788: rmdir d2/d2a/d42/d86/df4 39 2026-03-10T14:08:12.086 INFO:tasks.workunit.client.0.vm03.stdout:4/336: creat d5/d9/db/f67 x:0 0 0 2026-03-10T14:08:12.086 INFO:tasks.workunit.client.0.vm03.stdout:6/270: mknod d8/db/d2c/d2d/d32/c52 0 2026-03-10T14:08:12.086 INFO:tasks.workunit.client.0.vm03.stdout:6/271: write d8/db/d12/f26 [4654695,87610] 0 2026-03-10T14:08:12.087 INFO:tasks.workunit.client.1.vm04.stdout:4/734: sync 2026-03-10T14:08:12.088 INFO:tasks.workunit.client.0.vm03.stdout:6/272: chown d8/db/d2c/d2d/d32/f41 65 1 2026-03-10T14:08:12.096 INFO:tasks.workunit.client.0.vm03.stdout:6/273: dwrite f0 [0,4194304] 0 2026-03-10T14:08:12.103 INFO:tasks.workunit.client.0.vm03.stdout:8/270: mkdir da/d36/d40/d56 0 2026-03-10T14:08:12.108 INFO:tasks.workunit.client.1.vm04.stdout:6/676: dread - d3/de/fb9 zero size 2026-03-10T14:08:12.108 INFO:tasks.workunit.client.0.vm03.stdout:8/271: chown da/d36/d40 1697073 1 2026-03-10T14:08:12.108 INFO:tasks.workunit.client.0.vm03.stdout:8/272: stat da/d3c/f48 0 2026-03-10T14:08:12.117 INFO:tasks.workunit.client.1.vm04.stdout:7/789: fsync d2/dc/de/d2d/d60/d7c/f58 0 2026-03-10T14:08:12.123 INFO:tasks.workunit.client.1.vm04.stdout:0/779: creat d0/d2/d15/d49/d50/d5c/dd8/deb/ff7 x:0 0 0 2026-03-10T14:08:12.124 INFO:tasks.workunit.client.1.vm04.stdout:0/780: truncate d0/d2/d25/f64 4609863 0 2026-03-10T14:08:12.143 INFO:tasks.workunit.client.0.vm03.stdout:6/274: read - d8/d11/f35 zero size 2026-03-10T14:08:12.165 INFO:tasks.workunit.client.1.vm04.stdout:7/790: rename d2/dc/d4d/dcd/dc6 to d2/dc/de/d2d/d60/d7c/d64/d108/d117 0 2026-03-10T14:08:12.165 INFO:tasks.workunit.client.1.vm04.stdout:7/791: readlink d2/d2a/d42/l72 0 2026-03-10T14:08:12.171 INFO:tasks.workunit.client.1.vm04.stdout:0/781: rmdir d0/d2/d15/d22/d38 39 2026-03-10T14:08:12.172 INFO:tasks.workunit.client.1.vm04.stdout:0/782: dread - d0/dee/ff0 zero size 2026-03-10T14:08:12.178 INFO:tasks.workunit.client.0.vm03.stdout:3/219: truncate d1d/f2c 383352 0 2026-03-10T14:08:12.183 INFO:tasks.workunit.client.0.vm03.stdout:7/216: dread d5/d9/d14/d26/f27 [0,4194304] 0 2026-03-10T14:08:12.183 INFO:tasks.workunit.client.1.vm04.stdout:2/733: dread d0/d14/d91/d4a/d8c/fac [0,4194304] 0 2026-03-10T14:08:12.185 INFO:tasks.workunit.client.0.vm03.stdout:5/361: write d4/d13/f4b [3508038,75485] 0 2026-03-10T14:08:12.187 INFO:tasks.workunit.client.0.vm03.stdout:3/220: symlink d1d/d33/l3e 0 2026-03-10T14:08:12.188 INFO:tasks.workunit.client.0.vm03.stdout:5/362: readlink d4/d6/de/l7c 0 2026-03-10T14:08:12.188 INFO:tasks.workunit.client.0.vm03.stdout:5/363: dread - d4/d16/d19/d23/d3f/f7a zero size 2026-03-10T14:08:12.195 INFO:tasks.workunit.client.1.vm04.stdout:7/792: getdents d2/dc/de/d2d/d60/d7c/d36/d103 0 2026-03-10T14:08:12.199 INFO:tasks.workunit.client.0.vm03.stdout:3/221: creat d1d/d29/f3f x:0 0 0 2026-03-10T14:08:12.203 INFO:tasks.workunit.client.0.vm03.stdout:3/222: dwrite d1d/d29/f2e [0,4194304] 0 2026-03-10T14:08:12.204 INFO:tasks.workunit.client.0.vm03.stdout:7/217: mkdir d5/d9/d3e 0 2026-03-10T14:08:12.212 INFO:tasks.workunit.client.1.vm04.stdout:7/793: fdatasync d2/dc/de/d2d/d38/d50/dc8/f7b 0 2026-03-10T14:08:12.214 INFO:tasks.workunit.client.1.vm04.stdout:2/734: dread d0/d14/d1b/f55 [0,4194304] 0 2026-03-10T14:08:12.216 INFO:tasks.workunit.client.1.vm04.stdout:2/735: fdatasync d0/d14/f6b 0 2026-03-10T14:08:12.218 INFO:tasks.workunit.client.1.vm04.stdout:0/783: dread d0/d2/d15/d22/d38/d56/d66/fba [0,4194304] 0 2026-03-10T14:08:12.221 INFO:tasks.workunit.client.0.vm03.stdout:3/223: mkdir d1d/d40 0 2026-03-10T14:08:12.221 INFO:tasks.workunit.client.0.vm03.stdout:3/224: stat fb 0 2026-03-10T14:08:12.221 INFO:tasks.workunit.client.1.vm04.stdout:7/794: creat d2/dc/de/d2d/d60/f118 x:0 0 0 2026-03-10T14:08:12.224 INFO:tasks.workunit.client.0.vm03.stdout:3/225: mkdir d1d/d29/d41 0 2026-03-10T14:08:12.224 INFO:tasks.workunit.client.0.vm03.stdout:0/259: rename d3/d11/d25 to d3/d4d 0 2026-03-10T14:08:12.227 INFO:tasks.workunit.client.0.vm03.stdout:1/304: rename d0 to d0/d18/d3b/d50/d64 22 2026-03-10T14:08:12.227 INFO:tasks.workunit.client.0.vm03.stdout:1/305: stat d0/d18/d1d/l40 0 2026-03-10T14:08:12.228 INFO:tasks.workunit.client.0.vm03.stdout:1/306: truncate d0/d2/df/f1f 9154297 0 2026-03-10T14:08:12.229 INFO:tasks.workunit.client.1.vm04.stdout:5/838: dwrite d7/d2d/d32/f9d [4194304,4194304] 0 2026-03-10T14:08:12.229 INFO:tasks.workunit.client.0.vm03.stdout:1/307: chown d0/d2/d34/f3e 483220885 1 2026-03-10T14:08:12.232 INFO:tasks.workunit.client.1.vm04.stdout:0/784: mknod d0/d6e/cf8 0 2026-03-10T14:08:12.238 INFO:tasks.workunit.client.1.vm04.stdout:2/736: mkdir d0/d14/d39/d47/d93/dde 0 2026-03-10T14:08:12.238 INFO:tasks.workunit.client.1.vm04.stdout:2/737: dread - d0/d14/d91/d8/d17/d4e/d85/f88 zero size 2026-03-10T14:08:12.247 INFO:tasks.workunit.client.0.vm03.stdout:8/273: rename l3 to da/d24/d49/l57 0 2026-03-10T14:08:12.247 INFO:tasks.workunit.client.0.vm03.stdout:3/226: creat d1d/d39/f42 x:0 0 0 2026-03-10T14:08:12.250 INFO:tasks.workunit.client.0.vm03.stdout:8/274: dread - da/d24/f52 zero size 2026-03-10T14:08:12.265 INFO:tasks.workunit.client.0.vm03.stdout:0/260: sync 2026-03-10T14:08:12.268 INFO:tasks.workunit.client.0.vm03.stdout:5/364: rename l1 to d4/d16/d19/d67/l7d 0 2026-03-10T14:08:12.269 INFO:tasks.workunit.client.0.vm03.stdout:1/308: rename d0/d2/df/d16/d20 to d0/d2/df/d16/d20/d65 22 2026-03-10T14:08:12.270 INFO:tasks.workunit.client.0.vm03.stdout:1/309: write d0/d2/df/d16/f61 [831047,33414] 0 2026-03-10T14:08:12.277 INFO:tasks.workunit.client.0.vm03.stdout:1/310: dwrite d0/d18/d3b/f53 [0,4194304] 0 2026-03-10T14:08:12.302 INFO:tasks.workunit.client.1.vm04.stdout:5/839: symlink d7/d12/d2b/d3e/l10e 0 2026-03-10T14:08:12.306 INFO:tasks.workunit.client.0.vm03.stdout:2/267: write d5/d10/f13 [214926,76535] 0 2026-03-10T14:08:12.307 INFO:tasks.workunit.client.1.vm04.stdout:3/777: dwrite da/dc/d35/f5b [0,4194304] 0 2026-03-10T14:08:12.307 INFO:tasks.workunit.client.1.vm04.stdout:1/778: truncate d3/d22/fe1 411918 0 2026-03-10T14:08:12.307 INFO:tasks.workunit.client.1.vm04.stdout:8/857: write d0/d3/d73/fdd [392429,106771] 0 2026-03-10T14:08:12.309 INFO:tasks.workunit.client.1.vm04.stdout:8/858: chown d0/d3/d63/d12/d69/l7a 313 1 2026-03-10T14:08:12.310 INFO:tasks.workunit.client.1.vm04.stdout:9/708: write d9/da/dd/d1c/da3/fb3 [2987779,45063] 0 2026-03-10T14:08:12.313 INFO:tasks.workunit.client.1.vm04.stdout:8/859: write d0/d3/d5/f70 [277165,44658] 0 2026-03-10T14:08:12.314 INFO:tasks.workunit.client.1.vm04.stdout:8/860: chown d0/d3/dd/c22 16540837 1 2026-03-10T14:08:12.318 INFO:tasks.workunit.client.0.vm03.stdout:9/305: dwrite d2/d29/d33/d41/f53 [0,4194304] 0 2026-03-10T14:08:12.327 INFO:tasks.workunit.client.0.vm03.stdout:3/227: symlink d1d/d33/l43 0 2026-03-10T14:08:12.335 INFO:tasks.workunit.client.1.vm04.stdout:6/677: dwrite d3/de/d35/d3f/f17 [0,4194304] 0 2026-03-10T14:08:12.336 INFO:tasks.workunit.client.1.vm04.stdout:5/840: mknod d7/d2d/d76/c10f 0 2026-03-10T14:08:12.337 INFO:tasks.workunit.client.0.vm03.stdout:4/337: truncate f3 8735466 0 2026-03-10T14:08:12.338 INFO:tasks.workunit.client.1.vm04.stdout:5/841: chown d7/d12/f42 12932 1 2026-03-10T14:08:12.338 INFO:tasks.workunit.client.1.vm04.stdout:3/778: mkdir da/dc/d35/d37/d10a 0 2026-03-10T14:08:12.338 INFO:tasks.workunit.client.0.vm03.stdout:6/275: truncate d8/db/df/f10 1230431 0 2026-03-10T14:08:12.342 INFO:tasks.workunit.client.1.vm04.stdout:4/735: dwrite d4/df/db2/db4/d47/d4f/f84 [0,4194304] 0 2026-03-10T14:08:12.353 INFO:tasks.workunit.client.1.vm04.stdout:9/709: symlink d9/da/dd/d1c/da3/lf3 0 2026-03-10T14:08:12.353 INFO:tasks.workunit.client.1.vm04.stdout:8/861: symlink d0/d3/d63/d12/d51/d67/d96/d105/def/l10f 0 2026-03-10T14:08:12.354 INFO:tasks.workunit.client.1.vm04.stdout:9/710: dread - d9/da/dd/d74/fe1 zero size 2026-03-10T14:08:12.354 INFO:tasks.workunit.client.1.vm04.stdout:7/795: write d2/dc/de/d2d/d60/d81/db3/fb9 [692213,24013] 0 2026-03-10T14:08:12.366 INFO:tasks.workunit.client.0.vm03.stdout:2/268: mkdir d5/d10/d1c/d50 0 2026-03-10T14:08:12.372 INFO:tasks.workunit.client.0.vm03.stdout:5/365: symlink d4/d13/l7e 0 2026-03-10T14:08:12.372 INFO:tasks.workunit.client.0.vm03.stdout:9/306: stat d2/d14/c1c 0 2026-03-10T14:08:12.373 INFO:tasks.workunit.client.0.vm03.stdout:8/275: mkdir da/d58 0 2026-03-10T14:08:12.373 INFO:tasks.workunit.client.0.vm03.stdout:3/228: creat d1d/d29/f44 x:0 0 0 2026-03-10T14:08:12.374 INFO:tasks.workunit.client.0.vm03.stdout:3/229: chown d1d/d29/f3f 521840 1 2026-03-10T14:08:12.375 INFO:tasks.workunit.client.1.vm04.stdout:2/738: dwrite d0/d14/d91/f24 [0,4194304] 0 2026-03-10T14:08:12.378 INFO:tasks.workunit.client.0.vm03.stdout:3/230: dwrite d1d/f32 [0,4194304] 0 2026-03-10T14:08:12.385 INFO:tasks.workunit.client.0.vm03.stdout:8/276: sync 2026-03-10T14:08:12.392 INFO:tasks.workunit.client.0.vm03.stdout:6/276: creat d8/d3b/f53 x:0 0 0 2026-03-10T14:08:12.397 INFO:tasks.workunit.client.1.vm04.stdout:5/842: rename d7/d12/d2b/d3e/d3f/ff4 to d7/d9/db5/f110 0 2026-03-10T14:08:12.408 INFO:tasks.workunit.client.0.vm03.stdout:2/269: creat d5/d2a/f51 x:0 0 0 2026-03-10T14:08:12.409 INFO:tasks.workunit.client.0.vm03.stdout:2/270: fsync f4 0 2026-03-10T14:08:12.420 INFO:tasks.workunit.client.1.vm04.stdout:0/785: link d0/d2/d15/d22/d38/d56/da7/lf3 d0/d2/d15/lf9 0 2026-03-10T14:08:12.420 INFO:tasks.workunit.client.1.vm04.stdout:0/786: chown d0/d2/d15/d22/c6d 83238755 1 2026-03-10T14:08:12.421 INFO:tasks.workunit.client.1.vm04.stdout:0/787: write d0/d2/dbe/fd7 [500911,55604] 0 2026-03-10T14:08:12.430 INFO:tasks.workunit.client.1.vm04.stdout:1/779: write d3/d22/d2f/f34 [4680285,4815] 0 2026-03-10T14:08:12.433 INFO:tasks.workunit.client.1.vm04.stdout:1/780: dwrite d3/d22/d63/d35/dd9/d13/d38/db5/dc4/ffc [0,4194304] 0 2026-03-10T14:08:12.445 INFO:tasks.workunit.client.0.vm03.stdout:1/311: dwrite d0/d2/df/f43 [0,4194304] 0 2026-03-10T14:08:12.451 INFO:tasks.workunit.client.1.vm04.stdout:3/779: dwrite da/d3e/fe0 [0,4194304] 0 2026-03-10T14:08:12.451 INFO:tasks.workunit.client.1.vm04.stdout:3/780: chown da/c91 4271 1 2026-03-10T14:08:12.454 INFO:tasks.workunit.client.0.vm03.stdout:1/312: sync 2026-03-10T14:08:12.459 INFO:tasks.workunit.client.0.vm03.stdout:9/307: dread d2/f1e [0,4194304] 0 2026-03-10T14:08:12.462 INFO:tasks.workunit.client.0.vm03.stdout:9/308: dread d2/d29/d38/f4b [0,4194304] 0 2026-03-10T14:08:12.464 INFO:tasks.workunit.client.0.vm03.stdout:0/261: write d3/d16/f31 [657363,73900] 0 2026-03-10T14:08:12.469 INFO:tasks.workunit.client.1.vm04.stdout:5/843: truncate d7/d26/f30 1770778 0 2026-03-10T14:08:12.478 INFO:tasks.workunit.client.0.vm03.stdout:7/218: mkdir d5/d9/d14/d21/d28/d3f 0 2026-03-10T14:08:12.478 INFO:tasks.workunit.client.0.vm03.stdout:7/219: readlink d5/l8 0 2026-03-10T14:08:12.481 INFO:tasks.workunit.client.1.vm04.stdout:0/788: mkdir d0/d2/d25/dfa 0 2026-03-10T14:08:12.501 INFO:tasks.workunit.client.1.vm04.stdout:8/862: write d0/d3/d63/f5f [689203,24021] 0 2026-03-10T14:08:12.501 INFO:tasks.workunit.client.1.vm04.stdout:9/711: write d9/d44/d59/f5a [1220483,110298] 0 2026-03-10T14:08:12.503 INFO:tasks.workunit.client.1.vm04.stdout:7/796: dwrite d2/d2a/f93 [0,4194304] 0 2026-03-10T14:08:12.534 INFO:tasks.workunit.client.0.vm03.stdout:2/271: rmdir d5/d10/d1f 39 2026-03-10T14:08:12.534 INFO:tasks.workunit.client.0.vm03.stdout:2/272: fsync d5/f2d 0 2026-03-10T14:08:12.544 INFO:tasks.workunit.client.1.vm04.stdout:3/781: rename da/dc/d3f/d61/l89 to da/dc/d3f/d61/df7/l10b 0 2026-03-10T14:08:12.545 INFO:tasks.workunit.client.1.vm04.stdout:3/782: chown da/dc/l49 94379 1 2026-03-10T14:08:12.568 INFO:tasks.workunit.client.1.vm04.stdout:2/739: write d0/d14/d91/d4a/d8c/fac [3012492,63503] 0 2026-03-10T14:08:12.569 INFO:tasks.workunit.client.1.vm04.stdout:2/740: chown d0/db8 50760969 1 2026-03-10T14:08:12.571 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:12 vm03.local ceph-mon[49718]: pgmap v6: 65 pgs: 65 active+clean; 2.2 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:08:12.571 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:12 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:12.571 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:12 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:12.571 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:12 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:12.571 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:12 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:12.575 INFO:tasks.workunit.client.0.vm03.stdout:3/231: mkdir d1d/d29/d41/d45 0 2026-03-10T14:08:12.575 INFO:tasks.workunit.client.0.vm03.stdout:3/232: read - d1d/d29/f44 zero size 2026-03-10T14:08:12.580 INFO:tasks.workunit.client.0.vm03.stdout:3/233: sync 2026-03-10T14:08:12.588 INFO:tasks.workunit.client.1.vm04.stdout:5/844: symlink d7/d12/d2b/d3e/d3f/dc0/l111 0 2026-03-10T14:08:12.589 INFO:tasks.workunit.client.0.vm03.stdout:6/277: write d8/db/f1f [6870820,93270] 0 2026-03-10T14:08:12.592 INFO:tasks.workunit.client.1.vm04.stdout:1/781: write d3/d22/f3b [1663004,99341] 0 2026-03-10T14:08:12.596 INFO:tasks.workunit.client.1.vm04.stdout:8/863: mknod d0/d3/d73/c110 0 2026-03-10T14:08:12.598 INFO:tasks.workunit.client.0.vm03.stdout:4/338: link d5/d47/c4d d5/d47/d5b/c68 0 2026-03-10T14:08:12.600 INFO:tasks.workunit.client.0.vm03.stdout:9/309: dwrite d2/d14/d2b/d34/f59 [0,4194304] 0 2026-03-10T14:08:12.601 INFO:tasks.workunit.client.1.vm04.stdout:7/797: fsync d2/dc/de/d2d/d60/d7c/d36/d8b/fe3 0 2026-03-10T14:08:12.613 INFO:tasks.workunit.client.1.vm04.stdout:4/736: getdents d4/d14/d3c/d62 0 2026-03-10T14:08:12.625 INFO:tasks.workunit.client.0.vm03.stdout:3/234: mknod d1d/d29/c46 0 2026-03-10T14:08:12.625 INFO:tasks.workunit.client.0.vm03.stdout:7/220: symlink d5/d9/d3e/l40 0 2026-03-10T14:08:12.625 INFO:tasks.workunit.client.0.vm03.stdout:6/278: mkdir d8/d11/d18/d54 0 2026-03-10T14:08:12.625 INFO:tasks.workunit.client.1.vm04.stdout:6/678: link d3/de/d35/d3f/d2d/d32/f81 d3/d1d/fcc 0 2026-03-10T14:08:12.625 INFO:tasks.workunit.client.1.vm04.stdout:5/845: creat d7/d59/d7e/f112 x:0 0 0 2026-03-10T14:08:12.625 INFO:tasks.workunit.client.1.vm04.stdout:5/846: readlink d7/d2d/d32/d34/lc1 0 2026-03-10T14:08:12.625 INFO:tasks.workunit.client.1.vm04.stdout:1/782: rmdir d3/d5c/d79/d98 39 2026-03-10T14:08:12.636 INFO:tasks.workunit.client.1.vm04.stdout:7/798: creat d2/dc/de/d2d/d5c/f119 x:0 0 0 2026-03-10T14:08:12.640 INFO:tasks.workunit.client.1.vm04.stdout:7/799: dwrite d2/dc/de/d2d/d60/d7c/d36/d8b/fb8 [4194304,4194304] 0 2026-03-10T14:08:12.647 INFO:tasks.workunit.client.0.vm03.stdout:1/313: link d0/f11 d0/d42/f66 0 2026-03-10T14:08:12.652 INFO:tasks.workunit.client.0.vm03.stdout:1/314: dwrite d0/d2/df/d27/f58 [0,4194304] 0 2026-03-10T14:08:12.660 INFO:tasks.workunit.client.0.vm03.stdout:9/310: unlink c0 0 2026-03-10T14:08:12.668 INFO:tasks.workunit.client.1.vm04.stdout:4/737: symlink d4/d14/d3c/d5e/lfa 0 2026-03-10T14:08:12.675 INFO:tasks.workunit.client.1.vm04.stdout:4/738: stat d4/d14/d64 0 2026-03-10T14:08:12.675 INFO:tasks.workunit.client.1.vm04.stdout:2/741: dread d0/d14/d91/d3a/d3e/f61 [4194304,4194304] 0 2026-03-10T14:08:12.675 INFO:tasks.workunit.client.1.vm04.stdout:2/742: readlink d0/d14/d91/d4a/d8c/dab/l77 0 2026-03-10T14:08:12.675 INFO:tasks.workunit.client.0.vm03.stdout:5/366: write d4/d13/d43/f51 [1354602,30516] 0 2026-03-10T14:08:12.681 INFO:tasks.workunit.client.1.vm04.stdout:6/679: unlink d3/de/d35/d3f/d2d/d32/d5c/f75 0 2026-03-10T14:08:12.681 INFO:tasks.workunit.client.1.vm04.stdout:6/680: write d3/f8d [3180520,3243] 0 2026-03-10T14:08:12.685 INFO:tasks.workunit.client.0.vm03.stdout:3/235: mkdir d1d/d33/d47 0 2026-03-10T14:08:12.690 INFO:tasks.workunit.client.1.vm04.stdout:9/712: link d9/da/dd/l21 d9/d44/lf4 0 2026-03-10T14:08:12.690 INFO:tasks.workunit.client.1.vm04.stdout:7/800: symlink d2/dc/de/d2d/d60/d7c/d36/daa/l11a 0 2026-03-10T14:08:12.690 INFO:tasks.workunit.client.0.vm03.stdout:3/236: fsync f1b 0 2026-03-10T14:08:12.690 INFO:tasks.workunit.client.0.vm03.stdout:6/279: mkdir d8/d1b/d55 0 2026-03-10T14:08:12.690 INFO:tasks.workunit.client.0.vm03.stdout:6/280: chown d8/d11/d18/d54 1300787922 1 2026-03-10T14:08:12.691 INFO:tasks.workunit.client.0.vm03.stdout:1/315: truncate d0/f48 1683983 0 2026-03-10T14:08:12.691 INFO:tasks.workunit.client.0.vm03.stdout:1/316: chown d0/d2/d34/f56 486113245 1 2026-03-10T14:08:12.692 INFO:tasks.workunit.client.0.vm03.stdout:6/281: sync 2026-03-10T14:08:12.696 INFO:tasks.workunit.client.1.vm04.stdout:2/743: mknod d0/d14/d91/d8/dd/cdf 0 2026-03-10T14:08:12.698 INFO:tasks.workunit.client.0.vm03.stdout:9/311: unlink d2/d14/d2b/d34/f3e 0 2026-03-10T14:08:12.698 INFO:tasks.workunit.client.0.vm03.stdout:5/367: truncate d4/d16/d19/d6e/f57 29595 0 2026-03-10T14:08:12.701 INFO:tasks.workunit.client.0.vm03.stdout:2/273: fsync d5/d10/d31/f3d 0 2026-03-10T14:08:12.702 INFO:tasks.workunit.client.1.vm04.stdout:5/847: rename d7/d12/c15 to d7/d59/d7d/c113 0 2026-03-10T14:08:12.711 INFO:tasks.workunit.client.0.vm03.stdout:8/277: truncate f7 1881290 0 2026-03-10T14:08:12.712 INFO:tasks.workunit.client.0.vm03.stdout:1/317: stat d0/d2/f9 0 2026-03-10T14:08:12.712 INFO:tasks.workunit.client.0.vm03.stdout:6/282: symlink d8/d1b/l56 0 2026-03-10T14:08:12.713 INFO:tasks.workunit.client.0.vm03.stdout:0/262: dread d3/f28 [0,4194304] 0 2026-03-10T14:08:12.714 INFO:tasks.workunit.client.1.vm04.stdout:0/789: dwrite d0/d2/d15/d22/d38/d56/d66/f2b [0,4194304] 0 2026-03-10T14:08:12.716 INFO:tasks.workunit.client.1.vm04.stdout:3/783: dwrite f4 [0,4194304] 0 2026-03-10T14:08:12.718 INFO:tasks.workunit.client.1.vm04.stdout:3/784: readlink da/dc/d3f/d54/l7c 0 2026-03-10T14:08:12.728 INFO:tasks.workunit.client.0.vm03.stdout:9/312: mknod d2/d29/d33/d55/c69 0 2026-03-10T14:08:12.729 INFO:tasks.workunit.client.1.vm04.stdout:2/744: truncate d0/d14/d91/d8/d17/d4e/f78 3606921 0 2026-03-10T14:08:12.730 INFO:tasks.workunit.client.1.vm04.stdout:2/745: dread - d0/d14/d91/d8/d17/d4e/d85/f88 zero size 2026-03-10T14:08:12.741 INFO:tasks.workunit.client.0.vm03.stdout:5/368: fsync d4/d16/d19/d23/f50 0 2026-03-10T14:08:12.744 INFO:tasks.workunit.client.0.vm03.stdout:5/369: dwrite d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:12.744 INFO:tasks.workunit.client.1.vm04.stdout:6/681: stat d3/de/d35/d3f/d2d/d32/d23/d4e/c5b 0 2026-03-10T14:08:12.747 INFO:tasks.workunit.client.0.vm03.stdout:3/237: mkdir d1d/d40/d48 0 2026-03-10T14:08:12.757 INFO:tasks.workunit.client.0.vm03.stdout:1/318: mknod d0/d18/d1d/c67 0 2026-03-10T14:08:12.758 INFO:tasks.workunit.client.0.vm03.stdout:0/263: truncate d3/d4d/f22 1181804 0 2026-03-10T14:08:12.758 INFO:tasks.workunit.client.0.vm03.stdout:1/319: stat d0/d18/d1d/f5e 0 2026-03-10T14:08:12.760 INFO:tasks.workunit.client.0.vm03.stdout:9/313: chown d2/d14/c17 7795 1 2026-03-10T14:08:12.763 INFO:tasks.workunit.client.0.vm03.stdout:5/370: mkdir d4/d16/d19/d6e/d7f 0 2026-03-10T14:08:12.764 INFO:tasks.workunit.client.0.vm03.stdout:5/371: dread d4/d16/d19/d23/f50 [0,4194304] 0 2026-03-10T14:08:12.765 INFO:tasks.workunit.client.0.vm03.stdout:2/274: fdatasync d5/d10/d1f/f3a 0 2026-03-10T14:08:12.765 INFO:tasks.workunit.client.0.vm03.stdout:6/283: truncate f2 1312946 0 2026-03-10T14:08:12.765 INFO:tasks.workunit.client.0.vm03.stdout:6/284: fdatasync d8/db/d49/f4a 0 2026-03-10T14:08:12.776 INFO:tasks.workunit.client.1.vm04.stdout:1/783: link d3/d22/d63/d35/dd9/cad d3/d22/d2f/d109/c10e 0 2026-03-10T14:08:12.776 INFO:tasks.workunit.client.0.vm03.stdout:0/264: sync 2026-03-10T14:08:12.778 INFO:tasks.workunit.client.0.vm03.stdout:9/314: dread d2/f37 [0,4194304] 0 2026-03-10T14:08:12.779 INFO:tasks.workunit.client.0.vm03.stdout:1/320: dwrite d0/d42/f66 [0,4194304] 0 2026-03-10T14:08:12.780 INFO:tasks.workunit.client.1.vm04.stdout:9/713: dread d9/d5c/fb6 [0,4194304] 0 2026-03-10T14:08:12.782 INFO:tasks.workunit.client.0.vm03.stdout:1/321: dread - d0/d18/d1d/f5e zero size 2026-03-10T14:08:12.793 INFO:tasks.workunit.client.0.vm03.stdout:4/339: write f2 [608843,21884] 0 2026-03-10T14:08:12.798 INFO:tasks.workunit.client.0.vm03.stdout:4/340: dwrite d5/d9/db/d19/d38/d53/f59 [0,4194304] 0 2026-03-10T14:08:12.807 INFO:tasks.workunit.client.0.vm03.stdout:5/372: dread d4/d13/f4b [0,4194304] 0 2026-03-10T14:08:12.807 INFO:tasks.workunit.client.0.vm03.stdout:4/341: stat d5/l2d 0 2026-03-10T14:08:12.811 INFO:tasks.workunit.client.0.vm03.stdout:2/275: creat d5/d10/d1c/d40/f52 x:0 0 0 2026-03-10T14:08:12.812 INFO:tasks.workunit.client.1.vm04.stdout:7/801: symlink d2/dc/de/d2d/d5c/da9/df6/dfe/l11b 0 2026-03-10T14:08:12.814 INFO:tasks.workunit.client.1.vm04.stdout:7/802: dread d2/dc/de/d2d/d60/d7c/d36/f87 [0,4194304] 0 2026-03-10T14:08:12.815 INFO:tasks.workunit.client.0.vm03.stdout:4/342: sync 2026-03-10T14:08:12.815 INFO:tasks.workunit.client.0.vm03.stdout:5/373: sync 2026-03-10T14:08:12.821 INFO:tasks.workunit.client.1.vm04.stdout:0/790: creat d0/d2/d15/d22/d38/d56/dcb/dce/ffb x:0 0 0 2026-03-10T14:08:12.824 INFO:tasks.workunit.client.0.vm03.stdout:9/315: truncate d2/d29/d38/f4b 541621 0 2026-03-10T14:08:12.826 INFO:tasks.workunit.client.1.vm04.stdout:3/785: creat da/dc/d47/f10c x:0 0 0 2026-03-10T14:08:12.826 INFO:tasks.workunit.client.1.vm04.stdout:3/786: chown da/dc/d3f/d61/fc7 3743 1 2026-03-10T14:08:12.831 INFO:tasks.workunit.client.0.vm03.stdout:0/265: truncate d3/d4d/d30/f45 585662 0 2026-03-10T14:08:12.839 INFO:tasks.workunit.client.1.vm04.stdout:2/746: creat d0/d14/d39/d47/d70/dc3/fe0 x:0 0 0 2026-03-10T14:08:12.840 INFO:tasks.workunit.client.1.vm04.stdout:4/739: write d4/df/d34/f95 [2738603,45121] 0 2026-03-10T14:08:12.842 INFO:tasks.workunit.client.1.vm04.stdout:4/740: dread d4/f77 [0,4194304] 0 2026-03-10T14:08:12.846 INFO:tasks.workunit.client.1.vm04.stdout:8/864: dread d0/d3/f6b [0,4194304] 0 2026-03-10T14:08:12.858 INFO:tasks.workunit.client.1.vm04.stdout:5/848: write d7/d12/d2b/d93/fca [586062,22449] 0 2026-03-10T14:08:12.861 INFO:tasks.workunit.client.0.vm03.stdout:2/276: mknod d5/d10/d1c/c53 0 2026-03-10T14:08:12.861 INFO:tasks.workunit.client.0.vm03.stdout:2/277: write d5/d10/d1f/f3e [294258,36000] 0 2026-03-10T14:08:12.861 INFO:tasks.workunit.client.1.vm04.stdout:9/714: creat d9/da/dd/d1c/da3/ff5 x:0 0 0 2026-03-10T14:08:12.872 INFO:tasks.workunit.client.0.vm03.stdout:5/374: truncate d4/d13/d1f/f74 155259 0 2026-03-10T14:08:12.875 INFO:tasks.workunit.client.0.vm03.stdout:8/278: dwrite f5 [0,4194304] 0 2026-03-10T14:08:12.876 INFO:tasks.workunit.client.1.vm04.stdout:7/803: read d2/dc/de/d2d/d60/d7c/f105 [409021,8109] 0 2026-03-10T14:08:12.877 INFO:tasks.workunit.client.0.vm03.stdout:5/375: dread d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:08:12.877 INFO:tasks.workunit.client.0.vm03.stdout:9/316: creat d2/d14/d2b/d34/f6a x:0 0 0 2026-03-10T14:08:12.878 INFO:tasks.workunit.client.1.vm04.stdout:2/747: symlink d0/d14/d91/d4a/d8c/d92/le1 0 2026-03-10T14:08:12.879 INFO:tasks.workunit.client.1.vm04.stdout:2/748: chown d0/d14/d91/d4a/d8c/dab/d46/c5a 3146718 1 2026-03-10T14:08:12.883 INFO:tasks.workunit.client.1.vm04.stdout:8/865: mkdir d0/d3/d63/d12/d51/d67/d96/dc8/d111 0 2026-03-10T14:08:12.886 INFO:tasks.workunit.client.1.vm04.stdout:6/682: fdatasync d3/de/d35/d3f/d2d/d32/d23/d24/f36 0 2026-03-10T14:08:12.888 INFO:tasks.workunit.client.0.vm03.stdout:2/278: mkdir d5/d10/d1c/d54 0 2026-03-10T14:08:12.888 INFO:tasks.workunit.client.0.vm03.stdout:2/279: chown d5/d35/f49 1165 1 2026-03-10T14:08:12.900 INFO:tasks.workunit.client.0.vm03.stdout:8/279: dread - da/d3c/f3f zero size 2026-03-10T14:08:12.902 INFO:tasks.workunit.client.0.vm03.stdout:9/317: fdatasync d2/f4c 0 2026-03-10T14:08:12.910 INFO:tasks.workunit.client.1.vm04.stdout:7/804: stat d2/dc/de/d2d/c80 0 2026-03-10T14:08:12.912 INFO:tasks.workunit.client.0.vm03.stdout:5/376: dwrite d4/d6/de/f65 [0,4194304] 0 2026-03-10T14:08:12.913 INFO:tasks.workunit.client.0.vm03.stdout:2/280: chown d5/d10/l1a 164 1 2026-03-10T14:08:12.927 INFO:tasks.workunit.client.1.vm04.stdout:2/749: creat d0/d14/d91/d8/d17/d4e/d85/d86/fe2 x:0 0 0 2026-03-10T14:08:12.930 INFO:tasks.workunit.client.0.vm03.stdout:0/266: symlink d3/l4e 0 2026-03-10T14:08:12.933 INFO:tasks.workunit.client.1.vm04.stdout:8/866: chown d0/d3/d63/d12/lf7 336544767 1 2026-03-10T14:08:12.933 INFO:tasks.workunit.client.1.vm04.stdout:8/867: chown d0/d3/d63/d29/fba 964137 1 2026-03-10T14:08:12.934 INFO:tasks.workunit.client.1.vm04.stdout:8/868: chown d0/d3/fc3 6 1 2026-03-10T14:08:12.936 INFO:tasks.workunit.client.1.vm04.stdout:8/869: dread d0/d3/d63/d12/d51/f4f [4194304,4194304] 0 2026-03-10T14:08:12.938 INFO:tasks.workunit.client.1.vm04.stdout:8/870: chown d0/d3/dd/d89/db5/cc5 22494576 1 2026-03-10T14:08:12.938 INFO:tasks.workunit.client.1.vm04.stdout:8/871: chown d0/d3/d63/d12/c3e 0 1 2026-03-10T14:08:12.943 INFO:tasks.workunit.client.0.vm03.stdout:5/377: creat d4/d40/d4e/f80 x:0 0 0 2026-03-10T14:08:12.960 INFO:tasks.workunit.client.0.vm03.stdout:0/267: symlink d3/d11/d2c/d4a/l4f 0 2026-03-10T14:08:12.962 INFO:tasks.workunit.client.1.vm04.stdout:9/715: mkdir d9/da/d8c/de5/df6 0 2026-03-10T14:08:12.966 INFO:tasks.workunit.client.0.vm03.stdout:6/285: write d8/d1b/f3d [3246980,129510] 0 2026-03-10T14:08:12.972 INFO:tasks.workunit.client.1.vm04.stdout:2/750: creat d0/d14/d91/d4a/d8c/dab/db3/fe3 x:0 0 0 2026-03-10T14:08:12.974 INFO:tasks.workunit.client.0.vm03.stdout:5/378: dwrite d4/d13/d1f/f20 [4194304,4194304] 0 2026-03-10T14:08:12.974 INFO:tasks.workunit.client.0.vm03.stdout:5/379: chown d4/d16/d19/d23/c6a 6 1 2026-03-10T14:08:12.984 INFO:tasks.workunit.client.0.vm03.stdout:2/281: fsync d5/f23 0 2026-03-10T14:08:12.985 INFO:tasks.workunit.client.1.vm04.stdout:6/683: rename d3/d1d/fcc to d3/de/d35/d3f/d2d/d38/fcd 0 2026-03-10T14:08:12.986 INFO:tasks.workunit.client.1.vm04.stdout:6/684: chown d3/de/d35/d3f/d2d/d32/d23/d24/c4f 5358539 1 2026-03-10T14:08:12.996 INFO:tasks.workunit.client.0.vm03.stdout:6/286: stat d8/db/d12/c17 0 2026-03-10T14:08:12.996 INFO:tasks.workunit.client.1.vm04.stdout:8/872: mknod d0/d3/d63/c112 0 2026-03-10T14:08:12.999 INFO:tasks.workunit.client.0.vm03.stdout:6/287: dwrite d8/d1b/f29 [0,4194304] 0 2026-03-10T14:08:13.014 INFO:tasks.workunit.client.1.vm04.stdout:5/849: link d7/d2d/d32/d34/c80 d7/d12/d2b/c114 0 2026-03-10T14:08:13.014 INFO:tasks.workunit.client.0.vm03.stdout:5/380: readlink d4/d6/l61 0 2026-03-10T14:08:13.015 INFO:tasks.workunit.client.1.vm04.stdout:5/850: chown d7/d12/d2b/d3e/d3f/l44 9856018 1 2026-03-10T14:08:13.017 INFO:tasks.workunit.client.0.vm03.stdout:5/381: dread d4/d13/d43/f51 [0,4194304] 0 2026-03-10T14:08:13.018 INFO:tasks.workunit.client.0.vm03.stdout:5/382: readlink d4/d6/l2f 0 2026-03-10T14:08:13.018 INFO:tasks.workunit.client.0.vm03.stdout:5/383: chown d4/d6/de/f4f 440 1 2026-03-10T14:08:13.021 INFO:tasks.workunit.client.0.vm03.stdout:2/282: readlink d5/ld 0 2026-03-10T14:08:13.022 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:12 vm04.local ceph-mon[55966]: pgmap v6: 65 pgs: 65 active+clean; 2.2 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:08:13.022 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:12 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:13.022 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:12 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:13.022 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:12 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:13.022 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:12 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:13.031 INFO:tasks.workunit.client.1.vm04.stdout:9/716: rename d9/da/lb7 to d9/da/dd/d1c/da3/lf7 0 2026-03-10T14:08:13.032 INFO:tasks.workunit.client.0.vm03.stdout:0/268: creat d3/d16/d21/f50 x:0 0 0 2026-03-10T14:08:13.034 INFO:tasks.workunit.client.0.vm03.stdout:8/280: getdents da/d24 0 2026-03-10T14:08:13.035 INFO:tasks.workunit.client.0.vm03.stdout:8/281: chown da/d36 1495 1 2026-03-10T14:08:13.043 INFO:tasks.workunit.client.1.vm04.stdout:7/805: getdents d2/d2a/d42/d86 0 2026-03-10T14:08:13.045 INFO:tasks.workunit.client.0.vm03.stdout:9/318: getdents d2/d14/d2b 0 2026-03-10T14:08:13.068 INFO:tasks.workunit.client.1.vm04.stdout:3/787: write da/dc/d47/d9b/fd2 [15256,24470] 0 2026-03-10T14:08:13.074 INFO:tasks.workunit.client.1.vm04.stdout:2/751: getdents d0/d14/d91/d4a 0 2026-03-10T14:08:13.080 INFO:tasks.workunit.client.1.vm04.stdout:6/685: link d3/de/d35/d3f/d2d/la1 d3/de/d35/d3a/d43/d4c/d5e/lce 0 2026-03-10T14:08:13.081 INFO:tasks.workunit.client.1.vm04.stdout:4/741: write d4/df/d34/f7c [539801,95064] 0 2026-03-10T14:08:13.106 INFO:tasks.workunit.client.0.vm03.stdout:0/269: mkdir d3/d51 0 2026-03-10T14:08:13.112 INFO:tasks.workunit.client.0.vm03.stdout:9/319: truncate d2/d29/d38/f4b 250882 0 2026-03-10T14:08:13.117 INFO:tasks.workunit.client.1.vm04.stdout:2/752: unlink d0/d14/d91/d8/d17/d4e/l65 0 2026-03-10T14:08:13.119 INFO:tasks.workunit.client.0.vm03.stdout:9/320: sync 2026-03-10T14:08:13.119 INFO:tasks.workunit.client.1.vm04.stdout:1/784: dwrite d3/d22/d63/d35/dd9/d13/d38/db5/fb8 [0,4194304] 0 2026-03-10T14:08:13.125 INFO:tasks.workunit.client.0.vm03.stdout:5/384: link d4/f37 d4/d16/d19/d4a/f81 0 2026-03-10T14:08:13.128 INFO:tasks.workunit.client.1.vm04.stdout:6/686: symlink d3/de/d35/d3f/d2d/d32/lcf 0 2026-03-10T14:08:13.128 INFO:tasks.workunit.client.1.vm04.stdout:4/742: truncate d4/df/db2/db4/d47/d70/d74/f76 4356383 0 2026-03-10T14:08:13.130 INFO:tasks.workunit.client.1.vm04.stdout:4/743: write d4/df/d34/fef [674349,11411] 0 2026-03-10T14:08:13.145 INFO:tasks.workunit.client.1.vm04.stdout:0/791: write d0/d2/d15/d49/d50/d61/d75/f80 [170114,93490] 0 2026-03-10T14:08:13.145 INFO:tasks.workunit.client.0.vm03.stdout:8/282: link da/d36/d40/d50/c55 da/d24/c59 0 2026-03-10T14:08:13.146 INFO:tasks.workunit.client.0.vm03.stdout:8/283: write da/f30 [263140,102974] 0 2026-03-10T14:08:13.147 INFO:tasks.workunit.client.0.vm03.stdout:8/284: chown da/d3c/d4b 1 1 2026-03-10T14:08:13.149 INFO:tasks.workunit.client.0.vm03.stdout:5/385: dread d4/d6/fa [0,4194304] 0 2026-03-10T14:08:13.151 INFO:tasks.workunit.client.0.vm03.stdout:8/285: dwrite da/d36/d40/f47 [0,4194304] 0 2026-03-10T14:08:13.160 INFO:tasks.workunit.client.1.vm04.stdout:6/687: mkdir d3/de/d35/d3f/d2d/d32/dd0 0 2026-03-10T14:08:13.160 INFO:tasks.workunit.client.0.vm03.stdout:9/321: symlink d2/d29/l6b 0 2026-03-10T14:08:13.164 INFO:tasks.workunit.client.0.vm03.stdout:8/286: mkdir da/d36/d40/d5a 0 2026-03-10T14:08:13.166 INFO:tasks.workunit.client.1.vm04.stdout:6/688: mknod d3/de/d35/d3a/d43/d9c/cd1 0 2026-03-10T14:08:13.170 INFO:tasks.workunit.client.1.vm04.stdout:4/744: creat d4/df/db2/db4/d47/d4f/d8c/dc8/ffb x:0 0 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.0.vm03.stdout:9/322: mknod d2/d29/d33/c6c 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.0.vm03.stdout:8/287: creat da/d36/d40/d5a/f5b x:0 0 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.0.vm03.stdout:8/288: write da/f33 [2195400,100650] 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.0.vm03.stdout:8/289: readlink da/d24/d49/l57 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.0.vm03.stdout:8/290: symlink da/d58/l5c 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.0.vm03.stdout:8/291: write da/fe [1576307,116554] 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.0.vm03.stdout:8/292: creat da/d36/d40/d56/f5d x:0 0 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.1.vm04.stdout:4/745: mknod d4/df/db2/db4/d47/cfc 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.1.vm04.stdout:0/792: link d0/f19 d0/d2/d15/d49/d50/d5c/da4/ffc 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.1.vm04.stdout:0/793: chown d0/dee/ff0 163 1 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.1.vm04.stdout:4/746: rmdir d4/df/db2/de1 39 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.1.vm04.stdout:4/747: truncate d4/d14/fad 1600217 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.1.vm04.stdout:0/794: link d0/d2/d15/d22/d38/l41 d0/d2/d15/d22/d62/df2/lfd 0 2026-03-10T14:08:13.195 INFO:tasks.workunit.client.1.vm04.stdout:0/795: mkdir d0/d2/d25/dfe 0 2026-03-10T14:08:13.197 INFO:tasks.workunit.client.1.vm04.stdout:4/748: getdents d4/d14/dac/db7 0 2026-03-10T14:08:13.198 INFO:tasks.workunit.client.1.vm04.stdout:0/796: dread d0/d2/d15/d22/f81 [0,4194304] 0 2026-03-10T14:08:13.202 INFO:tasks.workunit.client.1.vm04.stdout:0/797: dwrite d0/d2/f9 [0,4194304] 0 2026-03-10T14:08:13.202 INFO:tasks.workunit.client.0.vm03.stdout:8/293: truncate da/d24/f52 163632 0 2026-03-10T14:08:13.202 INFO:tasks.workunit.client.0.vm03.stdout:8/294: fdatasync da/d3a/d44/f4e 0 2026-03-10T14:08:13.209 INFO:tasks.workunit.client.1.vm04.stdout:4/749: creat d4/d14/d3c/d85/ffd x:0 0 0 2026-03-10T14:08:13.210 INFO:tasks.workunit.client.1.vm04.stdout:0/798: truncate d0/d2/d15/d22/d38/f93 58123 0 2026-03-10T14:08:13.211 INFO:tasks.workunit.client.1.vm04.stdout:0/799: readlink d0/da2/lb3 0 2026-03-10T14:08:13.211 INFO:tasks.workunit.client.1.vm04.stdout:0/800: write d0/dee/ff0 [145349,112774] 0 2026-03-10T14:08:13.212 INFO:tasks.workunit.client.1.vm04.stdout:4/750: rmdir d4/df/db2/de1 39 2026-03-10T14:08:13.212 INFO:tasks.workunit.client.1.vm04.stdout:0/801: readlink d0/d2/d15/d22/d38/d56/l9b 0 2026-03-10T14:08:13.216 INFO:tasks.workunit.client.1.vm04.stdout:4/751: creat d4/d14/d1b/ffe x:0 0 0 2026-03-10T14:08:13.217 INFO:tasks.workunit.client.1.vm04.stdout:0/802: stat d0/d2/d15/d22/d62/df2/lfd 0 2026-03-10T14:08:13.245 INFO:tasks.workunit.client.1.vm04.stdout:1/785: sync 2026-03-10T14:08:13.249 INFO:tasks.workunit.client.0.vm03.stdout:7/221: rename d5/d9/d14/d26/f33 to d5/d9/d14/f41 0 2026-03-10T14:08:13.250 INFO:tasks.workunit.client.1.vm04.stdout:1/786: read d3/d5c/fbf [3133468,18403] 0 2026-03-10T14:08:13.251 INFO:tasks.workunit.client.0.vm03.stdout:8/295: sync 2026-03-10T14:08:13.256 INFO:tasks.workunit.client.1.vm04.stdout:1/787: creat d3/d22/d2f/f10f x:0 0 0 2026-03-10T14:08:13.258 INFO:tasks.workunit.client.0.vm03.stdout:8/296: dread - da/f1f zero size 2026-03-10T14:08:13.262 INFO:tasks.workunit.client.0.vm03.stdout:3/238: rename d1d/f2c to d1d/d33/f49 0 2026-03-10T14:08:13.265 INFO:tasks.workunit.client.1.vm04.stdout:1/788: read d3/d22/d63/f72 [1825519,85860] 0 2026-03-10T14:08:13.272 INFO:tasks.workunit.client.0.vm03.stdout:8/297: rmdir da/d3a 39 2026-03-10T14:08:13.273 INFO:tasks.workunit.client.0.vm03.stdout:8/298: write da/d36/f42 [854953,122777] 0 2026-03-10T14:08:13.290 INFO:tasks.workunit.client.0.vm03.stdout:7/222: creat d5/d9/f42 x:0 0 0 2026-03-10T14:08:13.291 INFO:tasks.workunit.client.0.vm03.stdout:7/223: readlink d5/l8 0 2026-03-10T14:08:13.292 INFO:tasks.workunit.client.0.vm03.stdout:8/299: symlink da/d36/l5e 0 2026-03-10T14:08:13.293 INFO:tasks.workunit.client.0.vm03.stdout:8/300: write da/f33 [914226,127068] 0 2026-03-10T14:08:13.310 INFO:tasks.workunit.client.0.vm03.stdout:1/322: write d0/d2/f1a [584914,30863] 0 2026-03-10T14:08:13.313 INFO:tasks.workunit.client.0.vm03.stdout:8/301: dread da/f2e [0,4194304] 0 2026-03-10T14:08:13.325 INFO:tasks.workunit.client.1.vm04.stdout:5/851: dwrite d7/d9/f20 [0,4194304] 0 2026-03-10T14:08:13.337 INFO:tasks.workunit.client.1.vm04.stdout:8/873: write d0/d3/d63/d12/d51/f8e [893449,77416] 0 2026-03-10T14:08:13.354 INFO:tasks.workunit.client.1.vm04.stdout:7/806: write d2/dc/de/f73 [4485540,53882] 0 2026-03-10T14:08:13.357 INFO:tasks.workunit.client.0.vm03.stdout:3/239: creat d1d/f4a x:0 0 0 2026-03-10T14:08:13.361 INFO:tasks.workunit.client.0.vm03.stdout:3/240: dwrite fc [0,4194304] 0 2026-03-10T14:08:13.362 INFO:tasks.workunit.client.0.vm03.stdout:4/343: rename d5/d9/l35 to d5/d47/l69 0 2026-03-10T14:08:13.363 INFO:tasks.workunit.client.1.vm04.stdout:7/807: mkdir d2/d11c 0 2026-03-10T14:08:13.372 INFO:tasks.workunit.client.1.vm04.stdout:9/717: write d9/da/dd/de7/d96/fce [3676969,94637] 0 2026-03-10T14:08:13.373 INFO:tasks.workunit.client.1.vm04.stdout:9/718: chown d9/da/dd/de7/d96/d9d/fe9 230430 1 2026-03-10T14:08:13.381 INFO:tasks.workunit.client.1.vm04.stdout:9/719: dread d9/d44/d4d/f66 [0,4194304] 0 2026-03-10T14:08:13.382 INFO:tasks.workunit.client.0.vm03.stdout:8/302: dread da/f31 [0,4194304] 0 2026-03-10T14:08:13.397 INFO:tasks.workunit.client.1.vm04.stdout:7/808: sync 2026-03-10T14:08:13.401 INFO:tasks.workunit.client.0.vm03.stdout:6/288: rename f2 to d8/db/d12/f57 0 2026-03-10T14:08:13.402 INFO:tasks.workunit.client.0.vm03.stdout:6/289: write d8/d1b/d1c/f50 [843188,87895] 0 2026-03-10T14:08:13.404 INFO:tasks.workunit.client.0.vm03.stdout:6/290: stat d8/db/f1f 0 2026-03-10T14:08:13.406 INFO:tasks.workunit.client.0.vm03.stdout:2/283: write d5/d10/f12 [565320,119185] 0 2026-03-10T14:08:13.411 INFO:tasks.workunit.client.0.vm03.stdout:4/344: symlink d5/l6a 0 2026-03-10T14:08:13.416 INFO:tasks.workunit.client.0.vm03.stdout:8/303: mkdir da/d58/d5f 0 2026-03-10T14:08:13.431 INFO:tasks.workunit.client.1.vm04.stdout:3/788: dwrite da/dc/d3f/d54/f97 [0,4194304] 0 2026-03-10T14:08:13.450 INFO:tasks.workunit.client.0.vm03.stdout:0/270: rename d3/d16/d21/c33 to d3/d51/c52 0 2026-03-10T14:08:13.451 INFO:tasks.workunit.client.1.vm04.stdout:3/789: fsync da/dc/d35/f6a 0 2026-03-10T14:08:13.454 INFO:tasks.workunit.client.0.vm03.stdout:6/291: fsync d8/db/df/f27 0 2026-03-10T14:08:13.455 INFO:tasks.workunit.client.1.vm04.stdout:4/752: rename d4/df/db2/db4/d47/d70 to d4/df/db2/db4/d47/d4f/d8c/dc8/dff 0 2026-03-10T14:08:13.458 INFO:tasks.workunit.client.1.vm04.stdout:4/753: chown d4/df/d34/f8f 60124359 1 2026-03-10T14:08:13.467 INFO:tasks.workunit.client.0.vm03.stdout:2/284: dread d5/d10/f22 [0,4194304] 0 2026-03-10T14:08:13.479 INFO:tasks.workunit.client.1.vm04.stdout:2/753: dwrite d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/f7b [0,4194304] 0 2026-03-10T14:08:13.486 INFO:tasks.workunit.client.1.vm04.stdout:2/754: dread d0/d14/d91/d8/d17/d35/f81 [0,4194304] 0 2026-03-10T14:08:13.486 INFO:tasks.workunit.client.1.vm04.stdout:2/755: stat d0/d14/d91/d8 0 2026-03-10T14:08:13.487 INFO:tasks.workunit.client.1.vm04.stdout:2/756: fsync d0/d14/d91/d8/d17/f1f 0 2026-03-10T14:08:13.490 INFO:tasks.workunit.client.1.vm04.stdout:3/790: symlink da/dc/d3f/l10d 0 2026-03-10T14:08:13.497 INFO:tasks.workunit.client.0.vm03.stdout:3/241: link d1d/l30 d1d/d33/l4b 0 2026-03-10T14:08:13.497 INFO:tasks.workunit.client.0.vm03.stdout:9/323: write d2/d14/f1b [552573,63230] 0 2026-03-10T14:08:13.498 INFO:tasks.workunit.client.1.vm04.stdout:6/689: dwrite d3/de/d35/d3a/fb5 [0,4194304] 0 2026-03-10T14:08:13.506 INFO:tasks.workunit.client.0.vm03.stdout:0/271: stat d3/f9 0 2026-03-10T14:08:13.517 INFO:tasks.workunit.client.0.vm03.stdout:6/292: mkdir d8/db/d49/d58 0 2026-03-10T14:08:13.526 INFO:tasks.workunit.client.0.vm03.stdout:8/304: unlink da/d3a/d44/f4e 0 2026-03-10T14:08:13.530 INFO:tasks.workunit.client.0.vm03.stdout:8/305: chown da/d3c/d4b/d4c 0 1 2026-03-10T14:08:13.531 INFO:tasks.workunit.client.1.vm04.stdout:0/803: dwrite d0/d2/d15/d22/d38/d56/dc1/fc2 [0,4194304] 0 2026-03-10T14:08:13.547 INFO:tasks.workunit.client.1.vm04.stdout:5/852: rename d7/d12/c19 to d7/d2d/d69/db8/c115 0 2026-03-10T14:08:13.548 INFO:tasks.workunit.client.1.vm04.stdout:5/853: read - d7/d59/d7d/fb9 zero size 2026-03-10T14:08:13.548 INFO:tasks.workunit.client.1.vm04.stdout:6/690: creat d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fd2 x:0 0 0 2026-03-10T14:08:13.551 INFO:tasks.workunit.client.1.vm04.stdout:4/754: truncate d4/df/db2/db4/d47/d4f/d8c/dc8/dff/f87 85804 0 2026-03-10T14:08:13.555 INFO:tasks.workunit.client.0.vm03.stdout:6/293: mknod d8/db/d2c/d2d/c59 0 2026-03-10T14:08:13.567 INFO:tasks.workunit.client.1.vm04.stdout:1/789: truncate d3/f2c 5876741 0 2026-03-10T14:08:13.572 INFO:tasks.workunit.client.1.vm04.stdout:2/757: rename d0/d14/d91/d8/d17/ld9 to d0/d14/d39/d47/d93/le4 0 2026-03-10T14:08:13.580 INFO:tasks.workunit.client.1.vm04.stdout:8/874: write d0/d3/dd/d76/f9c [1716292,19282] 0 2026-03-10T14:08:13.587 INFO:tasks.workunit.client.0.vm03.stdout:3/242: mkdir d1d/d29/d41/d45/d4c 0 2026-03-10T14:08:13.587 INFO:tasks.workunit.client.0.vm03.stdout:3/243: chown d1d/d29/d41 13383 1 2026-03-10T14:08:13.587 INFO:tasks.workunit.client.0.vm03.stdout:3/244: read fc [2136643,110154] 0 2026-03-10T14:08:13.590 INFO:tasks.workunit.client.1.vm04.stdout:6/691: rmdir d3/de/d35 39 2026-03-10T14:08:13.592 INFO:tasks.workunit.client.0.vm03.stdout:9/324: truncate d2/d29/d33/f3c 2970673 0 2026-03-10T14:08:13.594 INFO:tasks.workunit.client.1.vm04.stdout:4/755: read d4/d14/d6d/f8a [467581,39215] 0 2026-03-10T14:08:13.595 INFO:tasks.workunit.client.1.vm04.stdout:4/756: dread d4/df/f60 [0,4194304] 0 2026-03-10T14:08:13.598 INFO:tasks.workunit.client.0.vm03.stdout:5/386: rename d4/d6/f10 to d4/f82 0 2026-03-10T14:08:13.611 INFO:tasks.workunit.client.1.vm04.stdout:1/790: creat d3/d22/d6d/f110 x:0 0 0 2026-03-10T14:08:13.611 INFO:tasks.workunit.client.1.vm04.stdout:1/791: stat d3/d22/d63/d35/dd9/d13/d1a/f62 0 2026-03-10T14:08:13.615 INFO:tasks.workunit.client.1.vm04.stdout:2/758: chown d0/d14/d1b/c21 52899742 1 2026-03-10T14:08:13.616 INFO:tasks.workunit.client.0.vm03.stdout:1/323: write d0/f10 [3878546,100046] 0 2026-03-10T14:08:13.620 INFO:tasks.workunit.client.0.vm03.stdout:3/245: write d1d/f3c [7910618,70159] 0 2026-03-10T14:08:13.624 INFO:tasks.workunit.client.1.vm04.stdout:9/720: dwrite d9/da/d5d/fde [0,4194304] 0 2026-03-10T14:08:13.630 INFO:tasks.workunit.client.0.vm03.stdout:9/325: chown d2/d29/d38/c66 14 1 2026-03-10T14:08:13.633 INFO:tasks.workunit.client.1.vm04.stdout:7/809: dwrite d2/dc/de/d2d/d60/d7c/f15 [4194304,4194304] 0 2026-03-10T14:08:13.635 INFO:tasks.workunit.client.0.vm03.stdout:4/345: rename d5/f49 to d5/d9/db/d19/d34/f6b 0 2026-03-10T14:08:13.641 INFO:tasks.workunit.client.0.vm03.stdout:9/326: read d2/f2c [1400851,16091] 0 2026-03-10T14:08:13.644 INFO:tasks.workunit.client.1.vm04.stdout:7/810: dwrite d2/dc/de/d2d/d60/d7c/d36/d8b/fb8 [4194304,4194304] 0 2026-03-10T14:08:13.662 INFO:tasks.workunit.client.0.vm03.stdout:0/272: rmdir d3/d16 39 2026-03-10T14:08:13.692 INFO:tasks.workunit.client.0.vm03.stdout:3/246: dread d1d/d33/f49 [0,4194304] 0 2026-03-10T14:08:13.692 INFO:tasks.workunit.client.1.vm04.stdout:2/759: symlink d0/db8/le5 0 2026-03-10T14:08:13.693 INFO:tasks.workunit.client.0.vm03.stdout:4/346: fdatasync d5/f7 0 2026-03-10T14:08:13.693 INFO:tasks.workunit.client.1.vm04.stdout:2/760: write d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/f7b [194917,36243] 0 2026-03-10T14:08:13.696 INFO:tasks.workunit.client.0.vm03.stdout:3/247: dwrite d1d/f2b [0,4194304] 0 2026-03-10T14:08:13.700 INFO:tasks.workunit.client.1.vm04.stdout:9/721: mkdir d9/dd3/df8 0 2026-03-10T14:08:13.700 INFO:tasks.workunit.client.0.vm03.stdout:8/306: rename da/d3c/d51/c53 to da/d36/d40/c60 0 2026-03-10T14:08:13.700 INFO:tasks.workunit.client.0.vm03.stdout:8/307: write da/f33 [1551902,45629] 0 2026-03-10T14:08:13.722 INFO:tasks.workunit.client.0.vm03.stdout:0/273: dwrite d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:13.736 INFO:tasks.workunit.client.1.vm04.stdout:5/854: rmdir d7/d2d/d69/dce 0 2026-03-10T14:08:13.742 INFO:tasks.workunit.client.1.vm04.stdout:6/692: mknod d3/de/d35/d3f/d2d/d38/cd3 0 2026-03-10T14:08:13.746 INFO:tasks.workunit.client.0.vm03.stdout:1/324: dread d0/d2/f46 [0,4194304] 0 2026-03-10T14:08:13.748 INFO:tasks.workunit.client.0.vm03.stdout:5/387: rmdir d4/d16 39 2026-03-10T14:08:13.758 INFO:tasks.workunit.client.0.vm03.stdout:2/285: write d5/d10/d17/f33 [4800150,90463] 0 2026-03-10T14:08:13.764 INFO:tasks.workunit.client.0.vm03.stdout:7/224: symlink d5/d9/d14/d26/l43 0 2026-03-10T14:08:13.769 INFO:tasks.workunit.client.1.vm04.stdout:0/804: truncate d0/d2/d25/f2a 3695925 0 2026-03-10T14:08:13.770 INFO:tasks.workunit.client.1.vm04.stdout:8/875: write d0/d75/d8a/fdf [392106,101132] 0 2026-03-10T14:08:13.773 INFO:tasks.workunit.client.1.vm04.stdout:0/805: dread d0/d2/d15/d22/d38/d56/dc1/fc2 [0,4194304] 0 2026-03-10T14:08:13.773 INFO:tasks.workunit.client.1.vm04.stdout:1/792: creat d3/d22/d63/d35/dd9/d13/d38/db5/dff/f111 x:0 0 0 2026-03-10T14:08:13.775 INFO:tasks.workunit.client.0.vm03.stdout:3/248: write f1b [704198,43992] 0 2026-03-10T14:08:13.775 INFO:tasks.workunit.client.0.vm03.stdout:3/249: dread - d1d/d29/f3f zero size 2026-03-10T14:08:13.776 INFO:tasks.workunit.client.1.vm04.stdout:2/761: symlink d0/d14/d91/d4a/d8c/dab/d46/dc8/le6 0 2026-03-10T14:08:13.796 INFO:tasks.workunit.client.1.vm04.stdout:9/722: mknod d9/d5c/cf9 0 2026-03-10T14:08:13.800 INFO:tasks.workunit.client.0.vm03.stdout:9/327: mkdir d2/d29/d33/d6d 0 2026-03-10T14:08:13.811 INFO:tasks.workunit.client.0.vm03.stdout:6/294: dwrite d8/db/d12/f57 [0,4194304] 0 2026-03-10T14:08:13.818 INFO:tasks.workunit.client.0.vm03.stdout:7/225: dread d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:08:13.821 INFO:tasks.workunit.client.0.vm03.stdout:6/295: dwrite d8/d11/d18/f34 [0,4194304] 0 2026-03-10T14:08:13.828 INFO:tasks.workunit.client.0.vm03.stdout:5/388: read d4/d16/d19/d23/f50 [10986,98711] 0 2026-03-10T14:08:13.830 INFO:tasks.workunit.client.0.vm03.stdout:2/286: truncate d5/d2a/f45 984754 0 2026-03-10T14:08:13.844 INFO:tasks.workunit.client.0.vm03.stdout:4/347: symlink d5/d47/d62/l6c 0 2026-03-10T14:08:13.847 INFO:tasks.workunit.client.0.vm03.stdout:3/250: truncate d1d/d33/f49 999264 0 2026-03-10T14:08:13.852 INFO:tasks.workunit.client.0.vm03.stdout:8/308: fsync da/d24/f28 0 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:13.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:13 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:13.868 INFO:tasks.workunit.client.0.vm03.stdout:9/328: rmdir d2/d14/d2b/d43 39 2026-03-10T14:08:13.869 INFO:tasks.workunit.client.0.vm03.stdout:1/325: creat d0/d2/df/d16/d41/f68 x:0 0 0 2026-03-10T14:08:13.885 INFO:tasks.workunit.client.0.vm03.stdout:6/296: mknod d8/db/d2c/c5a 0 2026-03-10T14:08:13.889 INFO:tasks.workunit.client.0.vm03.stdout:6/297: dwrite d8/db/d49/f4a [0,4194304] 0 2026-03-10T14:08:13.912 INFO:tasks.workunit.client.0.vm03.stdout:2/287: read d5/d10/d1c/f2e [11769,57016] 0 2026-03-10T14:08:13.930 INFO:tasks.workunit.client.0.vm03.stdout:7/226: readlink d5/d9/d14/d26/l43 0 2026-03-10T14:08:13.930 INFO:tasks.workunit.client.0.vm03.stdout:0/274: mknod d3/d4d/c53 0 2026-03-10T14:08:13.930 INFO:tasks.workunit.client.0.vm03.stdout:9/329: creat d2/d29/d33/d41/f6e x:0 0 0 2026-03-10T14:08:13.934 INFO:tasks.workunit.client.0.vm03.stdout:1/326: rmdir d0/d18/d1d 39 2026-03-10T14:08:13.941 INFO:tasks.workunit.client.0.vm03.stdout:3/251: symlink d1d/l4d 0 2026-03-10T14:08:13.964 INFO:tasks.workunit.client.0.vm03.stdout:8/309: symlink da/l61 0 2026-03-10T14:08:13.964 INFO:tasks.workunit.client.0.vm03.stdout:8/310: chown da/d58/d5f 8063 1 2026-03-10T14:08:13.964 INFO:tasks.workunit.client.1.vm04.stdout:7/811: symlink d2/dc/de/d2d/d5c/da9/dee/l11d 0 2026-03-10T14:08:13.974 INFO:tasks.workunit.client.0.vm03.stdout:4/348: write d5/d9/f16 [4585214,91779] 0 2026-03-10T14:08:13.980 INFO:tasks.workunit.client.1.vm04.stdout:3/791: dread da/dc/d3f/d54/ff3 [0,4194304] 0 2026-03-10T14:08:13.984 INFO:tasks.workunit.client.0.vm03.stdout:0/275: write d3/d4d/f2a [72497,28339] 0 2026-03-10T14:08:13.987 INFO:tasks.workunit.client.0.vm03.stdout:1/327: dread - d0/d2/df/d16/f4f zero size 2026-03-10T14:08:13.991 INFO:tasks.workunit.client.0.vm03.stdout:2/288: creat d5/d10/d1f/d4f/f55 x:0 0 0 2026-03-10T14:08:13.991 INFO:tasks.workunit.client.1.vm04.stdout:8/876: dread - d0/d75/d8a/ffc zero size 2026-03-10T14:08:13.991 INFO:tasks.workunit.client.0.vm03.stdout:3/252: creat d1d/d29/d41/f4e x:0 0 0 2026-03-10T14:08:13.991 INFO:tasks.workunit.client.0.vm03.stdout:3/253: read fc [105952,80222] 0 2026-03-10T14:08:13.996 INFO:tasks.workunit.client.1.vm04.stdout:0/806: dread d0/d2/d15/d22/d38/f93 [0,4194304] 0 2026-03-10T14:08:13.997 INFO:tasks.workunit.client.0.vm03.stdout:7/227: symlink d5/d9/d14/l44 0 2026-03-10T14:08:13.997 INFO:tasks.workunit.client.0.vm03.stdout:9/330: symlink d2/d29/l6f 0 2026-03-10T14:08:13.997 INFO:tasks.workunit.client.1.vm04.stdout:0/807: fdatasync d0/d2/d15/f57 0 2026-03-10T14:08:14.000 INFO:tasks.workunit.client.1.vm04.stdout:7/812: creat d2/d2a/d42/d86/f11e x:0 0 0 2026-03-10T14:08:14.004 INFO:tasks.workunit.client.0.vm03.stdout:8/311: write da/f1f [395442,118893] 0 2026-03-10T14:08:14.005 INFO:tasks.workunit.client.0.vm03.stdout:5/389: getdents d4/d16/d19/d23 0 2026-03-10T14:08:14.011 INFO:tasks.workunit.client.0.vm03.stdout:0/276: dread d3/f28 [0,4194304] 0 2026-03-10T14:08:14.011 INFO:tasks.workunit.client.1.vm04.stdout:9/723: dwrite d9/da/dd/d1c/f2e [0,4194304] 0 2026-03-10T14:08:14.012 INFO:tasks.workunit.client.0.vm03.stdout:0/277: dread d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:14.015 INFO:tasks.workunit.client.1.vm04.stdout:3/792: unlink da/dc/d3f/d54/fe2 0 2026-03-10T14:08:14.015 INFO:tasks.workunit.client.1.vm04.stdout:4/757: link d4/d14/dac/db7/lc4 d4/d14/d6d/l100 0 2026-03-10T14:08:14.021 INFO:tasks.workunit.client.1.vm04.stdout:8/877: dread - d0/d3/dd/d89/db5/fea zero size 2026-03-10T14:08:14.025 INFO:tasks.workunit.client.0.vm03.stdout:2/289: unlink d5/d10/d1c/f2e 0 2026-03-10T14:08:14.025 INFO:tasks.workunit.client.0.vm03.stdout:3/254: mknod d1d/d29/c4f 0 2026-03-10T14:08:14.026 INFO:tasks.workunit.client.1.vm04.stdout:2/762: mkdir d0/d14/d39/d47/de7 0 2026-03-10T14:08:14.038 INFO:tasks.workunit.client.1.vm04.stdout:5/855: write d7/d2d/d69/fc9 [4925645,38359] 0 2026-03-10T14:08:14.038 INFO:tasks.workunit.client.1.vm04.stdout:0/808: fsync d0/d2/d15/d22/d38/d56/dc1/dd4/fd9 0 2026-03-10T14:08:14.038 INFO:tasks.workunit.client.0.vm03.stdout:1/328: read d0/d2/f9 [2156444,50497] 0 2026-03-10T14:08:14.038 INFO:tasks.workunit.client.0.vm03.stdout:3/255: write d1d/d33/f3a [158470,86887] 0 2026-03-10T14:08:14.039 INFO:tasks.workunit.client.0.vm03.stdout:1/329: dwrite d0/d2/df/d16/d41/f68 [0,4194304] 0 2026-03-10T14:08:14.040 INFO:tasks.workunit.client.1.vm04.stdout:0/809: chown d0/dee/ff0 701324 1 2026-03-10T14:08:14.040 INFO:tasks.workunit.client.0.vm03.stdout:8/312: readlink da/d24/l27 0 2026-03-10T14:08:14.059 INFO:tasks.workunit.client.0.vm03.stdout:6/298: getdents d8/d11 0 2026-03-10T14:08:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:14.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:14.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:14.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:14.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:14.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:14.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:13 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:14.076 INFO:tasks.workunit.client.1.vm04.stdout:6/693: rename d3/de/d35/d3f/d2d/d32/d23/d83 to d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4 0 2026-03-10T14:08:14.078 INFO:tasks.workunit.client.1.vm04.stdout:7/813: dread d2/dc/de/d2d/d60/faf [0,4194304] 0 2026-03-10T14:08:14.083 INFO:tasks.workunit.client.1.vm04.stdout:9/724: creat d9/da/d8c/de5/ffa x:0 0 0 2026-03-10T14:08:14.092 INFO:tasks.workunit.client.1.vm04.stdout:8/878: symlink d0/d3/dd/d78/l113 0 2026-03-10T14:08:14.098 INFO:tasks.workunit.client.1.vm04.stdout:2/763: mknod d0/db8/ce8 0 2026-03-10T14:08:14.098 INFO:tasks.workunit.client.1.vm04.stdout:9/725: dwrite d9/da/dd/d1c/f22 [0,4194304] 0 2026-03-10T14:08:14.099 INFO:tasks.workunit.client.1.vm04.stdout:2/764: write d0/d14/d91/d8/fdc [412365,45437] 0 2026-03-10T14:08:14.107 INFO:tasks.workunit.client.0.vm03.stdout:2/290: sync 2026-03-10T14:08:14.108 INFO:tasks.workunit.client.1.vm04.stdout:0/810: sync 2026-03-10T14:08:14.108 INFO:tasks.workunit.client.1.vm04.stdout:4/758: dread d4/df/db2/db4/d47/d4f/f6a [0,4194304] 0 2026-03-10T14:08:14.122 INFO:tasks.workunit.client.0.vm03.stdout:2/291: sync 2026-03-10T14:08:14.128 INFO:tasks.workunit.client.1.vm04.stdout:1/793: rename d3/d5c/cde to d3/d8f/db7/dce/c112 0 2026-03-10T14:08:14.132 INFO:tasks.workunit.client.0.vm03.stdout:7/228: dwrite d5/d9/f30 [4194304,4194304] 0 2026-03-10T14:08:14.133 INFO:tasks.workunit.client.1.vm04.stdout:1/794: readlink d3/d22/d63/d35/dd9/d13/d38/d58/d5b/l10b 0 2026-03-10T14:08:14.136 INFO:tasks.workunit.client.1.vm04.stdout:1/795: chown d3/d22/d63/d35/dd9/d13/da0/dc5/dfa/c104 7844276 1 2026-03-10T14:08:14.148 INFO:tasks.workunit.client.1.vm04.stdout:6/694: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/f36 [0,4194304] 0 2026-03-10T14:08:14.148 INFO:tasks.workunit.client.0.vm03.stdout:8/313: dread f7 [0,4194304] 0 2026-03-10T14:08:14.159 INFO:tasks.workunit.client.0.vm03.stdout:0/278: rename d3/d51 to d3/d46/d54 0 2026-03-10T14:08:14.166 INFO:tasks.workunit.client.1.vm04.stdout:3/793: fsync da/dc/d3f/d61/f8c 0 2026-03-10T14:08:14.166 INFO:tasks.workunit.client.0.vm03.stdout:9/331: creat d2/d29/d33/f70 x:0 0 0 2026-03-10T14:08:14.166 INFO:tasks.workunit.client.0.vm03.stdout:3/256: symlink d1d/d33/d47/l50 0 2026-03-10T14:08:14.166 INFO:tasks.workunit.client.0.vm03.stdout:3/257: fsync d1d/d29/f3f 0 2026-03-10T14:08:14.166 INFO:tasks.workunit.client.0.vm03.stdout:4/349: getdents d5/d9/db/d19/d38/d53/d55 0 2026-03-10T14:08:14.166 INFO:tasks.workunit.client.0.vm03.stdout:4/350: chown d5/d9/db 3 1 2026-03-10T14:08:14.168 INFO:tasks.workunit.client.1.vm04.stdout:9/726: dread d9/da/dd/d1c/fdd [0,4194304] 0 2026-03-10T14:08:14.172 INFO:tasks.workunit.client.0.vm03.stdout:1/330: fdatasync d0/d2/d34/f5f 0 2026-03-10T14:08:14.174 INFO:tasks.workunit.client.1.vm04.stdout:7/814: mknod d2/dc/de/c11f 0 2026-03-10T14:08:14.180 INFO:tasks.workunit.client.1.vm04.stdout:7/815: chown d2/dc/de/d2d/d60/d7c/d64/l106 1 1 2026-03-10T14:08:14.180 INFO:tasks.workunit.client.0.vm03.stdout:6/299: creat d8/d11/d18/d54/f5b x:0 0 0 2026-03-10T14:08:14.180 INFO:tasks.workunit.client.0.vm03.stdout:6/300: stat f0 0 2026-03-10T14:08:14.180 INFO:tasks.workunit.client.0.vm03.stdout:0/279: creat d3/d46/f55 x:0 0 0 2026-03-10T14:08:14.180 INFO:tasks.workunit.client.1.vm04.stdout:8/879: mknod d0/d3/dd/d89/db5/ddc/c114 0 2026-03-10T14:08:14.182 INFO:tasks.workunit.client.0.vm03.stdout:6/301: sync 2026-03-10T14:08:14.183 INFO:tasks.workunit.client.0.vm03.stdout:6/302: write d8/db/f1f [5017009,130409] 0 2026-03-10T14:08:14.191 INFO:tasks.workunit.client.0.vm03.stdout:3/258: mkdir d1d/d39/d51 0 2026-03-10T14:08:14.192 INFO:tasks.workunit.client.1.vm04.stdout:3/794: creat da/dc/d47/d9b/d106/dde/f10e x:0 0 0 2026-03-10T14:08:14.193 INFO:tasks.workunit.client.0.vm03.stdout:2/292: symlink d5/d10/l56 0 2026-03-10T14:08:14.195 INFO:tasks.workunit.client.0.vm03.stdout:5/390: write d4/f82 [848217,58854] 0 2026-03-10T14:08:14.197 INFO:tasks.workunit.client.1.vm04.stdout:2/765: write d0/d14/d91/d4a/d8c/dab/f36 [845665,79418] 0 2026-03-10T14:08:14.199 INFO:tasks.workunit.client.1.vm04.stdout:1/796: write d3/d22/d63/d35/dd9/d13/da0/fb9 [289842,82579] 0 2026-03-10T14:08:14.200 INFO:tasks.workunit.client.0.vm03.stdout:5/391: dread d4/d16/d19/d23/f50 [0,4194304] 0 2026-03-10T14:08:14.201 INFO:tasks.workunit.client.1.vm04.stdout:4/759: dwrite d4/df/db2/db4/d47/d4f/d8c/dc8/dff/fd2 [0,4194304] 0 2026-03-10T14:08:14.202 INFO:tasks.workunit.client.0.vm03.stdout:5/392: sync 2026-03-10T14:08:14.203 INFO:tasks.workunit.client.1.vm04.stdout:4/760: readlink d4/d14/d3c/la3 0 2026-03-10T14:08:14.205 INFO:tasks.workunit.client.0.vm03.stdout:1/331: creat d0/d2/d34/f69 x:0 0 0 2026-03-10T14:08:14.212 INFO:tasks.workunit.client.1.vm04.stdout:5/856: creat d7/d26/f116 x:0 0 0 2026-03-10T14:08:14.214 INFO:tasks.workunit.client.0.vm03.stdout:9/332: mknod d2/d29/d33/d6d/c71 0 2026-03-10T14:08:14.214 INFO:tasks.workunit.client.1.vm04.stdout:7/816: chown d2/dc/d4d/d7f/ldf 31171309 1 2026-03-10T14:08:14.216 INFO:tasks.workunit.client.1.vm04.stdout:6/695: mknod d3/de/d35/d3f/dba/cd5 0 2026-03-10T14:08:14.220 INFO:tasks.workunit.client.1.vm04.stdout:0/811: creat d0/d2/fff x:0 0 0 2026-03-10T14:08:14.234 INFO:tasks.workunit.client.0.vm03.stdout:8/314: rename da/d36/d40/d5a/f5b to da/f62 0 2026-03-10T14:08:14.235 INFO:tasks.workunit.client.0.vm03.stdout:7/229: truncate d5/d9/d14/f2f 1599414 0 2026-03-10T14:08:14.238 INFO:tasks.workunit.client.1.vm04.stdout:9/727: dwrite d9/d58/db5/f3d [0,4194304] 0 2026-03-10T14:08:14.248 INFO:tasks.workunit.client.1.vm04.stdout:3/795: write da/dc/d3f/d61/f94 [3787894,6835] 0 2026-03-10T14:08:14.250 INFO:tasks.workunit.client.1.vm04.stdout:2/766: dwrite d0/d14/d91/d8/f42 [4194304,4194304] 0 2026-03-10T14:08:14.256 INFO:tasks.workunit.client.0.vm03.stdout:9/333: mkdir d2/d29/d33/d55/d72 0 2026-03-10T14:08:14.257 INFO:tasks.workunit.client.0.vm03.stdout:9/334: truncate d2/d29/d33/d41/f6e 100581 0 2026-03-10T14:08:14.261 INFO:tasks.workunit.client.0.vm03.stdout:9/335: dwrite d2/d14/f61 [0,4194304] 0 2026-03-10T14:08:14.267 INFO:tasks.workunit.client.0.vm03.stdout:6/303: getdents d8/db/d2c/d2d/d32/d3a 0 2026-03-10T14:08:14.267 INFO:tasks.workunit.client.0.vm03.stdout:6/304: chown d8/db/d12/f40 261668993 1 2026-03-10T14:08:14.267 INFO:tasks.workunit.client.0.vm03.stdout:6/305: chown d8/d1b/d1c 3860849 1 2026-03-10T14:08:14.275 INFO:tasks.workunit.client.1.vm04.stdout:1/797: write d3/f8 [530990,53215] 0 2026-03-10T14:08:14.280 INFO:tasks.workunit.client.0.vm03.stdout:3/259: creat d1d/d39/d51/f52 x:0 0 0 2026-03-10T14:08:14.280 INFO:tasks.workunit.client.0.vm03.stdout:3/260: stat d1d/d29/f3f 0 2026-03-10T14:08:14.288 INFO:tasks.workunit.client.0.vm03.stdout:4/351: creat d5/d9/db/f6d x:0 0 0 2026-03-10T14:08:14.293 INFO:tasks.workunit.client.0.vm03.stdout:4/352: stat d5/d47/d5b 0 2026-03-10T14:08:14.293 INFO:tasks.workunit.client.1.vm04.stdout:8/880: mkdir d0/d3/d63/d12/d51/d67/d96/dc8/d111/d115 0 2026-03-10T14:08:14.293 INFO:tasks.workunit.client.1.vm04.stdout:0/812: mknod d0/d2/dbe/c100 0 2026-03-10T14:08:14.294 INFO:tasks.workunit.client.0.vm03.stdout:8/315: rename da/d36/d40/d56/f5d to da/d3c/d4b/f63 0 2026-03-10T14:08:14.303 INFO:tasks.workunit.client.1.vm04.stdout:4/761: dwrite d4/df/db2/db4/d47/d4f/fa6 [0,4194304] 0 2026-03-10T14:08:14.303 INFO:tasks.workunit.client.0.vm03.stdout:2/293: dread d5/d10/d17/f28 [0,4194304] 0 2026-03-10T14:08:14.309 INFO:tasks.workunit.client.0.vm03.stdout:9/336: stat d2/f15 0 2026-03-10T14:08:14.310 INFO:tasks.workunit.client.0.vm03.stdout:6/306: rmdir d8/db/d49 39 2026-03-10T14:08:14.312 INFO:tasks.workunit.client.1.vm04.stdout:2/767: fsync d0/d14/d54/f9c 0 2026-03-10T14:08:14.314 INFO:tasks.workunit.client.0.vm03.stdout:3/261: readlink d1d/d33/l4b 0 2026-03-10T14:08:14.314 INFO:tasks.workunit.client.0.vm03.stdout:3/262: dread - d1d/d29/f44 zero size 2026-03-10T14:08:14.315 INFO:tasks.workunit.client.1.vm04.stdout:1/798: truncate d3/d5c/fbf 4166513 0 2026-03-10T14:08:14.315 INFO:tasks.workunit.client.0.vm03.stdout:3/263: truncate d1d/d39/d51/f52 256658 0 2026-03-10T14:08:14.315 INFO:tasks.workunit.client.1.vm04.stdout:1/799: chown d3/d22/d63/d35/dd9/fea 37323074 1 2026-03-10T14:08:14.316 INFO:tasks.workunit.client.1.vm04.stdout:1/800: chown d3/d22/ffe 25222 1 2026-03-10T14:08:14.317 INFO:tasks.workunit.client.1.vm04.stdout:6/696: rename d3/de/d35/d3f/d2d/d32/lcf to d3/de/d35/d3a/d43/d4c/d5e/d76/ld6 0 2026-03-10T14:08:14.318 INFO:tasks.workunit.client.1.vm04.stdout:8/881: read - d0/d3/dd/d78/fab zero size 2026-03-10T14:08:14.320 INFO:tasks.workunit.client.1.vm04.stdout:8/882: chown d0/d3/dd/d89/db5/ddc 1 1 2026-03-10T14:08:14.320 INFO:tasks.workunit.client.1.vm04.stdout:8/883: readlink d0/d3/d5/lcb 0 2026-03-10T14:08:14.321 INFO:tasks.workunit.client.1.vm04.stdout:8/884: chown d0/d3/dd/d78/fae 215112 1 2026-03-10T14:08:14.321 INFO:tasks.workunit.client.1.vm04.stdout:8/885: rename d0/d3/d63/d12/d51/d67/d96/dc8/d111 to d0/d3/d63/d12/d51/d67/d96/dc8/d111/d116 22 2026-03-10T14:08:14.321 INFO:tasks.workunit.client.1.vm04.stdout:8/886: readlink d0/d3/d5/la6 0 2026-03-10T14:08:14.329 INFO:tasks.workunit.client.0.vm03.stdout:4/353: dread d5/f7 [0,4194304] 0 2026-03-10T14:08:14.333 INFO:tasks.workunit.client.1.vm04.stdout:5/857: rmdir d7/d12/d2b/d8c/d10c 0 2026-03-10T14:08:14.339 INFO:tasks.workunit.client.0.vm03.stdout:5/393: dread - d4/d35/f54 zero size 2026-03-10T14:08:14.349 INFO:tasks.workunit.client.1.vm04.stdout:7/817: dwrite d2/dc/de/d2d/d38/f37 [4194304,4194304] 0 2026-03-10T14:08:14.349 INFO:tasks.workunit.client.1.vm04.stdout:9/728: write d9/da/d5d/f8b [1828873,104001] 0 2026-03-10T14:08:14.427 INFO:tasks.workunit.client.1.vm04.stdout:6/697: creat d3/de/d35/d3f/d2d/d32/d5c/fd7 x:0 0 0 2026-03-10T14:08:14.431 INFO:tasks.workunit.client.1.vm04.stdout:0/813: truncate d0/d2/d15/d22/d38/d56/f58 683786 0 2026-03-10T14:08:14.432 INFO:tasks.workunit.client.1.vm04.stdout:0/814: chown d0/d2/d15/d22/d38/d56/dc1/dd4/fd9 3 1 2026-03-10T14:08:14.434 INFO:tasks.workunit.client.1.vm04.stdout:3/796: dwrite da/dc/d47/d9b/d106/dd7/fa0 [0,4194304] 0 2026-03-10T14:08:14.457 INFO:tasks.workunit.client.1.vm04.stdout:5/858: truncate d7/d12/fef 748901 0 2026-03-10T14:08:14.459 INFO:tasks.workunit.client.1.vm04.stdout:7/818: truncate d2/fbe 533729 0 2026-03-10T14:08:14.460 INFO:tasks.workunit.client.1.vm04.stdout:7/819: stat d2/dc/de/d2d/d38/d50/l10b 0 2026-03-10T14:08:14.464 INFO:tasks.workunit.client.0.vm03.stdout:1/332: creat d0/d2/df/f6a x:0 0 0 2026-03-10T14:08:14.464 INFO:tasks.workunit.client.1.vm04.stdout:9/729: read - d9/d58/db5/da5/fc9 zero size 2026-03-10T14:08:14.467 INFO:tasks.workunit.client.1.vm04.stdout:9/730: dwrite d9/d33/f4b [0,4194304] 0 2026-03-10T14:08:14.484 INFO:tasks.workunit.client.0.vm03.stdout:8/316: mkdir da/d3a/d44/d64 0 2026-03-10T14:08:14.484 INFO:tasks.workunit.client.1.vm04.stdout:0/815: mknod d0/d2/d15/d22/d38/d56/dcb/c101 0 2026-03-10T14:08:14.485 INFO:tasks.workunit.client.0.vm03.stdout:7/230: fsync d5/d9/d14/f41 0 2026-03-10T14:08:14.490 INFO:tasks.workunit.client.0.vm03.stdout:3/264: mkdir d1d/d33/d47/d53 0 2026-03-10T14:08:14.495 INFO:tasks.workunit.client.1.vm04.stdout:7/820: mkdir d2/dc/de/d2d/d60/d7c/d64/d108/d117/d120 0 2026-03-10T14:08:14.503 INFO:tasks.workunit.client.1.vm04.stdout:8/887: mkdir d0/d3/d63/d12/d51/d67/d96/d105/d117 0 2026-03-10T14:08:14.504 INFO:tasks.workunit.client.0.vm03.stdout:5/394: dread d4/d6/de/f4f [0,4194304] 0 2026-03-10T14:08:14.515 INFO:tasks.workunit.client.0.vm03.stdout:1/333: creat d0/d2/df/d16/d20/f6b x:0 0 0 2026-03-10T14:08:14.523 INFO:tasks.workunit.client.1.vm04.stdout:3/797: mkdir da/dc/d35/d37/d10a/d10f 0 2026-03-10T14:08:14.527 INFO:tasks.workunit.client.1.vm04.stdout:2/768: getdents d0/d14/d91/d4a/d8c/dab/db3 0 2026-03-10T14:08:14.530 INFO:tasks.workunit.client.1.vm04.stdout:5/859: mkdir d7/d2d/d117 0 2026-03-10T14:08:14.536 INFO:tasks.workunit.client.1.vm04.stdout:7/821: chown d2/dc/cf 35 1 2026-03-10T14:08:14.542 INFO:tasks.workunit.client.1.vm04.stdout:8/888: fdatasync d0/d3/d63/d12/d51/d67/d96/f71 0 2026-03-10T14:08:14.551 INFO:tasks.workunit.client.0.vm03.stdout:4/354: dread d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:14.551 INFO:tasks.workunit.client.0.vm03.stdout:1/334: readlink d0/d2/df/d27/l4d 0 2026-03-10T14:08:14.593 INFO:tasks.workunit.client.1.vm04.stdout:3/798: rename da/dc/d3f/d61/fc7 to da/dc/d47/d9b/d106/dde/dac/f110 0 2026-03-10T14:08:14.597 INFO:tasks.workunit.client.1.vm04.stdout:2/769: creat d0/d14/d91/d8/fe9 x:0 0 0 2026-03-10T14:08:14.598 INFO:tasks.workunit.client.1.vm04.stdout:2/770: chown d0/d14/d39/d47/d70/dad 1396 1 2026-03-10T14:08:14.599 INFO:tasks.workunit.client.1.vm04.stdout:2/771: stat d0/d14/d91/d4a/d8c/dab/d46 0 2026-03-10T14:08:14.608 INFO:tasks.workunit.client.1.vm04.stdout:7/822: dread - d2/dc/de/d2d/d60/ffb zero size 2026-03-10T14:08:14.615 INFO:tasks.workunit.client.1.vm04.stdout:4/762: write d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/f76 [3552231,85390] 0 2026-03-10T14:08:14.618 INFO:tasks.workunit.client.0.vm03.stdout:0/280: link d3/f10 d3/d17/f56 0 2026-03-10T14:08:14.620 INFO:tasks.workunit.client.1.vm04.stdout:0/816: rmdir d0/d2/d25/dfa 0 2026-03-10T14:08:14.628 INFO:tasks.workunit.client.1.vm04.stdout:3/799: truncate da/dc/d35/d52/d53/d78/f98 4608994 0 2026-03-10T14:08:14.635 INFO:tasks.workunit.client.1.vm04.stdout:2/772: rename d0/d14/d39/d47/d93 to d0/d14/d91/d8/d17/d4e/dea 0 2026-03-10T14:08:14.645 INFO:tasks.workunit.client.1.vm04.stdout:5/860: truncate d7/d26/f30 500897 0 2026-03-10T14:08:14.649 INFO:tasks.workunit.client.0.vm03.stdout:0/281: getdents d3/d4d/d30 0 2026-03-10T14:08:14.657 INFO:tasks.workunit.client.1.vm04.stdout:0/817: truncate d0/d2/d15/d22/d38/d56/d66/f54 122321 0 2026-03-10T14:08:14.669 INFO:tasks.workunit.client.1.vm04.stdout:4/763: rename d4/f57 to d4/d14/dac/f101 0 2026-03-10T14:08:14.672 INFO:tasks.workunit.client.1.vm04.stdout:5/861: symlink d7/d26/d6b/d6e/d82/l118 0 2026-03-10T14:08:14.680 INFO:tasks.workunit.client.1.vm04.stdout:5/862: dread d7/d2d/d69/fc9 [4194304,4194304] 0 2026-03-10T14:08:14.692 INFO:tasks.workunit.client.1.vm04.stdout:0/818: unlink d0/da2/fca 0 2026-03-10T14:08:14.699 INFO:tasks.workunit.client.1.vm04.stdout:1/801: dwrite d3/d22/d63/d35/d6c/fec [0,4194304] 0 2026-03-10T14:08:14.703 INFO:tasks.workunit.client.0.vm03.stdout:1/335: rmdir d0/d42 39 2026-03-10T14:08:14.704 INFO:tasks.workunit.client.0.vm03.stdout:9/337: rmdir d2/d29 39 2026-03-10T14:08:14.714 INFO:tasks.workunit.client.0.vm03.stdout:4/355: mkdir d5/d6e 0 2026-03-10T14:08:14.715 INFO:tasks.workunit.client.0.vm03.stdout:4/356: write d5/d9/db/d19/d34/f58 [2282512,38621] 0 2026-03-10T14:08:14.719 INFO:tasks.workunit.client.0.vm03.stdout:6/307: dwrite d8/db/d2c/d2d/d32/f41 [0,4194304] 0 2026-03-10T14:08:14.722 INFO:tasks.workunit.client.0.vm03.stdout:6/308: write d8/d1b/f29 [3081139,70255] 0 2026-03-10T14:08:14.725 INFO:tasks.workunit.client.1.vm04.stdout:3/800: creat da/dc/d3f/d61/d102/f111 x:0 0 0 2026-03-10T14:08:14.727 INFO:tasks.workunit.client.0.vm03.stdout:9/338: readlink d2/d14/l1d 0 2026-03-10T14:08:14.737 INFO:tasks.workunit.client.0.vm03.stdout:4/357: symlink d5/d9/db/d19/d38/d53/d55/l6f 0 2026-03-10T14:08:14.743 INFO:tasks.workunit.client.1.vm04.stdout:2/773: mkdir d0/d14/deb 0 2026-03-10T14:08:14.747 INFO:tasks.workunit.client.1.vm04.stdout:5/863: mknod d7/d12/d45/c119 0 2026-03-10T14:08:14.752 INFO:tasks.workunit.client.0.vm03.stdout:8/317: mknod da/c65 0 2026-03-10T14:08:14.771 INFO:tasks.workunit.client.0.vm03.stdout:3/265: rename d1d/l4d to d1d/d33/d47/l54 0 2026-03-10T14:08:14.773 INFO:tasks.workunit.client.1.vm04.stdout:9/731: write d9/da/f13 [737388,9754] 0 2026-03-10T14:08:14.777 INFO:tasks.workunit.client.0.vm03.stdout:5/395: truncate d4/d16/f2d 853456 0 2026-03-10T14:08:14.781 INFO:tasks.workunit.client.1.vm04.stdout:6/698: creat d3/de/d35/d3f/fd8 x:0 0 0 2026-03-10T14:08:14.787 INFO:tasks.workunit.client.1.vm04.stdout:8/889: dwrite d0/d3/d63/d12/d51/f64 [4194304,4194304] 0 2026-03-10T14:08:14.795 INFO:tasks.workunit.client.0.vm03.stdout:4/358: symlink d5/d9/db/d19/d38/d53/l70 0 2026-03-10T14:08:14.800 INFO:tasks.workunit.client.0.vm03.stdout:6/309: mkdir d8/db/d12/d51/d5c 0 2026-03-10T14:08:14.801 INFO:tasks.workunit.client.0.vm03.stdout:6/310: read d8/db/d12/f57 [3801493,25661] 0 2026-03-10T14:08:14.801 INFO:tasks.workunit.client.0.vm03.stdout:7/231: write d5/d9/f10 [5180159,62287] 0 2026-03-10T14:08:14.802 INFO:tasks.workunit.client.0.vm03.stdout:8/318: creat da/d24/d49/f66 x:0 0 0 2026-03-10T14:08:14.805 INFO:tasks.workunit.client.0.vm03.stdout:9/339: write d2/d14/d2b/d43/f45 [4448175,16869] 0 2026-03-10T14:08:14.811 INFO:tasks.workunit.client.0.vm03.stdout:0/282: write d3/d4d/f22 [2131199,84997] 0 2026-03-10T14:08:14.812 INFO:tasks.workunit.client.1.vm04.stdout:7/823: dwrite d2/dc/de/d2d/d60/d7c/f84 [4194304,4194304] 0 2026-03-10T14:08:14.813 INFO:tasks.workunit.client.1.vm04.stdout:7/824: stat d2/dc/de/d2d/d60/d81/fa6 0 2026-03-10T14:08:14.818 INFO:tasks.workunit.client.0.vm03.stdout:3/266: dread d1d/f36 [0,4194304] 0 2026-03-10T14:08:14.819 INFO:tasks.workunit.client.0.vm03.stdout:3/267: chown d1d/l30 4073 1 2026-03-10T14:08:14.826 INFO:tasks.workunit.client.0.vm03.stdout:4/359: mkdir d5/d9/db/d19/d38/d53/d71 0 2026-03-10T14:08:14.830 INFO:tasks.workunit.client.0.vm03.stdout:6/311: creat d8/db/d12/f5d x:0 0 0 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:6/312: chown d8/db/d12/d51/d5c 366 1 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:8/319: fdatasync f7 0 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:7/232: dread d5/d9/d14/ff [0,4194304] 0 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:5/396: rmdir d4/d16 39 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:5/397: stat d4/d13/d1f/f74 0 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:5/398: readlink d4/d13/d1f/l5f 0 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:7/233: creat d5/d9/d14/d26/d39/f45 x:0 0 0 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:0/283: mknod d3/d16/d21/c57 0 2026-03-10T14:08:14.840 INFO:tasks.workunit.client.0.vm03.stdout:4/360: dread d5/d9/f16 [0,4194304] 0 2026-03-10T14:08:14.841 INFO:tasks.workunit.client.0.vm03.stdout:1/336: dwrite d0/fa [0,4194304] 0 2026-03-10T14:08:14.841 INFO:tasks.workunit.client.0.vm03.stdout:2/294: link d5/ld d5/d10/d1c/l57 0 2026-03-10T14:08:14.843 INFO:tasks.workunit.client.0.vm03.stdout:3/268: readlink l14 0 2026-03-10T14:08:14.843 INFO:tasks.workunit.client.0.vm03.stdout:3/269: write d1d/d39/d51/f52 [53445,79213] 0 2026-03-10T14:08:14.849 INFO:tasks.workunit.client.0.vm03.stdout:8/320: rename da/d36/d40/d5a to da/d58/d5f/d67 0 2026-03-10T14:08:14.861 INFO:tasks.workunit.client.0.vm03.stdout:0/284: truncate d3/f9 4540020 0 2026-03-10T14:08:14.862 INFO:tasks.workunit.client.0.vm03.stdout:4/361: dread - d5/d9/db/d19/f3a zero size 2026-03-10T14:08:14.864 INFO:tasks.workunit.client.0.vm03.stdout:0/285: dread d3/d4d/d30/f45 [0,4194304] 0 2026-03-10T14:08:14.864 INFO:tasks.workunit.client.0.vm03.stdout:0/286: readlink d3/d11/l23 0 2026-03-10T14:08:14.867 INFO:tasks.workunit.client.1.vm04.stdout:2/774: creat d0/d14/d91/d8/d17/d4e/d85/d86/d96/fec x:0 0 0 2026-03-10T14:08:14.868 INFO:tasks.workunit.client.0.vm03.stdout:2/295: creat d5/d35/f58 x:0 0 0 2026-03-10T14:08:14.868 INFO:tasks.workunit.client.0.vm03.stdout:3/270: fdatasync fc 0 2026-03-10T14:08:14.868 INFO:tasks.workunit.client.0.vm03.stdout:4/362: dwrite d5/d9/db/d19/f45 [4194304,4194304] 0 2026-03-10T14:08:14.871 INFO:tasks.workunit.client.0.vm03.stdout:1/337: truncate d0/d42/f66 4703462 0 2026-03-10T14:08:14.877 INFO:tasks.workunit.client.0.vm03.stdout:3/271: dread fc [0,4194304] 0 2026-03-10T14:08:14.881 INFO:tasks.workunit.client.1.vm04.stdout:8/890: write d0/d3/d63/d12/d51/d67/d96/f108 [368853,67920] 0 2026-03-10T14:08:14.884 INFO:tasks.workunit.client.0.vm03.stdout:0/287: unlink d3/d11/d2c/d4a/f4c 0 2026-03-10T14:08:14.895 INFO:tasks.workunit.client.0.vm03.stdout:5/399: rename d4/d6/fb to d4/d13/d1f/f83 0 2026-03-10T14:08:14.896 INFO:tasks.workunit.client.1.vm04.stdout:7/825: mknod d2/dc/de/d2d/d60/c121 0 2026-03-10T14:08:14.896 INFO:tasks.workunit.client.0.vm03.stdout:1/338: dread - d0/d18/d1d/f5e zero size 2026-03-10T14:08:14.899 INFO:tasks.workunit.client.0.vm03.stdout:4/363: fdatasync d5/d9/db/f2a 0 2026-03-10T14:08:14.901 INFO:tasks.workunit.client.0.vm03.stdout:2/296: sync 2026-03-10T14:08:14.902 INFO:tasks.workunit.client.0.vm03.stdout:2/297: chown d5/d10 13614460 1 2026-03-10T14:08:14.904 INFO:tasks.workunit.client.0.vm03.stdout:4/364: dwrite f2 [0,4194304] 0 2026-03-10T14:08:14.909 INFO:tasks.workunit.client.1.vm04.stdout:2/775: dread d0/d14/d91/d8/d17/d35/f81 [0,4194304] 0 2026-03-10T14:08:14.912 INFO:tasks.workunit.client.0.vm03.stdout:3/272: mkdir d1d/d29/d41/d45/d55 0 2026-03-10T14:08:14.913 INFO:tasks.workunit.client.1.vm04.stdout:5/864: mknod d7/d12/d2b/d3e/c11a 0 2026-03-10T14:08:14.914 INFO:tasks.workunit.client.0.vm03.stdout:3/273: dread d1d/f2b [0,4194304] 0 2026-03-10T14:08:14.916 INFO:tasks.workunit.client.0.vm03.stdout:8/321: creat da/d3a/d44/d64/f68 x:0 0 0 2026-03-10T14:08:14.918 INFO:tasks.workunit.client.1.vm04.stdout:0/819: creat d0/d2/d15/d49/d50/d61/f102 x:0 0 0 2026-03-10T14:08:14.918 INFO:tasks.workunit.client.0.vm03.stdout:8/322: dread da/d36/d40/f47 [0,4194304] 0 2026-03-10T14:08:14.920 INFO:tasks.workunit.client.0.vm03.stdout:9/340: creat d2/d29/f73 x:0 0 0 2026-03-10T14:08:14.922 INFO:tasks.workunit.client.1.vm04.stdout:8/891: creat d0/d3/d63/d12/df5/f118 x:0 0 0 2026-03-10T14:08:14.923 INFO:tasks.workunit.client.1.vm04.stdout:8/892: readlink d0/d3/d63/d12/d69/l88 0 2026-03-10T14:08:14.924 INFO:tasks.workunit.client.1.vm04.stdout:8/893: read d0/d75/d8a/fdf [357746,63247] 0 2026-03-10T14:08:14.925 INFO:tasks.workunit.client.1.vm04.stdout:6/699: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:08:14.930 INFO:tasks.workunit.client.0.vm03.stdout:0/288: mknod d3/d46/d54/c58 0 2026-03-10T14:08:14.939 INFO:tasks.workunit.client.1.vm04.stdout:1/802: rename d3/d8f to d3/d22/d63/d35/dd9/d13/d38/d58/d113 0 2026-03-10T14:08:14.952 INFO:tasks.workunit.client.0.vm03.stdout:6/313: truncate d8/db/d49/f4a 602195 0 2026-03-10T14:08:14.953 INFO:tasks.workunit.client.0.vm03.stdout:6/314: chown d8/d11/d18/d54/f5b 11588 1 2026-03-10T14:08:14.954 INFO:tasks.workunit.client.0.vm03.stdout:6/315: read - d8/d11/f35 zero size 2026-03-10T14:08:14.954 INFO:tasks.workunit.client.0.vm03.stdout:6/316: chown d8/d3b/f53 648799502 1 2026-03-10T14:08:14.955 INFO:tasks.workunit.client.0.vm03.stdout:5/400: mkdir d4/d13/d1f/d84 0 2026-03-10T14:08:14.958 INFO:tasks.workunit.client.1.vm04.stdout:7/826: dread d2/dc/de/d2d/d38/fa5 [0,4194304] 0 2026-03-10T14:08:14.959 INFO:tasks.workunit.client.1.vm04.stdout:7/827: read d2/dc/de/dae/fb2 [2034475,7672] 0 2026-03-10T14:08:14.960 INFO:tasks.workunit.client.0.vm03.stdout:1/339: rmdir d0/d42 39 2026-03-10T14:08:14.968 INFO:tasks.workunit.client.0.vm03.stdout:2/298: mkdir d5/d10/d1c/d40/d59 0 2026-03-10T14:08:14.971 INFO:tasks.workunit.client.1.vm04.stdout:5/865: dread d7/d9/fe0 [0,4194304] 0 2026-03-10T14:08:14.972 INFO:tasks.workunit.client.1.vm04.stdout:0/820: symlink d0/d2/d15/d49/l103 0 2026-03-10T14:08:14.973 INFO:tasks.workunit.client.1.vm04.stdout:3/801: rename da/dc/d47/ce1 to da/dc/d35/d52/d53/d78/c112 0 2026-03-10T14:08:14.979 INFO:tasks.workunit.client.1.vm04.stdout:7/828: mknod d2/d2a/c122 0 2026-03-10T14:08:14.980 INFO:tasks.workunit.client.1.vm04.stdout:7/829: stat d2/dc/de/d2d/d60/d7c/d44/dc0/lfc 0 2026-03-10T14:08:14.982 INFO:tasks.workunit.client.0.vm03.stdout:3/274: creat d1d/d29/d41/f56 x:0 0 0 2026-03-10T14:08:14.982 INFO:tasks.workunit.client.1.vm04.stdout:4/764: creat d4/d14/d3c/d62/de6/f102 x:0 0 0 2026-03-10T14:08:14.987 INFO:tasks.workunit.client.0.vm03.stdout:8/323: mkdir da/d3c/d4b/d69 0 2026-03-10T14:08:14.988 INFO:tasks.workunit.client.0.vm03.stdout:8/324: dread - da/f62 zero size 2026-03-10T14:08:14.995 INFO:tasks.workunit.client.1.vm04.stdout:5/866: dread d7/f24 [4194304,4194304] 0 2026-03-10T14:08:14.996 INFO:tasks.workunit.client.1.vm04.stdout:5/867: read d7/d9/fe0 [883692,37603] 0 2026-03-10T14:08:14.996 INFO:tasks.workunit.client.1.vm04.stdout:5/868: chown d7/d2d 44540 1 2026-03-10T14:08:14.999 INFO:tasks.workunit.client.1.vm04.stdout:0/821: unlink d0/d2/d15/d49/d50/cea 0 2026-03-10T14:08:15.000 INFO:tasks.workunit.client.1.vm04.stdout:0/822: stat d0/d2/d15/d22/d62/ca5 0 2026-03-10T14:08:15.001 INFO:tasks.workunit.client.0.vm03.stdout:6/317: unlink d8/db/d2c/d2d/f46 0 2026-03-10T14:08:15.001 INFO:tasks.workunit.client.1.vm04.stdout:6/700: truncate d3/de/d35/d3f/d2d/d32/d23/f5a 1365995 0 2026-03-10T14:08:15.004 INFO:tasks.workunit.client.1.vm04.stdout:9/732: rename d9/d58/db5/da5/fa8 to d9/da/d8c/de5/df6/ffb 0 2026-03-10T14:08:15.004 INFO:tasks.workunit.client.0.vm03.stdout:5/401: rmdir d4/d13/d43 39 2026-03-10T14:08:15.006 INFO:tasks.workunit.client.1.vm04.stdout:3/802: symlink da/d30/l113 0 2026-03-10T14:08:15.012 INFO:tasks.workunit.client.0.vm03.stdout:1/340: truncate d0/d2/df/d16/f4f 6298 0 2026-03-10T14:08:15.013 INFO:tasks.workunit.client.1.vm04.stdout:1/803: truncate d3/d22/d63/f89 325579 0 2026-03-10T14:08:15.013 INFO:tasks.workunit.client.0.vm03.stdout:1/341: write d0/f11 [2835144,60402] 0 2026-03-10T14:08:15.013 INFO:tasks.workunit.client.0.vm03.stdout:0/289: dwrite d3/d4d/f49 [0,4194304] 0 2026-03-10T14:08:15.017 INFO:tasks.workunit.client.0.vm03.stdout:1/342: chown d0/d2/df/d27 34 1 2026-03-10T14:08:15.020 INFO:tasks.workunit.client.0.vm03.stdout:0/290: dread d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:15.027 INFO:tasks.workunit.client.1.vm04.stdout:5/869: creat d7/d59/f11b x:0 0 0 2026-03-10T14:08:15.033 INFO:tasks.workunit.client.1.vm04.stdout:4/765: dwrite d4/d14/d1b/f5c [0,4194304] 0 2026-03-10T14:08:15.033 INFO:tasks.workunit.client.1.vm04.stdout:7/830: dwrite d2/dc/f4a [0,4194304] 0 2026-03-10T14:08:15.055 INFO:tasks.workunit.client.1.vm04.stdout:6/701: mknod d3/d1d/d73/cd9 0 2026-03-10T14:08:15.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: pgmap v7: 65 pgs: 65 active+clean; 2.3 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 26 MiB/s rd, 77 MiB/s wr, 192 op/s 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:14 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.069 INFO:tasks.workunit.client.0.vm03.stdout:9/341: mknod d2/d29/d33/d41/d5c/c74 0 2026-03-10T14:08:15.069 INFO:tasks.workunit.client.1.vm04.stdout:2/776: rename d0/d14/d91/d4a/d8c/dab/db3/fe3 to d0/d14/d91/d4a/fed 0 2026-03-10T14:08:15.070 INFO:tasks.workunit.client.0.vm03.stdout:9/342: write d2/d14/d2b/d43/f45 [2578305,88244] 0 2026-03-10T14:08:15.071 INFO:tasks.workunit.client.0.vm03.stdout:9/343: write d2/d14/f1b [894305,112891] 0 2026-03-10T14:08:15.072 INFO:tasks.workunit.client.1.vm04.stdout:3/803: unlink c7 0 2026-03-10T14:08:15.072 INFO:tasks.workunit.client.1.vm04.stdout:3/804: stat da/dc/d47/f10c 0 2026-03-10T14:08:15.081 INFO:tasks.workunit.client.1.vm04.stdout:0/823: dread d0/d2/d15/d22/d38/f5b [0,4194304] 0 2026-03-10T14:08:15.086 INFO:tasks.workunit.client.1.vm04.stdout:7/831: creat d2/dc/de/d2d/d5c/da9/f123 x:0 0 0 2026-03-10T14:08:15.090 INFO:tasks.workunit.client.1.vm04.stdout:8/894: rename d0/d75/c7b to d0/d3/d63/d12/d51/d67/d96/d105/def/c119 0 2026-03-10T14:08:15.095 INFO:tasks.workunit.client.1.vm04.stdout:8/895: chown d0/d3/d63/d12/d51/d67/d96/fad 6 1 2026-03-10T14:08:15.095 INFO:tasks.workunit.client.1.vm04.stdout:8/896: write d0/d3/d63/f5f [4645,74622] 0 2026-03-10T14:08:15.096 INFO:tasks.workunit.client.1.vm04.stdout:1/804: creat d3/d5c/d79/d98/f114 x:0 0 0 2026-03-10T14:08:15.096 INFO:tasks.workunit.client.1.vm04.stdout:1/805: readlink d3/lf 0 2026-03-10T14:08:15.096 INFO:tasks.workunit.client.1.vm04.stdout:8/897: read d0/d3/d63/d12/d69/f81 [1303670,13889] 0 2026-03-10T14:08:15.098 INFO:tasks.workunit.client.1.vm04.stdout:3/805: creat da/dc/d35/d37/f114 x:0 0 0 2026-03-10T14:08:15.099 INFO:tasks.workunit.client.0.vm03.stdout:5/402: dread d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:15.100 INFO:tasks.workunit.client.0.vm03.stdout:5/403: read d4/d6/de/f65 [531116,29670] 0 2026-03-10T14:08:15.101 INFO:tasks.workunit.client.0.vm03.stdout:5/404: readlink d4/d13/d1f/l68 0 2026-03-10T14:08:15.101 INFO:tasks.workunit.client.0.vm03.stdout:3/275: rmdir d1d/d29/d41/d45/d4c 0 2026-03-10T14:08:15.102 INFO:tasks.workunit.client.1.vm04.stdout:7/832: mknod d2/dac/c124 0 2026-03-10T14:08:15.102 INFO:tasks.workunit.client.0.vm03.stdout:3/276: read d1d/f3c [2645292,78141] 0 2026-03-10T14:08:15.103 INFO:tasks.workunit.client.1.vm04.stdout:6/702: mknod d3/de/d35/d3f/d2d/cda 0 2026-03-10T14:08:15.105 INFO:tasks.workunit.client.1.vm04.stdout:9/733: rename d9/d44/d59/c8e to d9/d44/cfc 0 2026-03-10T14:08:15.106 INFO:tasks.workunit.client.1.vm04.stdout:9/734: stat d9/da/d5d/lb9 0 2026-03-10T14:08:15.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: pgmap v7: 65 pgs: 65 active+clean; 2.3 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 26 MiB/s rd, 77 MiB/s wr, 192 op/s 2026-03-10T14:08:15.112 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:14 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:15.113 INFO:tasks.workunit.client.1.vm04.stdout:1/806: creat d3/d22/d63/d35/dd9/d13/d38/db5/dff/f115 x:0 0 0 2026-03-10T14:08:15.113 INFO:tasks.workunit.client.0.vm03.stdout:5/405: stat d4/d13/l38 0 2026-03-10T14:08:15.113 INFO:tasks.workunit.client.0.vm03.stdout:3/277: write fc [620613,31804] 0 2026-03-10T14:08:15.114 INFO:tasks.workunit.client.0.vm03.stdout:5/406: dwrite d4/d16/f7b [0,4194304] 0 2026-03-10T14:08:15.126 INFO:tasks.workunit.client.1.vm04.stdout:8/898: rmdir d0/d3/d63/d12/df5 39 2026-03-10T14:08:15.128 INFO:tasks.workunit.client.1.vm04.stdout:8/899: read d0/d3/d73/fdd [159709,82985] 0 2026-03-10T14:08:15.129 INFO:tasks.workunit.client.1.vm04.stdout:3/806: mkdir da/dc/d115 0 2026-03-10T14:08:15.130 INFO:tasks.workunit.client.0.vm03.stdout:6/318: getdents d8/db/d12 0 2026-03-10T14:08:15.131 INFO:tasks.workunit.client.1.vm04.stdout:7/833: unlink d2/dc/f25 0 2026-03-10T14:08:15.134 INFO:tasks.workunit.client.0.vm03.stdout:5/407: creat d4/d16/d19/d6e/f85 x:0 0 0 2026-03-10T14:08:15.137 INFO:tasks.workunit.client.1.vm04.stdout:8/900: fsync d0/d3/f6b 0 2026-03-10T14:08:15.138 INFO:tasks.workunit.client.1.vm04.stdout:3/807: mkdir da/dc/d47/d116 0 2026-03-10T14:08:15.139 INFO:tasks.workunit.client.1.vm04.stdout:0/824: truncate d0/d2/d15/f57 3745918 0 2026-03-10T14:08:15.140 INFO:tasks.workunit.client.0.vm03.stdout:0/291: dread d3/d17/f35 [0,4194304] 0 2026-03-10T14:08:15.140 INFO:tasks.workunit.client.0.vm03.stdout:3/278: getdents d1d/d33 0 2026-03-10T14:08:15.140 INFO:tasks.workunit.client.1.vm04.stdout:7/834: mkdir d2/dc/d4d/d125 0 2026-03-10T14:08:15.143 INFO:tasks.workunit.client.1.vm04.stdout:9/735: mknod d9/da/dd/d1c/da3/dec/cfd 0 2026-03-10T14:08:15.158 INFO:tasks.workunit.client.1.vm04.stdout:9/736: dread d9/d58/db5/f3d [0,4194304] 0 2026-03-10T14:08:15.158 INFO:tasks.workunit.client.1.vm04.stdout:8/901: creat d0/d3/dd/d78/f11a x:0 0 0 2026-03-10T14:08:15.158 INFO:tasks.workunit.client.0.vm03.stdout:5/408: dwrite d4/d6/f6d [0,4194304] 0 2026-03-10T14:08:15.159 INFO:tasks.workunit.client.0.vm03.stdout:3/279: unlink d1d/d29/f2f 0 2026-03-10T14:08:15.159 INFO:tasks.workunit.client.0.vm03.stdout:5/409: mknod d4/d16/d19/d6e/c86 0 2026-03-10T14:08:15.159 INFO:tasks.workunit.client.0.vm03.stdout:3/280: write d1d/d29/d41/f56 [386065,88585] 0 2026-03-10T14:08:15.159 INFO:tasks.workunit.client.0.vm03.stdout:5/410: mknod d4/d13/d1f/c87 0 2026-03-10T14:08:15.159 INFO:tasks.workunit.client.0.vm03.stdout:5/411: write d4/d13/d1f/f74 [1033974,17692] 0 2026-03-10T14:08:15.165 INFO:tasks.workunit.client.0.vm03.stdout:3/281: unlink d1d/d33/f49 0 2026-03-10T14:08:15.165 INFO:tasks.workunit.client.0.vm03.stdout:9/344: mknod d2/d14/d2b/c75 0 2026-03-10T14:08:15.166 INFO:tasks.workunit.client.0.vm03.stdout:3/282: write d1d/d29/d41/f4e [375911,51883] 0 2026-03-10T14:08:15.167 INFO:tasks.workunit.client.1.vm04.stdout:6/703: getdents d3/de/d35/d3f/d2d/d32/d23/d24 0 2026-03-10T14:08:15.170 INFO:tasks.workunit.client.1.vm04.stdout:7/835: getdents d2/dc/de/d2d/d38/d50/dc8/d10e 0 2026-03-10T14:08:15.173 INFO:tasks.workunit.client.0.vm03.stdout:9/345: dwrite d2/d14/d2b/d43/f45 [0,4194304] 0 2026-03-10T14:08:15.173 INFO:tasks.workunit.client.1.vm04.stdout:9/737: mknod d9/dd3/df8/cfe 0 2026-03-10T14:08:15.177 INFO:tasks.workunit.client.0.vm03.stdout:0/292: dread d3/d4d/f22 [0,4194304] 0 2026-03-10T14:08:15.180 INFO:tasks.workunit.client.0.vm03.stdout:2/299: rmdir d5/d10/d1c/d40 39 2026-03-10T14:08:15.180 INFO:tasks.workunit.client.1.vm04.stdout:4/766: sync 2026-03-10T14:08:15.185 INFO:tasks.workunit.client.1.vm04.stdout:9/738: symlink d9/da/dd/de7/d96/lff 0 2026-03-10T14:08:15.191 INFO:tasks.workunit.client.1.vm04.stdout:6/704: mkdir d3/de/d35/d3f/d2d/ddb 0 2026-03-10T14:08:15.191 INFO:tasks.workunit.client.1.vm04.stdout:6/705: chown d3/de/l2c 34 1 2026-03-10T14:08:15.193 INFO:tasks.workunit.client.1.vm04.stdout:7/836: mkdir d2/dc/de/d2d/d60/d7c/d64/d108/d117/d120/d126 0 2026-03-10T14:08:15.198 INFO:tasks.workunit.client.0.vm03.stdout:2/300: symlink d5/d10/d1c/d54/l5a 0 2026-03-10T14:08:15.204 INFO:tasks.workunit.client.0.vm03.stdout:0/293: symlink d3/d4d/l59 0 2026-03-10T14:08:15.204 INFO:tasks.workunit.client.1.vm04.stdout:8/902: rmdir d0/d3/d63/d12/df5/d107 0 2026-03-10T14:08:15.204 INFO:tasks.workunit.client.1.vm04.stdout:7/837: creat d2/d6b/f127 x:0 0 0 2026-03-10T14:08:15.204 INFO:tasks.workunit.client.1.vm04.stdout:7/838: readlink d2/dc/de/l1b 0 2026-03-10T14:08:15.208 INFO:tasks.workunit.client.0.vm03.stdout:9/346: dread d2/d14/f1b [0,4194304] 0 2026-03-10T14:08:15.212 INFO:tasks.workunit.client.1.vm04.stdout:9/739: mknod d9/d44/d4d/c100 0 2026-03-10T14:08:15.214 INFO:tasks.workunit.client.1.vm04.stdout:3/808: dread da/d3e/ff4 [0,4194304] 0 2026-03-10T14:08:15.216 INFO:tasks.workunit.client.1.vm04.stdout:8/903: rmdir d0/d3/d63/d12/d69 39 2026-03-10T14:08:15.218 INFO:tasks.workunit.client.0.vm03.stdout:0/294: dwrite d3/f1e [4194304,4194304] 0 2026-03-10T14:08:15.226 INFO:tasks.workunit.client.0.vm03.stdout:8/325: truncate da/f33 2980118 0 2026-03-10T14:08:15.228 INFO:tasks.workunit.client.1.vm04.stdout:2/777: dwrite d0/d14/d91/d8/d17/d4e/d85/fcb [0,4194304] 0 2026-03-10T14:08:15.230 INFO:tasks.workunit.client.1.vm04.stdout:2/778: chown d0/d14/d91/d8/d17/d4e 2355 1 2026-03-10T14:08:15.230 INFO:tasks.workunit.client.1.vm04.stdout:5/870: write d7/d12/d2b/d3e/d57/d8a/fd6 [2679315,104980] 0 2026-03-10T14:08:15.231 INFO:tasks.workunit.client.1.vm04.stdout:5/871: write d7/d26/f116 [646911,45793] 0 2026-03-10T14:08:15.250 INFO:tasks.workunit.client.1.vm04.stdout:1/807: write d3/d22/d63/d35/dd9/d13/d38/db5/ffd [1011407,123735] 0 2026-03-10T14:08:15.256 INFO:tasks.workunit.client.0.vm03.stdout:3/283: truncate d1d/f3c 6631908 0 2026-03-10T14:08:15.257 INFO:tasks.workunit.client.0.vm03.stdout:2/301: creat d5/d10/d1c/f5b x:0 0 0 2026-03-10T14:08:15.262 INFO:tasks.workunit.client.0.vm03.stdout:3/284: dwrite d1d/d29/d41/f4e [0,4194304] 0 2026-03-10T14:08:15.275 INFO:tasks.workunit.client.1.vm04.stdout:5/872: sync 2026-03-10T14:08:15.276 INFO:tasks.workunit.client.1.vm04.stdout:7/839: dread d2/dc/de/d2d/d60/d7c/d3b/f49 [0,4194304] 0 2026-03-10T14:08:15.285 INFO:tasks.workunit.client.1.vm04.stdout:0/825: write d0/d2/d15/d22/d38/d56/d66/f7e [78018,28532] 0 2026-03-10T14:08:15.285 INFO:tasks.workunit.client.0.vm03.stdout:2/302: chown d5/d10/d17/l42 6 1 2026-03-10T14:08:15.285 INFO:tasks.workunit.client.1.vm04.stdout:0/826: read - d0/d2/d15/d49/d50/f8a zero size 2026-03-10T14:08:15.290 INFO:tasks.workunit.client.1.vm04.stdout:2/779: mknod d0/d14/d91/d4a/d8c/dab/db3/cee 0 2026-03-10T14:08:15.293 INFO:tasks.workunit.client.0.vm03.stdout:3/285: mknod d1d/d29/d41/d45/c57 0 2026-03-10T14:08:15.296 INFO:tasks.workunit.client.1.vm04.stdout:6/706: write d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f86 [1011987,16399] 0 2026-03-10T14:08:15.298 INFO:tasks.workunit.client.0.vm03.stdout:5/412: dwrite d4/f37 [0,4194304] 0 2026-03-10T14:08:15.301 INFO:tasks.workunit.client.1.vm04.stdout:1/808: fsync d3/d22/d63/d35/dd9/d13/da0/dc5/fc6 0 2026-03-10T14:08:15.305 INFO:tasks.workunit.client.0.vm03.stdout:0/295: unlink d3/fe 0 2026-03-10T14:08:15.305 INFO:tasks.workunit.client.0.vm03.stdout:0/296: write d3/d4d/f49 [4760175,84585] 0 2026-03-10T14:08:15.310 INFO:tasks.workunit.client.1.vm04.stdout:9/740: truncate d9/da/d8c/fea 1061878 0 2026-03-10T14:08:15.310 INFO:tasks.workunit.client.1.vm04.stdout:3/809: fdatasync da/dc/d47/d9b/d106/dde/dac/fe8 0 2026-03-10T14:08:15.321 INFO:tasks.workunit.client.1.vm04.stdout:2/780: creat d0/d14/d91/d8/d17/d4e/d85/d86/fef x:0 0 0 2026-03-10T14:08:15.321 INFO:tasks.workunit.client.0.vm03.stdout:4/365: dwrite f3 [0,4194304] 0 2026-03-10T14:08:15.321 INFO:tasks.workunit.client.0.vm03.stdout:3/286: dwrite d1d/d39/d51/f52 [0,4194304] 0 2026-03-10T14:08:15.323 INFO:tasks.workunit.client.1.vm04.stdout:2/781: dwrite d0/d14/f6b [0,4194304] 0 2026-03-10T14:08:15.326 INFO:tasks.workunit.client.1.vm04.stdout:2/782: write d0/d14/d91/d8/d17/d4e/d85/d86/fef [952061,26286] 0 2026-03-10T14:08:15.326 INFO:tasks.workunit.client.1.vm04.stdout:1/809: sync 2026-03-10T14:08:15.327 INFO:tasks.workunit.client.1.vm04.stdout:1/810: dread - d3/d20/d60/ff7 zero size 2026-03-10T14:08:15.329 INFO:tasks.workunit.client.0.vm03.stdout:3/287: dread d1d/d39/d51/f52 [0,4194304] 0 2026-03-10T14:08:15.331 INFO:tasks.workunit.client.0.vm03.stdout:3/288: chown f1b 4677 1 2026-03-10T14:08:15.335 INFO:tasks.workunit.client.1.vm04.stdout:1/811: dwrite d3/d5c/f71 [4194304,4194304] 0 2026-03-10T14:08:15.339 INFO:tasks.workunit.client.1.vm04.stdout:4/767: write d4/df/d34/f8f [6815687,117784] 0 2026-03-10T14:08:15.345 INFO:tasks.workunit.client.1.vm04.stdout:1/812: dread d3/d22/d63/d35/dd9/f56 [0,4194304] 0 2026-03-10T14:08:15.355 INFO:tasks.workunit.client.0.vm03.stdout:4/366: dread d5/fe [0,4194304] 0 2026-03-10T14:08:15.361 INFO:tasks.workunit.client.1.vm04.stdout:8/904: rename d0/d3/d63/cbe to d0/d3/d63/c11b 0 2026-03-10T14:08:15.366 INFO:tasks.workunit.client.0.vm03.stdout:8/326: dwrite da/fd [0,4194304] 0 2026-03-10T14:08:15.366 INFO:tasks.workunit.client.1.vm04.stdout:0/827: creat d0/d2/d25/dfe/f104 x:0 0 0 2026-03-10T14:08:15.376 INFO:tasks.workunit.client.1.vm04.stdout:6/707: mkdir d3/de/d35/d3f/d2d/d32/d23/ddc 0 2026-03-10T14:08:15.379 INFO:tasks.workunit.client.0.vm03.stdout:9/347: dwrite d2/d14/f39 [0,4194304] 0 2026-03-10T14:08:15.382 INFO:tasks.workunit.client.0.vm03.stdout:2/303: symlink d5/d10/d31/l5c 0 2026-03-10T14:08:15.382 INFO:tasks.workunit.client.1.vm04.stdout:4/768: chown d4/df/l25 939143074 1 2026-03-10T14:08:15.384 INFO:tasks.workunit.client.1.vm04.stdout:6/708: mknod d3/de/d35/d3f/d2d/d38/d40/cdd 0 2026-03-10T14:08:15.385 INFO:tasks.workunit.client.0.vm03.stdout:4/367: symlink d5/d9/db/d19/d38/d53/l72 0 2026-03-10T14:08:15.385 INFO:tasks.workunit.client.0.vm03.stdout:3/289: write d1d/f2b [1534602,124324] 0 2026-03-10T14:08:15.385 INFO:tasks.workunit.client.1.vm04.stdout:8/905: dread d0/d3/d63/d12/d51/d67/d96/d105/fcc [0,4194304] 0 2026-03-10T14:08:15.391 INFO:tasks.workunit.client.1.vm04.stdout:2/783: rename d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/lb0 to d0/d14/d91/d4a/d8c/dab/d95/lf0 0 2026-03-10T14:08:15.393 INFO:tasks.workunit.client.1.vm04.stdout:7/840: getdents d2/df9 0 2026-03-10T14:08:15.394 INFO:tasks.workunit.client.1.vm04.stdout:7/841: dread d2/dc/de/d2d/d38/fa5 [0,4194304] 0 2026-03-10T14:08:15.397 INFO:tasks.workunit.client.1.vm04.stdout:8/906: mknod d0/d3/d63/d12/d51/d67/d96/c11c 0 2026-03-10T14:08:15.397 INFO:tasks.workunit.client.1.vm04.stdout:8/907: fdatasync d0/d3/dd/d78/f11a 0 2026-03-10T14:08:15.399 INFO:tasks.workunit.client.0.vm03.stdout:9/348: dwrite d2/f15 [4194304,4194304] 0 2026-03-10T14:08:15.401 INFO:tasks.workunit.client.0.vm03.stdout:0/297: mknod d3/d17/c5a 0 2026-03-10T14:08:15.405 INFO:tasks.workunit.client.1.vm04.stdout:1/813: rename d3/d22/d63/d35/dd9/f56 to d3/d22/d63/d35/dd9/d13/d38/db5/dc4/f116 0 2026-03-10T14:08:15.410 INFO:tasks.workunit.client.1.vm04.stdout:2/784: unlink d0/d14/d91/d8/dd/cdf 0 2026-03-10T14:08:15.410 INFO:tasks.workunit.client.0.vm03.stdout:5/413: mknod d4/d6/c88 0 2026-03-10T14:08:15.410 INFO:tasks.workunit.client.0.vm03.stdout:5/414: stat d4/l76 0 2026-03-10T14:08:15.411 INFO:tasks.workunit.client.1.vm04.stdout:4/769: unlink d4/df/l80 0 2026-03-10T14:08:15.417 INFO:tasks.workunit.client.1.vm04.stdout:6/709: sync 2026-03-10T14:08:15.428 INFO:tasks.workunit.client.0.vm03.stdout:3/290: unlink d1d/d33/f37 0 2026-03-10T14:08:15.428 INFO:tasks.workunit.client.1.vm04.stdout:5/873: write d7/f24 [310429,115749] 0 2026-03-10T14:08:15.430 INFO:tasks.workunit.client.0.vm03.stdout:5/415: creat d4/d6/de/f89 x:0 0 0 2026-03-10T14:08:15.435 INFO:tasks.workunit.client.0.vm03.stdout:4/368: mknod d5/d47/c73 0 2026-03-10T14:08:15.436 INFO:tasks.workunit.client.1.vm04.stdout:2/785: dread d0/d14/d91/d4a/d66/f72 [4194304,4194304] 0 2026-03-10T14:08:15.438 INFO:tasks.workunit.client.1.vm04.stdout:0/828: rename d0/d2/d15/d22/d38/f7d to d0/d2/d15/d49/d50/d5c/dd8/f105 0 2026-03-10T14:08:15.444 INFO:tasks.workunit.client.0.vm03.stdout:9/349: mkdir d2/d14/d2b/d76 0 2026-03-10T14:08:15.447 INFO:tasks.workunit.client.1.vm04.stdout:5/874: fdatasync d7/d2d/d69/fc9 0 2026-03-10T14:08:15.450 INFO:tasks.workunit.client.0.vm03.stdout:0/298: creat d3/d16/d21/d3c/f5b x:0 0 0 2026-03-10T14:08:15.454 INFO:tasks.workunit.client.0.vm03.stdout:0/299: dwrite d3/d16/f31 [0,4194304] 0 2026-03-10T14:08:15.456 INFO:tasks.workunit.client.1.vm04.stdout:4/770: unlink d4/d14/f98 0 2026-03-10T14:08:15.464 INFO:tasks.workunit.client.1.vm04.stdout:9/741: dwrite d9/d44/d70/fdb [0,4194304] 0 2026-03-10T14:08:15.465 INFO:tasks.workunit.client.1.vm04.stdout:3/810: dwrite da/dc/f39 [4194304,4194304] 0 2026-03-10T14:08:15.477 INFO:tasks.workunit.client.1.vm04.stdout:3/811: dread da/dc/d3f/d54/f82 [0,4194304] 0 2026-03-10T14:08:15.478 INFO:tasks.workunit.client.1.vm04.stdout:8/908: link d0/d3/d5/fd7 d0/d3/d63/d12/d51/d67/d96/d105/d117/f11d 0 2026-03-10T14:08:15.484 INFO:tasks.workunit.client.0.vm03.stdout:4/369: mknod d5/d9/c74 0 2026-03-10T14:08:15.484 INFO:tasks.workunit.client.1.vm04.stdout:7/842: dwrite d2/d2a/f9c [0,4194304] 0 2026-03-10T14:08:15.487 INFO:tasks.workunit.client.1.vm04.stdout:6/710: creat d3/de/d35/d3f/d2d/d32/dd0/fde x:0 0 0 2026-03-10T14:08:15.488 INFO:tasks.workunit.client.1.vm04.stdout:1/814: dwrite d3/d22/d63/f2d [0,4194304] 0 2026-03-10T14:08:15.491 INFO:tasks.workunit.client.0.vm03.stdout:2/304: write d5/d10/d1f/f3f [586874,88800] 0 2026-03-10T14:08:15.491 INFO:tasks.workunit.client.0.vm03.stdout:5/416: rmdir d4 39 2026-03-10T14:08:15.498 INFO:tasks.workunit.client.0.vm03.stdout:3/291: creat d1d/d40/d48/f58 x:0 0 0 2026-03-10T14:08:15.501 INFO:tasks.workunit.client.0.vm03.stdout:8/327: dwrite da/d36/f3b [0,4194304] 0 2026-03-10T14:08:15.501 INFO:tasks.workunit.client.1.vm04.stdout:6/711: dread d3/de/d35/d3f/f22 [0,4194304] 0 2026-03-10T14:08:15.512 INFO:tasks.workunit.client.1.vm04.stdout:0/829: creat d0/f106 x:0 0 0 2026-03-10T14:08:15.512 INFO:tasks.workunit.client.1.vm04.stdout:0/830: dread d0/d2/d15/d22/d38/f93 [0,4194304] 0 2026-03-10T14:08:15.513 INFO:tasks.workunit.client.1.vm04.stdout:0/831: chown d0/d2/d15/d22/ca9 1193531842 1 2026-03-10T14:08:15.516 INFO:tasks.workunit.client.1.vm04.stdout:5/875: symlink d7/d12/d2b/d3e/dae/l11c 0 2026-03-10T14:08:15.523 INFO:tasks.workunit.client.0.vm03.stdout:0/300: chown d3/c2e 70942983 1 2026-03-10T14:08:15.525 INFO:tasks.workunit.client.0.vm03.stdout:2/305: mkdir d5/d10/d1f/d5d 0 2026-03-10T14:08:15.528 INFO:tasks.workunit.client.1.vm04.stdout:9/742: truncate d9/d44/d4d/fa9 1321971 0 2026-03-10T14:08:15.537 INFO:tasks.workunit.client.0.vm03.stdout:8/328: mkdir da/d36/d6a 0 2026-03-10T14:08:15.537 INFO:tasks.workunit.client.1.vm04.stdout:8/909: mknod d0/d82/c11e 0 2026-03-10T14:08:15.539 INFO:tasks.workunit.client.0.vm03.stdout:9/350: write d2/d14/d2b/f2d [4872769,3419] 0 2026-03-10T14:08:15.540 INFO:tasks.workunit.client.0.vm03.stdout:8/329: dwrite da/d24/d49/f66 [0,4194304] 0 2026-03-10T14:08:15.541 INFO:tasks.workunit.client.1.vm04.stdout:7/843: fsync d2/dc/de/dae/fb2 0 2026-03-10T14:08:15.554 INFO:tasks.workunit.client.1.vm04.stdout:1/815: truncate d3/d5c/ff2 59875 0 2026-03-10T14:08:15.555 INFO:tasks.workunit.client.0.vm03.stdout:2/306: fsync d5/d10/f13 0 2026-03-10T14:08:15.561 INFO:tasks.workunit.client.1.vm04.stdout:2/786: creat d0/d14/d39/d47/d70/dc3/dd0/ff1 x:0 0 0 2026-03-10T14:08:15.565 INFO:tasks.workunit.client.0.vm03.stdout:9/351: mknod d2/d29/d33/d41/d5c/c77 0 2026-03-10T14:08:15.579 INFO:tasks.workunit.client.0.vm03.stdout:2/307: dread d5/d10/d31/f4e [0,4194304] 0 2026-03-10T14:08:15.581 INFO:tasks.workunit.client.1.vm04.stdout:9/743: symlink d9/da/dd/de7/l101 0 2026-03-10T14:08:15.582 INFO:tasks.workunit.client.1.vm04.stdout:9/744: stat d9/d58/db5/c64 0 2026-03-10T14:08:15.585 INFO:tasks.workunit.client.0.vm03.stdout:8/330: creat da/d36/d6a/f6b x:0 0 0 2026-03-10T14:08:15.595 INFO:tasks.workunit.client.1.vm04.stdout:8/910: rename d0/d75 to d0/d3/d73/db8/d103/d11f 0 2026-03-10T14:08:15.595 INFO:tasks.workunit.client.1.vm04.stdout:7/844: mknod d2/d2a/d42/c128 0 2026-03-10T14:08:15.595 INFO:tasks.workunit.client.0.vm03.stdout:2/308: creat d5/d10/d1f/f5e x:0 0 0 2026-03-10T14:08:15.595 INFO:tasks.workunit.client.0.vm03.stdout:8/331: mkdir da/d58/d6c 0 2026-03-10T14:08:15.597 INFO:tasks.workunit.client.0.vm03.stdout:2/309: mkdir d5/d10/d1c/d54/d5f 0 2026-03-10T14:08:15.598 INFO:tasks.workunit.client.0.vm03.stdout:2/310: write d5/d10/d17/f33 [199879,42953] 0 2026-03-10T14:08:15.599 INFO:tasks.workunit.client.1.vm04.stdout:1/816: mkdir d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/dce/d117 0 2026-03-10T14:08:15.623 INFO:tasks.workunit.client.1.vm04.stdout:9/745: rmdir d9/da/dd/d1c/da3 39 2026-03-10T14:08:15.627 INFO:tasks.workunit.client.1.vm04.stdout:7/845: creat d2/dc/de/d2d/d60/d7c/d64/d108/f129 x:0 0 0 2026-03-10T14:08:15.628 INFO:tasks.workunit.client.1.vm04.stdout:7/846: chown d2/dc/de/d2d/d60/d7c/d36/d103 257396165 1 2026-03-10T14:08:15.628 INFO:tasks.workunit.client.0.vm03.stdout:7/234: rename d5/d9/d14/ff to d5/d9/d14/d21/d28/f46 0 2026-03-10T14:08:15.629 INFO:tasks.workunit.client.1.vm04.stdout:3/812: write da/d3e/fba [365035,73240] 0 2026-03-10T14:08:15.632 INFO:tasks.workunit.client.1.vm04.stdout:2/787: link d0/d14/d91/d8/f42 d0/d14/d91/d8/d17/d4e/d85/d86/ff2 0 2026-03-10T14:08:15.634 INFO:tasks.workunit.client.0.vm03.stdout:3/292: dwrite d1d/f36 [0,4194304] 0 2026-03-10T14:08:15.634 INFO:tasks.workunit.client.0.vm03.stdout:3/293: readlink d1d/l21 0 2026-03-10T14:08:15.639 INFO:tasks.workunit.client.0.vm03.stdout:0/301: dwrite d3/d4d/d30/f32 [0,4194304] 0 2026-03-10T14:08:15.651 INFO:tasks.workunit.client.0.vm03.stdout:5/417: dwrite d4/f55 [4194304,4194304] 0 2026-03-10T14:08:15.657 INFO:tasks.workunit.client.0.vm03.stdout:3/294: dread d1d/d29/d41/f4e [0,4194304] 0 2026-03-10T14:08:15.658 INFO:tasks.workunit.client.1.vm04.stdout:4/771: dwrite d4/df/f2e [0,4194304] 0 2026-03-10T14:08:15.658 INFO:tasks.workunit.client.1.vm04.stdout:4/772: dread - d4/df/d31/fa7 zero size 2026-03-10T14:08:15.658 INFO:tasks.workunit.client.1.vm04.stdout:4/773: fsync d4/df/d34/fef 0 2026-03-10T14:08:15.671 INFO:tasks.workunit.client.0.vm03.stdout:9/352: dwrite d2/d29/d38/f4b [0,4194304] 0 2026-03-10T14:08:15.672 INFO:tasks.workunit.client.1.vm04.stdout:9/746: rename d9/d5c/fa6 to d9/da/d5d/dd6/f102 0 2026-03-10T14:08:15.679 INFO:tasks.workunit.client.1.vm04.stdout:8/911: symlink d0/d3/d63/d12/d69/l120 0 2026-03-10T14:08:15.681 INFO:tasks.workunit.client.0.vm03.stdout:8/332: write f7 [1666615,1397] 0 2026-03-10T14:08:15.682 INFO:tasks.workunit.client.0.vm03.stdout:8/333: write f5 [4202074,48644] 0 2026-03-10T14:08:15.682 INFO:tasks.workunit.client.1.vm04.stdout:0/832: truncate d0/dee/ff0 55760 0 2026-03-10T14:08:15.700 INFO:tasks.workunit.client.1.vm04.stdout:7/847: creat d2/dc/de/d2d/d60/f12a x:0 0 0 2026-03-10T14:08:15.700 INFO:tasks.workunit.client.1.vm04.stdout:7/848: chown d2/dc/de/d11/c59 878 1 2026-03-10T14:08:15.700 INFO:tasks.workunit.client.1.vm04.stdout:3/813: fdatasync da/ded/f104 0 2026-03-10T14:08:15.700 INFO:tasks.workunit.client.1.vm04.stdout:6/712: getdents d3/de/d35/d3f/d2d/d32/dd0 0 2026-03-10T14:08:15.700 INFO:tasks.workunit.client.1.vm04.stdout:2/788: mknod d0/d14/d39/d47/cf3 0 2026-03-10T14:08:15.700 INFO:tasks.workunit.client.1.vm04.stdout:6/713: unlink d3/de/d35/d3a/d43/cbd 0 2026-03-10T14:08:15.700 INFO:tasks.workunit.client.1.vm04.stdout:9/747: link d9/da/d5d/f8b d9/dd3/f103 0 2026-03-10T14:08:15.700 INFO:tasks.workunit.client.1.vm04.stdout:7/849: mknod d2/dac/d115/c12b 0 2026-03-10T14:08:15.701 INFO:tasks.workunit.client.1.vm04.stdout:2/789: mkdir d0/d14/d91/d4a/d66/dda/df4 0 2026-03-10T14:08:15.702 INFO:tasks.workunit.client.1.vm04.stdout:2/790: symlink d0/d14/d91/d4a/d8c/dab/d46/dc8/lf5 0 2026-03-10T14:08:15.732 INFO:tasks.workunit.client.1.vm04.stdout:9/748: dread d9/d58/db5/f67 [0,4194304] 0 2026-03-10T14:08:15.732 INFO:tasks.workunit.client.1.vm04.stdout:9/749: write d9/da/d5d/fde [4523080,25] 0 2026-03-10T14:08:15.738 INFO:tasks.workunit.client.1.vm04.stdout:9/750: read d9/da/dd/f8a [3674599,77809] 0 2026-03-10T14:08:15.740 INFO:tasks.workunit.client.1.vm04.stdout:9/751: creat d9/da/dd/d1c/da3/dec/f104 x:0 0 0 2026-03-10T14:08:15.742 INFO:tasks.workunit.client.1.vm04.stdout:9/752: rename d9/d44/d59/f5a to d9/dd3/f105 0 2026-03-10T14:08:15.748 INFO:tasks.workunit.client.1.vm04.stdout:9/753: dwrite d9/d44/d70/fee [0,4194304] 0 2026-03-10T14:08:15.752 INFO:tasks.workunit.client.1.vm04.stdout:9/754: read d9/da/dd/de7/d96/d9d/fa2 [121171,48494] 0 2026-03-10T14:08:15.755 INFO:tasks.workunit.client.1.vm04.stdout:9/755: dwrite d9/d44/d4d/fed [0,4194304] 0 2026-03-10T14:08:15.758 INFO:tasks.workunit.client.1.vm04.stdout:9/756: creat d9/d44/d70/f106 x:0 0 0 2026-03-10T14:08:15.760 INFO:tasks.workunit.client.1.vm04.stdout:9/757: mknod d9/da/dd/de7/c107 0 2026-03-10T14:08:15.761 INFO:tasks.workunit.client.1.vm04.stdout:9/758: read - d9/da/dd/d1c/da3/fd0 zero size 2026-03-10T14:08:15.765 INFO:tasks.workunit.client.1.vm04.stdout:4/774: sync 2026-03-10T14:08:15.765 INFO:tasks.workunit.client.1.vm04.stdout:0/833: sync 2026-03-10T14:08:15.767 INFO:tasks.workunit.client.0.vm03.stdout:8/334: sync 2026-03-10T14:08:15.773 INFO:tasks.workunit.client.1.vm04.stdout:4/775: chown d4/db9 2350657 1 2026-03-10T14:08:15.779 INFO:tasks.workunit.client.1.vm04.stdout:5/876: write d7/d2d/d69/f95 [1467828,89833] 0 2026-03-10T14:08:15.782 INFO:tasks.workunit.client.1.vm04.stdout:9/759: link d9/d58/f62 d9/dd3/f108 0 2026-03-10T14:08:15.783 INFO:tasks.workunit.client.1.vm04.stdout:9/760: mknod d9/d58/dda/c109 0 2026-03-10T14:08:15.783 INFO:tasks.workunit.client.1.vm04.stdout:5/877: link d7/d59/d7e/d87/lde d7/d26/ddf/l11d 0 2026-03-10T14:08:15.786 INFO:tasks.workunit.client.1.vm04.stdout:5/878: mkdir d7/d12/d2b/d3e/d57/d8a/dec/d11e 0 2026-03-10T14:08:15.786 INFO:tasks.workunit.client.1.vm04.stdout:5/879: readlink d7/d12/d2b/d3e/l10e 0 2026-03-10T14:08:15.788 INFO:tasks.workunit.client.1.vm04.stdout:5/880: mknod d7/d26/d6b/d6e/d82/c11f 0 2026-03-10T14:08:15.789 INFO:tasks.workunit.client.1.vm04.stdout:5/881: mkdir d7/d12/d2b/d3e/d57/d9f/d120 0 2026-03-10T14:08:15.791 INFO:tasks.workunit.client.1.vm04.stdout:5/882: symlink d7/d12/d2b/d3e/d3f/l121 0 2026-03-10T14:08:15.797 INFO:tasks.workunit.client.1.vm04.stdout:5/883: dread - d7/d12/d2b/d93/fa1 zero size 2026-03-10T14:08:15.797 INFO:tasks.workunit.client.1.vm04.stdout:5/884: creat d7/d12/f122 x:0 0 0 2026-03-10T14:08:15.797 INFO:tasks.workunit.client.1.vm04.stdout:5/885: readlink d7/d26/l10a 0 2026-03-10T14:08:15.797 INFO:tasks.workunit.client.1.vm04.stdout:9/761: dread d9/da/dd/d74/fa0 [0,4194304] 0 2026-03-10T14:08:15.799 INFO:tasks.workunit.client.1.vm04.stdout:9/762: creat d9/dd3/df8/f10a x:0 0 0 2026-03-10T14:08:15.800 INFO:tasks.workunit.client.1.vm04.stdout:9/763: readlink d9/d44/lf4 0 2026-03-10T14:08:15.802 INFO:tasks.workunit.client.1.vm04.stdout:9/764: mkdir d9/d58/db5/da5/d10b 0 2026-03-10T14:08:15.812 INFO:tasks.workunit.client.1.vm04.stdout:9/765: dread d9/da/dd/f8a [0,4194304] 0 2026-03-10T14:08:15.813 INFO:tasks.workunit.client.1.vm04.stdout:9/766: write d9/d5c/fb6 [2630085,81961] 0 2026-03-10T14:08:15.821 INFO:tasks.workunit.client.1.vm04.stdout:0/834: dread d0/d2/d25/f64 [0,4194304] 0 2026-03-10T14:08:15.821 INFO:tasks.workunit.client.1.vm04.stdout:5/886: sync 2026-03-10T14:08:15.833 INFO:tasks.workunit.client.1.vm04.stdout:9/767: getdents d9/d44/d4d 0 2026-03-10T14:08:15.833 INFO:tasks.workunit.client.1.vm04.stdout:9/768: dread - d9/da/d8c/de5/ffa zero size 2026-03-10T14:08:15.839 INFO:tasks.workunit.client.1.vm04.stdout:0/835: mkdir d0/d2/d15/d22/d38/d56/dc1/d107 0 2026-03-10T14:08:15.840 INFO:tasks.workunit.client.1.vm04.stdout:9/769: fsync d9/da/dd/de7/d96/d9d/fa2 0 2026-03-10T14:08:15.848 INFO:tasks.workunit.client.1.vm04.stdout:5/887: rename d7/d59/d7d/c113 to d7/d59/d7d/c123 0 2026-03-10T14:08:15.849 INFO:tasks.workunit.client.1.vm04.stdout:1/817: write d3/f2c [2195130,53518] 0 2026-03-10T14:08:15.854 INFO:tasks.workunit.client.1.vm04.stdout:8/912: write d0/d3/dd/d89/fd0 [2400892,70498] 0 2026-03-10T14:08:15.854 INFO:tasks.workunit.client.1.vm04.stdout:8/913: stat d0/d3/dd/d78 0 2026-03-10T14:08:15.869 INFO:tasks.workunit.client.1.vm04.stdout:9/770: dread d9/da/d5d/dd6/f102 [0,4194304] 0 2026-03-10T14:08:15.871 INFO:tasks.workunit.client.1.vm04.stdout:5/888: fsync d7/d59/d7e/ffa 0 2026-03-10T14:08:15.879 INFO:tasks.workunit.client.1.vm04.stdout:3/814: dwrite da/dc/d47/d9b/d106/f108 [0,4194304] 0 2026-03-10T14:08:15.935 INFO:tasks.workunit.client.1.vm04.stdout:8/914: symlink d0/d3/d63/d12/d51/d67/d96/dc8/dcf/l121 0 2026-03-10T14:08:15.943 INFO:tasks.workunit.client.1.vm04.stdout:7/850: dwrite d2/dc/d4d/dcd/f100 [0,4194304] 0 2026-03-10T14:08:15.948 INFO:tasks.workunit.client.1.vm04.stdout:7/851: dread d2/dc/d4d/dcd/f100 [0,4194304] 0 2026-03-10T14:08:15.954 INFO:tasks.workunit.client.1.vm04.stdout:2/791: dwrite d0/d14/d91/d8/d17/d4e/f78 [0,4194304] 0 2026-03-10T14:08:15.967 INFO:tasks.workunit.client.1.vm04.stdout:5/889: fsync d7/d12/d45/fb2 0 2026-03-10T14:08:15.968 INFO:tasks.workunit.client.1.vm04.stdout:1/818: mknod d3/d20/c118 0 2026-03-10T14:08:15.973 INFO:tasks.workunit.client.1.vm04.stdout:3/815: creat da/dc/d3f/d54/d66/f117 x:0 0 0 2026-03-10T14:08:15.974 INFO:tasks.workunit.client.1.vm04.stdout:8/915: rename d0/d3/dd/fa7 to d0/d3/d63/d12/d51/d67/d96/df3/f122 0 2026-03-10T14:08:15.974 INFO:tasks.workunit.client.1.vm04.stdout:8/916: dread - d0/d3/d5/f10d zero size 2026-03-10T14:08:15.975 INFO:tasks.workunit.client.1.vm04.stdout:6/714: chown d3/de/d35/d3a/d43/d4c/d5e/lac 5 1 2026-03-10T14:08:15.975 INFO:tasks.workunit.client.1.vm04.stdout:7/852: mknod d2/dac/d115/c12c 0 2026-03-10T14:08:15.975 INFO:tasks.workunit.client.1.vm04.stdout:2/792: dread - d0/db8/fd3 zero size 2026-03-10T14:08:15.976 INFO:tasks.workunit.client.1.vm04.stdout:2/793: chown d0/d14/d91/f1d 837598243 1 2026-03-10T14:08:15.977 INFO:tasks.workunit.client.1.vm04.stdout:2/794: stat d0/d14/d91/d8/d17/d4e/l9f 0 2026-03-10T14:08:15.977 INFO:tasks.workunit.client.1.vm04.stdout:2/795: chown d0/d14/d39/fa2 55006243 1 2026-03-10T14:08:15.978 INFO:tasks.workunit.client.1.vm04.stdout:5/890: dread - d7/d12/d2b/d3e/d3f/da6/fe9 zero size 2026-03-10T14:08:15.987 INFO:tasks.workunit.client.1.vm04.stdout:3/816: rmdir da/dc/d3f/d54 39 2026-03-10T14:08:15.987 INFO:tasks.workunit.client.1.vm04.stdout:3/817: chown da/d3e/fb9 344 1 2026-03-10T14:08:15.988 INFO:tasks.workunit.client.1.vm04.stdout:8/917: mkdir d0/d3/d73/db8/dd5/d123 0 2026-03-10T14:08:15.988 INFO:tasks.workunit.client.1.vm04.stdout:9/771: rmdir d9/da/dd 39 2026-03-10T14:08:15.990 INFO:tasks.workunit.client.1.vm04.stdout:7/853: dread d2/dc/d4d/dcd/fc5 [0,4194304] 0 2026-03-10T14:08:15.997 INFO:tasks.workunit.client.1.vm04.stdout:5/891: mkdir d7/d2d/d76/d124 0 2026-03-10T14:08:15.998 INFO:tasks.workunit.client.1.vm04.stdout:5/892: readlink d7/d26/d6b/d6e/d82/l118 0 2026-03-10T14:08:16.000 INFO:tasks.workunit.client.1.vm04.stdout:2/796: dread d0/d14/d91/d4a/d66/f7d [0,4194304] 0 2026-03-10T14:08:16.000 INFO:tasks.workunit.client.1.vm04.stdout:4/776: dwrite d4/fb3 [0,4194304] 0 2026-03-10T14:08:16.017 INFO:tasks.workunit.client.1.vm04.stdout:9/772: dread d9/da/dd/f85 [0,4194304] 0 2026-03-10T14:08:16.017 INFO:tasks.workunit.client.1.vm04.stdout:0/836: write d0/d2/f13 [2988435,106953] 0 2026-03-10T14:08:16.017 INFO:tasks.workunit.client.1.vm04.stdout:9/773: chown d9/da/dd/d1c/da3/lc8 11247 1 2026-03-10T14:08:16.024 INFO:tasks.workunit.client.1.vm04.stdout:5/893: symlink d7/d12/d45/l125 0 2026-03-10T14:08:16.030 INFO:tasks.workunit.client.1.vm04.stdout:4/777: unlink d4/df/d34/d6f/l75 0 2026-03-10T14:08:16.033 INFO:tasks.workunit.client.1.vm04.stdout:1/819: dwrite d3/d22/d63/f69 [0,4194304] 0 2026-03-10T14:08:16.039 INFO:tasks.workunit.client.1.vm04.stdout:8/918: dwrite d0/d3/d63/d12/f2c [4194304,4194304] 0 2026-03-10T14:08:16.044 INFO:tasks.workunit.client.1.vm04.stdout:7/854: dwrite d2/dc/de/d2d/d60/d7c/d44/f51 [0,4194304] 0 2026-03-10T14:08:16.048 INFO:tasks.workunit.client.1.vm04.stdout:9/774: truncate d9/d58/db5/f23 4578596 0 2026-03-10T14:08:16.054 INFO:tasks.workunit.client.1.vm04.stdout:6/715: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/fdf x:0 0 0 2026-03-10T14:08:16.055 INFO:tasks.workunit.client.1.vm04.stdout:1/820: dwrite d3/d22/d63/d35/dd9/fe8 [0,4194304] 0 2026-03-10T14:08:16.059 INFO:tasks.workunit.client.1.vm04.stdout:7/855: rename d2/dc/de/d11/ffd to d2/df9/f12d 0 2026-03-10T14:08:16.074 INFO:tasks.workunit.client.1.vm04.stdout:0/837: dread d0/d2/d15/f2f [0,4194304] 0 2026-03-10T14:08:16.077 INFO:tasks.workunit.client.1.vm04.stdout:8/919: dread d0/d3/d63/d12/d51/d67/d96/d105/d117/f11d [0,4194304] 0 2026-03-10T14:08:16.077 INFO:tasks.workunit.client.1.vm04.stdout:8/920: chown d0/d3/l72 14070 1 2026-03-10T14:08:16.079 INFO:tasks.workunit.client.1.vm04.stdout:3/818: getdents da/d8e/db5 0 2026-03-10T14:08:16.090 INFO:tasks.workunit.client.1.vm04.stdout:2/797: write d0/d14/d91/d8/d17/d4e/d85/d86/ff2 [5387936,119435] 0 2026-03-10T14:08:16.093 INFO:tasks.workunit.client.1.vm04.stdout:0/838: mknod d0/d2/d15/d22/d38/d56/c108 0 2026-03-10T14:08:16.095 INFO:tasks.workunit.client.1.vm04.stdout:8/921: chown d0/d3/d73/db8/l102 27 1 2026-03-10T14:08:16.098 INFO:tasks.workunit.client.1.vm04.stdout:5/894: dwrite d7/d12/d2b/f53 [4194304,4194304] 0 2026-03-10T14:08:16.100 INFO:tasks.workunit.client.1.vm04.stdout:3/819: fdatasync da/d30/f55 0 2026-03-10T14:08:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:15 vm03.local ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:08:16.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:15 vm03.local ceph-mon[49718]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-10T14:08:16.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:15 vm03.local ceph-mon[49718]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-10T14:08:16.110 INFO:tasks.workunit.client.1.vm04.stdout:0/839: chown d0/dee/ff1 124412 1 2026-03-10T14:08:16.121 INFO:tasks.workunit.client.1.vm04.stdout:3/820: unlink da/dc/d47/d9b/d106/dde/f10e 0 2026-03-10T14:08:16.127 INFO:tasks.workunit.client.1.vm04.stdout:5/895: dread d7/d12/d2b/d3e/d57/d8a/fd6 [0,4194304] 0 2026-03-10T14:08:16.131 INFO:tasks.workunit.client.1.vm04.stdout:1/821: creat d3/d22/d63/d35/dd9/d13/d38/d58/d5b/f119 x:0 0 0 2026-03-10T14:08:16.144 INFO:tasks.workunit.client.1.vm04.stdout:1/822: chown d3/d22/d2f/l43 25 1 2026-03-10T14:08:16.144 INFO:tasks.workunit.client.1.vm04.stdout:3/821: chown da/l23 494282 1 2026-03-10T14:08:16.144 INFO:tasks.workunit.client.1.vm04.stdout:5/896: creat d7/d12/f126 x:0 0 0 2026-03-10T14:08:16.144 INFO:tasks.workunit.client.1.vm04.stdout:1/823: creat d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/f11a x:0 0 0 2026-03-10T14:08:16.145 INFO:tasks.workunit.client.1.vm04.stdout:1/824: dwrite d3/d22/d2f/f34 [0,4194304] 0 2026-03-10T14:08:16.145 INFO:tasks.workunit.client.1.vm04.stdout:3/822: creat da/dc/d3f/d61/df7/f118 x:0 0 0 2026-03-10T14:08:16.150 INFO:tasks.workunit.client.1.vm04.stdout:5/897: truncate d7/d12/d2b/f4d 2796834 0 2026-03-10T14:08:16.158 INFO:tasks.workunit.client.1.vm04.stdout:1/825: dread d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92/f93 [0,4194304] 0 2026-03-10T14:08:16.170 INFO:tasks.workunit.client.1.vm04.stdout:1/826: dread - d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/dce/f101 zero size 2026-03-10T14:08:16.170 INFO:tasks.workunit.client.1.vm04.stdout:1/827: truncate d3/d22/d63/d35/dd9/d13/d38/db5/dff/f111 961636 0 2026-03-10T14:08:16.170 INFO:tasks.workunit.client.1.vm04.stdout:1/828: readlink d3/d5c/la1 0 2026-03-10T14:08:16.170 INFO:tasks.workunit.client.1.vm04.stdout:1/829: dwrite d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/f11a [0,4194304] 0 2026-03-10T14:08:16.170 INFO:tasks.workunit.client.1.vm04.stdout:1/830: symlink d3/d22/d2f/d57/l11b 0 2026-03-10T14:08:16.180 INFO:tasks.workunit.client.1.vm04.stdout:9/775: dwrite d9/d44/d4d/ff2 [0,4194304] 0 2026-03-10T14:08:16.180 INFO:tasks.workunit.client.1.vm04.stdout:7/856: fdatasync d2/df9/f12d 0 2026-03-10T14:08:16.183 INFO:tasks.workunit.client.1.vm04.stdout:7/857: dread - d2/d94/fe5 zero size 2026-03-10T14:08:16.186 INFO:tasks.workunit.client.1.vm04.stdout:9/776: fsync d9/d44/d4d/f66 0 2026-03-10T14:08:16.189 INFO:tasks.workunit.client.1.vm04.stdout:7/858: mknod d2/dc/de/d2d/d60/d7c/d44/d102/c12e 0 2026-03-10T14:08:16.191 INFO:tasks.workunit.client.1.vm04.stdout:1/831: dread d3/f14 [0,4194304] 0 2026-03-10T14:08:16.200 INFO:tasks.workunit.client.1.vm04.stdout:1/832: write d3/d22/d2f/f34 [744662,122750] 0 2026-03-10T14:08:16.200 INFO:tasks.workunit.client.1.vm04.stdout:1/833: dread - d3/d22/d6d/fbd zero size 2026-03-10T14:08:16.200 INFO:tasks.workunit.client.1.vm04.stdout:9/777: rename d9/da/dd/de7/cb0 to d9/da/d8c/de5/c10c 0 2026-03-10T14:08:16.200 INFO:tasks.workunit.client.1.vm04.stdout:7/859: read d2/dc/de/f1e [3388491,105948] 0 2026-03-10T14:08:16.200 INFO:tasks.workunit.client.1.vm04.stdout:9/778: creat d9/d58/f10d x:0 0 0 2026-03-10T14:08:16.203 INFO:tasks.workunit.client.1.vm04.stdout:1/834: link d3/d22/f107 d3/d22/deb/f11c 0 2026-03-10T14:08:16.226 INFO:tasks.workunit.client.1.vm04.stdout:4/778: dwrite d4/d14/fad [0,4194304] 0 2026-03-10T14:08:16.231 INFO:tasks.workunit.client.1.vm04.stdout:4/779: read d4/fe [4542904,128608] 0 2026-03-10T14:08:16.244 INFO:tasks.workunit.client.1.vm04.stdout:4/780: dread d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/f76 [0,4194304] 0 2026-03-10T14:08:16.248 INFO:tasks.workunit.client.1.vm04.stdout:4/781: chown d4/d14/d3c/d5e/c72 12528267 1 2026-03-10T14:08:16.249 INFO:tasks.workunit.client.1.vm04.stdout:4/782: dread d4/df/d34/f7c [0,4194304] 0 2026-03-10T14:08:16.264 INFO:tasks.workunit.client.1.vm04.stdout:9/779: sync 2026-03-10T14:08:16.265 INFO:tasks.workunit.client.1.vm04.stdout:9/780: symlink d9/d5c/l10e 0 2026-03-10T14:08:16.267 INFO:tasks.workunit.client.1.vm04.stdout:9/781: truncate d9/d58/db5/da5/fc9 870119 0 2026-03-10T14:08:16.268 INFO:tasks.workunit.client.1.vm04.stdout:9/782: chown d9/da/dd/d1c/da3/dec/f104 174 1 2026-03-10T14:08:16.269 INFO:tasks.workunit.client.1.vm04.stdout:9/783: rmdir d9/da/d8c/de5/df6 39 2026-03-10T14:08:16.272 INFO:tasks.workunit.client.1.vm04.stdout:6/716: dwrite d3/de/d35/d3a/d43/d4c/f53 [0,4194304] 0 2026-03-10T14:08:16.279 INFO:tasks.workunit.client.1.vm04.stdout:6/717: rename d3/f57 to d3/de/d35/d3a/d43/fe0 0 2026-03-10T14:08:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:15 vm04.local ceph-mon[55966]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:08:16.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:15 vm04.local ceph-mon[55966]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-10T14:08:16.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:15 vm04.local ceph-mon[55966]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-10T14:08:16.319 INFO:tasks.workunit.client.1.vm04.stdout:2/798: dwrite d0/d14/d39/d47/d70/f74 [0,4194304] 0 2026-03-10T14:08:16.324 INFO:tasks.workunit.client.1.vm04.stdout:2/799: creat d0/d14/d91/d8/d17/d35/ff6 x:0 0 0 2026-03-10T14:08:16.328 INFO:tasks.workunit.client.1.vm04.stdout:2/800: dread d0/d14/d1b/f55 [0,4194304] 0 2026-03-10T14:08:16.331 INFO:tasks.workunit.client.1.vm04.stdout:2/801: dwrite d0/d14/d91/d3a/fb5 [4194304,4194304] 0 2026-03-10T14:08:16.336 INFO:tasks.workunit.client.1.vm04.stdout:2/802: creat d0/d14/d91/d8/dd/ff7 x:0 0 0 2026-03-10T14:08:16.350 INFO:tasks.workunit.client.1.vm04.stdout:8/922: write d0/d3/d5/f30 [1438173,57283] 0 2026-03-10T14:08:16.350 INFO:tasks.workunit.client.1.vm04.stdout:8/923: write d0/d3/d73/f10e [1034084,11439] 0 2026-03-10T14:08:16.352 INFO:tasks.workunit.client.1.vm04.stdout:8/924: truncate d0/d3/d63/d12/d51/d67/d96/d105/def/ff6 1679689 0 2026-03-10T14:08:16.357 INFO:tasks.workunit.client.1.vm04.stdout:8/925: creat d0/d3/d63/d12/d51/d67/d96/df3/f124 x:0 0 0 2026-03-10T14:08:16.360 INFO:tasks.workunit.client.1.vm04.stdout:8/926: fdatasync d0/d3/d63/d12/f50 0 2026-03-10T14:08:16.366 INFO:tasks.workunit.client.1.vm04.stdout:8/927: truncate d0/d3/d63/fd3 425239 0 2026-03-10T14:08:16.403 INFO:tasks.workunit.client.1.vm04.stdout:0/840: dwrite d0/d2/d15/f59 [0,4194304] 0 2026-03-10T14:08:16.405 INFO:tasks.workunit.client.1.vm04.stdout:0/841: symlink d0/d2/d15/d22/d38/d56/dcb/l109 0 2026-03-10T14:08:16.412 INFO:tasks.workunit.client.1.vm04.stdout:3/823: write da/dc/d47/d9b/d106/dde/dac/fb4 [719488,41549] 0 2026-03-10T14:08:16.414 INFO:tasks.workunit.client.1.vm04.stdout:5/898: write d7/d12/f42 [5003827,125849] 0 2026-03-10T14:08:16.428 INFO:tasks.workunit.client.1.vm04.stdout:7/860: write d2/dc/de/d2d/d38/d50/fa2 [325841,10702] 0 2026-03-10T14:08:16.451 INFO:tasks.workunit.client.1.vm04.stdout:7/861: sync 2026-03-10T14:08:16.604 INFO:tasks.workunit.client.1.vm04.stdout:1/835: dwrite d3/d20/fd2 [0,4194304] 0 2026-03-10T14:08:16.633 INFO:tasks.workunit.client.1.vm04.stdout:4/783: write d4/f5f [1325490,11316] 0 2026-03-10T14:08:16.638 INFO:tasks.workunit.client.1.vm04.stdout:9/784: write d9/d58/db5/f3d [1669241,130820] 0 2026-03-10T14:08:16.641 INFO:tasks.workunit.client.1.vm04.stdout:6/718: write d3/de/d35/d3f/f96 [475550,66644] 0 2026-03-10T14:08:16.648 INFO:tasks.workunit.client.1.vm04.stdout:2/803: dwrite d0/d14/d91/d4a/d8c/dab/db3/fd6 [4194304,4194304] 0 2026-03-10T14:08:16.654 INFO:tasks.workunit.client.1.vm04.stdout:2/804: dwrite d0/d14/d91/d8/d17/f1f [0,4194304] 0 2026-03-10T14:08:16.665 INFO:tasks.workunit.client.1.vm04.stdout:8/928: getdents d0/d3/d63/d12/d51/d67/d96/df3 0 2026-03-10T14:08:16.665 INFO:tasks.workunit.client.1.vm04.stdout:8/929: readlink d0/d3/d63/d12/d51/d67/d96/dc8/dcf/l121 0 2026-03-10T14:08:16.673 INFO:tasks.workunit.client.1.vm04.stdout:9/785: dread f5 [0,4194304] 0 2026-03-10T14:08:16.684 INFO:tasks.workunit.client.1.vm04.stdout:9/786: creat d9/da/dd/de7/d96/d9d/f10f x:0 0 0 2026-03-10T14:08:16.684 INFO:tasks.workunit.client.1.vm04.stdout:9/787: read d9/da/dd/f85 [162817,74762] 0 2026-03-10T14:08:16.684 INFO:tasks.workunit.client.1.vm04.stdout:9/788: write d9/d5c/fdc [872183,76244] 0 2026-03-10T14:08:16.684 INFO:tasks.workunit.client.1.vm04.stdout:0/842: write d0/d2/d15/d22/d38/d56/f79 [3227791,60610] 0 2026-03-10T14:08:16.684 INFO:tasks.workunit.client.1.vm04.stdout:2/805: getdents d0/d14/d91/d3a/d3e 0 2026-03-10T14:08:16.684 INFO:tasks.workunit.client.1.vm04.stdout:9/789: fsync d9/da/dd/d1c/da3/fd0 0 2026-03-10T14:08:16.685 INFO:tasks.workunit.client.1.vm04.stdout:3/824: write da/dc/d47/fbd [216225,65251] 0 2026-03-10T14:08:16.692 INFO:tasks.workunit.client.1.vm04.stdout:2/806: truncate d0/d14/d91/d3a/d3e/fbd 140046 0 2026-03-10T14:08:16.695 INFO:tasks.workunit.client.1.vm04.stdout:9/790: creat d9/da/dd/de7/db1/f110 x:0 0 0 2026-03-10T14:08:16.695 INFO:tasks.workunit.client.1.vm04.stdout:9/791: chown d9/da/dd/d74 4621774 1 2026-03-10T14:08:16.696 INFO:tasks.workunit.client.1.vm04.stdout:3/825: creat da/d30/f119 x:0 0 0 2026-03-10T14:08:16.696 INFO:tasks.workunit.client.1.vm04.stdout:5/899: write d7/d12/d2b/d3e/d57/d77/da5/faa [471829,83913] 0 2026-03-10T14:08:16.699 INFO:tasks.workunit.client.1.vm04.stdout:9/792: dwrite d9/d33/f4b [0,4194304] 0 2026-03-10T14:08:16.709 INFO:tasks.workunit.client.1.vm04.stdout:7/862: write d2/fc7 [222518,104501] 0 2026-03-10T14:08:16.709 INFO:tasks.workunit.client.1.vm04.stdout:2/807: truncate d0/d14/d39/d47/d70/f8d 1236356 0 2026-03-10T14:08:16.709 INFO:tasks.workunit.client.1.vm04.stdout:1/836: write d3/d22/ffe [151159,2573] 0 2026-03-10T14:08:16.712 INFO:tasks.workunit.client.1.vm04.stdout:4/784: write d4/d14/d64/f71 [292556,112623] 0 2026-03-10T14:08:16.713 INFO:tasks.workunit.client.1.vm04.stdout:5/900: unlink d7/d12/d2b/d8c/ceb 0 2026-03-10T14:08:16.714 INFO:tasks.workunit.client.1.vm04.stdout:6/719: write d3/de/d35/d3f/d2d/f21 [123169,121040] 0 2026-03-10T14:08:16.715 INFO:tasks.workunit.client.1.vm04.stdout:3/826: creat da/dc/d3f/d61/d102/f11a x:0 0 0 2026-03-10T14:08:16.715 INFO:tasks.workunit.client.1.vm04.stdout:9/793: truncate d9/da/dd/de7/d96/d9d/fa2 896327 0 2026-03-10T14:08:16.716 INFO:tasks.workunit.client.1.vm04.stdout:6/720: write d3/de/d35/d3f/d2d/d32/d23/d24/d6f/fdf [460309,128420] 0 2026-03-10T14:08:16.717 INFO:tasks.workunit.client.1.vm04.stdout:8/930: write d0/d3/dd/d89/fa5 [4711928,38422] 0 2026-03-10T14:08:16.727 INFO:tasks.workunit.client.1.vm04.stdout:7/863: rename d2/d2a/d42/l72 to d2/dc/de/d2d/d60/l12f 0 2026-03-10T14:08:16.727 INFO:tasks.workunit.client.1.vm04.stdout:7/864: stat d2/dc/de/d2d/d60/d7c/d44 0 2026-03-10T14:08:16.735 INFO:tasks.workunit.client.1.vm04.stdout:4/785: dread d4/d14/d1b/f6c [0,4194304] 0 2026-03-10T14:08:16.736 INFO:tasks.workunit.client.1.vm04.stdout:9/794: creat d9/d58/db5/da5/f111 x:0 0 0 2026-03-10T14:08:16.737 INFO:tasks.workunit.client.1.vm04.stdout:4/786: dread d4/df/d31/fae [0,4194304] 0 2026-03-10T14:08:16.740 INFO:tasks.workunit.client.1.vm04.stdout:0/843: truncate d0/d2/d15/d22/d38/d56/fbf 1069657 0 2026-03-10T14:08:16.743 INFO:tasks.workunit.client.1.vm04.stdout:0/844: dwrite d0/d2/d15/d49/d50/d61/d75/f80 [0,4194304] 0 2026-03-10T14:08:16.751 INFO:tasks.workunit.client.1.vm04.stdout:9/795: dread d9/d44/d4d/f99 [0,4194304] 0 2026-03-10T14:08:16.761 INFO:tasks.workunit.client.1.vm04.stdout:1/837: dwrite d3/d22/d63/d35/dd9/d13/d1a/f4b [4194304,4194304] 0 2026-03-10T14:08:16.765 INFO:tasks.workunit.client.1.vm04.stdout:5/901: write d7/d12/d2b/d3e/d57/d8a/dec/f100 [175743,45344] 0 2026-03-10T14:08:16.787 INFO:tasks.workunit.client.1.vm04.stdout:7/865: creat d2/d6b/f130 x:0 0 0 2026-03-10T14:08:16.791 INFO:tasks.workunit.client.1.vm04.stdout:8/931: dread d0/d3/dd/d76/f92 [0,4194304] 0 2026-03-10T14:08:16.798 INFO:tasks.workunit.client.1.vm04.stdout:4/787: chown d4/d14/ca0 1748 1 2026-03-10T14:08:16.809 INFO:tasks.workunit.client.1.vm04.stdout:6/721: truncate d3/de/d35/d3f/d2d/d32/d23/f5a 2075612 0 2026-03-10T14:08:16.813 INFO:tasks.workunit.client.1.vm04.stdout:2/808: rmdir d0/d14/d39/d47/de7 0 2026-03-10T14:08:16.818 INFO:tasks.workunit.client.1.vm04.stdout:1/838: mkdir d3/d22/d2f/d11d 0 2026-03-10T14:08:16.820 INFO:tasks.workunit.client.1.vm04.stdout:8/932: dread - d0/d3/d63/d12/df5/ff9 zero size 2026-03-10T14:08:16.828 INFO:tasks.workunit.client.1.vm04.stdout:5/902: dread d7/d2d/d69/fc9 [4194304,4194304] 0 2026-03-10T14:08:16.835 INFO:tasks.workunit.client.1.vm04.stdout:7/866: dread d2/dc/de/d2d/d5c/f8e [0,4194304] 0 2026-03-10T14:08:16.838 INFO:tasks.workunit.client.1.vm04.stdout:7/867: stat d2/dc/de/d2d/d38/d50/fbc 0 2026-03-10T14:08:16.838 INFO:tasks.workunit.client.1.vm04.stdout:7/868: dread - d2/dc/de/d2d/d60/d7c/d44/f66 zero size 2026-03-10T14:08:16.838 INFO:tasks.workunit.client.1.vm04.stdout:4/788: dread d4/d14/d3c/d5e/f7b [4194304,4194304] 0 2026-03-10T14:08:16.838 INFO:tasks.workunit.client.1.vm04.stdout:2/809: fsync d0/d14/d91/d8/d17/d4e/d85/f89 0 2026-03-10T14:08:16.844 INFO:tasks.workunit.client.1.vm04.stdout:3/827: write da/dc/d3f/f83 [3426142,36636] 0 2026-03-10T14:08:16.844 INFO:tasks.workunit.client.1.vm04.stdout:0/845: write d0/d2/d15/d49/d50/f8a [257516,108075] 0 2026-03-10T14:08:16.845 INFO:tasks.workunit.client.1.vm04.stdout:0/846: readlink d0/d2/d15/d49/l103 0 2026-03-10T14:08:16.847 INFO:tasks.workunit.client.1.vm04.stdout:9/796: dwrite d9/da/dd/d74/f92 [0,4194304] 0 2026-03-10T14:08:16.856 INFO:tasks.workunit.client.1.vm04.stdout:4/789: rmdir d4/d14 39 2026-03-10T14:08:16.862 INFO:tasks.workunit.client.1.vm04.stdout:2/810: mknod d0/d14/d91/d4a/d66/dcd/cf8 0 2026-03-10T14:08:16.862 INFO:tasks.workunit.client.1.vm04.stdout:9/797: creat d9/da/dd/de7/db1/f112 x:0 0 0 2026-03-10T14:08:16.862 INFO:tasks.workunit.client.1.vm04.stdout:2/811: rmdir d0 39 2026-03-10T14:08:16.862 INFO:tasks.workunit.client.1.vm04.stdout:9/798: read d9/da/dd/f47 [3484369,54409] 0 2026-03-10T14:08:16.862 INFO:tasks.workunit.client.1.vm04.stdout:3/828: mknod da/dc/d47/d9b/d106/dde/df2/c11b 0 2026-03-10T14:08:16.865 INFO:tasks.workunit.client.1.vm04.stdout:1/839: getdents d3/d22/d63/d35/dd9/d13/d38 0 2026-03-10T14:08:16.866 INFO:tasks.workunit.client.1.vm04.stdout:9/799: mknod d9/da/dd/de7/db1/c113 0 2026-03-10T14:08:16.867 INFO:tasks.workunit.client.1.vm04.stdout:1/840: creat d3/d22/deb/f11e x:0 0 0 2026-03-10T14:08:16.868 INFO:tasks.workunit.client.1.vm04.stdout:3/829: creat da/dc/d115/f11c x:0 0 0 2026-03-10T14:08:16.871 INFO:tasks.workunit.client.1.vm04.stdout:3/830: dread da/dc/f39 [4194304,4194304] 0 2026-03-10T14:08:16.882 INFO:tasks.workunit.client.1.vm04.stdout:9/800: rmdir d9/da/dd/de7/d96/d9d 39 2026-03-10T14:08:16.882 INFO:tasks.workunit.client.1.vm04.stdout:9/801: read - d9/da/dd/d74/fe1 zero size 2026-03-10T14:08:16.882 INFO:tasks.workunit.client.1.vm04.stdout:3/831: mknod da/dc/d3f/d61/dc1/c11d 0 2026-03-10T14:08:16.882 INFO:tasks.workunit.client.1.vm04.stdout:1/841: dread d3/d5c/f71 [4194304,4194304] 0 2026-03-10T14:08:16.882 INFO:tasks.workunit.client.1.vm04.stdout:4/790: link d4/d14/d3c/d85/la5 d4/d14/d3c/d85/l103 0 2026-03-10T14:08:16.882 INFO:tasks.workunit.client.1.vm04.stdout:3/832: fdatasync da/dc/d35/d52/d6d/fe4 0 2026-03-10T14:08:16.882 INFO:tasks.workunit.client.1.vm04.stdout:3/833: chown da/l23 2195669 1 2026-03-10T14:08:16.884 INFO:tasks.workunit.client.1.vm04.stdout:2/812: getdents d0/d14/d91/d4a/d8c/dab/d95 0 2026-03-10T14:08:16.885 INFO:tasks.workunit.client.1.vm04.stdout:2/813: write d0/d14/d91/d8/d17/d4e/d85/fcb [783922,116591] 0 2026-03-10T14:08:16.889 INFO:tasks.workunit.client.1.vm04.stdout:1/842: rename d3/d22/d63/c4a to d3/d22/d2f/c11f 0 2026-03-10T14:08:16.899 INFO:tasks.workunit.client.1.vm04.stdout:1/843: dread - d3/d22/d63/d35/dd9/d13/d38/ff1 zero size 2026-03-10T14:08:16.899 INFO:tasks.workunit.client.1.vm04.stdout:2/814: rmdir d0/d14/d91/d4a/d8c/d92 39 2026-03-10T14:08:16.899 INFO:tasks.workunit.client.1.vm04.stdout:2/815: chown d0/d14/d1b 63 1 2026-03-10T14:08:16.899 INFO:tasks.workunit.client.1.vm04.stdout:2/816: chown d0/d14/d91/d3a/fb5 5588476 1 2026-03-10T14:08:16.899 INFO:tasks.workunit.client.1.vm04.stdout:1/844: symlink d3/d22/d63/d35/dd9/d13/d38/d58/d113/l120 0 2026-03-10T14:08:16.899 INFO:tasks.workunit.client.1.vm04.stdout:2/817: symlink d0/d14/d91/d8/d17/d4e/dea/lf9 0 2026-03-10T14:08:16.899 INFO:tasks.workunit.client.1.vm04.stdout:6/722: sync 2026-03-10T14:08:16.899 INFO:tasks.workunit.client.1.vm04.stdout:2/818: dwrite d0/d14/d1b/f8a [0,4194304] 0 2026-03-10T14:08:16.902 INFO:tasks.workunit.client.1.vm04.stdout:1/845: creat d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/d84/f121 x:0 0 0 2026-03-10T14:08:16.915 INFO:tasks.workunit.client.1.vm04.stdout:1/846: creat d3/d22/d63/d35/dd9/d13/da0/dc5/dfa/f122 x:0 0 0 2026-03-10T14:08:16.926 INFO:tasks.workunit.client.1.vm04.stdout:7/869: dwrite d2/dc/d4d/dcd/f3c [0,4194304] 0 2026-03-10T14:08:16.930 INFO:tasks.workunit.client.1.vm04.stdout:8/933: dwrite d0/d3/d63/d12/d51/d67/d96/df3/f122 [0,4194304] 0 2026-03-10T14:08:16.931 INFO:tasks.workunit.client.1.vm04.stdout:5/903: dwrite d7/d2d/f64 [0,4194304] 0 2026-03-10T14:08:16.933 INFO:tasks.workunit.client.1.vm04.stdout:8/934: stat d0/d3/fa3 0 2026-03-10T14:08:16.934 INFO:tasks.workunit.client.1.vm04.stdout:0/847: write d0/d2/d15/d22/d38/fe6 [4400187,81718] 0 2026-03-10T14:08:16.934 INFO:tasks.workunit.client.1.vm04.stdout:8/935: chown d0/d3/d63/d12/d69 123 1 2026-03-10T14:08:16.949 INFO:tasks.workunit.client.1.vm04.stdout:9/802: dwrite d9/d58/f62 [0,4194304] 0 2026-03-10T14:08:16.954 INFO:tasks.workunit.client.1.vm04.stdout:3/834: write da/d8e/db5/ff9 [160203,23627] 0 2026-03-10T14:08:16.954 INFO:tasks.workunit.client.1.vm04.stdout:1/847: chown d3/c1c 61 1 2026-03-10T14:08:16.955 INFO:tasks.workunit.client.1.vm04.stdout:3/835: chown da/dc/d35/d52/d70/f8f 28521990 1 2026-03-10T14:08:16.964 INFO:tasks.workunit.client.1.vm04.stdout:4/791: dwrite d4/df/db2/db4/fd9 [0,4194304] 0 2026-03-10T14:08:16.965 INFO:tasks.workunit.client.1.vm04.stdout:7/870: sync 2026-03-10T14:08:16.967 INFO:tasks.workunit.client.1.vm04.stdout:7/871: chown d2/dc/de/d11/c59 658180 1 2026-03-10T14:08:16.977 INFO:tasks.workunit.client.1.vm04.stdout:8/936: truncate d0/d3/d63/d29/f9b 360524 0 2026-03-10T14:08:16.983 INFO:tasks.workunit.client.1.vm04.stdout:5/904: mknod d7/d59/d7e/d101/c127 0 2026-03-10T14:08:16.994 INFO:tasks.workunit.client.1.vm04.stdout:4/792: mkdir d4/df/db2/db6/dc9/d104 0 2026-03-10T14:08:16.994 INFO:tasks.workunit.client.1.vm04.stdout:4/793: readlink d4/df/db2/db4/d47/d4f/l90 0 2026-03-10T14:08:16.994 INFO:tasks.workunit.client.1.vm04.stdout:7/872: truncate d2/dc/de/d2d/d60/d81/fd3 800573 0 2026-03-10T14:08:16.994 INFO:tasks.workunit.client.1.vm04.stdout:8/937: symlink d0/d3/d63/d12/d51/d67/d96/dc8/d111/l125 0 2026-03-10T14:08:17.015 INFO:tasks.workunit.client.1.vm04.stdout:2/819: dwrite d0/d14/d91/d4a/f57 [0,4194304] 0 2026-03-10T14:08:17.015 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:16 vm03.local ceph-mon[49718]: pgmap v8: 65 pgs: 65 active+clean; 2.3 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 20 MiB/s rd, 59 MiB/s wr, 147 op/s 2026-03-10T14:08:17.016 INFO:tasks.workunit.client.1.vm04.stdout:2/820: dread - d0/d14/d39/d47/d70/dc3/fe0 zero size 2026-03-10T14:08:17.017 INFO:tasks.workunit.client.1.vm04.stdout:2/821: chown d0/d14/d91/d8/d17/d4e/d85/d86/d96/dc2 2 1 2026-03-10T14:08:17.025 INFO:tasks.workunit.client.1.vm04.stdout:3/836: dread da/dc/d47/d9b/d106/dde/dac/fb4 [0,4194304] 0 2026-03-10T14:08:17.027 INFO:tasks.workunit.client.1.vm04.stdout:6/723: write d3/de/d35/d3f/d2d/d32/d23/f2f [484233,20859] 0 2026-03-10T14:08:17.028 INFO:tasks.workunit.client.1.vm04.stdout:3/837: dread da/dc/d35/d52/f6f [0,4194304] 0 2026-03-10T14:08:17.028 INFO:tasks.workunit.client.1.vm04.stdout:8/938: symlink d0/d3/d63/d12/d51/d67/d96/df3/l126 0 2026-03-10T14:08:17.029 INFO:tasks.workunit.client.1.vm04.stdout:8/939: chown d0/d3/dd/d89/lb7 29208 1 2026-03-10T14:08:17.030 INFO:tasks.workunit.client.1.vm04.stdout:8/940: write d0/d3/d73/f10e [516126,103958] 0 2026-03-10T14:08:17.030 INFO:tasks.workunit.client.1.vm04.stdout:8/941: chown d0/d3/dd/d78/f10c 51375 1 2026-03-10T14:08:17.042 INFO:tasks.workunit.client.1.vm04.stdout:9/803: creat d9/da/dd/de7/d96/f114 x:0 0 0 2026-03-10T14:08:17.048 INFO:tasks.workunit.client.1.vm04.stdout:4/794: symlink d4/d14/d6d/df5/l105 0 2026-03-10T14:08:17.049 INFO:tasks.workunit.client.1.vm04.stdout:7/873: creat d2/dc/de/d2d/d60/d7c/df8/f131 x:0 0 0 2026-03-10T14:08:17.051 INFO:tasks.workunit.client.1.vm04.stdout:2/822: symlink d0/d14/d91/d4a/d8c/dab/d95/lfa 0 2026-03-10T14:08:17.063 INFO:tasks.workunit.client.1.vm04.stdout:0/848: getdents d0 0 2026-03-10T14:08:17.065 INFO:tasks.workunit.client.1.vm04.stdout:7/874: sync 2026-03-10T14:08:17.066 INFO:tasks.workunit.client.1.vm04.stdout:1/848: dwrite d3/d20/f32 [0,4194304] 0 2026-03-10T14:08:17.066 INFO:tasks.workunit.client.1.vm04.stdout:1/849: fsync d3/f8 0 2026-03-10T14:08:17.067 INFO:tasks.workunit.client.1.vm04.stdout:1/850: dread - d3/d20/d60/ff7 zero size 2026-03-10T14:08:17.072 INFO:tasks.workunit.client.1.vm04.stdout:6/724: creat d3/de/d35/d3f/d2d/d32/d9e/fe1 x:0 0 0 2026-03-10T14:08:17.077 INFO:tasks.workunit.client.1.vm04.stdout:3/838: readlink da/dc/d3f/d54/d66/ld6 0 2026-03-10T14:08:17.080 INFO:tasks.workunit.client.1.vm04.stdout:5/905: creat d7/d26/f128 x:0 0 0 2026-03-10T14:08:17.093 INFO:tasks.workunit.client.1.vm04.stdout:7/875: creat d2/dc/de/d2d/d5c/da9/df6/f132 x:0 0 0 2026-03-10T14:08:17.114 INFO:tasks.workunit.client.1.vm04.stdout:4/795: write d4/d14/d64/fab [4855,89660] 0 2026-03-10T14:08:17.117 INFO:tasks.workunit.client.1.vm04.stdout:2/823: dwrite d0/d14/d91/d8/dd/fc0 [0,4194304] 0 2026-03-10T14:08:17.118 INFO:tasks.workunit.client.1.vm04.stdout:2/824: truncate d0/d14/d91/d4a/fed 515221 0 2026-03-10T14:08:17.261 INFO:tasks.workunit.client.1.vm04.stdout:6/725: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/d6f/fc0 [0,4194304] 0 2026-03-10T14:08:17.264 INFO:tasks.workunit.client.1.vm04.stdout:9/804: creat d9/da/dd/de7/d96/d9d/f115 x:0 0 0 2026-03-10T14:08:17.272 INFO:tasks.workunit.client.1.vm04.stdout:5/906: mknod d7/d12/d2b/d3e/d57/c129 0 2026-03-10T14:08:17.273 INFO:tasks.workunit.client.1.vm04.stdout:0/849: creat d0/d2/d15/d22/d38/d56/dc1/d107/f10a x:0 0 0 2026-03-10T14:08:17.275 INFO:tasks.workunit.client.1.vm04.stdout:1/851: unlink d3/d22/d2f/d57/l59 0 2026-03-10T14:08:17.277 INFO:tasks.workunit.client.1.vm04.stdout:2/825: symlink d0/d14/d91/d3a/lfb 0 2026-03-10T14:08:17.279 INFO:tasks.workunit.client.1.vm04.stdout:4/796: read d4/df/d31/f5d [221627,2914] 0 2026-03-10T14:08:17.280 INFO:tasks.workunit.client.1.vm04.stdout:8/942: creat d0/f127 x:0 0 0 2026-03-10T14:08:17.281 INFO:tasks.workunit.client.1.vm04.stdout:8/943: write d0/d3/d63/d12/d51/d67/d96/df3/f124 [163239,105655] 0 2026-03-10T14:08:17.288 INFO:tasks.workunit.client.1.vm04.stdout:5/907: rename d7/d12/d2b/d3e/d57/l7b to d7/d12/d2b/d3e/d57/d9f/d120/l12a 0 2026-03-10T14:08:17.291 INFO:tasks.workunit.client.1.vm04.stdout:7/876: symlink d2/dc/de/d2d/l133 0 2026-03-10T14:08:17.292 INFO:tasks.workunit.client.1.vm04.stdout:1/852: unlink d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/dce/f101 0 2026-03-10T14:08:17.293 INFO:tasks.workunit.client.1.vm04.stdout:2/826: unlink d0/d14/d91/d4a/d8c/dab/d46/dc8/lcf 0 2026-03-10T14:08:17.294 INFO:tasks.workunit.client.1.vm04.stdout:2/827: read d0/d14/d1b/f55 [1202743,36771] 0 2026-03-10T14:08:17.295 INFO:tasks.workunit.client.1.vm04.stdout:4/797: mkdir d4/d14/d3c/d5e/d106 0 2026-03-10T14:08:17.296 INFO:tasks.workunit.client.1.vm04.stdout:8/944: rmdir d0/d3/d73/db8 39 2026-03-10T14:08:17.299 INFO:tasks.workunit.client.1.vm04.stdout:3/839: dwrite da/d3e/f63 [0,4194304] 0 2026-03-10T14:08:17.300 INFO:tasks.workunit.client.1.vm04.stdout:3/840: chown da/dc/d35/d52/f79 1807846786 1 2026-03-10T14:08:17.310 INFO:tasks.workunit.client.1.vm04.stdout:9/805: dwrite d9/d58/f5e [0,4194304] 0 2026-03-10T14:08:17.312 INFO:tasks.workunit.client.1.vm04.stdout:6/726: mkdir d3/de/d35/d3f/d2d/d32/d23/d24/d6f/de2 0 2026-03-10T14:08:17.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:16 vm04.local ceph-mon[55966]: pgmap v8: 65 pgs: 65 active+clean; 2.3 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 20 MiB/s rd, 59 MiB/s wr, 147 op/s 2026-03-10T14:08:17.316 INFO:tasks.workunit.client.1.vm04.stdout:9/806: dwrite d9/da/dd/d1c/f2e [4194304,4194304] 0 2026-03-10T14:08:17.324 INFO:tasks.workunit.client.0.vm03.stdout:3/295: unlink d1d/d39/d51/f52 0 2026-03-10T14:08:17.326 INFO:tasks.workunit.client.1.vm04.stdout:7/877: rename d2/dc/de/d2d/d60/d7c/c23 to d2/dc/de/d2d/d60/d81/c134 0 2026-03-10T14:08:17.326 INFO:tasks.workunit.client.0.vm03.stdout:8/335: creat da/d3a/d44/f6d x:0 0 0 2026-03-10T14:08:17.327 INFO:tasks.workunit.client.1.vm04.stdout:2/828: mkdir d0/d14/d39/d47/d70/dad/dfc 0 2026-03-10T14:08:17.330 INFO:tasks.workunit.client.1.vm04.stdout:9/807: mkdir d9/da/dd/de7/db1/d116 0 2026-03-10T14:08:17.331 INFO:tasks.workunit.client.1.vm04.stdout:0/850: link d0/d2/d15/d49/d50/d61/la8 d0/d2/d15/d49/d50/d5c/da4/l10b 0 2026-03-10T14:08:17.333 INFO:tasks.workunit.client.1.vm04.stdout:1/853: rename d3/d22/f2b to d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/d84/f123 0 2026-03-10T14:08:17.340 INFO:tasks.workunit.client.1.vm04.stdout:8/945: symlink d0/d3/d63/d12/d51/l128 0 2026-03-10T14:08:17.343 INFO:tasks.workunit.client.1.vm04.stdout:8/946: dread d0/d3/d63/d29/fba [0,4194304] 0 2026-03-10T14:08:17.345 INFO:tasks.workunit.client.1.vm04.stdout:9/808: rmdir d9/d58/db5/da5/dab 39 2026-03-10T14:08:17.352 INFO:tasks.workunit.client.1.vm04.stdout:0/851: mknod d0/d2/d25/c10c 0 2026-03-10T14:08:17.352 INFO:tasks.workunit.client.1.vm04.stdout:3/841: rename da/dc/d47/l86 to da/dc/l11e 0 2026-03-10T14:08:17.352 INFO:tasks.workunit.client.1.vm04.stdout:2/829: dread d0/d14/d91/d4a/d8c/dab/fd2 [0,4194304] 0 2026-03-10T14:08:17.353 INFO:tasks.workunit.client.1.vm04.stdout:2/830: write d0/d14/d39/d47/d70/dc3/fe0 [65107,24658] 0 2026-03-10T14:08:17.356 INFO:tasks.workunit.client.1.vm04.stdout:9/809: mknod d9/da/dd/d1c/da3/dec/c117 0 2026-03-10T14:08:17.363 INFO:tasks.workunit.client.1.vm04.stdout:0/852: fdatasync d0/d2/d15/d49/d50/d61/f96 0 2026-03-10T14:08:17.367 INFO:tasks.workunit.client.1.vm04.stdout:2/831: creat d0/d14/d91/d3a/ffd x:0 0 0 2026-03-10T14:08:17.371 INFO:tasks.workunit.client.1.vm04.stdout:3/842: mknod da/c11f 0 2026-03-10T14:08:17.373 INFO:tasks.workunit.client.1.vm04.stdout:0/853: fdatasync d0/f72 0 2026-03-10T14:08:17.375 INFO:tasks.workunit.client.1.vm04.stdout:2/832: dread d0/d14/d91/d8/d17/d35/f94 [0,4194304] 0 2026-03-10T14:08:17.382 INFO:tasks.workunit.client.1.vm04.stdout:5/908: dwrite d7/f3c [0,4194304] 0 2026-03-10T14:08:17.383 INFO:tasks.workunit.client.1.vm04.stdout:5/909: fsync d7/d12/f126 0 2026-03-10T14:08:17.389 INFO:tasks.workunit.client.1.vm04.stdout:6/727: truncate d3/de/d35/d3f/d2d/d32/d23/d24/d6f/fc0 3989824 0 2026-03-10T14:08:17.389 INFO:tasks.workunit.client.1.vm04.stdout:4/798: dwrite d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/f86 [0,4194304] 0 2026-03-10T14:08:17.391 INFO:tasks.workunit.client.1.vm04.stdout:7/878: dwrite d2/d94/f29 [4194304,4194304] 0 2026-03-10T14:08:17.398 INFO:tasks.workunit.client.1.vm04.stdout:0/854: mkdir d0/d2/d15/d49/d50/d10d 0 2026-03-10T14:08:17.403 INFO:tasks.workunit.client.1.vm04.stdout:2/833: fsync d0/d14/fc7 0 2026-03-10T14:08:17.404 INFO:tasks.workunit.client.1.vm04.stdout:2/834: write d0/d14/d1b/f8a [714565,119333] 0 2026-03-10T14:08:17.412 INFO:tasks.workunit.client.1.vm04.stdout:9/810: rename d9/da/d5d/dd6 to d9/da/dd/de7/d96/d118 0 2026-03-10T14:08:17.413 INFO:tasks.workunit.client.1.vm04.stdout:1/854: write d3/d22/d63/f89 [630544,43624] 0 2026-03-10T14:08:17.421 INFO:tasks.workunit.client.1.vm04.stdout:6/728: fsync d3/de/d35/d3f/d2d/f98 0 2026-03-10T14:08:17.425 INFO:tasks.workunit.client.0.vm03.stdout:8/336: mknod da/d3c/d51/c6e 0 2026-03-10T14:08:17.425 INFO:tasks.workunit.client.1.vm04.stdout:7/879: creat d2/dc/de/d2d/d60/d7c/d64/d108/d117/f135 x:0 0 0 2026-03-10T14:08:17.426 INFO:tasks.workunit.client.1.vm04.stdout:7/880: readlink d2/dc/de/d2d/d60/d7c/d44/dc0/lfc 0 2026-03-10T14:08:17.428 INFO:tasks.workunit.client.1.vm04.stdout:7/881: truncate d2/dc/de/d2d/d5c/da9/df6/f132 421176 0 2026-03-10T14:08:17.429 INFO:tasks.workunit.client.1.vm04.stdout:0/855: mkdir d0/d2/d15/d49/d50/d10e 0 2026-03-10T14:08:17.434 INFO:tasks.workunit.client.0.vm03.stdout:1/343: rename d0/d2/df/d27/f49 to d0/d2/df/f6c 0 2026-03-10T14:08:17.439 INFO:tasks.workunit.client.1.vm04.stdout:2/835: rename d0/d14/d91/f24 to d0/d14/d91/d8/d17/d4e/dea/dde/ffe 0 2026-03-10T14:08:17.444 INFO:tasks.workunit.client.0.vm03.stdout:6/319: rename d8/d3b/f53 to d8/db/d2c/d2d/d32/d3a/f5e 0 2026-03-10T14:08:17.445 INFO:tasks.workunit.client.0.vm03.stdout:1/344: creat d0/d2/d34/f6d x:0 0 0 2026-03-10T14:08:17.450 INFO:tasks.workunit.client.0.vm03.stdout:3/296: dread f17 [0,4194304] 0 2026-03-10T14:08:17.454 INFO:tasks.workunit.client.1.vm04.stdout:0/856: unlink d0/d2/d15/d49/lab 0 2026-03-10T14:08:17.455 INFO:tasks.workunit.client.0.vm03.stdout:3/297: truncate d1d/d29/f3f 709057 0 2026-03-10T14:08:17.457 INFO:tasks.workunit.client.1.vm04.stdout:9/811: mknod d9/da/c119 0 2026-03-10T14:08:17.462 INFO:tasks.workunit.client.1.vm04.stdout:3/843: dwrite da/d30/f55 [0,4194304] 0 2026-03-10T14:08:17.465 INFO:tasks.workunit.client.0.vm03.stdout:3/298: read d1d/f32 [3047896,80861] 0 2026-03-10T14:08:17.467 INFO:tasks.workunit.client.0.vm03.stdout:4/370: rename d5/d9/db/f10 to d5/d9/db/d19/d38/d53/d55/f75 0 2026-03-10T14:08:17.480 INFO:tasks.workunit.client.1.vm04.stdout:2/836: dread d0/d14/d39/f44 [0,4194304] 0 2026-03-10T14:08:17.480 INFO:tasks.workunit.client.1.vm04.stdout:2/837: stat d0/l9d 0 2026-03-10T14:08:17.480 INFO:tasks.workunit.client.1.vm04.stdout:2/838: chown d0/d14/d91/d4a/d8c/dab 1 1 2026-03-10T14:08:17.480 INFO:tasks.workunit.client.0.vm03.stdout:1/345: rmdir d0/d42 39 2026-03-10T14:08:17.480 INFO:tasks.workunit.client.1.vm04.stdout:2/839: chown d0/d14/d91/d4a/d8c/dab/d46/c52 19290195 1 2026-03-10T14:08:17.481 INFO:tasks.workunit.client.1.vm04.stdout:2/840: readlink d0/d14/d91/d4a/d8c/dab/d46/lc1 0 2026-03-10T14:08:17.485 INFO:tasks.workunit.client.0.vm03.stdout:4/371: creat d5/d9/db/d19/d38/d53/d55/f76 x:0 0 0 2026-03-10T14:08:17.489 INFO:tasks.workunit.client.1.vm04.stdout:4/799: dwrite d4/df/db2/db4/f4b [4194304,4194304] 0 2026-03-10T14:08:17.489 INFO:tasks.workunit.client.1.vm04.stdout:3/844: creat da/dc/d35/d52/d53/d78/f120 x:0 0 0 2026-03-10T14:08:17.493 INFO:tasks.workunit.client.1.vm04.stdout:6/729: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f87 [4194304,4194304] 0 2026-03-10T14:08:17.496 INFO:tasks.workunit.client.1.vm04.stdout:5/910: getdents d7/d12/d2b/d3e 0 2026-03-10T14:08:17.497 INFO:tasks.workunit.client.1.vm04.stdout:7/882: dwrite d2/dc/de/d2d/d60/d7c/f78 [0,4194304] 0 2026-03-10T14:08:17.511 INFO:tasks.workunit.client.0.vm03.stdout:1/346: symlink d0/d18/l6e 0 2026-03-10T14:08:17.514 INFO:tasks.workunit.client.1.vm04.stdout:9/812: mkdir d9/da/d8c/de5/df6/d11a 0 2026-03-10T14:08:17.515 INFO:tasks.workunit.client.1.vm04.stdout:9/813: chown d9/da/dd/d1c/f3b 816 1 2026-03-10T14:08:17.515 INFO:tasks.workunit.client.0.vm03.stdout:3/299: getdents d1d/d33/d47 0 2026-03-10T14:08:17.515 INFO:tasks.workunit.client.1.vm04.stdout:3/845: mkdir da/dc/d47/d9b/d106/dde/d121 0 2026-03-10T14:08:17.521 INFO:tasks.workunit.client.1.vm04.stdout:6/730: dread - d3/de/d35/d3f/d2d/d32/f97 zero size 2026-03-10T14:08:17.521 INFO:tasks.workunit.client.1.vm04.stdout:1/855: getdents d3/d22/d63/d35/dd9/d13/d1a 0 2026-03-10T14:08:17.522 INFO:tasks.workunit.client.1.vm04.stdout:6/731: dread - d3/de/d35/d3f/d2d/d32/f97 zero size 2026-03-10T14:08:17.524 INFO:tasks.workunit.client.1.vm04.stdout:7/883: mknod d2/d6b/c136 0 2026-03-10T14:08:17.528 INFO:tasks.workunit.client.1.vm04.stdout:9/814: fdatasync d9/da/d5d/d81/fcd 0 2026-03-10T14:08:17.531 INFO:tasks.workunit.client.1.vm04.stdout:1/856: rename d3/d22/d2f/d57/fac to d3/d22/d63/d35/dd9/d13/da0/de9/f124 0 2026-03-10T14:08:17.534 INFO:tasks.workunit.client.1.vm04.stdout:2/841: dread d0/d14/d91/d8/d17/d4e/d85/d86/ff2 [4194304,4194304] 0 2026-03-10T14:08:17.538 INFO:tasks.workunit.client.1.vm04.stdout:0/857: getdents d0/d2/d15/d22/d38/d56 0 2026-03-10T14:08:17.540 INFO:tasks.workunit.client.1.vm04.stdout:9/815: mknod d9/dd3/c11b 0 2026-03-10T14:08:17.543 INFO:tasks.workunit.client.1.vm04.stdout:1/857: rmdir d3/d5c 39 2026-03-10T14:08:17.546 INFO:tasks.workunit.client.1.vm04.stdout:2/842: rename d0/d14/d91/d4a/d8c/dab/db3/cc6 to d0/d14/d39/d47/d70/d8b/cff 0 2026-03-10T14:08:17.546 INFO:tasks.workunit.client.1.vm04.stdout:2/843: stat d0/d14/d39/d47/d70/dad/dfc 0 2026-03-10T14:08:17.548 INFO:tasks.workunit.client.0.vm03.stdout:3/300: dread d1d/f31 [0,4194304] 0 2026-03-10T14:08:17.549 INFO:tasks.workunit.client.1.vm04.stdout:0/858: read d0/d2/d15/f44 [2614933,130545] 0 2026-03-10T14:08:17.550 INFO:tasks.workunit.client.0.vm03.stdout:3/301: stat d1d/d33/l4b 0 2026-03-10T14:08:17.550 INFO:tasks.workunit.client.0.vm03.stdout:8/337: dwrite da/f33 [0,4194304] 0 2026-03-10T14:08:17.550 INFO:tasks.workunit.client.1.vm04.stdout:4/800: link d4/df/l1c d4/d14/d3c/d5e/d106/l107 0 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.1.vm04.stdout:9/816: mknod d9/d5c/dc2/c11c 0 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.1.vm04.stdout:6/732: creat d3/de/d35/d3f/d2d/fe3 x:0 0 0 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.1.vm04.stdout:0/859: dread d0/d2/d15/d49/d50/d5c/dd8/f105 [0,4194304] 0 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.1.vm04.stdout:2/844: mknod d0/d14/d91/d4a/d66/dda/df4/c100 0 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.0.vm03.stdout:3/302: chown d1d/d29/c4f 1913462713 1 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.0.vm03.stdout:3/303: dread fc [0,4194304] 0 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.0.vm03.stdout:8/338: chown da/d24/f52 10085424 1 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.0.vm03.stdout:3/304: dread d1d/d29/d41/f4e [0,4194304] 0 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.0.vm03.stdout:3/305: mkdir d1d/d59 0 2026-03-10T14:08:17.567 INFO:tasks.workunit.client.1.vm04.stdout:2/845: chown d0/d14/d39/d47/cf3 1 1 2026-03-10T14:08:17.569 INFO:tasks.workunit.client.0.vm03.stdout:3/306: creat d1d/d40/f5a x:0 0 0 2026-03-10T14:08:17.569 INFO:tasks.workunit.client.1.vm04.stdout:4/801: creat d4/df/db2/de1/f108 x:0 0 0 2026-03-10T14:08:17.569 INFO:tasks.workunit.client.1.vm04.stdout:4/802: stat d4/d14/d1b/f20 0 2026-03-10T14:08:17.570 INFO:tasks.workunit.client.1.vm04.stdout:4/803: fdatasync d4/df/f2e 0 2026-03-10T14:08:17.571 INFO:tasks.workunit.client.0.vm03.stdout:3/307: fsync fc 0 2026-03-10T14:08:17.572 INFO:tasks.workunit.client.1.vm04.stdout:6/733: creat d3/de/d35/d3f/d2d/d32/d23/ddc/fe4 x:0 0 0 2026-03-10T14:08:17.574 INFO:tasks.workunit.client.0.vm03.stdout:3/308: mkdir d1d/d29/d41/d45/d5b 0 2026-03-10T14:08:17.574 INFO:tasks.workunit.client.0.vm03.stdout:3/309: write d1d/d29/f3f [25943,14336] 0 2026-03-10T14:08:17.577 INFO:tasks.workunit.client.1.vm04.stdout:9/817: mkdir d9/da/d8c/de5/d11d 0 2026-03-10T14:08:17.577 INFO:tasks.workunit.client.0.vm03.stdout:3/310: mknod d1d/d33/d47/c5c 0 2026-03-10T14:08:17.578 INFO:tasks.workunit.client.0.vm03.stdout:3/311: chown d1d/d59 292031945 1 2026-03-10T14:08:17.578 INFO:tasks.workunit.client.1.vm04.stdout:9/818: fdatasync d9/da/dd/de7/d96/d9d/f10f 0 2026-03-10T14:08:17.578 INFO:tasks.workunit.client.0.vm03.stdout:3/312: chown d1d/d33/l43 551 1 2026-03-10T14:08:17.579 INFO:tasks.workunit.client.1.vm04.stdout:1/858: link d3/d22/d2f/d57/l11b d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb/l125 0 2026-03-10T14:08:17.581 INFO:tasks.workunit.client.0.vm03.stdout:3/313: mkdir d1d/d40/d5d 0 2026-03-10T14:08:17.581 INFO:tasks.workunit.client.0.vm03.stdout:3/314: readlink d1d/d33/l3e 0 2026-03-10T14:08:17.582 INFO:tasks.workunit.client.1.vm04.stdout:8/947: dread d0/d3/dd/d89/fa5 [0,4194304] 0 2026-03-10T14:08:17.591 INFO:tasks.workunit.client.1.vm04.stdout:9/819: dwrite d9/d58/f62 [0,4194304] 0 2026-03-10T14:08:17.591 INFO:tasks.workunit.client.1.vm04.stdout:8/948: readlink d0/d3/d63/d12/d69/l88 0 2026-03-10T14:08:17.591 INFO:tasks.workunit.client.1.vm04.stdout:2/846: creat d0/d14/d1b/d45/f101 x:0 0 0 2026-03-10T14:08:17.591 INFO:tasks.workunit.client.1.vm04.stdout:2/847: write d0/d14/d39/d47/d70/f74 [3408361,102023] 0 2026-03-10T14:08:17.596 INFO:tasks.workunit.client.1.vm04.stdout:1/859: symlink d3/d22/d63/d35/dd9/d13/d38/d58/d113/l126 0 2026-03-10T14:08:17.597 INFO:tasks.workunit.client.1.vm04.stdout:1/860: readlink d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/dce/lcf 0 2026-03-10T14:08:17.603 INFO:tasks.workunit.client.1.vm04.stdout:4/804: mknod d4/d14/d3c/c109 0 2026-03-10T14:08:17.610 INFO:tasks.workunit.client.1.vm04.stdout:6/734: creat d3/de/d35/d3a/d43/d4c/fe5 x:0 0 0 2026-03-10T14:08:17.610 INFO:tasks.workunit.client.1.vm04.stdout:6/735: chown d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/fb7 874765 1 2026-03-10T14:08:17.610 INFO:tasks.workunit.client.1.vm04.stdout:1/861: fdatasync d3/d22/d63/f72 0 2026-03-10T14:08:17.610 INFO:tasks.workunit.client.1.vm04.stdout:1/862: rename d3/d22/d63/d35/dd9/d13/d38/db5/dff to d3/d22/d63/d35/dd9/d13/d38/db5/dff/d127 22 2026-03-10T14:08:17.616 INFO:tasks.workunit.client.1.vm04.stdout:6/736: truncate d3/de/d35/d3f/d2d/fc7 255314 0 2026-03-10T14:08:17.617 INFO:tasks.workunit.client.1.vm04.stdout:6/737: write d3/de/d35/d3f/d2d/f2e [3903694,111884] 0 2026-03-10T14:08:17.630 INFO:tasks.workunit.client.0.vm03.stdout:7/235: write d5/f6 [1331851,57113] 0 2026-03-10T14:08:17.630 INFO:tasks.workunit.client.0.vm03.stdout:2/311: write d5/d2a/f34 [1519484,48719] 0 2026-03-10T14:08:17.630 INFO:tasks.workunit.client.1.vm04.stdout:4/805: dread d4/d14/d3c/d5e/f7b [0,4194304] 0 2026-03-10T14:08:17.634 INFO:tasks.workunit.client.0.vm03.stdout:7/236: creat d5/f47 x:0 0 0 2026-03-10T14:08:17.635 INFO:tasks.workunit.client.1.vm04.stdout:3/846: sync 2026-03-10T14:08:17.635 INFO:tasks.workunit.client.1.vm04.stdout:7/884: sync 2026-03-10T14:08:17.636 INFO:tasks.workunit.client.1.vm04.stdout:9/820: sync 2026-03-10T14:08:17.636 INFO:tasks.workunit.client.1.vm04.stdout:3/847: write da/d3e/fba [1363791,59344] 0 2026-03-10T14:08:17.637 INFO:tasks.workunit.client.1.vm04.stdout:9/821: stat d9/da/dd/d1c/da3/dec/f104 0 2026-03-10T14:08:17.639 INFO:tasks.workunit.client.1.vm04.stdout:7/885: mkdir d2/dc/de/d2d/d60/d7c/d36/d8b/d137 0 2026-03-10T14:08:17.639 INFO:tasks.workunit.client.1.vm04.stdout:4/806: dread d4/f77 [0,4194304] 0 2026-03-10T14:08:17.640 INFO:tasks.workunit.client.0.vm03.stdout:2/312: dread d5/d10/f22 [0,4194304] 0 2026-03-10T14:08:17.641 INFO:tasks.workunit.client.1.vm04.stdout:7/886: creat d2/dc/de/d2d/d38/f138 x:0 0 0 2026-03-10T14:08:17.643 INFO:tasks.workunit.client.1.vm04.stdout:4/807: symlink d4/db9/l10a 0 2026-03-10T14:08:17.649 INFO:tasks.workunit.client.0.vm03.stdout:5/418: getdents d4/d13 0 2026-03-10T14:08:17.649 INFO:tasks.workunit.client.0.vm03.stdout:7/237: stat d5/d9/d14/c13 0 2026-03-10T14:08:17.649 INFO:tasks.workunit.client.0.vm03.stdout:9/353: write d2/d29/d33/d41/f57 [145690,79230] 0 2026-03-10T14:08:17.649 INFO:tasks.workunit.client.1.vm04.stdout:9/822: link d9/da/dd/de7/d96/d9d/laf d9/da/l11e 0 2026-03-10T14:08:17.649 INFO:tasks.workunit.client.1.vm04.stdout:7/887: rename d2/dc/de/d2d/l4f to d2/dc/de/d2d/d38/d50/dc8/l139 0 2026-03-10T14:08:17.649 INFO:tasks.workunit.client.1.vm04.stdout:7/888: mknod d2/dc/de/d2d/d38/c13a 0 2026-03-10T14:08:17.649 INFO:tasks.workunit.client.1.vm04.stdout:7/889: chown d2/dc/ld8 11 1 2026-03-10T14:08:17.649 INFO:tasks.workunit.client.1.vm04.stdout:9/823: mkdir d9/da/d11f 0 2026-03-10T14:08:17.653 INFO:tasks.workunit.client.0.vm03.stdout:9/354: creat d2/d14/d2b/d43/f78 x:0 0 0 2026-03-10T14:08:17.653 INFO:tasks.workunit.client.0.vm03.stdout:5/419: mknod d4/c8a 0 2026-03-10T14:08:17.655 INFO:tasks.workunit.client.0.vm03.stdout:2/313: creat d5/d10/f60 x:0 0 0 2026-03-10T14:08:17.660 INFO:tasks.workunit.client.0.vm03.stdout:9/355: fdatasync d2/d29/d33/d60/f65 0 2026-03-10T14:08:17.660 INFO:tasks.workunit.client.1.vm04.stdout:3/848: sync 2026-03-10T14:08:17.660 INFO:tasks.workunit.client.1.vm04.stdout:7/890: sync 2026-03-10T14:08:17.661 INFO:tasks.workunit.client.1.vm04.stdout:3/849: chown da/dc/d3f/d61/d102/f105 41796279 1 2026-03-10T14:08:17.662 INFO:tasks.workunit.client.0.vm03.stdout:5/420: dread d4/d13/d43/f51 [0,4194304] 0 2026-03-10T14:08:17.662 INFO:tasks.workunit.client.1.vm04.stdout:7/891: truncate d2/dc/de/d2d/d5c/da9/f123 302677 0 2026-03-10T14:08:17.662 INFO:tasks.workunit.client.1.vm04.stdout:7/892: stat d2/dc/de/d2d/d60/d81/db3/fb9 0 2026-03-10T14:08:17.664 INFO:tasks.workunit.client.1.vm04.stdout:7/893: truncate d2/df9/f12d 613583 0 2026-03-10T14:08:17.665 INFO:tasks.workunit.client.1.vm04.stdout:3/850: chown da/dc/d35/d37/d10a/d10f 1515836 1 2026-03-10T14:08:17.665 INFO:tasks.workunit.client.1.vm04.stdout:3/851: chown da/d8e/le6 93 1 2026-03-10T14:08:17.666 INFO:tasks.workunit.client.0.vm03.stdout:7/238: unlink d5/d9/d14/d26/f27 0 2026-03-10T14:08:17.667 INFO:tasks.workunit.client.1.vm04.stdout:3/852: creat da/dc/d47/d9b/f122 x:0 0 0 2026-03-10T14:08:17.668 INFO:tasks.workunit.client.1.vm04.stdout:3/853: read f8 [2410864,10335] 0 2026-03-10T14:08:17.670 INFO:tasks.workunit.client.1.vm04.stdout:3/854: link da/dc/fcc da/dc/d35/d52/d53/d78/f123 0 2026-03-10T14:08:17.673 INFO:tasks.workunit.client.0.vm03.stdout:7/239: creat d5/f48 x:0 0 0 2026-03-10T14:08:17.674 INFO:tasks.workunit.client.0.vm03.stdout:5/421: creat d4/d13/d43/f8b x:0 0 0 2026-03-10T14:08:17.676 INFO:tasks.workunit.client.0.vm03.stdout:9/356: mkdir d2/d14/d2b/d79 0 2026-03-10T14:08:17.676 INFO:tasks.workunit.client.0.vm03.stdout:9/357: readlink d2/d29/l6b 0 2026-03-10T14:08:17.711 INFO:tasks.workunit.client.1.vm04.stdout:5/911: write d7/d9/db5/f10b [825532,25312] 0 2026-03-10T14:08:17.712 INFO:tasks.workunit.client.1.vm04.stdout:5/912: truncate d7/d12/d2b/d93/fd9 1114696 0 2026-03-10T14:08:17.715 INFO:tasks.workunit.client.1.vm04.stdout:5/913: truncate d7/d2d/fbf 811856 0 2026-03-10T14:08:17.716 INFO:tasks.workunit.client.1.vm04.stdout:6/738: dread d3/de/d35/d3a/fb5 [0,4194304] 0 2026-03-10T14:08:17.718 INFO:tasks.workunit.client.0.vm03.stdout:7/240: mknod d5/d9/d14/d26/d36/c49 0 2026-03-10T14:08:17.722 INFO:tasks.workunit.client.1.vm04.stdout:5/914: creat d7/d2d/d32/f12b x:0 0 0 2026-03-10T14:08:17.723 INFO:tasks.workunit.client.0.vm03.stdout:7/241: read d5/d9/d14/d21/f29 [3612028,100387] 0 2026-03-10T14:08:17.726 INFO:tasks.workunit.client.0.vm03.stdout:6/320: write d8/db/df/f37 [1022477,108870] 0 2026-03-10T14:08:17.726 INFO:tasks.workunit.client.0.vm03.stdout:6/321: stat d8/l9 0 2026-03-10T14:08:17.727 INFO:tasks.workunit.client.1.vm04.stdout:0/860: write d0/fb0 [1431555,105427] 0 2026-03-10T14:08:17.730 INFO:tasks.workunit.client.0.vm03.stdout:1/347: dwrite d0/d2/d34/f3e [0,4194304] 0 2026-03-10T14:08:17.732 INFO:tasks.workunit.client.0.vm03.stdout:4/372: dwrite d5/d9/db/d19/d34/f4c [0,4194304] 0 2026-03-10T14:08:17.733 INFO:tasks.workunit.client.0.vm03.stdout:8/339: truncate da/d24/f43 827114 0 2026-03-10T14:08:17.734 INFO:tasks.workunit.client.0.vm03.stdout:8/340: write da/fe [4199510,105124] 0 2026-03-10T14:08:17.734 INFO:tasks.workunit.client.0.vm03.stdout:1/348: truncate d0/d2/df/f6c 774 0 2026-03-10T14:08:17.738 INFO:tasks.workunit.client.0.vm03.stdout:8/341: dwrite f2 [4194304,4194304] 0 2026-03-10T14:08:17.740 INFO:tasks.workunit.client.1.vm04.stdout:9/824: fsync d9/d58/f62 0 2026-03-10T14:08:17.741 INFO:tasks.workunit.client.1.vm04.stdout:9/825: write d9/da/dd/d1c/f22 [1103206,32452] 0 2026-03-10T14:08:17.741 INFO:tasks.workunit.client.1.vm04.stdout:9/826: stat d9/d33/f76 0 2026-03-10T14:08:17.741 INFO:tasks.workunit.client.1.vm04.stdout:9/827: fsync d9/dd3/f108 0 2026-03-10T14:08:17.746 INFO:tasks.workunit.client.1.vm04.stdout:5/915: creat d7/d2d/d76/f12c x:0 0 0 2026-03-10T14:08:17.761 INFO:tasks.workunit.client.0.vm03.stdout:6/322: creat d8/d1b/f5f x:0 0 0 2026-03-10T14:08:17.762 INFO:tasks.workunit.client.1.vm04.stdout:0/861: mknod d0/da2/c10f 0 2026-03-10T14:08:17.762 INFO:tasks.workunit.client.1.vm04.stdout:5/916: chown d7/d59/d7d/c123 13390 1 2026-03-10T14:08:17.762 INFO:tasks.workunit.client.1.vm04.stdout:0/862: readlink d0/d2/d15/d22/d62/l8f 0 2026-03-10T14:08:17.762 INFO:tasks.workunit.client.1.vm04.stdout:0/863: unlink d0/f99 0 2026-03-10T14:08:17.762 INFO:tasks.workunit.client.1.vm04.stdout:0/864: chown d0/d2/d15/d22/d38/d56/l9b 6717930 1 2026-03-10T14:08:17.762 INFO:tasks.workunit.client.1.vm04.stdout:0/865: fdatasync d0/d2/d15/d22/d38/d56/f67 0 2026-03-10T14:08:17.762 INFO:tasks.workunit.client.1.vm04.stdout:5/917: getdents d7/d2d/d32/d34 0 2026-03-10T14:08:17.763 INFO:tasks.workunit.client.1.vm04.stdout:8/949: dread d0/d3/d73/db8/d103/d11f/d8a/fa1 [0,4194304] 0 2026-03-10T14:08:17.764 INFO:tasks.workunit.client.1.vm04.stdout:8/950: chown d0/d3/d63/d12/d51/d67/d96/d105/fcc 1071954 1 2026-03-10T14:08:17.765 INFO:tasks.workunit.client.1.vm04.stdout:5/918: creat d7/d12/d2b/f12d x:0 0 0 2026-03-10T14:08:17.767 INFO:tasks.workunit.client.1.vm04.stdout:8/951: creat d0/d3/d73/db8/d103/d11f/d8a/f129 x:0 0 0 2026-03-10T14:08:17.771 INFO:tasks.workunit.client.0.vm03.stdout:4/373: creat d5/d9/db/d19/d38/f77 x:0 0 0 2026-03-10T14:08:17.778 INFO:tasks.workunit.client.0.vm03.stdout:0/302: chown d3/f19 0 1 2026-03-10T14:08:17.779 INFO:tasks.workunit.client.0.vm03.stdout:3/315: dwrite f17 [0,4194304] 0 2026-03-10T14:08:17.786 INFO:tasks.workunit.client.0.vm03.stdout:3/316: dwrite d1d/d33/f3a [0,4194304] 0 2026-03-10T14:08:17.790 INFO:tasks.workunit.client.0.vm03.stdout:3/317: write d1d/d40/f5a [232900,54409] 0 2026-03-10T14:08:17.791 INFO:tasks.workunit.client.0.vm03.stdout:8/342: symlink da/d3a/d44/l6f 0 2026-03-10T14:08:17.800 INFO:tasks.workunit.client.0.vm03.stdout:6/323: mkdir d8/db/d12/d51/d5c/d60 0 2026-03-10T14:08:17.804 INFO:tasks.workunit.client.0.vm03.stdout:4/374: creat d5/d47/d5b/d64/f78 x:0 0 0 2026-03-10T14:08:17.818 INFO:tasks.workunit.client.0.vm03.stdout:4/375: dread - d5/d47/d5b/d64/f78 zero size 2026-03-10T14:08:17.818 INFO:tasks.workunit.client.0.vm03.stdout:3/318: truncate d1d/d29/f2e 1152253 0 2026-03-10T14:08:17.818 INFO:tasks.workunit.client.0.vm03.stdout:3/319: stat d1d/f32 0 2026-03-10T14:08:17.818 INFO:tasks.workunit.client.0.vm03.stdout:3/320: chown d1d/d29/d41 2271 1 2026-03-10T14:08:17.818 INFO:tasks.workunit.client.0.vm03.stdout:3/321: chown l5 3 1 2026-03-10T14:08:17.818 INFO:tasks.workunit.client.0.vm03.stdout:8/343: readlink da/l1d 0 2026-03-10T14:08:17.818 INFO:tasks.workunit.client.0.vm03.stdout:4/376: creat d5/d47/d5b/f79 x:0 0 0 2026-03-10T14:08:17.822 INFO:tasks.workunit.client.0.vm03.stdout:6/324: link d8/db/df/f27 d8/d1b/f61 0 2026-03-10T14:08:17.823 INFO:tasks.workunit.client.0.vm03.stdout:4/377: dread - d5/d9/db/f12 zero size 2026-03-10T14:08:17.826 INFO:tasks.workunit.client.0.vm03.stdout:4/378: symlink d5/d9/l7a 0 2026-03-10T14:08:17.829 INFO:tasks.workunit.client.0.vm03.stdout:4/379: mkdir d5/d9/db/d19/d38/d7b 0 2026-03-10T14:08:17.836 INFO:tasks.workunit.client.0.vm03.stdout:6/325: link d8/db/d2c/d2d/l4c d8/db/d12/d51/d5c/d60/l62 0 2026-03-10T14:08:17.873 INFO:tasks.workunit.client.0.vm03.stdout:6/326: creat d8/db/df/f63 x:0 0 0 2026-03-10T14:08:17.873 INFO:tasks.workunit.client.0.vm03.stdout:6/327: mkdir d8/db/d12/d64 0 2026-03-10T14:08:17.873 INFO:tasks.workunit.client.0.vm03.stdout:6/328: write d8/db/df/f37 [1006710,64246] 0 2026-03-10T14:08:17.873 INFO:tasks.workunit.client.0.vm03.stdout:6/329: rmdir d8/d1b/d55 0 2026-03-10T14:08:17.873 INFO:tasks.workunit.client.0.vm03.stdout:6/330: write d8/d1b/f5f [99533,76024] 0 2026-03-10T14:08:17.878 INFO:tasks.workunit.client.1.vm04.stdout:5/919: sync 2026-03-10T14:08:17.879 INFO:tasks.workunit.client.1.vm04.stdout:5/920: rmdir d7/d59/d7e/dc6 39 2026-03-10T14:08:17.881 INFO:tasks.workunit.client.1.vm04.stdout:5/921: symlink d7/d12/d2b/d3e/d57/d8a/dec/d11e/l12e 0 2026-03-10T14:08:17.885 INFO:tasks.workunit.client.0.vm03.stdout:2/314: dread d5/d10/d1c/f3c [0,4194304] 0 2026-03-10T14:08:17.895 INFO:tasks.workunit.client.0.vm03.stdout:2/315: fsync d5/d10/d31/f38 0 2026-03-10T14:08:17.895 INFO:tasks.workunit.client.0.vm03.stdout:2/316: fsync d5/d10/f13 0 2026-03-10T14:08:17.895 INFO:tasks.workunit.client.0.vm03.stdout:2/317: creat d5/f61 x:0 0 0 2026-03-10T14:08:17.901 INFO:tasks.workunit.client.0.vm03.stdout:2/318: dread d5/f9 [0,4194304] 0 2026-03-10T14:08:17.917 INFO:tasks.workunit.client.1.vm04.stdout:2/848: write d0/d14/d91/d3a/d3e/f38 [4164,21055] 0 2026-03-10T14:08:17.918 INFO:tasks.workunit.client.0.vm03.stdout:2/319: dread f1 [0,4194304] 0 2026-03-10T14:08:17.918 INFO:tasks.workunit.client.1.vm04.stdout:2/849: rmdir d0/d14/d91/d8/d17/d4e/dea/dde 39 2026-03-10T14:08:17.921 INFO:tasks.workunit.client.0.vm03.stdout:2/320: truncate d5/d35/f49 645483 0 2026-03-10T14:08:17.922 INFO:tasks.workunit.client.1.vm04.stdout:1/863: truncate d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/f11a 2242148 0 2026-03-10T14:08:17.924 INFO:tasks.workunit.client.1.vm04.stdout:1/864: symlink d3/d22/d63/d35/dd9/d13/d38/df3/l128 0 2026-03-10T14:08:17.929 INFO:tasks.workunit.client.0.vm03.stdout:9/358: sync 2026-03-10T14:08:17.931 INFO:tasks.workunit.client.1.vm04.stdout:1/865: rename d3/d22/d63/d35/dd9/d13/d38/d58/d5b/fa8 to d3/d22/d63/d35/dd9/f129 0 2026-03-10T14:08:17.935 INFO:tasks.workunit.client.0.vm03.stdout:9/359: chown d2/f2f 140542 1 2026-03-10T14:08:17.935 INFO:tasks.workunit.client.1.vm04.stdout:4/808: dwrite d4/df/d31/fae [0,4194304] 0 2026-03-10T14:08:17.935 INFO:tasks.workunit.client.0.vm03.stdout:6/331: sync 2026-03-10T14:08:17.936 INFO:tasks.workunit.client.0.vm03.stdout:2/321: getdents d5/d10/d1c/d40 0 2026-03-10T14:08:17.937 INFO:tasks.workunit.client.1.vm04.stdout:7/894: write d2/dc/de/d2d/d60/d81/f113 [7943727,2456] 0 2026-03-10T14:08:17.941 INFO:tasks.workunit.client.1.vm04.stdout:7/895: mkdir d2/dc/de/d2d/d5c/da9/dee/d13b 0 2026-03-10T14:08:17.952 INFO:tasks.workunit.client.0.vm03.stdout:9/360: creat d2/d14/d2b/d34/f7a x:0 0 0 2026-03-10T14:08:17.953 INFO:tasks.workunit.client.0.vm03.stdout:9/361: write d2/d14/d2b/d43/f78 [280558,48768] 0 2026-03-10T14:08:17.959 INFO:tasks.workunit.client.0.vm03.stdout:7/242: stat d5/d9/f10 0 2026-03-10T14:08:17.963 INFO:tasks.workunit.client.0.vm03.stdout:5/422: write d4/d13/d43/f72 [49879,52533] 0 2026-03-10T14:08:17.978 INFO:tasks.workunit.client.1.vm04.stdout:4/809: dread d4/df/d31/f3d [0,4194304] 0 2026-03-10T14:08:17.978 INFO:tasks.workunit.client.1.vm04.stdout:4/810: chown d4/d14/d64/ld8 86250 1 2026-03-10T14:08:17.979 INFO:tasks.workunit.client.1.vm04.stdout:4/811: read d4/d14/d64/fd6 [79091,61649] 0 2026-03-10T14:08:17.982 INFO:tasks.workunit.client.0.vm03.stdout:6/332: symlink d8/d11/l65 0 2026-03-10T14:08:17.983 INFO:tasks.workunit.client.1.vm04.stdout:4/812: dwrite d4/d14/d3c/d85/ffd [0,4194304] 0 2026-03-10T14:08:17.985 INFO:tasks.workunit.client.1.vm04.stdout:4/813: symlink d4/df7/l10b 0 2026-03-10T14:08:17.987 INFO:tasks.workunit.client.0.vm03.stdout:9/362: creat d2/d29/d33/d41/f7b x:0 0 0 2026-03-10T14:08:17.988 INFO:tasks.workunit.client.0.vm03.stdout:5/423: mkdir d4/d13/d1f/d8c 0 2026-03-10T14:08:17.995 INFO:tasks.workunit.client.0.vm03.stdout:6/333: truncate d8/fd 5184072 0 2026-03-10T14:08:17.995 INFO:tasks.workunit.client.0.vm03.stdout:7/243: read d5/d9/f19 [1185549,109918] 0 2026-03-10T14:08:18.008 INFO:tasks.workunit.client.0.vm03.stdout:5/424: dwrite d4/d16/d19/f79 [0,4194304] 0 2026-03-10T14:08:18.015 INFO:tasks.workunit.client.0.vm03.stdout:7/244: symlink d5/d9/d35/l4a 0 2026-03-10T14:08:18.017 INFO:tasks.workunit.client.0.vm03.stdout:5/425: rename d4/d6/c88 to d4/d13/d1f/d8c/c8d 0 2026-03-10T14:08:18.021 INFO:tasks.workunit.client.0.vm03.stdout:5/426: chown d4/d40/d4e/c66 17 1 2026-03-10T14:08:18.023 INFO:tasks.workunit.client.1.vm04.stdout:3/855: write da/dc/d35/f64 [142743,43494] 0 2026-03-10T14:08:18.025 INFO:tasks.workunit.client.1.vm04.stdout:5/922: truncate d7/f21 1302407 0 2026-03-10T14:08:18.031 INFO:tasks.workunit.client.1.vm04.stdout:3/856: creat da/dc/d35/d52/d6d/f124 x:0 0 0 2026-03-10T14:08:18.031 INFO:tasks.workunit.client.1.vm04.stdout:3/857: dread - da/d3e/fb9 zero size 2026-03-10T14:08:18.032 INFO:tasks.workunit.client.1.vm04.stdout:3/858: stat da/d30/fd0 0 2026-03-10T14:08:18.034 INFO:tasks.workunit.client.1.vm04.stdout:5/923: rename d7/l65 to d7/d12/d2b/d93/d9e/l12f 0 2026-03-10T14:08:18.039 INFO:tasks.workunit.client.1.vm04.stdout:9/828: dwrite d9/da/f9e [0,4194304] 0 2026-03-10T14:08:18.047 INFO:tasks.workunit.client.1.vm04.stdout:3/859: fdatasync da/dc/fc2 0 2026-03-10T14:08:18.047 INFO:tasks.workunit.client.0.vm03.stdout:3/322: write d1d/f26 [225149,75929] 0 2026-03-10T14:08:18.047 INFO:tasks.workunit.client.0.vm03.stdout:4/380: write d5/d9/db/d19/d34/f5c [811924,53235] 0 2026-03-10T14:08:18.048 INFO:tasks.workunit.client.1.vm04.stdout:5/924: mknod d7/d59/c130 0 2026-03-10T14:08:18.049 INFO:tasks.workunit.client.0.vm03.stdout:4/381: write d5/d9/db/d19/d38/d53/f59 [2448404,16598] 0 2026-03-10T14:08:18.051 INFO:tasks.workunit.client.1.vm04.stdout:0/866: dwrite d0/d2/f2d [0,4194304] 0 2026-03-10T14:08:18.053 INFO:tasks.workunit.client.0.vm03.stdout:1/349: dwrite d0/d2/f9 [0,4194304] 0 2026-03-10T14:08:18.054 INFO:tasks.workunit.client.0.vm03.stdout:0/303: dwrite d3/d16/f2d [0,4194304] 0 2026-03-10T14:08:18.055 INFO:tasks.workunit.client.0.vm03.stdout:0/304: write d3/d16/d21/d3c/f5b [329947,121485] 0 2026-03-10T14:08:18.063 INFO:tasks.workunit.client.1.vm04.stdout:8/952: write d0/d3/dd/d78/fae [4838439,124568] 0 2026-03-10T14:08:18.069 INFO:tasks.workunit.client.1.vm04.stdout:5/925: rename d7/d2d/d32/c8b to d7/d12/d2b/d8c/c131 0 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.0.vm03.stdout:7/245: symlink d5/d9/d3e/l4b 0 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.0.vm03.stdout:3/323: creat d1d/d33/f5e x:0 0 0 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.0.vm03.stdout:3/324: readlink l14 0 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.0.vm03.stdout:3/325: write f17 [2471264,112620] 0 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.1.vm04.stdout:6/739: write d3/de/d35/d3f/d2d/fc7 [1270393,102128] 0 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.1.vm04.stdout:9/829: read - d9/d58/db5/da5/dab/fbb zero size 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.1.vm04.stdout:0/867: chown d0/d2/d15/d22/d38/d56/da7/lf3 14 1 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.1.vm04.stdout:8/953: creat d0/d82/f12a x:0 0 0 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.1.vm04.stdout:8/954: write d0/f127 [567930,68018] 0 2026-03-10T14:08:18.076 INFO:tasks.workunit.client.1.vm04.stdout:2/850: dwrite d0/d14/d91/d8/f30 [0,4194304] 0 2026-03-10T14:08:18.077 INFO:tasks.workunit.client.1.vm04.stdout:2/851: truncate d0/d14/d91/d3a/ffd 510665 0 2026-03-10T14:08:18.084 INFO:tasks.workunit.client.1.vm04.stdout:8/955: dwrite d0/f42 [0,4194304] 0 2026-03-10T14:08:18.084 INFO:tasks.workunit.client.0.vm03.stdout:4/382: rename d5/d9/db/d19/f45 to d5/d47/d5b/f7c 0 2026-03-10T14:08:18.085 INFO:tasks.workunit.client.1.vm04.stdout:1/866: write d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92/f93 [3981656,55848] 0 2026-03-10T14:08:18.089 INFO:tasks.workunit.client.1.vm04.stdout:7/896: dwrite d2/d2a/d42/d86/f104 [0,4194304] 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.1.vm04.stdout:4/814: truncate d4/df/db2/db4/d47/d4f/ff0 584994 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.1.vm04.stdout:0/868: creat d0/d2/dc9/f110 x:0 0 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.1.vm04.stdout:9/830: readlink d9/da/dd/d1c/da3/lac 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.1.vm04.stdout:8/956: fdatasync d0/d3/d63/d29/f45 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.0.vm03.stdout:7/246: mknod d5/d9/d3e/c4c 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.0.vm03.stdout:3/326: mknod d1d/d39/d51/c5f 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.0.vm03.stdout:0/305: mkdir d3/d17/d5c 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.0.vm03.stdout:3/327: chown d1d/l22 94265 1 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.0.vm03.stdout:0/306: dwrite d3/f1e [0,4194304] 0 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.1.vm04.stdout:6/740: sync 2026-03-10T14:08:18.103 INFO:tasks.workunit.client.1.vm04.stdout:5/926: sync 2026-03-10T14:08:18.111 INFO:tasks.workunit.client.1.vm04.stdout:1/867: dread d3/d5c/f71 [4194304,4194304] 0 2026-03-10T14:08:18.111 INFO:tasks.workunit.client.1.vm04.stdout:7/897: stat d2/dc/de/d11/la1 0 2026-03-10T14:08:18.116 INFO:tasks.workunit.client.1.vm04.stdout:4/815: rename d4/df/d34/ccc to d4/df/db2/db6/c10c 0 2026-03-10T14:08:18.121 INFO:tasks.workunit.client.0.vm03.stdout:5/427: creat d4/d35/f8e x:0 0 0 2026-03-10T14:08:18.121 INFO:tasks.workunit.client.1.vm04.stdout:0/869: creat d0/d6e/f111 x:0 0 0 2026-03-10T14:08:18.133 INFO:tasks.workunit.client.0.vm03.stdout:1/350: getdents d0/d18/d1d 0 2026-03-10T14:08:18.133 INFO:tasks.workunit.client.1.vm04.stdout:8/957: getdents d0/d3/d73/db8/dd5/d123 0 2026-03-10T14:08:18.133 INFO:tasks.workunit.client.0.vm03.stdout:1/351: dread - d0/d2/df/d16/f1e zero size 2026-03-10T14:08:18.135 INFO:tasks.workunit.client.1.vm04.stdout:5/927: dread - d7/d2d/d69/db8/f103 zero size 2026-03-10T14:08:18.139 INFO:tasks.workunit.client.0.vm03.stdout:1/352: dwrite d0/d18/d1d/f5e [0,4194304] 0 2026-03-10T14:08:18.148 INFO:tasks.workunit.client.1.vm04.stdout:6/741: dread d3/de/d35/d3f/d2d/d32/d23/f31 [4194304,4194304] 0 2026-03-10T14:08:18.149 INFO:tasks.workunit.client.1.vm04.stdout:6/742: chown d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/fb7 711203088 1 2026-03-10T14:08:18.151 INFO:tasks.workunit.client.0.vm03.stdout:5/428: truncate d4/d6/fa 2629738 0 2026-03-10T14:08:18.151 INFO:tasks.workunit.client.1.vm04.stdout:6/743: dwrite d3/de/d35/d3f/d2d/f2e [0,4194304] 0 2026-03-10T14:08:18.153 INFO:tasks.workunit.client.1.vm04.stdout:8/958: mknod d0/d3/d73/db8/dd5/d123/c12b 0 2026-03-10T14:08:18.161 INFO:tasks.workunit.client.0.vm03.stdout:0/307: mknod d3/d17/d5c/c5d 0 2026-03-10T14:08:18.162 INFO:tasks.workunit.client.1.vm04.stdout:6/744: symlink d3/le6 0 2026-03-10T14:08:18.170 INFO:tasks.workunit.client.0.vm03.stdout:1/353: rename d0/d18/f60 to d0/d18/d1d/f6f 0 2026-03-10T14:08:18.173 INFO:tasks.workunit.client.0.vm03.stdout:5/429: truncate d4/d13/d1f/f21 2072791 0 2026-03-10T14:08:18.181 INFO:tasks.workunit.client.0.vm03.stdout:0/308: mkdir d3/d46/d5e 0 2026-03-10T14:08:18.191 INFO:tasks.workunit.client.0.vm03.stdout:0/309: creat d3/d4d/d47/f5f x:0 0 0 2026-03-10T14:08:18.195 INFO:tasks.workunit.client.0.vm03.stdout:5/430: mkdir d4/d13/d8f 0 2026-03-10T14:08:18.199 INFO:tasks.workunit.client.0.vm03.stdout:1/354: creat d0/f70 x:0 0 0 2026-03-10T14:08:18.200 INFO:tasks.workunit.client.0.vm03.stdout:5/431: mknod d4/d16/d19/d23/c90 0 2026-03-10T14:08:18.201 INFO:tasks.workunit.client.0.vm03.stdout:5/432: readlink d4/d16/d19/d6e/l59 0 2026-03-10T14:08:18.201 INFO:tasks.workunit.client.0.vm03.stdout:5/433: stat d4/d16/d19/l26 0 2026-03-10T14:08:18.210 INFO:tasks.workunit.client.0.vm03.stdout:4/383: read f3 [6423901,97987] 0 2026-03-10T14:08:18.213 INFO:tasks.workunit.client.1.vm04.stdout:9/831: write d9/da/dd/f47 [5053387,93356] 0 2026-03-10T14:08:18.213 INFO:tasks.workunit.client.1.vm04.stdout:9/832: mkdir d9/d44/d70/d120 0 2026-03-10T14:08:18.214 INFO:tasks.workunit.client.1.vm04.stdout:9/833: mkdir d9/da/d8c/d121 0 2026-03-10T14:08:18.215 INFO:tasks.workunit.client.1.vm04.stdout:9/834: dread - d9/d44/d70/f106 zero size 2026-03-10T14:08:18.215 INFO:tasks.workunit.client.1.vm04.stdout:9/835: chown d9/da/d5d/fde 7198602 1 2026-03-10T14:08:18.216 INFO:tasks.workunit.client.1.vm04.stdout:9/836: write d9/da/dd/d74/fe1 [139813,85569] 0 2026-03-10T14:08:18.221 INFO:tasks.workunit.client.1.vm04.stdout:2/852: dread d0/db8/fca [0,4194304] 0 2026-03-10T14:08:18.228 INFO:tasks.workunit.client.0.vm03.stdout:5/434: chown d4/d6/c45 17 1 2026-03-10T14:08:18.229 INFO:tasks.workunit.client.1.vm04.stdout:2/853: creat d0/d14/d91/d4a/d8c/d92/f102 x:0 0 0 2026-03-10T14:08:18.229 INFO:tasks.workunit.client.1.vm04.stdout:2/854: mkdir d0/d14/d91/d8/d17/d4e/d85/d86/d96/d103 0 2026-03-10T14:08:18.229 INFO:tasks.workunit.client.1.vm04.stdout:2/855: symlink d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/l104 0 2026-03-10T14:08:18.229 INFO:tasks.workunit.client.1.vm04.stdout:2/856: dwrite d0/d14/f6b [0,4194304] 0 2026-03-10T14:08:18.233 INFO:tasks.workunit.client.1.vm04.stdout:7/898: write d2/dc/de/d2d/d38/fa5 [870925,46225] 0 2026-03-10T14:08:18.234 INFO:tasks.workunit.client.1.vm04.stdout:7/899: chown d2/d94/f9a 14 1 2026-03-10T14:08:18.234 INFO:tasks.workunit.client.1.vm04.stdout:7/900: stat d2/dc/de/d2d/d60/d7c/df8 0 2026-03-10T14:08:18.235 INFO:tasks.workunit.client.1.vm04.stdout:7/901: readlink d2/dc/de/d2d/d60/d7c/d36/l9d 0 2026-03-10T14:08:18.239 INFO:tasks.workunit.client.0.vm03.stdout:4/384: mknod d5/d9/db/d19/d38/d53/c7d 0 2026-03-10T14:08:18.243 INFO:tasks.workunit.client.1.vm04.stdout:2/857: fsync d0/d14/d91/d8/d17/d35/f81 0 2026-03-10T14:08:18.248 INFO:tasks.workunit.client.1.vm04.stdout:7/902: creat d2/dc/de/d2d/d5c/da9/df6/dfe/f13c x:0 0 0 2026-03-10T14:08:18.248 INFO:tasks.workunit.client.1.vm04.stdout:4/816: dwrite d4/df/db2/db4/d47/d4f/f84 [0,4194304] 0 2026-03-10T14:08:18.255 INFO:tasks.workunit.client.0.vm03.stdout:5/435: link d4/d16/d19/d6e/f85 d4/d13/d8f/f91 0 2026-03-10T14:08:18.255 INFO:tasks.workunit.client.0.vm03.stdout:6/334: dread d8/d1b/d1c/f50 [0,4194304] 0 2026-03-10T14:08:18.255 INFO:tasks.workunit.client.1.vm04.stdout:2/858: rename d0/d14/d91/d4a/d8c/dab/f33 to d0/d14/d91/d4a/d66/dcd/f105 0 2026-03-10T14:08:18.256 INFO:tasks.workunit.client.0.vm03.stdout:4/385: mkdir d5/d47/d62/d7e 0 2026-03-10T14:08:18.256 INFO:tasks.workunit.client.1.vm04.stdout:7/903: mkdir d2/dc/de/d2d/d60/d7c/d36/d8b/d13d 0 2026-03-10T14:08:18.257 INFO:tasks.workunit.client.1.vm04.stdout:7/904: chown d2/dc/d4d/dcd/f100 13 1 2026-03-10T14:08:18.261 INFO:tasks.workunit.client.1.vm04.stdout:4/817: symlink d4/d14/d6d/df5/l10d 0 2026-03-10T14:08:18.264 INFO:tasks.workunit.client.1.vm04.stdout:7/905: creat d2/dc/de/d2d/d60/d81/db3/f13e x:0 0 0 2026-03-10T14:08:18.264 INFO:tasks.workunit.client.0.vm03.stdout:0/310: rename d3/d11/d2c/f3b to d3/d17/f60 0 2026-03-10T14:08:18.269 INFO:tasks.workunit.client.1.vm04.stdout:7/906: symlink d2/d2a/d42/d86/l13f 0 2026-03-10T14:08:18.275 INFO:tasks.workunit.client.1.vm04.stdout:7/907: mknod d2/dd5/c140 0 2026-03-10T14:08:18.278 INFO:tasks.workunit.client.0.vm03.stdout:6/335: creat d8/db/d2c/d2d/f66 x:0 0 0 2026-03-10T14:08:18.304 INFO:tasks.workunit.client.1.vm04.stdout:7/908: rmdir d2/d2a 39 2026-03-10T14:08:18.304 INFO:tasks.workunit.client.0.vm03.stdout:6/336: creat d8/db/d12/d51/d5c/d60/f67 x:0 0 0 2026-03-10T14:08:18.304 INFO:tasks.workunit.client.0.vm03.stdout:6/337: dwrite d8/d11/d18/d54/f5b [0,4194304] 0 2026-03-10T14:08:18.304 INFO:tasks.workunit.client.0.vm03.stdout:6/338: dread - d8/db/d12/d51/d5c/d60/f67 zero size 2026-03-10T14:08:18.311 INFO:tasks.workunit.client.1.vm04.stdout:1/868: write d3/d22/fe1 [1271331,11528] 0 2026-03-10T14:08:18.315 INFO:tasks.workunit.client.1.vm04.stdout:5/928: write d7/d2d/fbf [592840,51816] 0 2026-03-10T14:08:18.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:18 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:18.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:18 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:18.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:18 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:08:18.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:18 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:18.316 INFO:tasks.workunit.client.1.vm04.stdout:6/745: write d3/ff [76350,94786] 0 2026-03-10T14:08:18.316 INFO:tasks.workunit.client.1.vm04.stdout:8/959: write d0/d3/d73/db8/d103/d11f/d8a/fdf [132372,66458] 0 2026-03-10T14:08:18.318 INFO:tasks.workunit.client.1.vm04.stdout:8/960: stat d0/d3/d63/d12/d51/f64 0 2026-03-10T14:08:18.319 INFO:tasks.workunit.client.1.vm04.stdout:0/870: dwrite d0/d2/d25/f2a [4194304,4194304] 0 2026-03-10T14:08:18.322 INFO:tasks.workunit.client.1.vm04.stdout:8/961: chown d0/d3/d63/d12/d51/d67/d96/d105/d117 929 1 2026-03-10T14:08:18.329 INFO:tasks.workunit.client.1.vm04.stdout:1/869: creat d3/d22/d63/d35/dd9/d13/d38/db5/dc4/f12a x:0 0 0 2026-03-10T14:08:18.333 INFO:tasks.workunit.client.1.vm04.stdout:0/871: creat d0/d2/d15/d49/d50/d5c/da4/f112 x:0 0 0 2026-03-10T14:08:18.337 INFO:tasks.workunit.client.0.vm03.stdout:5/436: dread d4/fd [0,4194304] 0 2026-03-10T14:08:18.339 INFO:tasks.workunit.client.0.vm03.stdout:0/311: sync 2026-03-10T14:08:18.339 INFO:tasks.workunit.client.1.vm04.stdout:6/746: getdents d3/de/d35/d3f/d2d/d32 0 2026-03-10T14:08:18.340 INFO:tasks.workunit.client.1.vm04.stdout:5/929: sync 2026-03-10T14:08:18.342 INFO:tasks.workunit.client.1.vm04.stdout:0/872: creat d0/d2/d15/d49/d50/d61/f113 x:0 0 0 2026-03-10T14:08:18.342 INFO:tasks.workunit.client.1.vm04.stdout:0/873: chown d0/d2/f13 497 1 2026-03-10T14:08:18.344 INFO:tasks.workunit.client.1.vm04.stdout:8/962: getdents d0/d3/dd/d89 0 2026-03-10T14:08:18.345 INFO:tasks.workunit.client.1.vm04.stdout:1/870: getdents d3/d22/d2f/d109 0 2026-03-10T14:08:18.346 INFO:tasks.workunit.client.1.vm04.stdout:1/871: chown d3/d22/d63/d35/dd9/d13/da0/dc5/dfa/c104 26901967 1 2026-03-10T14:08:18.347 INFO:tasks.workunit.client.1.vm04.stdout:1/872: write d3/d22/d2f/f34 [3649817,7896] 0 2026-03-10T14:08:18.353 INFO:tasks.workunit.client.1.vm04.stdout:5/930: symlink d7/d12/d2b/d3e/d57/l132 0 2026-03-10T14:08:18.357 INFO:tasks.workunit.client.1.vm04.stdout:5/931: creat d7/d12/d2b/d3e/f133 x:0 0 0 2026-03-10T14:08:18.359 INFO:tasks.workunit.client.1.vm04.stdout:1/873: getdents d3/d22/d63/ded 0 2026-03-10T14:08:18.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:18 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:18.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:18 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:18.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:18 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:08:18.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:18 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:18.361 INFO:tasks.workunit.client.1.vm04.stdout:8/963: creat d0/d3/f12c x:0 0 0 2026-03-10T14:08:18.363 INFO:tasks.workunit.client.1.vm04.stdout:1/874: rename d3/d20 to d3/d22/d2f/d109/d12b 0 2026-03-10T14:08:18.364 INFO:tasks.workunit.client.1.vm04.stdout:9/837: write d9/da/dd/f48 [5282671,107193] 0 2026-03-10T14:08:18.368 INFO:tasks.workunit.client.1.vm04.stdout:1/875: mknod d3/d22/d2f/c12c 0 2026-03-10T14:08:18.368 INFO:tasks.workunit.client.1.vm04.stdout:1/876: stat d3/f18 0 2026-03-10T14:08:18.370 INFO:tasks.workunit.client.1.vm04.stdout:8/964: getdents d0/d3/dd/dec 0 2026-03-10T14:08:18.373 INFO:tasks.workunit.client.1.vm04.stdout:9/838: creat d9/d44/f122 x:0 0 0 2026-03-10T14:08:18.389 INFO:tasks.workunit.client.1.vm04.stdout:2/859: rmdir d0/d14/d91/d4a/d66/dcd 39 2026-03-10T14:08:18.393 INFO:tasks.workunit.client.1.vm04.stdout:2/860: mknod d0/d14/d39/d47/d70/dc3/c106 0 2026-03-10T14:08:18.395 INFO:tasks.workunit.client.1.vm04.stdout:2/861: mkdir d0/d14/d91/d4a/d8c/dab/d95/d107 0 2026-03-10T14:08:18.396 INFO:tasks.workunit.client.1.vm04.stdout:8/965: sync 2026-03-10T14:08:18.400 INFO:tasks.workunit.client.0.vm03.stdout:0/312: dread - d3/d16/d21/f50 zero size 2026-03-10T14:08:18.400 INFO:tasks.workunit.client.1.vm04.stdout:4/818: write d4/d14/d3c/f46 [896460,52154] 0 2026-03-10T14:08:18.400 INFO:tasks.workunit.client.0.vm03.stdout:0/313: fdatasync d3/d4d/f49 0 2026-03-10T14:08:18.400 INFO:tasks.workunit.client.0.vm03.stdout:0/314: dread - d3/d4d/d47/f5f zero size 2026-03-10T14:08:18.402 INFO:tasks.workunit.client.1.vm04.stdout:2/862: creat d0/d14/d91/d3a/f108 x:0 0 0 2026-03-10T14:08:18.403 INFO:tasks.workunit.client.1.vm04.stdout:2/863: read d0/d14/d91/d3a/fb5 [932182,13903] 0 2026-03-10T14:08:18.404 INFO:tasks.workunit.client.1.vm04.stdout:8/966: dread - d0/d3/d63/d12/d51/d67/ff0 zero size 2026-03-10T14:08:18.404 INFO:tasks.workunit.client.1.vm04.stdout:2/864: chown d0/d14/d91/d4a/d8c/dab/d46/dc8/cdd 1779 1 2026-03-10T14:08:18.406 INFO:tasks.workunit.client.1.vm04.stdout:8/967: read d0/d3/d5/f66 [1422517,63219] 0 2026-03-10T14:08:18.407 INFO:tasks.workunit.client.1.vm04.stdout:8/968: truncate d0/d3/d5/f30 4586533 0 2026-03-10T14:08:18.408 INFO:tasks.workunit.client.1.vm04.stdout:8/969: read d0/d3/dd/d78/fae [4442929,29033] 0 2026-03-10T14:08:18.411 INFO:tasks.workunit.client.1.vm04.stdout:6/747: read d3/ff [277652,28022] 0 2026-03-10T14:08:18.412 INFO:tasks.workunit.client.1.vm04.stdout:8/970: dwrite d0/d3/d5/f30 [4194304,4194304] 0 2026-03-10T14:08:18.419 INFO:tasks.workunit.client.1.vm04.stdout:0/874: write d0/d2/d15/d22/f30 [77454,64241] 0 2026-03-10T14:08:18.423 INFO:tasks.workunit.client.1.vm04.stdout:2/865: truncate d0/d14/d91/d8/d17/f73 2264977 0 2026-03-10T14:08:18.424 INFO:tasks.workunit.client.0.vm03.stdout:0/315: creat d3/d46/d54/f61 x:0 0 0 2026-03-10T14:08:18.424 INFO:tasks.workunit.client.1.vm04.stdout:7/909: dwrite d2/dc/de/d2d/d5c/da9/fd6 [0,4194304] 0 2026-03-10T14:08:18.427 INFO:tasks.workunit.client.1.vm04.stdout:7/910: chown d2/dc/de/d11/c30 91252 1 2026-03-10T14:08:18.427 INFO:tasks.workunit.client.1.vm04.stdout:6/748: creat d3/de/d35/d3f/d2d/d32/d23/ddc/fe7 x:0 0 0 2026-03-10T14:08:18.427 INFO:tasks.workunit.client.0.vm03.stdout:0/316: truncate d3/d11/f13 1318079 0 2026-03-10T14:08:18.427 INFO:tasks.workunit.client.1.vm04.stdout:6/749: dread - d3/d1d/f6c zero size 2026-03-10T14:08:18.428 INFO:tasks.workunit.client.0.vm03.stdout:0/317: write d3/d4d/d30/f32 [4356354,81216] 0 2026-03-10T14:08:18.428 INFO:tasks.workunit.client.0.vm03.stdout:0/318: stat d3/l4e 0 2026-03-10T14:08:18.439 INFO:tasks.workunit.client.1.vm04.stdout:9/839: dwrite d9/d44/f51 [0,4194304] 0 2026-03-10T14:08:18.442 INFO:tasks.workunit.client.1.vm04.stdout:5/932: chown d7/d12/d2b/d3e/d57/d8a/fd6 537 1 2026-03-10T14:08:18.452 INFO:tasks.workunit.client.1.vm04.stdout:6/750: dwrite d3/de/d35/d3a/fb5 [0,4194304] 0 2026-03-10T14:08:18.456 INFO:tasks.workunit.client.1.vm04.stdout:2/866: dread d0/d14/d39/d47/d70/f8d [0,4194304] 0 2026-03-10T14:08:18.464 INFO:tasks.workunit.client.1.vm04.stdout:3/860: dread da/dc/f1d [0,4194304] 0 2026-03-10T14:08:18.467 INFO:tasks.workunit.client.1.vm04.stdout:9/840: creat d9/d44/d4d/d7d/f123 x:0 0 0 2026-03-10T14:08:18.468 INFO:tasks.workunit.client.1.vm04.stdout:3/861: readlink da/dc/d3f/d61/dc1/ld3 0 2026-03-10T14:08:18.468 INFO:tasks.workunit.client.1.vm04.stdout:9/841: chown d9/d44/d4d/f99 653146 1 2026-03-10T14:08:18.469 INFO:tasks.workunit.client.0.vm03.stdout:2/322: dwrite d5/d10/d17/f20 [0,4194304] 0 2026-03-10T14:08:18.474 INFO:tasks.workunit.client.0.vm03.stdout:0/319: link d3/d46/d54/c58 d3/d4d/c62 0 2026-03-10T14:08:18.474 INFO:tasks.workunit.client.0.vm03.stdout:2/323: write d5/d10/f13 [4629389,104943] 0 2026-03-10T14:08:18.475 INFO:tasks.workunit.client.0.vm03.stdout:9/363: dwrite d2/d29/d38/f47 [0,4194304] 0 2026-03-10T14:08:18.477 INFO:tasks.workunit.client.1.vm04.stdout:3/862: sync 2026-03-10T14:08:18.482 INFO:tasks.workunit.client.1.vm04.stdout:7/911: dread d2/dc/de/d2d/d60/d7c/d36/d8b/fb8 [4194304,4194304] 0 2026-03-10T14:08:18.492 INFO:tasks.workunit.client.1.vm04.stdout:4/819: link d4/d14/d3c/d85/la5 d4/df/db2/l10e 0 2026-03-10T14:08:18.494 INFO:tasks.workunit.client.1.vm04.stdout:2/867: creat d0/d14/deb/f109 x:0 0 0 2026-03-10T14:08:18.495 INFO:tasks.workunit.client.1.vm04.stdout:3/863: creat da/dc/d47/d9b/d106/dde/f125 x:0 0 0 2026-03-10T14:08:18.496 INFO:tasks.workunit.client.0.vm03.stdout:4/386: getdents d5/d9/db/d19 0 2026-03-10T14:08:18.497 INFO:tasks.workunit.client.1.vm04.stdout:3/864: dread da/dc/f1d [0,4194304] 0 2026-03-10T14:08:18.499 INFO:tasks.workunit.client.1.vm04.stdout:5/933: creat d7/d2d/d69/f134 x:0 0 0 2026-03-10T14:08:18.499 INFO:tasks.workunit.client.1.vm04.stdout:2/868: creat d0/d14/d54/f10a x:0 0 0 2026-03-10T14:08:18.500 INFO:tasks.workunit.client.1.vm04.stdout:7/912: mknod d2/dc/de/d2d/d38/d50/dc8/d10e/c141 0 2026-03-10T14:08:18.500 INFO:tasks.workunit.client.1.vm04.stdout:3/865: rename da/dc/d47/fbd to da/dc/d47/d9b/f126 0 2026-03-10T14:08:18.501 INFO:tasks.workunit.client.1.vm04.stdout:9/842: link d9/d5c/l60 d9/da/dd/d74/l124 0 2026-03-10T14:08:18.502 INFO:tasks.workunit.client.1.vm04.stdout:9/843: chown d9/d58/db5/c97 105 1 2026-03-10T14:08:18.505 INFO:tasks.workunit.client.1.vm04.stdout:2/869: symlink d0/d14/d91/d3a/d3e/l10b 0 2026-03-10T14:08:18.506 INFO:tasks.workunit.client.1.vm04.stdout:2/870: chown d0/d14/d91/d4a/d8c/dab/d95/la1 412987011 1 2026-03-10T14:08:18.508 INFO:tasks.workunit.client.1.vm04.stdout:7/913: mkdir d2/dc/de/d2d/d60/d81/db3/d142 0 2026-03-10T14:08:18.510 INFO:tasks.workunit.client.1.vm04.stdout:9/844: unlink d9/da/dd/d74/fa0 0 2026-03-10T14:08:18.510 INFO:tasks.workunit.client.1.vm04.stdout:9/845: chown d9/ff 11551572 1 2026-03-10T14:08:18.511 INFO:tasks.workunit.client.1.vm04.stdout:5/934: rename d7/d12/d2b/d3e/d3f/dc0/l111 to d7/d12/d2b/d3e/d3f/dc0/l135 0 2026-03-10T14:08:18.514 INFO:tasks.workunit.client.1.vm04.stdout:3/866: mkdir da/dc/d35/d37/d127 0 2026-03-10T14:08:18.518 INFO:tasks.workunit.client.1.vm04.stdout:9/846: rename d9/d5c/f77 to d9/d44/d70/f125 0 2026-03-10T14:08:18.518 INFO:tasks.workunit.client.1.vm04.stdout:9/847: chown d9/da/dd/de7/d96/f114 1699 1 2026-03-10T14:08:18.522 INFO:tasks.workunit.client.1.vm04.stdout:5/935: symlink d7/d2d/l136 0 2026-03-10T14:08:18.525 INFO:tasks.workunit.client.1.vm04.stdout:2/871: link d0/d14/d91/d4a/d8c/dab/d95/lc4 d0/d14/d1b/d45/da5/l10c 0 2026-03-10T14:08:18.526 INFO:tasks.workunit.client.1.vm04.stdout:9/848: mkdir d9/da/dd/d1c/da3/dec/d126 0 2026-03-10T14:08:18.527 INFO:tasks.workunit.client.1.vm04.stdout:2/872: mkdir d0/d14/d91/d8/d10d 0 2026-03-10T14:08:18.534 INFO:tasks.workunit.client.0.vm03.stdout:9/364: dread d2/d14/d2b/d43/f78 [0,4194304] 0 2026-03-10T14:08:18.534 INFO:tasks.workunit.client.1.vm04.stdout:9/849: creat d9/da/d11f/f127 x:0 0 0 2026-03-10T14:08:18.534 INFO:tasks.workunit.client.1.vm04.stdout:5/936: link d7/d12/d2b/d3e/d57/l6f d7/d12/d2b/d93/l137 0 2026-03-10T14:08:18.534 INFO:tasks.workunit.client.1.vm04.stdout:9/850: dwrite d9/d44/d4d/d7d/f123 [0,4194304] 0 2026-03-10T14:08:18.546 INFO:tasks.workunit.client.1.vm04.stdout:5/937: write f4 [4532,62310] 0 2026-03-10T14:08:18.547 INFO:tasks.workunit.client.0.vm03.stdout:8/344: truncate da/d24/f43 614519 0 2026-03-10T14:08:18.550 INFO:tasks.workunit.client.1.vm04.stdout:9/851: creat d9/d5c/dc2/f128 x:0 0 0 2026-03-10T14:08:18.550 INFO:tasks.workunit.client.1.vm04.stdout:9/852: stat d9/d58/db5/c26 0 2026-03-10T14:08:18.552 INFO:tasks.workunit.client.1.vm04.stdout:9/853: readlink d9/d58/db5/la1 0 2026-03-10T14:08:18.552 INFO:tasks.workunit.client.1.vm04.stdout:8/971: dwrite d0/d3/d63/d12/d51/d67/d96/d105/d117/f11d [0,4194304] 0 2026-03-10T14:08:18.555 INFO:tasks.workunit.client.1.vm04.stdout:2/873: creat d0/d14/d91/d8/d17/d4e/d85/f10e x:0 0 0 2026-03-10T14:08:18.556 INFO:tasks.workunit.client.1.vm04.stdout:2/874: readlink d0/d14/d91/d4a/d8c/dab/l77 0 2026-03-10T14:08:18.556 INFO:tasks.workunit.client.1.vm04.stdout:0/875: write d0/d2/d15/d22/d38/f5b [599115,38750] 0 2026-03-10T14:08:18.557 INFO:tasks.workunit.client.1.vm04.stdout:0/876: write d0/d2/d15/f59 [1194102,95489] 0 2026-03-10T14:08:18.559 INFO:tasks.workunit.client.1.vm04.stdout:5/938: mkdir d7/d12/d2b/d93/d138 0 2026-03-10T14:08:18.559 INFO:tasks.workunit.client.0.vm03.stdout:9/365: unlink d2/d29/f73 0 2026-03-10T14:08:18.565 INFO:tasks.workunit.client.0.vm03.stdout:8/345: mkdir da/d36/d40/d50/d70 0 2026-03-10T14:08:18.568 INFO:tasks.workunit.client.0.vm03.stdout:8/346: fsync da/d24/f25 0 2026-03-10T14:08:18.568 INFO:tasks.workunit.client.1.vm04.stdout:2/875: symlink d0/d14/d39/l10f 0 2026-03-10T14:08:18.569 INFO:tasks.workunit.client.1.vm04.stdout:9/854: dread d9/da/dd/f85 [0,4194304] 0 2026-03-10T14:08:18.573 INFO:tasks.workunit.client.0.vm03.stdout:8/347: dread f2 [8388608,4194304] 0 2026-03-10T14:08:18.576 INFO:tasks.workunit.client.1.vm04.stdout:0/877: chown d0/d2/d15/d49/d50/d61/la8 3085 1 2026-03-10T14:08:18.576 INFO:tasks.workunit.client.1.vm04.stdout:5/939: chown d7/d9/db5/fd1 31181 1 2026-03-10T14:08:18.581 INFO:tasks.workunit.client.0.vm03.stdout:9/366: dread d2/d14/d2b/d43/f5b [0,4194304] 0 2026-03-10T14:08:18.582 INFO:tasks.workunit.client.1.vm04.stdout:0/878: read d0/d2/d15/d49/d50/d61/d75/f80 [3663570,47189] 0 2026-03-10T14:08:18.584 INFO:tasks.workunit.client.1.vm04.stdout:0/879: dread d0/d2/d15/d49/d50/d5c/dd8/f105 [0,4194304] 0 2026-03-10T14:08:18.591 INFO:tasks.workunit.client.1.vm04.stdout:0/880: dread d0/d2/d15/f2f [0,4194304] 0 2026-03-10T14:08:18.600 INFO:tasks.workunit.client.1.vm04.stdout:2/876: fsync d0/d14/d91/f11 0 2026-03-10T14:08:18.605 INFO:tasks.workunit.client.1.vm04.stdout:9/855: creat d9/da/d8c/de5/d11d/f129 x:0 0 0 2026-03-10T14:08:18.605 INFO:tasks.workunit.client.1.vm04.stdout:5/940: link d7/d2d/d76/f8f d7/d12/d2b/d3e/d57/d8a/f139 0 2026-03-10T14:08:18.609 INFO:tasks.workunit.client.1.vm04.stdout:2/877: rename d0/d14/d39/l10f to d0/d14/d91/d8/d17/d4e/dea/l110 0 2026-03-10T14:08:18.613 INFO:tasks.workunit.client.1.vm04.stdout:2/878: rename d0/d14/d39/fa2 to d0/f111 0 2026-03-10T14:08:18.616 INFO:tasks.workunit.client.1.vm04.stdout:2/879: symlink d0/d14/d91/d8/d17/d4e/d8e/l112 0 2026-03-10T14:08:18.617 INFO:tasks.workunit.client.1.vm04.stdout:2/880: fsync d0/d14/d91/d8/d17/d4e/d85/f88 0 2026-03-10T14:08:18.617 INFO:tasks.workunit.client.1.vm04.stdout:2/881: chown d0/d14/deb 59334389 1 2026-03-10T14:08:18.618 INFO:tasks.workunit.client.1.vm04.stdout:2/882: readlink d0/d14/d91/d4a/d8c/d92/le1 0 2026-03-10T14:08:18.620 INFO:tasks.workunit.client.1.vm04.stdout:2/883: readlink d0/d14/d91/d8/d17/d4e/dea/le4 0 2026-03-10T14:08:18.621 INFO:tasks.workunit.client.1.vm04.stdout:2/884: mknod d0/d14/d91/d8/d17/c113 0 2026-03-10T14:08:18.621 INFO:tasks.workunit.client.0.vm03.stdout:8/348: sync 2026-03-10T14:08:18.622 INFO:tasks.workunit.client.0.vm03.stdout:8/349: chown da/d3c/f3f 32 1 2026-03-10T14:08:18.622 INFO:tasks.workunit.client.1.vm04.stdout:0/881: sync 2026-03-10T14:08:18.626 INFO:tasks.workunit.client.1.vm04.stdout:2/885: truncate d0/d14/d1b/f29 8547934 0 2026-03-10T14:08:18.628 INFO:tasks.workunit.client.1.vm04.stdout:0/882: rename d0/d2/d15/d49/d50/d5c/da4/cb9 to d0/d2/d15/d22/d38/d56/d66/c114 0 2026-03-10T14:08:18.634 INFO:tasks.workunit.client.0.vm03.stdout:8/350: rename da/d3c/f3f to da/d58/d5f/f71 0 2026-03-10T14:08:18.635 INFO:tasks.workunit.client.0.vm03.stdout:8/351: creat da/d58/f72 x:0 0 0 2026-03-10T14:08:18.641 INFO:tasks.workunit.client.0.vm03.stdout:7/247: dwrite d5/d9/d14/d21/d28/f37 [0,4194304] 0 2026-03-10T14:08:18.641 INFO:tasks.workunit.client.1.vm04.stdout:6/751: dwrite d3/d1d/d73/fc6 [0,4194304] 0 2026-03-10T14:08:18.650 INFO:tasks.workunit.client.1.vm04.stdout:6/752: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/f36 [0,4194304] 0 2026-03-10T14:08:18.652 INFO:tasks.workunit.client.1.vm04.stdout:6/753: stat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f87 0 2026-03-10T14:08:18.653 INFO:tasks.workunit.client.1.vm04.stdout:4/820: truncate d4/df/db2/db4/d47/d4f/d8c/dc8/dff/fd2 1455358 0 2026-03-10T14:08:18.670 INFO:tasks.workunit.client.1.vm04.stdout:7/914: dwrite d2/dc/de/d2d/d60/d7c/d36/f87 [0,4194304] 0 2026-03-10T14:08:18.679 INFO:tasks.workunit.client.1.vm04.stdout:4/821: creat d4/df/db2/de1/f10f x:0 0 0 2026-03-10T14:08:18.681 INFO:tasks.workunit.client.0.vm03.stdout:7/248: rmdir d5/d9/d14/d21/d28/d3f 0 2026-03-10T14:08:18.681 INFO:tasks.workunit.client.1.vm04.stdout:3/867: dwrite da/dc/d35/f6a [0,4194304] 0 2026-03-10T14:08:18.684 INFO:tasks.workunit.client.0.vm03.stdout:7/249: creat d5/d9/d14/f4d x:0 0 0 2026-03-10T14:08:18.689 INFO:tasks.workunit.client.1.vm04.stdout:7/915: truncate d2/dc/de/dae/fb2 1003384 0 2026-03-10T14:08:18.690 INFO:tasks.workunit.client.0.vm03.stdout:7/250: read - d5/d9/d14/d26/d36/f3a zero size 2026-03-10T14:08:18.690 INFO:tasks.workunit.client.0.vm03.stdout:7/251: unlink d5/d9/d14/f2d 0 2026-03-10T14:08:18.690 INFO:tasks.workunit.client.0.vm03.stdout:7/252: readlink d5/d9/d3e/l4b 0 2026-03-10T14:08:18.691 INFO:tasks.workunit.client.0.vm03.stdout:7/253: creat d5/d9/d3e/f4e x:0 0 0 2026-03-10T14:08:18.692 INFO:tasks.workunit.client.0.vm03.stdout:7/254: readlink d5/l8 0 2026-03-10T14:08:18.692 INFO:tasks.workunit.client.0.vm03.stdout:7/255: stat d5/d9/d14/d21/d28/c3c 0 2026-03-10T14:08:18.705 INFO:tasks.workunit.client.1.vm04.stdout:6/754: dread d3/de/d35/d3f/d2d/f89 [0,4194304] 0 2026-03-10T14:08:18.705 INFO:tasks.workunit.client.1.vm04.stdout:6/755: chown d3/de/f6d 9 1 2026-03-10T14:08:18.710 INFO:tasks.workunit.client.1.vm04.stdout:3/868: mknod da/dc/d35/d37/d127/c128 0 2026-03-10T14:08:18.711 INFO:tasks.workunit.client.1.vm04.stdout:7/916: symlink d2/dc/de/d2d/d60/d7c/d36/d103/l143 0 2026-03-10T14:08:18.712 INFO:tasks.workunit.client.1.vm04.stdout:7/917: chown d2/dc/de/d2d/d60/d7c/c95 3535 1 2026-03-10T14:08:18.714 INFO:tasks.workunit.client.1.vm04.stdout:6/756: mkdir d3/de/de8 0 2026-03-10T14:08:18.715 INFO:tasks.workunit.client.1.vm04.stdout:3/869: rmdir da/dc/d35/d52/d53/d78 39 2026-03-10T14:08:18.718 INFO:tasks.workunit.client.1.vm04.stdout:7/918: readlink d2/dc/d4d/dcd/ld9 0 2026-03-10T14:08:18.718 INFO:tasks.workunit.client.1.vm04.stdout:6/757: symlink d3/de/d35/d3f/d2d/d32/d23/d4e/le9 0 2026-03-10T14:08:18.720 INFO:tasks.workunit.client.1.vm04.stdout:3/870: fsync da/d3e/f4c 0 2026-03-10T14:08:18.725 INFO:tasks.workunit.client.1.vm04.stdout:7/919: truncate d2/f20 1036096 0 2026-03-10T14:08:18.730 INFO:tasks.workunit.client.0.vm03.stdout:8/352: rename da/d36/d40/d56 to da/d36/d40/d73 0 2026-03-10T14:08:18.730 INFO:tasks.workunit.client.0.vm03.stdout:8/353: chown da/d36/d40/d73 0 1 2026-03-10T14:08:18.730 INFO:tasks.workunit.client.1.vm04.stdout:3/871: stat da/d30/l3a 0 2026-03-10T14:08:18.730 INFO:tasks.workunit.client.1.vm04.stdout:8/972: write d0/d3/d63/d12/f50 [622189,88980] 0 2026-03-10T14:08:18.730 INFO:tasks.workunit.client.1.vm04.stdout:8/973: chown d0/d3/d63/d12/d51/d67/d96 6 1 2026-03-10T14:08:18.731 INFO:tasks.workunit.client.0.vm03.stdout:8/354: fsync da/f62 0 2026-03-10T14:08:18.731 INFO:tasks.workunit.client.0.vm03.stdout:8/355: chown da/d58/l5c 55005588 1 2026-03-10T14:08:18.732 INFO:tasks.workunit.client.0.vm03.stdout:8/356: fdatasync da/d36/d6a/f6b 0 2026-03-10T14:08:18.732 INFO:tasks.workunit.client.0.vm03.stdout:8/357: dread - da/d58/f72 zero size 2026-03-10T14:08:18.733 INFO:tasks.workunit.client.0.vm03.stdout:8/358: write da/d36/d6a/f6b [423789,67281] 0 2026-03-10T14:08:18.737 INFO:tasks.workunit.client.1.vm04.stdout:3/872: creat da/dc/d3f/d54/d66/f129 x:0 0 0 2026-03-10T14:08:18.744 INFO:tasks.workunit.client.1.vm04.stdout:3/873: creat da/dc/d3f/d61/f12a x:0 0 0 2026-03-10T14:08:18.744 INFO:tasks.workunit.client.1.vm04.stdout:7/920: dread d2/dc/de/d2d/d38/f5a [0,4194304] 0 2026-03-10T14:08:18.744 INFO:tasks.workunit.client.1.vm04.stdout:7/921: chown d2/dc/de/d2d/d5c/f8e 1 1 2026-03-10T14:08:18.744 INFO:tasks.workunit.client.1.vm04.stdout:8/974: unlink d0/d3/d63/d12/d51/d67/d96/d105/def/c119 0 2026-03-10T14:08:18.745 INFO:tasks.workunit.client.1.vm04.stdout:8/975: mknod d0/d3/d63/d12/d51/d8b/c12d 0 2026-03-10T14:08:18.745 INFO:tasks.workunit.client.1.vm04.stdout:3/874: symlink da/dc/d35/d37/l12b 0 2026-03-10T14:08:18.748 INFO:tasks.workunit.client.1.vm04.stdout:6/758: sync 2026-03-10T14:08:18.753 INFO:tasks.workunit.client.0.vm03.stdout:8/359: dread da/d3a/d44/f46 [0,4194304] 0 2026-03-10T14:08:18.755 INFO:tasks.workunit.client.1.vm04.stdout:3/875: creat da/dc/d35/d37/d10a/f12c x:0 0 0 2026-03-10T14:08:18.755 INFO:tasks.workunit.client.0.vm03.stdout:8/360: rmdir da/d3c/d51 39 2026-03-10T14:08:18.755 INFO:tasks.workunit.client.1.vm04.stdout:3/876: write da/dc/d3f/f83 [1857621,33906] 0 2026-03-10T14:08:18.758 INFO:tasks.workunit.client.1.vm04.stdout:9/856: rmdir d9/da/d8c 39 2026-03-10T14:08:18.759 INFO:tasks.workunit.client.1.vm04.stdout:6/759: truncate d3/de/d35/d3f/d2d/d32/f97 592387 0 2026-03-10T14:08:18.762 INFO:tasks.workunit.client.1.vm04.stdout:5/941: dwrite d7/d26/d6b/d6e/f8e [0,4194304] 0 2026-03-10T14:08:18.772 INFO:tasks.workunit.client.0.vm03.stdout:8/361: getdents da/d24 0 2026-03-10T14:08:18.781 INFO:tasks.workunit.client.1.vm04.stdout:7/922: getdents d2/dc/de/d2d/d60/d7c/d44/dc0 0 2026-03-10T14:08:18.788 INFO:tasks.workunit.client.1.vm04.stdout:0/883: dwrite d0/d2/d15/d22/f88 [0,4194304] 0 2026-03-10T14:08:18.788 INFO:tasks.workunit.client.1.vm04.stdout:5/942: rmdir d7/d2d/d76 39 2026-03-10T14:08:18.789 INFO:tasks.workunit.client.1.vm04.stdout:5/943: chown d7/d12/d2b/d3e/d57/d77/da5 161 1 2026-03-10T14:08:18.801 INFO:tasks.workunit.client.1.vm04.stdout:2/886: dwrite d0/d14/d1b/d45/fb6 [4194304,4194304] 0 2026-03-10T14:08:18.805 INFO:tasks.workunit.client.1.vm04.stdout:7/923: mkdir d2/dc/de/d2d/d5c/da9/df6/d144 0 2026-03-10T14:08:18.817 INFO:tasks.workunit.client.1.vm04.stdout:3/877: rename da/dc/d47/d9b/dcb/lef to da/dc/d47/d9b/d106/l12d 0 2026-03-10T14:08:18.817 INFO:tasks.workunit.client.1.vm04.stdout:3/878: chown da/dc/d47/f10c 93581 1 2026-03-10T14:08:18.821 INFO:tasks.workunit.client.1.vm04.stdout:3/879: dwrite da/dc/d35/f64 [0,4194304] 0 2026-03-10T14:08:18.835 INFO:tasks.workunit.client.1.vm04.stdout:7/924: mkdir d2/df9/d145 0 2026-03-10T14:08:18.838 INFO:tasks.workunit.client.1.vm04.stdout:4/822: truncate d4/f5f 4767515 0 2026-03-10T14:08:18.843 INFO:tasks.workunit.client.1.vm04.stdout:3/880: dread f8 [0,4194304] 0 2026-03-10T14:08:18.844 INFO:tasks.workunit.client.1.vm04.stdout:2/887: getdents d0/d14/d39/d47/d70/dad 0 2026-03-10T14:08:18.846 INFO:tasks.workunit.client.1.vm04.stdout:3/881: creat da/dc/d35/d52/d70/f12e x:0 0 0 2026-03-10T14:08:18.848 INFO:tasks.workunit.client.1.vm04.stdout:4/823: rename d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/f76 to d4/d14/f110 0 2026-03-10T14:08:18.848 INFO:tasks.workunit.client.1.vm04.stdout:3/882: dread - da/dc/d47/f10c zero size 2026-03-10T14:08:18.850 INFO:tasks.workunit.client.1.vm04.stdout:2/888: symlink d0/d14/d39/d47/l114 0 2026-03-10T14:08:18.850 INFO:tasks.workunit.client.1.vm04.stdout:2/889: chown d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/cae 3 1 2026-03-10T14:08:18.852 INFO:tasks.workunit.client.1.vm04.stdout:3/883: rename da/d8e/db5/le5 to da/dc/l12f 0 2026-03-10T14:08:18.853 INFO:tasks.workunit.client.1.vm04.stdout:2/890: mkdir d0/d14/d1b/d45/d115 0 2026-03-10T14:08:18.861 INFO:tasks.workunit.client.1.vm04.stdout:8/976: write d0/d3/f80 [4936565,64488] 0 2026-03-10T14:08:18.863 INFO:tasks.workunit.client.1.vm04.stdout:9/857: write d9/d44/d59/fcc [256857,130227] 0 2026-03-10T14:08:18.866 INFO:tasks.workunit.client.1.vm04.stdout:6/760: dwrite d3/d1d/f44 [0,4194304] 0 2026-03-10T14:08:18.880 INFO:tasks.workunit.client.1.vm04.stdout:3/884: mkdir da/dc/d47/d9b/d106/d130 0 2026-03-10T14:08:18.883 INFO:tasks.workunit.client.1.vm04.stdout:4/824: getdents d4/df/d34 0 2026-03-10T14:08:18.884 INFO:tasks.workunit.client.1.vm04.stdout:4/825: chown d4/d14/d1b/l5b 2835226 1 2026-03-10T14:08:18.885 INFO:tasks.workunit.client.1.vm04.stdout:2/891: mknod d0/d14/d91/d8/d17/d4e/d85/d86/d96/c116 0 2026-03-10T14:08:18.892 INFO:tasks.workunit.client.1.vm04.stdout:0/884: dwrite d0/d2/d25/f64 [0,4194304] 0 2026-03-10T14:08:18.895 INFO:tasks.workunit.client.1.vm04.stdout:1/877: dread d3/f8 [0,4194304] 0 2026-03-10T14:08:18.897 INFO:tasks.workunit.client.1.vm04.stdout:6/761: fdatasync d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/fb7 0 2026-03-10T14:08:18.905 INFO:tasks.workunit.client.1.vm04.stdout:2/892: symlink d0/d14/d91/d4a/d8c/dab/d95/l117 0 2026-03-10T14:08:18.907 INFO:tasks.workunit.client.1.vm04.stdout:1/878: unlink d3/c1c 0 2026-03-10T14:08:18.909 INFO:tasks.workunit.client.1.vm04.stdout:0/885: dread d0/d2/d25/fc0 [0,4194304] 0 2026-03-10T14:08:18.912 INFO:tasks.workunit.client.1.vm04.stdout:8/977: creat d0/d3/d73/f12e x:0 0 0 2026-03-10T14:08:18.914 INFO:tasks.workunit.client.0.vm03.stdout:3/328: truncate d1d/d33/f3a 2394989 0 2026-03-10T14:08:18.915 INFO:tasks.workunit.client.1.vm04.stdout:0/886: dwrite d0/d2/f2d [4194304,4194304] 0 2026-03-10T14:08:18.916 INFO:tasks.workunit.client.1.vm04.stdout:0/887: write d0/d6e/f111 [629529,28822] 0 2026-03-10T14:08:18.917 INFO:tasks.workunit.client.1.vm04.stdout:0/888: readlink d0/d2/d15/d49/d50/d61/l86 0 2026-03-10T14:08:18.919 INFO:tasks.workunit.client.0.vm03.stdout:3/329: dread f1b [0,4194304] 0 2026-03-10T14:08:18.924 INFO:tasks.workunit.client.1.vm04.stdout:1/879: unlink d3/d5c/d79/lb1 0 2026-03-10T14:08:18.926 INFO:tasks.workunit.client.0.vm03.stdout:3/330: creat d1d/d39/f60 x:0 0 0 2026-03-10T14:08:18.932 INFO:tasks.workunit.client.1.vm04.stdout:5/944: truncate d7/d2d/f64 2057012 0 2026-03-10T14:08:18.933 INFO:tasks.workunit.client.0.vm03.stdout:3/331: unlink l12 0 2026-03-10T14:08:18.933 INFO:tasks.workunit.client.1.vm04.stdout:4/826: creat d4/df/db2/db4/d47/d4f/f111 x:0 0 0 2026-03-10T14:08:18.937 INFO:tasks.workunit.client.1.vm04.stdout:4/827: dwrite d4/d14/d3c/f46 [0,4194304] 0 2026-03-10T14:08:18.950 INFO:tasks.workunit.client.1.vm04.stdout:7/925: dwrite d2/d2a/f109 [0,4194304] 0 2026-03-10T14:08:18.950 INFO:tasks.workunit.client.1.vm04.stdout:7/926: fdatasync d2/d2a/d42/d86/f11e 0 2026-03-10T14:08:18.952 INFO:tasks.workunit.client.1.vm04.stdout:7/927: chown d2/dc/de/d2d/d60/d7c/c95 181 1 2026-03-10T14:08:18.956 INFO:tasks.workunit.client.1.vm04.stdout:1/880: creat d3/d22/deb/f12d x:0 0 0 2026-03-10T14:08:18.959 INFO:tasks.workunit.client.0.vm03.stdout:3/332: getdents d1d/d29 0 2026-03-10T14:08:18.964 INFO:tasks.workunit.client.1.vm04.stdout:6/762: creat d3/de/d35/d3a/d43/d4c/d5e/fea x:0 0 0 2026-03-10T14:08:18.964 INFO:tasks.workunit.client.1.vm04.stdout:0/889: sync 2026-03-10T14:08:18.968 INFO:tasks.workunit.client.1.vm04.stdout:4/828: symlink d4/df7/l112 0 2026-03-10T14:08:18.968 INFO:tasks.workunit.client.1.vm04.stdout:9/858: write d9/da/dd/d1c/fdd [1102535,22691] 0 2026-03-10T14:08:18.974 INFO:tasks.workunit.client.1.vm04.stdout:8/978: getdents d0/d3/d63/d12/d51/d67/d96/dc8/d111/d115 0 2026-03-10T14:08:18.976 INFO:tasks.workunit.client.1.vm04.stdout:2/893: creat d0/d14/d91/d4a/d8c/dab/d46/f118 x:0 0 0 2026-03-10T14:08:18.976 INFO:tasks.workunit.client.1.vm04.stdout:3/885: dwrite da/dc/f2a [0,4194304] 0 2026-03-10T14:08:18.977 INFO:tasks.workunit.client.1.vm04.stdout:2/894: chown d0/db8/ce8 1 1 2026-03-10T14:08:18.985 INFO:tasks.workunit.client.0.vm03.stdout:3/333: getdents d1d/d40/d48 0 2026-03-10T14:08:18.985 INFO:tasks.workunit.client.1.vm04.stdout:7/928: truncate d2/dc/de/d2d/d60/d7c/d44/f66 414029 0 2026-03-10T14:08:18.988 INFO:tasks.workunit.client.0.vm03.stdout:3/334: creat d1d/d39/d51/f61 x:0 0 0 2026-03-10T14:08:18.991 INFO:tasks.workunit.client.1.vm04.stdout:0/890: unlink d0/d2/f2d 0 2026-03-10T14:08:18.991 INFO:tasks.workunit.client.1.vm04.stdout:0/891: readlink d0/d2/d15/d22/d38/d56/l6b 0 2026-03-10T14:08:18.992 INFO:tasks.workunit.client.1.vm04.stdout:4/829: rmdir d4/df/d31 39 2026-03-10T14:08:18.993 INFO:tasks.workunit.client.1.vm04.stdout:4/830: fdatasync d4/df/db2/db4/d47/d4f/f84 0 2026-03-10T14:08:18.993 INFO:tasks.workunit.client.0.vm03.stdout:3/335: creat d1d/f62 x:0 0 0 2026-03-10T14:08:18.994 INFO:tasks.workunit.client.0.vm03.stdout:3/336: truncate d1d/f31 3222776 0 2026-03-10T14:08:18.997 INFO:tasks.workunit.client.1.vm04.stdout:8/979: symlink d0/d3/d63/d12/d69/l12f 0 2026-03-10T14:08:19.000 INFO:tasks.workunit.client.1.vm04.stdout:2/895: unlink d0/d14/d1b/f55 0 2026-03-10T14:08:19.000 INFO:tasks.workunit.client.1.vm04.stdout:1/881: mknod d3/d22/d63/d35/dd9/d13/d38/d58/d5b/c12e 0 2026-03-10T14:08:19.001 INFO:tasks.workunit.client.1.vm04.stdout:5/945: creat d7/f13a x:0 0 0 2026-03-10T14:08:19.009 INFO:tasks.workunit.client.0.vm03.stdout:3/337: dread fe [0,4194304] 0 2026-03-10T14:08:19.011 INFO:tasks.workunit.client.0.vm03.stdout:3/338: creat d1d/d29/d41/d45/d55/f63 x:0 0 0 2026-03-10T14:08:19.012 INFO:tasks.workunit.client.0.vm03.stdout:3/339: write d1d/f4a [833369,53198] 0 2026-03-10T14:08:19.012 INFO:tasks.workunit.client.0.vm03.stdout:3/340: fsync d1d/f2b 0 2026-03-10T14:08:19.016 INFO:tasks.workunit.client.1.vm04.stdout:6/763: dwrite d3/de/d35/d3f/d2d/f9a [0,4194304] 0 2026-03-10T14:08:19.017 INFO:tasks.workunit.client.1.vm04.stdout:6/764: write d3/de/d35/d3f/d2d/f21 [3532826,109448] 0 2026-03-10T14:08:19.021 INFO:tasks.workunit.client.0.vm03.stdout:3/341: unlink d1d/d40/f5a 0 2026-03-10T14:08:19.029 INFO:tasks.workunit.client.1.vm04.stdout:9/859: write d9/da/dd/de7/d96/d118/f102 [1868118,27489] 0 2026-03-10T14:08:19.030 INFO:tasks.workunit.client.0.vm03.stdout:1/355: dwrite d0/f24 [0,4194304] 0 2026-03-10T14:08:19.032 INFO:tasks.workunit.client.1.vm04.stdout:7/929: dwrite d2/dc/de/d11/f19 [0,4194304] 0 2026-03-10T14:08:19.035 INFO:tasks.workunit.client.0.vm03.stdout:1/356: dwrite d0/d2/df/d16/f1e [0,4194304] 0 2026-03-10T14:08:19.035 INFO:tasks.workunit.client.1.vm04.stdout:4/831: rename d4/d14/d6d/df5/l105 to d4/df/db2/db6/dc9/dd0/l113 0 2026-03-10T14:08:19.054 INFO:tasks.workunit.client.0.vm03.stdout:1/357: mkdir d0/d2/d71 0 2026-03-10T14:08:19.055 INFO:tasks.workunit.client.0.vm03.stdout:1/358: chown d0/d2/df/d16/d20 180078202 1 2026-03-10T14:08:19.055 INFO:tasks.workunit.client.0.vm03.stdout:1/359: chown d0/d2/df/d16 988621 1 2026-03-10T14:08:19.056 INFO:tasks.workunit.client.1.vm04.stdout:0/892: dwrite d0/d2/d15/d22/d38/d56/d66/f7a [0,4194304] 0 2026-03-10T14:08:19.057 INFO:tasks.workunit.client.0.vm03.stdout:1/360: creat d0/d2/df/d16/d20/f72 x:0 0 0 2026-03-10T14:08:19.058 INFO:tasks.workunit.client.1.vm04.stdout:3/886: creat da/dc/d47/d9b/d106/d130/f131 x:0 0 0 2026-03-10T14:08:19.059 INFO:tasks.workunit.client.0.vm03.stdout:1/361: creat d0/d2/d34/f73 x:0 0 0 2026-03-10T14:08:19.059 INFO:tasks.workunit.client.1.vm04.stdout:8/980: dread d0/d3/d73/fdd [0,4194304] 0 2026-03-10T14:08:19.060 INFO:tasks.workunit.client.1.vm04.stdout:1/882: unlink d3/d22/d63/d35/dd5/fd6 0 2026-03-10T14:08:19.072 INFO:tasks.workunit.client.1.vm04.stdout:7/930: fsync d2/dc/de/d2d/d60/d7c/f105 0 2026-03-10T14:08:19.076 INFO:tasks.workunit.client.1.vm04.stdout:4/832: creat d4/d14/d6d/f114 x:0 0 0 2026-03-10T14:08:19.084 INFO:tasks.workunit.client.1.vm04.stdout:7/931: dread d2/f4 [4194304,4194304] 0 2026-03-10T14:08:19.093 INFO:tasks.workunit.client.1.vm04.stdout:0/893: mknod d0/d2/d15/d49/d50/d5c/da4/c115 0 2026-03-10T14:08:19.093 INFO:tasks.workunit.client.1.vm04.stdout:3/887: mkdir da/dc/d115/d132 0 2026-03-10T14:08:19.106 INFO:tasks.workunit.client.1.vm04.stdout:2/896: creat d0/d14/d91/d4a/d8c/f119 x:0 0 0 2026-03-10T14:08:19.111 INFO:tasks.workunit.client.1.vm04.stdout:3/888: mknod da/dc/d115/c133 0 2026-03-10T14:08:19.111 INFO:tasks.workunit.client.1.vm04.stdout:3/889: stat da/dc/d3f/d54/c76 0 2026-03-10T14:08:19.113 INFO:tasks.workunit.client.1.vm04.stdout:8/981: creat d0/d3/d63/d12/d51/d67/d96/dc8/d111/d115/f130 x:0 0 0 2026-03-10T14:08:19.116 INFO:tasks.workunit.client.1.vm04.stdout:1/883: fsync d3/d22/d63/d35/fe5 0 2026-03-10T14:08:19.120 INFO:tasks.workunit.client.1.vm04.stdout:5/946: symlink d7/d2d/d76/d124/l13b 0 2026-03-10T14:08:19.124 INFO:tasks.workunit.client.1.vm04.stdout:2/897: symlink d0/d14/d91/d8/d17/d4e/d85/d86/d96/l11a 0 2026-03-10T14:08:19.125 INFO:tasks.workunit.client.1.vm04.stdout:3/890: rmdir da/dc/d35/d52/d53 39 2026-03-10T14:08:19.129 INFO:tasks.workunit.client.1.vm04.stdout:4/833: dread d4/d14/d1b/f28 [0,4194304] 0 2026-03-10T14:08:19.134 INFO:tasks.workunit.client.1.vm04.stdout:5/947: read d7/d2d/d76/f8f [821544,38998] 0 2026-03-10T14:08:19.135 INFO:tasks.workunit.client.1.vm04.stdout:5/948: write d7/d59/f11b [775486,57388] 0 2026-03-10T14:08:19.135 INFO:tasks.workunit.client.1.vm04.stdout:8/982: dread d0/d3/d63/f8f [0,4194304] 0 2026-03-10T14:08:19.136 INFO:tasks.workunit.client.1.vm04.stdout:8/983: write d0/d3/d63/d12/df5/f118 [216282,43425] 0 2026-03-10T14:08:19.139 INFO:tasks.workunit.client.1.vm04.stdout:6/765: rename d3/c9f to d3/ceb 0 2026-03-10T14:08:19.141 INFO:tasks.workunit.client.1.vm04.stdout:2/898: readlink d0/d14/d91/l2c 0 2026-03-10T14:08:19.143 INFO:tasks.workunit.client.1.vm04.stdout:3/891: fdatasync da/dc/d3f/d54/fa9 0 2026-03-10T14:08:19.144 INFO:tasks.workunit.client.1.vm04.stdout:3/892: fsync da/dc/d115/f11c 0 2026-03-10T14:08:19.144 INFO:tasks.workunit.client.0.vm03.stdout:2/324: truncate d5/d2a/f51 20275 0 2026-03-10T14:08:19.145 INFO:tasks.workunit.client.0.vm03.stdout:2/325: chown d5/d10/d31/f3d 1166 1 2026-03-10T14:08:19.148 INFO:tasks.workunit.client.1.vm04.stdout:9/860: write d9/da/dd/de7/d96/fce [2241607,128288] 0 2026-03-10T14:08:19.150 INFO:tasks.workunit.client.0.vm03.stdout:2/326: link d5/d10/d17/l42 d5/l62 0 2026-03-10T14:08:19.154 INFO:tasks.workunit.client.0.vm03.stdout:2/327: creat d5/d10/d1c/d40/d59/f63 x:0 0 0 2026-03-10T14:08:19.155 INFO:tasks.workunit.client.1.vm04.stdout:7/932: dwrite d2/d2a/f93 [4194304,4194304] 0 2026-03-10T14:08:19.155 INFO:tasks.workunit.client.1.vm04.stdout:0/894: dwrite d0/d2/d15/f2f [4194304,4194304] 0 2026-03-10T14:08:19.155 INFO:tasks.workunit.client.0.vm03.stdout:2/328: write d5/d10/f12 [873041,122380] 0 2026-03-10T14:08:19.160 INFO:tasks.workunit.client.0.vm03.stdout:2/329: symlink d5/d10/d1c/d54/l64 0 2026-03-10T14:08:19.161 INFO:tasks.workunit.client.0.vm03.stdout:2/330: chown d5/d35/f49 1293 1 2026-03-10T14:08:19.163 INFO:tasks.workunit.client.0.vm03.stdout:2/331: fdatasync d5/d10/d31/f3d 0 2026-03-10T14:08:19.164 INFO:tasks.workunit.client.1.vm04.stdout:5/949: chown d7/d59/d7e/d87/lde 976093 1 2026-03-10T14:08:19.164 INFO:tasks.workunit.client.0.vm03.stdout:8/362: truncate da/f30 109181 0 2026-03-10T14:08:19.165 INFO:tasks.workunit.client.0.vm03.stdout:8/363: write da/d3a/d44/d64/f68 [548065,18318] 0 2026-03-10T14:08:19.168 INFO:tasks.workunit.client.1.vm04.stdout:6/766: creat d3/de/d35/d3f/d2d/d32/d9e/fec x:0 0 0 2026-03-10T14:08:19.169 INFO:tasks.workunit.client.1.vm04.stdout:9/861: mknod d9/da/dd/de7/db1/c12a 0 2026-03-10T14:08:19.170 INFO:tasks.workunit.client.1.vm04.stdout:9/862: readlink d9/da/dd/l6a 0 2026-03-10T14:08:19.171 INFO:tasks.workunit.client.0.vm03.stdout:2/332: chown d5/l7 0 1 2026-03-10T14:08:19.171 INFO:tasks.workunit.client.0.vm03.stdout:2/333: readlink d5/le 0 2026-03-10T14:08:19.172 INFO:tasks.workunit.client.0.vm03.stdout:2/334: readlink d5/d10/d1c/d54/l5a 0 2026-03-10T14:08:19.172 INFO:tasks.workunit.client.0.vm03.stdout:2/335: write d5/f9 [2940786,4386] 0 2026-03-10T14:08:19.173 INFO:tasks.workunit.client.1.vm04.stdout:6/767: dwrite d3/de/d35/d3f/d2d/d32/d9e/fe1 [0,4194304] 0 2026-03-10T14:08:19.174 INFO:tasks.workunit.client.0.vm03.stdout:8/364: write da/f16 [4194427,97737] 0 2026-03-10T14:08:19.175 INFO:tasks.workunit.client.1.vm04.stdout:7/933: unlink d2/dc/l33 0 2026-03-10T14:08:19.181 INFO:tasks.workunit.client.0.vm03.stdout:8/365: truncate f6 3089358 0 2026-03-10T14:08:19.182 INFO:tasks.workunit.client.1.vm04.stdout:1/884: link d3/d5c/l8d d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92/l12f 0 2026-03-10T14:08:19.183 INFO:tasks.workunit.client.0.vm03.stdout:8/366: dread - da/d3c/f48 zero size 2026-03-10T14:08:19.193 INFO:tasks.workunit.client.1.vm04.stdout:2/899: mkdir d0/d14/d91/d8/d10d/d11b 0 2026-03-10T14:08:19.199 INFO:tasks.workunit.client.1.vm04.stdout:4/834: creat d4/df/f115 x:0 0 0 2026-03-10T14:08:19.204 INFO:tasks.workunit.client.1.vm04.stdout:7/934: rename d2/d94/f9a to d2/dc/de/d2d/d60/d81/db3/d142/f146 0 2026-03-10T14:08:19.215 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:08:19.215 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: pgmap v9: 65 pgs: 65 active+clean; 2.6 GiB data, 9.0 GiB used, 111 GiB / 120 GiB avail; 35 MiB/s rd, 100 MiB/s wr, 245 op/s 2026-03-10T14:08:19.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: Upgrade: Need to upgrade myself (mgr.vm04.ywwcto) 2026-03-10T14:08:19.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: Upgrade: Need to upgrade myself (mgr.vm04.ywwcto) 2026-03-10T14:08:19.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:19.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.rwbbep", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:08:19.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.rwbbep", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:08:19.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:08:19.216 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:19 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:19.216 INFO:tasks.workunit.client.1.vm04.stdout:1/885: truncate d3/d22/d2f/f5d 1159637 0 2026-03-10T14:08:19.216 INFO:tasks.workunit.client.1.vm04.stdout:4/835: mknod d4/df/db2/db6/dc9/dd0/c116 0 2026-03-10T14:08:19.216 INFO:tasks.workunit.client.1.vm04.stdout:6/768: mkdir d3/ded 0 2026-03-10T14:08:19.216 INFO:tasks.workunit.client.1.vm04.stdout:6/769: write d3/de/d35/d3f/d2d/d38/f60 [653234,1884] 0 2026-03-10T14:08:19.217 INFO:tasks.workunit.client.1.vm04.stdout:7/935: mknod d2/dc/de/d2d/d60/d7c/d64/d108/d117/c147 0 2026-03-10T14:08:19.222 INFO:tasks.workunit.client.1.vm04.stdout:4/836: symlink d4/d14/d3c/d62/de6/l117 0 2026-03-10T14:08:19.226 INFO:tasks.workunit.client.1.vm04.stdout:6/770: creat d3/d1d/d73/fee x:0 0 0 2026-03-10T14:08:19.229 INFO:tasks.workunit.client.1.vm04.stdout:6/771: dwrite d3/de/d35/d3f/f96 [0,4194304] 0 2026-03-10T14:08:19.238 INFO:tasks.workunit.client.1.vm04.stdout:1/886: dread d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/f11a [0,4194304] 0 2026-03-10T14:08:19.249 INFO:tasks.workunit.client.1.vm04.stdout:3/893: dread da/dc/d35/f5b [0,4194304] 0 2026-03-10T14:08:19.250 INFO:tasks.workunit.client.1.vm04.stdout:4/837: rename d4/d14/d3c/d85/ff1 to d4/df/db2/f118 0 2026-03-10T14:08:19.251 INFO:tasks.workunit.client.0.vm03.stdout:6/339: dwrite d8/d11/d18/f21 [0,4194304] 0 2026-03-10T14:08:19.253 INFO:tasks.workunit.client.1.vm04.stdout:3/894: creat da/dc/d35/d37/d10a/f134 x:0 0 0 2026-03-10T14:08:19.253 INFO:tasks.workunit.client.0.vm03.stdout:6/340: write d8/d11/d18/f21 [3781094,32796] 0 2026-03-10T14:08:19.254 INFO:tasks.workunit.client.1.vm04.stdout:1/887: rename d3/d22/d63/d35/dd9/d13/d38/db5/dc4/fe7 to d3/d22/d2f/d11d/f130 0 2026-03-10T14:08:19.258 INFO:tasks.workunit.client.1.vm04.stdout:1/888: fsync d3/d5c/fbf 0 2026-03-10T14:08:19.259 INFO:tasks.workunit.client.1.vm04.stdout:1/889: readlink d3/d22/d63/d35/dd9/d13/d38/df3/l128 0 2026-03-10T14:08:19.259 INFO:tasks.workunit.client.1.vm04.stdout:4/838: link d4/df/db2/db4/d47/d4f/d8c/fa4 d4/d14/d3c/d5e/f119 0 2026-03-10T14:08:19.262 INFO:tasks.workunit.client.1.vm04.stdout:4/839: unlink d4/d14/d3c/la3 0 2026-03-10T14:08:19.270 INFO:tasks.workunit.client.1.vm04.stdout:4/840: rename d4/d14/d6d/l94 to d4/d14/l11a 0 2026-03-10T14:08:19.270 INFO:tasks.workunit.client.1.vm04.stdout:4/841: fdatasync d4/df/db2/de1/f108 0 2026-03-10T14:08:19.310 INFO:tasks.workunit.client.1.vm04.stdout:5/950: dwrite d7/d59/d7e/d87/fe6 [0,4194304] 0 2026-03-10T14:08:19.311 INFO:tasks.workunit.client.1.vm04.stdout:8/984: truncate d0/d3/dd/d76/f9c 534125 0 2026-03-10T14:08:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:08:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: pgmap v9: 65 pgs: 65 active+clean; 2.6 GiB data, 9.0 GiB used, 111 GiB / 120 GiB avail; 35 MiB/s rd, 100 MiB/s wr, 245 op/s 2026-03-10T14:08:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: Upgrade: Need to upgrade myself (mgr.vm04.ywwcto) 2026-03-10T14:08:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: Upgrade: Need to upgrade myself (mgr.vm04.ywwcto) 2026-03-10T14:08:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.rwbbep", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:08:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.rwbbep", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:08:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:08:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:19 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:19.314 INFO:tasks.workunit.client.1.vm04.stdout:9/863: write d9/da/dd/de7/d96/d9d/fa2 [290244,113213] 0 2026-03-10T14:08:19.317 INFO:tasks.workunit.client.1.vm04.stdout:0/895: write d0/d2/d15/d22/d38/d56/dcb/dce/ffb [231263,14302] 0 2026-03-10T14:08:19.320 INFO:tasks.workunit.client.1.vm04.stdout:2/900: truncate d0/d14/d91/d3a/ffd 283260 0 2026-03-10T14:08:19.323 INFO:tasks.workunit.client.1.vm04.stdout:5/951: rename d7/d26/ddf/l11d to d7/d2d/d32/l13c 0 2026-03-10T14:08:19.325 INFO:tasks.workunit.client.1.vm04.stdout:7/936: write d2/dc/de/d2d/d60/d81/fd3 [1646638,128695] 0 2026-03-10T14:08:19.329 INFO:tasks.workunit.client.1.vm04.stdout:6/772: dwrite d3/de/f92 [0,4194304] 0 2026-03-10T14:08:19.330 INFO:tasks.workunit.client.1.vm04.stdout:6/773: chown d3/de/d35/d3f/c65 238 1 2026-03-10T14:08:19.339 INFO:tasks.workunit.client.1.vm04.stdout:2/901: truncate d0/d14/d91/f1d 3618422 0 2026-03-10T14:08:19.341 INFO:tasks.workunit.client.1.vm04.stdout:3/895: write da/dc/d3f/d54/f82 [493398,26959] 0 2026-03-10T14:08:19.348 INFO:tasks.workunit.client.1.vm04.stdout:7/937: creat d2/dc/d4d/f148 x:0 0 0 2026-03-10T14:08:19.351 INFO:tasks.workunit.client.1.vm04.stdout:5/952: dread d7/d2d/d76/f84 [0,4194304] 0 2026-03-10T14:08:19.355 INFO:tasks.workunit.client.1.vm04.stdout:1/890: dwrite d3/d22/d2f/d109/d12b/d60/fb6 [0,4194304] 0 2026-03-10T14:08:19.356 INFO:tasks.workunit.client.1.vm04.stdout:4/842: write d4/d14/f3b [1802539,76026] 0 2026-03-10T14:08:19.362 INFO:tasks.workunit.client.1.vm04.stdout:6/774: dread d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/dad/fbb [0,4194304] 0 2026-03-10T14:08:19.363 INFO:tasks.workunit.client.1.vm04.stdout:7/938: symlink d2/dc/de/d2d/d60/d7c/d64/d108/d117/d120/l149 0 2026-03-10T14:08:19.364 INFO:tasks.workunit.client.1.vm04.stdout:0/896: dread d0/d2/d15/d49/d50/d5c/dd8/fed [0,4194304] 0 2026-03-10T14:08:19.388 INFO:tasks.workunit.client.1.vm04.stdout:5/953: truncate d7/d12/d2b/d93/d9e/faf 39077 0 2026-03-10T14:08:19.389 INFO:tasks.workunit.client.1.vm04.stdout:5/954: chown d7/d12/d2b/d3e/d3f/l78 40 1 2026-03-10T14:08:19.389 INFO:tasks.workunit.client.1.vm04.stdout:5/955: fdatasync d7/d12/f126 0 2026-03-10T14:08:19.416 INFO:tasks.workunit.client.1.vm04.stdout:8/985: truncate d0/d3/d63/f5f 639395 0 2026-03-10T14:08:19.417 INFO:tasks.workunit.client.0.vm03.stdout:4/387: dwrite d5/d9/db/f24 [4194304,4194304] 0 2026-03-10T14:08:19.421 INFO:tasks.workunit.client.0.vm03.stdout:4/388: fsync d5/d9/db/f67 0 2026-03-10T14:08:19.428 INFO:tasks.workunit.client.1.vm04.stdout:9/864: write d9/da/d8c/fbf [434191,16789] 0 2026-03-10T14:08:19.428 INFO:tasks.workunit.client.1.vm04.stdout:7/939: read d2/dc/de/d2d/d60/ff2 [42349,58964] 0 2026-03-10T14:08:19.432 INFO:tasks.workunit.client.0.vm03.stdout:0/320: write d3/f19 [4170822,49357] 0 2026-03-10T14:08:19.433 INFO:tasks.workunit.client.1.vm04.stdout:1/891: mknod d3/d5c/d79/de6/c131 0 2026-03-10T14:08:19.434 INFO:tasks.workunit.client.1.vm04.stdout:2/902: link d0/d14/d91/d4a/d66/dda/df4/c100 d0/d14/d91/d4a/d66/dda/df4/c11c 0 2026-03-10T14:08:19.443 INFO:tasks.workunit.client.1.vm04.stdout:8/986: write d0/d3/dd/d78/f10c [969254,121922] 0 2026-03-10T14:08:19.443 INFO:tasks.workunit.client.1.vm04.stdout:6/775: rmdir d3/de/d35/d3a/d43/d9c 39 2026-03-10T14:08:19.444 INFO:tasks.workunit.client.1.vm04.stdout:0/897: symlink d0/d2/d15/d22/d38/d56/dcb/dce/df5/l116 0 2026-03-10T14:08:19.448 INFO:tasks.workunit.client.1.vm04.stdout:5/956: creat d7/d12/d2b/d93/d138/f13d x:0 0 0 2026-03-10T14:08:19.449 INFO:tasks.workunit.client.0.vm03.stdout:9/367: write d2/d29/d33/d55/f5f [540001,10318] 0 2026-03-10T14:08:19.456 INFO:tasks.workunit.client.1.vm04.stdout:1/892: creat d3/d22/d2f/d57/f132 x:0 0 0 2026-03-10T14:08:19.460 INFO:tasks.workunit.client.0.vm03.stdout:9/368: dread d2/d29/d33/d41/f53 [0,4194304] 0 2026-03-10T14:08:19.465 INFO:tasks.workunit.client.1.vm04.stdout:3/896: getdents da/dc/d47/d9b/d106/dde 0 2026-03-10T14:08:19.475 INFO:tasks.workunit.client.1.vm04.stdout:5/957: rename d7/d26/ddf to d7/d12/d2b/d3e/d57/d9f/d120/d13e 0 2026-03-10T14:08:19.486 INFO:tasks.workunit.client.0.vm03.stdout:7/256: dwrite d5/d9/f22 [0,4194304] 0 2026-03-10T14:08:19.486 INFO:tasks.workunit.client.1.vm04.stdout:1/893: mknod d3/d22/c133 0 2026-03-10T14:08:19.490 INFO:tasks.workunit.client.1.vm04.stdout:1/894: dwrite d3/d22/d63/d35/dd9/d13/da0/dc5/dfa/f122 [0,4194304] 0 2026-03-10T14:08:19.490 INFO:tasks.workunit.client.0.vm03.stdout:7/257: mknod d5/d9/d3e/c4f 0 2026-03-10T14:08:19.498 INFO:tasks.workunit.client.1.vm04.stdout:8/987: symlink d0/d3/d63/d12/d51/l131 0 2026-03-10T14:08:19.504 INFO:tasks.workunit.client.1.vm04.stdout:9/865: rmdir d9/da/d8c/de5/df6/d11a 0 2026-03-10T14:08:19.505 INFO:tasks.workunit.client.1.vm04.stdout:4/843: getdents d4/df/d31 0 2026-03-10T14:08:19.508 INFO:tasks.workunit.client.1.vm04.stdout:4/844: read d4/d14/d64/fab [405929,128254] 0 2026-03-10T14:08:19.509 INFO:tasks.workunit.client.1.vm04.stdout:5/958: fsync d7/d12/d2b/f4d 0 2026-03-10T14:08:19.510 INFO:tasks.workunit.client.1.vm04.stdout:9/866: creat d9/d5c/f12b x:0 0 0 2026-03-10T14:08:19.517 INFO:tasks.workunit.client.0.vm03.stdout:7/258: rmdir d5/d9 39 2026-03-10T14:08:19.522 INFO:tasks.workunit.client.1.vm04.stdout:1/895: mknod d3/d22/d63/c134 0 2026-03-10T14:08:19.529 INFO:tasks.workunit.client.0.vm03.stdout:7/259: mknod d5/d9/d14/c50 0 2026-03-10T14:08:19.533 INFO:tasks.workunit.client.1.vm04.stdout:0/898: dread d0/d2/d15/d22/d38/d56/d66/f2b [0,4194304] 0 2026-03-10T14:08:19.535 INFO:tasks.workunit.client.1.vm04.stdout:0/899: mknod d0/d2/d15/d22/d38/d56/da7/c117 0 2026-03-10T14:08:19.536 INFO:tasks.workunit.client.0.vm03.stdout:7/260: mkdir d5/d9/d14/d26/d36/d51 0 2026-03-10T14:08:19.543 INFO:tasks.workunit.client.0.vm03.stdout:3/342: dread d1d/d33/f3a [0,4194304] 0 2026-03-10T14:08:19.544 INFO:tasks.workunit.client.0.vm03.stdout:3/343: truncate d1d/d33/f5e 165838 0 2026-03-10T14:08:19.545 INFO:tasks.workunit.client.1.vm04.stdout:4/845: mkdir d4/df/d11b 0 2026-03-10T14:08:19.546 INFO:tasks.workunit.client.1.vm04.stdout:4/846: mkdir d4/df/db2/db6/dc9/d11c 0 2026-03-10T14:08:19.547 INFO:tasks.workunit.client.0.vm03.stdout:7/261: truncate d5/d9/d14/d21/d28/f46 4192341 0 2026-03-10T14:08:19.547 INFO:tasks.workunit.client.0.vm03.stdout:5/437: dwrite d4/d6/fa [0,4194304] 0 2026-03-10T14:08:19.553 INFO:tasks.workunit.client.0.vm03.stdout:3/344: symlink d1d/l64 0 2026-03-10T14:08:19.556 INFO:tasks.workunit.client.0.vm03.stdout:5/438: fsync d4/d13/d43/f51 0 2026-03-10T14:08:19.557 INFO:tasks.workunit.client.0.vm03.stdout:3/345: fsync d1d/d39/f42 0 2026-03-10T14:08:19.559 INFO:tasks.workunit.client.0.vm03.stdout:7/262: dwrite d5/d9/f22 [0,4194304] 0 2026-03-10T14:08:19.575 INFO:tasks.workunit.client.1.vm04.stdout:3/897: dread da/dc/d3f/d54/d66/fa7 [0,4194304] 0 2026-03-10T14:08:19.575 INFO:tasks.workunit.client.1.vm04.stdout:3/898: chown da/d8e/l7a 0 1 2026-03-10T14:08:19.584 INFO:tasks.workunit.client.1.vm04.stdout:3/899: dread da/dc/d3f/f4d [0,4194304] 0 2026-03-10T14:08:19.585 INFO:tasks.workunit.client.1.vm04.stdout:3/900: read da/dc/d35/d52/d70/f8f [703110,130777] 0 2026-03-10T14:08:19.589 INFO:tasks.workunit.client.1.vm04.stdout:3/901: link da/dc/d3f/d54/ff3 da/d8e/f135 0 2026-03-10T14:08:19.589 INFO:tasks.workunit.client.1.vm04.stdout:3/902: readlink l1 0 2026-03-10T14:08:19.590 INFO:tasks.workunit.client.1.vm04.stdout:3/903: creat da/dc/d35/f136 x:0 0 0 2026-03-10T14:08:19.596 INFO:tasks.workunit.client.1.vm04.stdout:3/904: dread da/dc/d47/d9b/fd2 [0,4194304] 0 2026-03-10T14:08:19.598 INFO:tasks.workunit.client.1.vm04.stdout:3/905: mknod da/d8e/c137 0 2026-03-10T14:08:19.599 INFO:tasks.workunit.client.1.vm04.stdout:3/906: mknod da/dc/d3f/d61/dc1/c138 0 2026-03-10T14:08:19.600 INFO:tasks.workunit.client.1.vm04.stdout:3/907: creat da/dc/d47/d9b/d106/dde/f139 x:0 0 0 2026-03-10T14:08:19.611 INFO:tasks.workunit.client.0.vm03.stdout:3/346: getdents d1d/d29/d41 0 2026-03-10T14:08:19.614 INFO:tasks.workunit.client.0.vm03.stdout:3/347: dwrite fc [0,4194304] 0 2026-03-10T14:08:19.628 INFO:tasks.workunit.client.0.vm03.stdout:5/439: dread d4/f17 [0,4194304] 0 2026-03-10T14:08:19.628 INFO:tasks.workunit.client.0.vm03.stdout:5/440: stat d4/d6/c45 0 2026-03-10T14:08:19.629 INFO:tasks.workunit.client.0.vm03.stdout:5/441: readlink d4/d6/l61 0 2026-03-10T14:08:19.667 INFO:tasks.workunit.client.0.vm03.stdout:3/348: rename d1d/d40 to d1d/d33/d65 0 2026-03-10T14:08:19.671 INFO:tasks.workunit.client.0.vm03.stdout:3/349: creat d1d/f66 x:0 0 0 2026-03-10T14:08:19.674 INFO:tasks.workunit.client.0.vm03.stdout:3/350: dwrite d1d/f66 [0,4194304] 0 2026-03-10T14:08:19.722 INFO:tasks.workunit.client.0.vm03.stdout:1/362: write d0/d2/d34/f3a [2186868,54205] 0 2026-03-10T14:08:19.730 INFO:tasks.workunit.client.0.vm03.stdout:1/363: read d0/f48 [309466,91561] 0 2026-03-10T14:08:19.733 INFO:tasks.workunit.client.0.vm03.stdout:1/364: dread d0/f24 [0,4194304] 0 2026-03-10T14:08:19.735 INFO:tasks.workunit.client.0.vm03.stdout:1/365: truncate d0/d2/df/d16/d20/f6b 281779 0 2026-03-10T14:08:19.735 INFO:tasks.workunit.client.0.vm03.stdout:1/366: stat d0/d18/d1d/l40 0 2026-03-10T14:08:19.735 INFO:tasks.workunit.client.0.vm03.stdout:1/367: stat d0/d2/df/d27/f58 0 2026-03-10T14:08:19.740 INFO:tasks.workunit.client.0.vm03.stdout:1/368: dread d0/d18/d1d/f5e [0,4194304] 0 2026-03-10T14:08:19.752 INFO:tasks.workunit.client.1.vm04.stdout:7/940: write d2/dc/de/f98 [929569,55182] 0 2026-03-10T14:08:19.756 INFO:tasks.workunit.client.1.vm04.stdout:7/941: mknod d2/dc/de/d2d/d60/d7c/d64/d108/d117/d120/d126/c14a 0 2026-03-10T14:08:19.758 INFO:tasks.workunit.client.1.vm04.stdout:7/942: symlink d2/dc/de/d2d/d60/d7c/d64/d108/d117/d120/l14b 0 2026-03-10T14:08:19.763 INFO:tasks.workunit.client.1.vm04.stdout:2/903: write d0/d14/d91/d4a/d8c/d92/fb1 [124047,16674] 0 2026-03-10T14:08:19.769 INFO:tasks.workunit.client.1.vm04.stdout:6/776: dwrite d3/de/f46 [0,4194304] 0 2026-03-10T14:08:19.770 INFO:tasks.workunit.client.1.vm04.stdout:6/777: write d3/de/d35/d3a/d43/d4c/d5e/fea [728751,40968] 0 2026-03-10T14:08:19.778 INFO:tasks.workunit.client.1.vm04.stdout:6/778: dwrite d3/de/d35/d3f/d2d/d32/d23/ddc/fe7 [0,4194304] 0 2026-03-10T14:08:19.778 INFO:tasks.workunit.client.0.vm03.stdout:2/336: read d5/d2a/f51 [17604,65716] 0 2026-03-10T14:08:19.785 INFO:tasks.workunit.client.1.vm04.stdout:6/779: unlink d3/de/d35/d3f/d2d/c37 0 2026-03-10T14:08:19.801 INFO:tasks.workunit.client.0.vm03.stdout:2/337: unlink d5/ld 0 2026-03-10T14:08:19.802 INFO:tasks.workunit.client.1.vm04.stdout:5/959: dwrite d7/d9/db5/fd1 [0,4194304] 0 2026-03-10T14:08:19.810 INFO:tasks.workunit.client.1.vm04.stdout:9/867: dwrite d9/da/dd/d1c/da3/fd0 [0,4194304] 0 2026-03-10T14:08:19.817 INFO:tasks.workunit.client.1.vm04.stdout:1/896: write d3/d22/d63/d35/dd9/d13/d38/d58/faf [2380846,62693] 0 2026-03-10T14:08:19.826 INFO:tasks.workunit.client.1.vm04.stdout:1/897: rmdir d3/d22/d63/d35/dd9/d13/d38/d58/dcc 39 2026-03-10T14:08:19.826 INFO:tasks.workunit.client.0.vm03.stdout:2/338: rmdir d5/d10/d1c 39 2026-03-10T14:08:19.826 INFO:tasks.workunit.client.1.vm04.stdout:0/900: dwrite d0/d2/d15/d49/d50/d5c/f76 [0,4194304] 0 2026-03-10T14:08:19.828 INFO:tasks.workunit.client.0.vm03.stdout:2/339: mknod d5/d10/d1c/d40/c65 0 2026-03-10T14:08:19.829 INFO:tasks.workunit.client.0.vm03.stdout:2/340: unlink d5/d2a/f51 0 2026-03-10T14:08:19.832 INFO:tasks.workunit.client.1.vm04.stdout:1/898: mkdir d3/d22/d63/d35/dd9/d13/d38/d58/d135 0 2026-03-10T14:08:19.834 INFO:tasks.workunit.client.1.vm04.stdout:4/847: write d4/d14/d3c/d5e/f92 [1296044,82469] 0 2026-03-10T14:08:19.835 INFO:tasks.workunit.client.1.vm04.stdout:0/901: rename d0/d2/c37 to d0/d2/d15/d22/d38/d56/dc1/d107/c118 0 2026-03-10T14:08:19.841 INFO:tasks.workunit.client.1.vm04.stdout:4/848: fsync d4/d14/d1b/f28 0 2026-03-10T14:08:19.845 INFO:tasks.workunit.client.1.vm04.stdout:1/899: rename d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/d84/f121 to d3/d22/d63/d35/dd9/d13/f136 0 2026-03-10T14:08:19.855 INFO:tasks.workunit.client.1.vm04.stdout:1/900: mkdir d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/d137 0 2026-03-10T14:08:19.855 INFO:tasks.workunit.client.0.vm03.stdout:2/341: dread d5/d10/d17/f18 [4194304,4194304] 0 2026-03-10T14:08:19.855 INFO:tasks.workunit.client.0.vm03.stdout:2/342: unlink d5/lc 0 2026-03-10T14:08:19.855 INFO:tasks.workunit.client.0.vm03.stdout:2/343: getdents d5/d10/d1f 0 2026-03-10T14:08:19.860 INFO:tasks.workunit.client.1.vm04.stdout:1/901: fsync d3/d22/d2f/d57/f78 0 2026-03-10T14:08:19.863 INFO:tasks.workunit.client.1.vm04.stdout:3/908: truncate da/dc/d3f/d61/f94 3995811 0 2026-03-10T14:08:19.864 INFO:tasks.workunit.client.1.vm04.stdout:0/902: link d0/d2/d15/d22/d38/d56/dc1/ld3 d0/d2/d15/d22/d38/d56/dcb/dce/df5/l119 0 2026-03-10T14:08:19.869 INFO:tasks.workunit.client.1.vm04.stdout:4/849: dread d4/d14/d1b/f6c [0,4194304] 0 2026-03-10T14:08:19.874 INFO:tasks.workunit.client.1.vm04.stdout:1/902: truncate d3/d22/d63/d35/dd9/d13/d38/f3d 707938 0 2026-03-10T14:08:19.874 INFO:tasks.workunit.client.0.vm03.stdout:2/344: symlink d5/d10/d1f/d5d/l66 0 2026-03-10T14:08:19.874 INFO:tasks.workunit.client.0.vm03.stdout:2/345: symlink d5/d10/d1c/d40/d59/l67 0 2026-03-10T14:08:19.874 INFO:tasks.workunit.client.0.vm03.stdout:2/346: stat d5/d10/d1f/d5d/l66 0 2026-03-10T14:08:19.876 INFO:tasks.workunit.client.1.vm04.stdout:1/903: creat d3/d22/d63/d35/dd5/f138 x:0 0 0 2026-03-10T14:08:19.881 INFO:tasks.workunit.client.0.vm03.stdout:8/367: write da/d3a/d44/f46 [1067181,118634] 0 2026-03-10T14:08:19.881 INFO:tasks.workunit.client.0.vm03.stdout:8/368: stat da/f16 0 2026-03-10T14:08:19.885 INFO:tasks.workunit.client.1.vm04.stdout:3/909: dread da/dc/d47/d9b/d106/dde/dac/fc9 [0,4194304] 0 2026-03-10T14:08:19.888 INFO:tasks.workunit.client.1.vm04.stdout:0/903: getdents d0/d2/d15/d49/d50/d61/ddb 0 2026-03-10T14:08:19.890 INFO:tasks.workunit.client.1.vm04.stdout:8/988: dwrite d0/d3/dd/d76/f9c [0,4194304] 0 2026-03-10T14:08:19.891 INFO:tasks.workunit.client.1.vm04.stdout:0/904: dread - d0/d2/d15/d22/d38/d56/dc1/d107/f10a zero size 2026-03-10T14:08:19.897 INFO:tasks.workunit.client.1.vm04.stdout:7/943: dwrite d2/d2a/d42/d86/fd0 [0,4194304] 0 2026-03-10T14:08:19.900 INFO:tasks.workunit.client.1.vm04.stdout:2/904: dwrite d0/d14/d91/d8/d17/d35/f5f [4194304,4194304] 0 2026-03-10T14:08:19.901 INFO:tasks.workunit.client.0.vm03.stdout:6/341: write d8/db/d49/f4a [1422771,88411] 0 2026-03-10T14:08:19.902 INFO:tasks.workunit.client.1.vm04.stdout:1/904: unlink d3/d22/d63/d35/dd9/d13/d38/db5/ffd 0 2026-03-10T14:08:19.908 INFO:tasks.workunit.client.1.vm04.stdout:6/780: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f85 [0,4194304] 0 2026-03-10T14:08:19.910 INFO:tasks.workunit.client.0.vm03.stdout:6/342: dwrite d8/db/d2c/d2d/d32/d3a/f5e [0,4194304] 0 2026-03-10T14:08:19.915 INFO:tasks.workunit.client.1.vm04.stdout:3/910: creat da/dc/d3f/d61/dc1/f13a x:0 0 0 2026-03-10T14:08:19.915 INFO:tasks.workunit.client.1.vm04.stdout:5/960: write d7/d12/d45/f61 [1071195,56434] 0 2026-03-10T14:08:19.915 INFO:tasks.workunit.client.0.vm03.stdout:6/343: link d8/db/d2c/d2d/d32/f4b d8/db/d12/d51/d5c/f68 0 2026-03-10T14:08:19.917 INFO:tasks.workunit.client.0.vm03.stdout:6/344: unlink d8/d1b/l24 0 2026-03-10T14:08:19.918 INFO:tasks.workunit.client.0.vm03.stdout:6/345: creat d8/db/d12/d51/d5c/f69 x:0 0 0 2026-03-10T14:08:19.920 INFO:tasks.workunit.client.0.vm03.stdout:6/346: creat d8/db/d12/f6a x:0 0 0 2026-03-10T14:08:19.923 INFO:tasks.workunit.client.0.vm03.stdout:6/347: rename d8/db/d12/l3f to d8/db/d12/l6b 0 2026-03-10T14:08:19.923 INFO:tasks.workunit.client.0.vm03.stdout:6/348: read - d8/db/df/f63 zero size 2026-03-10T14:08:19.927 INFO:tasks.workunit.client.1.vm04.stdout:9/868: dwrite d9/da/dd/de7/d96/d9d/fe9 [0,4194304] 0 2026-03-10T14:08:19.934 INFO:tasks.workunit.client.1.vm04.stdout:7/944: rmdir d2/dc/de/d2d/d60/d7c/d36 39 2026-03-10T14:08:19.934 INFO:tasks.workunit.client.0.vm03.stdout:8/369: dread da/f31 [0,4194304] 0 2026-03-10T14:08:19.934 INFO:tasks.workunit.client.0.vm03.stdout:6/349: fsync d8/db/d2c/d2d/d32/f3e 0 2026-03-10T14:08:19.934 INFO:tasks.workunit.client.0.vm03.stdout:2/347: sync 2026-03-10T14:08:19.934 INFO:tasks.workunit.client.1.vm04.stdout:2/905: unlink d0/d14/lc5 0 2026-03-10T14:08:19.935 INFO:tasks.workunit.client.0.vm03.stdout:8/370: mknod da/d58/d5f/c74 0 2026-03-10T14:08:19.935 INFO:tasks.workunit.client.0.vm03.stdout:2/348: fdatasync d5/d10/f13 0 2026-03-10T14:08:19.939 INFO:tasks.workunit.client.1.vm04.stdout:1/905: symlink d3/d22/d63/d35/dd5/l139 0 2026-03-10T14:08:19.946 INFO:tasks.workunit.client.0.vm03.stdout:2/349: sync 2026-03-10T14:08:19.947 INFO:tasks.workunit.client.0.vm03.stdout:2/350: write d5/d10/d1c/f5b [114312,90141] 0 2026-03-10T14:08:19.947 INFO:tasks.workunit.client.0.vm03.stdout:8/371: mkdir da/d3c/d51/d75 0 2026-03-10T14:08:19.948 INFO:tasks.workunit.client.0.vm03.stdout:2/351: readlink d5/l62 0 2026-03-10T14:08:19.953 INFO:tasks.workunit.client.0.vm03.stdout:2/352: creat d5/d10/d1c/d40/f68 x:0 0 0 2026-03-10T14:08:19.955 INFO:tasks.workunit.client.0.vm03.stdout:2/353: creat d5/d10/d1c/d50/f69 x:0 0 0 2026-03-10T14:08:19.956 INFO:tasks.workunit.client.0.vm03.stdout:2/354: mknod d5/d35/c6a 0 2026-03-10T14:08:19.956 INFO:tasks.workunit.client.0.vm03.stdout:2/355: readlink d5/d2a/l2f 0 2026-03-10T14:08:19.956 INFO:tasks.workunit.client.1.vm04.stdout:0/905: mkdir d0/d2/d15/d49/d50/d61/d11a 0 2026-03-10T14:08:19.958 INFO:tasks.workunit.client.0.vm03.stdout:2/356: dread d5/d10/f13 [0,4194304] 0 2026-03-10T14:08:19.960 INFO:tasks.workunit.client.0.vm03.stdout:2/357: unlink d5/f61 0 2026-03-10T14:08:19.960 INFO:tasks.workunit.client.1.vm04.stdout:7/945: creat d2/dc/de/d2d/d60/d7c/d3b/f14c x:0 0 0 2026-03-10T14:08:19.961 INFO:tasks.workunit.client.0.vm03.stdout:2/358: creat d5/d10/f6b x:0 0 0 2026-03-10T14:08:19.967 INFO:tasks.workunit.client.1.vm04.stdout:2/906: rmdir d0/d14/d91/d4a/d66 39 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:9/869: dread d9/da/dd/d1c/f27 [0,4194304] 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:9/870: chown d9/da/dd/de7/d96/d9d 63044607 1 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:1/906: creat d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/dce/d117/f13a x:0 0 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:1/907: readlink d3/d22/d63/d35/d6c/l86 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:2/907: symlink d0/d14/d1b/d45/d115/l11d 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:9/871: truncate d9/d33/fbe 4847204 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:6/781: link d3/de/d35/d3f/d2d/d38/cd3 d3/de/cef 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:9/872: creat d9/da/dd/d1c/da3/dec/f12c x:0 0 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:3/911: link da/l23 da/dc/d47/d9b/d106/l13b 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:2/908: mknod d0/d14/d91/d4a/d8c/dab/d46/c11e 0 2026-03-10T14:08:20.005 INFO:tasks.workunit.client.1.vm04.stdout:3/912: stat da/dc/d35/d52/laf 0 2026-03-10T14:08:20.008 INFO:tasks.workunit.client.1.vm04.stdout:3/913: creat da/dc/d35/d37/d10a/f13c x:0 0 0 2026-03-10T14:08:20.014 INFO:tasks.workunit.client.1.vm04.stdout:2/909: mknod d0/d14/d91/d4a/d66/c11f 0 2026-03-10T14:08:20.017 INFO:tasks.workunit.client.1.vm04.stdout:3/914: mkdir da/dc/d47/d13d 0 2026-03-10T14:08:20.018 INFO:tasks.workunit.client.1.vm04.stdout:9/873: link d9/da/c2f d9/d44/d4d/c12d 0 2026-03-10T14:08:20.022 INFO:tasks.workunit.client.1.vm04.stdout:1/908: dread d3/d5c/fbf [0,4194304] 0 2026-03-10T14:08:20.022 INFO:tasks.workunit.client.1.vm04.stdout:3/915: chown da/d30/l33 70 1 2026-03-10T14:08:20.027 INFO:tasks.workunit.client.1.vm04.stdout:2/910: truncate d0/d14/d91/d3a/d3e/fbd 524388 0 2026-03-10T14:08:20.028 INFO:tasks.workunit.client.1.vm04.stdout:1/909: rename d3/d22/d2f/d11d to d3/d22/d6d/d13b 0 2026-03-10T14:08:20.032 INFO:tasks.workunit.client.1.vm04.stdout:6/782: dread d3/de/d35/d3a/d43/d4c/f4d [0,4194304] 0 2026-03-10T14:08:20.032 INFO:tasks.workunit.client.1.vm04.stdout:1/910: chown d3/d5c/d79/d98/c9f 68538 1 2026-03-10T14:08:20.033 INFO:tasks.workunit.client.1.vm04.stdout:2/911: rmdir d0/d14/d1b/d45/d115 39 2026-03-10T14:08:20.033 INFO:tasks.workunit.client.1.vm04.stdout:9/874: creat d9/da/d8c/de5/f12e x:0 0 0 2026-03-10T14:08:20.034 INFO:tasks.workunit.client.1.vm04.stdout:1/911: rmdir d3/d22 39 2026-03-10T14:08:20.038 INFO:tasks.workunit.client.1.vm04.stdout:1/912: rename d3/l5e to d3/d22/deb/l13c 0 2026-03-10T14:08:20.043 INFO:tasks.workunit.client.1.vm04.stdout:9/875: getdents d9/da/dd/de7/d96 0 2026-03-10T14:08:20.043 INFO:tasks.workunit.client.1.vm04.stdout:1/913: read d3/d22/d63/f69 [238635,81209] 0 2026-03-10T14:08:20.043 INFO:tasks.workunit.client.1.vm04.stdout:9/876: chown d9/da/dd/de7/d96/lff 59511405 1 2026-03-10T14:08:20.045 INFO:tasks.workunit.client.1.vm04.stdout:9/877: fsync d9/d58/db5/fc6 0 2026-03-10T14:08:20.047 INFO:tasks.workunit.client.1.vm04.stdout:9/878: link d9/d44/d59/c7b d9/d44/d70/c12f 0 2026-03-10T14:08:20.064 INFO:tasks.workunit.client.1.vm04.stdout:9/879: rename d9/dd3/df8/f10a to d9/da/d5d/d81/f130 0 2026-03-10T14:08:20.065 INFO:tasks.workunit.client.1.vm04.stdout:9/880: getdents d9/da/dd/d1c/da3 0 2026-03-10T14:08:20.065 INFO:tasks.workunit.client.1.vm04.stdout:9/881: creat d9/da/dd/d1c/f131 x:0 0 0 2026-03-10T14:08:20.065 INFO:tasks.workunit.client.1.vm04.stdout:9/882: dwrite d9/d44/d70/fee [0,4194304] 0 2026-03-10T14:08:20.065 INFO:tasks.workunit.client.1.vm04.stdout:9/883: fdatasync d9/dd3/f108 0 2026-03-10T14:08:20.065 INFO:tasks.workunit.client.1.vm04.stdout:9/884: truncate d9/da/dd/fd9 172293 0 2026-03-10T14:08:20.065 INFO:tasks.workunit.client.1.vm04.stdout:5/961: sync 2026-03-10T14:08:20.071 INFO:tasks.workunit.client.1.vm04.stdout:2/912: dread d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/f7b [0,4194304] 0 2026-03-10T14:08:20.076 INFO:tasks.workunit.client.1.vm04.stdout:5/962: unlink d7/d12/d2b/d93/d138/f13d 0 2026-03-10T14:08:20.094 INFO:tasks.workunit.client.1.vm04.stdout:5/963: rename d7/d12/d2b/d3e/d57/d9f to d7/d59/d7d/d9a/d13f 0 2026-03-10T14:08:20.103 INFO:tasks.workunit.client.1.vm04.stdout:2/913: creat d0/d14/d39/d47/f120 x:0 0 0 2026-03-10T14:08:20.103 INFO:tasks.workunit.client.1.vm04.stdout:5/964: dread - d7/d12/d2b/d3e/d3f/da6/fee zero size 2026-03-10T14:08:20.106 INFO:tasks.workunit.client.1.vm04.stdout:2/914: sync 2026-03-10T14:08:20.117 INFO:tasks.workunit.client.1.vm04.stdout:2/915: rename d0/d14/d1b/d45/c6f to d0/d14/d91/d4a/d8c/dab/d95/d107/c121 0 2026-03-10T14:08:20.120 INFO:tasks.workunit.client.1.vm04.stdout:5/965: dread d7/d2d/d69/fc9 [0,4194304] 0 2026-03-10T14:08:20.121 INFO:tasks.workunit.client.1.vm04.stdout:2/916: read d0/d14/d91/d8/f42 [2991674,87025] 0 2026-03-10T14:08:20.127 INFO:tasks.workunit.client.1.vm04.stdout:2/917: rename d0/d14/d91/d8/d17/d4e/d85/f89 to d0/d14/d1b/d45/da5/f122 0 2026-03-10T14:08:20.130 INFO:tasks.workunit.client.1.vm04.stdout:0/906: dread d0/d2/d15/d22/d38/d56/d66/f2e [0,4194304] 0 2026-03-10T14:08:20.134 INFO:tasks.workunit.client.1.vm04.stdout:2/918: dwrite d0/d14/deb/f109 [0,4194304] 0 2026-03-10T14:08:20.140 INFO:tasks.workunit.client.1.vm04.stdout:0/907: creat d0/d2/d15/d22/d38/d56/dc1/dd4/f11b x:0 0 0 2026-03-10T14:08:20.140 INFO:tasks.workunit.client.1.vm04.stdout:0/908: readlink d0/d2/d15/d22/le5 0 2026-03-10T14:08:20.145 INFO:tasks.workunit.client.1.vm04.stdout:0/909: dwrite d0/d2/d15/d22/d38/d56/dc1/d107/f10a [0,4194304] 0 2026-03-10T14:08:20.148 INFO:tasks.workunit.client.1.vm04.stdout:2/919: symlink d0/d14/d91/d4a/d8c/dab/d46/d4b/l123 0 2026-03-10T14:08:20.149 INFO:tasks.workunit.client.1.vm04.stdout:2/920: write d0/d14/d39/d47/f120 [1011805,129391] 0 2026-03-10T14:08:20.150 INFO:tasks.workunit.client.1.vm04.stdout:2/921: chown d0/d14/d91/d4a/fa6 1 1 2026-03-10T14:08:20.155 INFO:tasks.workunit.client.1.vm04.stdout:0/910: creat d0/d2/dc9/f11c x:0 0 0 2026-03-10T14:08:20.159 INFO:tasks.workunit.client.1.vm04.stdout:2/922: dread d0/d14/d91/d8/d17/d35/f94 [0,4194304] 0 2026-03-10T14:08:20.161 INFO:tasks.workunit.client.1.vm04.stdout:0/911: creat d0/d2/f11d x:0 0 0 2026-03-10T14:08:20.161 INFO:tasks.workunit.client.1.vm04.stdout:0/912: chown d0/d2/d15/d22/d38/d56/d66/f7e 0 1 2026-03-10T14:08:20.162 INFO:tasks.workunit.client.1.vm04.stdout:0/913: write d0/d2/d15/d22/d38/d56/d66/f7e [263140,70004] 0 2026-03-10T14:08:20.167 INFO:tasks.workunit.client.1.vm04.stdout:0/914: dwrite d0/d2/d15/d49/d50/f8a [0,4194304] 0 2026-03-10T14:08:20.175 INFO:tasks.workunit.client.1.vm04.stdout:4/850: write d4/d14/d3c/d62/fe4 [785228,10635] 0 2026-03-10T14:08:20.177 INFO:tasks.workunit.client.1.vm04.stdout:0/915: mknod d0/d2/d90/c11e 0 2026-03-10T14:08:20.183 INFO:tasks.workunit.client.1.vm04.stdout:4/851: sync 2026-03-10T14:08:20.184 INFO:tasks.workunit.client.1.vm04.stdout:4/852: chown d4/d14/d3c/f41 1933 1 2026-03-10T14:08:20.185 INFO:tasks.workunit.client.1.vm04.stdout:0/916: mknod d0/d2/dc9/c11f 0 2026-03-10T14:08:20.191 INFO:tasks.workunit.client.1.vm04.stdout:4/853: unlink d4/d14/d3c/f41 0 2026-03-10T14:08:20.193 INFO:tasks.workunit.client.1.vm04.stdout:8/989: write d0/d3/fc3 [2572819,91571] 0 2026-03-10T14:08:20.210 INFO:tasks.workunit.client.1.vm04.stdout:7/946: dwrite d2/dc/de/d2d/d60/d7c/d3b/f49 [0,4194304] 0 2026-03-10T14:08:20.233 INFO:tasks.workunit.client.1.vm04.stdout:3/916: dwrite da/f25 [0,4194304] 0 2026-03-10T14:08:20.244 INFO:tasks.workunit.client.1.vm04.stdout:4/854: fsync d4/df/db2/db4/d47/d4f/d8c/dc8/dff/fbf 0 2026-03-10T14:08:20.249 INFO:tasks.workunit.client.1.vm04.stdout:2/923: dread d0/d14/d91/d4a/f57 [0,4194304] 0 2026-03-10T14:08:20.250 INFO:tasks.workunit.client.1.vm04.stdout:2/924: truncate d0/d14/d39/d47/d70/dc3/fe0 829193 0 2026-03-10T14:08:20.251 INFO:tasks.workunit.client.1.vm04.stdout:1/914: write d3/d22/d6d/fe3 [1697949,112873] 0 2026-03-10T14:08:20.255 INFO:tasks.workunit.client.1.vm04.stdout:6/783: stat d3/de/d35/d3f/d2d/d32 0 2026-03-10T14:08:20.263 INFO:tasks.workunit.client.1.vm04.stdout:9/885: write d9/d58/db5/f67 [713794,2015] 0 2026-03-10T14:08:20.265 INFO:tasks.workunit.client.0.vm03.stdout:4/389: dwrite d5/d9/db/d19/d38/d53/d55/f75 [0,4194304] 0 2026-03-10T14:08:20.266 INFO:tasks.workunit.client.0.vm03.stdout:4/390: dread - d5/d47/f65 zero size 2026-03-10T14:08:20.267 INFO:tasks.workunit.client.1.vm04.stdout:7/947: mkdir d2/dc/de/d2d/d60/d81/d14d 0 2026-03-10T14:08:20.267 INFO:tasks.workunit.client.0.vm03.stdout:4/391: fsync d5/d9/db/d19/d38/d53/d55/f61 0 2026-03-10T14:08:20.268 INFO:tasks.workunit.client.1.vm04.stdout:7/948: write d2/d2a/f93 [6626850,100519] 0 2026-03-10T14:08:20.269 INFO:tasks.workunit.client.0.vm03.stdout:4/392: fsync d5/d9/db/f67 0 2026-03-10T14:08:20.270 INFO:tasks.workunit.client.0.vm03.stdout:4/393: chown d5/d47/c73 752051 1 2026-03-10T14:08:20.270 INFO:tasks.workunit.client.1.vm04.stdout:3/917: rmdir da/dc/d115 39 2026-03-10T14:08:20.271 INFO:tasks.workunit.client.0.vm03.stdout:4/394: write d5/d9/d2b/f63 [876734,76720] 0 2026-03-10T14:08:20.277 INFO:tasks.workunit.client.1.vm04.stdout:2/925: rmdir d0/d14/d91/d4a/d8c/dab/d95/d107 39 2026-03-10T14:08:20.279 INFO:tasks.workunit.client.1.vm04.stdout:2/926: write d0/d14/d91/d8/d17/d4e/d85/d86/d96/fec [68343,115548] 0 2026-03-10T14:08:20.279 INFO:tasks.workunit.client.1.vm04.stdout:2/927: readlink d0/d14/d39/d47/l114 0 2026-03-10T14:08:20.282 INFO:tasks.workunit.client.1.vm04.stdout:6/784: dread d3/de/d35/d3f/f96 [0,4194304] 0 2026-03-10T14:08:20.286 INFO:tasks.workunit.client.1.vm04.stdout:8/990: creat d0/d3/d63/f132 x:0 0 0 2026-03-10T14:08:20.290 INFO:tasks.workunit.client.1.vm04.stdout:7/949: creat d2/dc/d4d/d125/f14e x:0 0 0 2026-03-10T14:08:20.296 INFO:tasks.workunit.client.1.vm04.stdout:2/928: mkdir d0/d14/d91/d3a/d124 0 2026-03-10T14:08:20.298 INFO:tasks.workunit.client.1.vm04.stdout:1/915: link d3/d22/d63/d35/dd9/d13/da0/dc5/fc6 d3/d22/d63/d35/dd9/d13/d38/d58/d135/f13d 0 2026-03-10T14:08:20.299 INFO:tasks.workunit.client.1.vm04.stdout:6/785: read d3/de/d35/d3f/d2d/d32/f81 [1965505,3431] 0 2026-03-10T14:08:20.305 INFO:tasks.workunit.client.1.vm04.stdout:4/855: dread d4/df/d31/fae [0,4194304] 0 2026-03-10T14:08:20.311 INFO:tasks.workunit.client.1.vm04.stdout:8/991: rename d0/d3/d63/d29/l32 to d0/d3/dd/d76/ddb/l133 0 2026-03-10T14:08:20.314 INFO:tasks.workunit.client.1.vm04.stdout:2/929: dread d0/d14/d91/d8/d17/d4e/d85/d86/ff2 [8388608,4194304] 0 2026-03-10T14:08:20.315 INFO:tasks.workunit.client.1.vm04.stdout:2/930: chown d0/d14/d91/d4a/d8c/dab/d95/lfa 199 1 2026-03-10T14:08:20.329 INFO:tasks.workunit.client.0.vm03.stdout:0/321: dwrite d3/d17/f56 [4194304,4194304] 0 2026-03-10T14:08:20.329 INFO:tasks.workunit.client.1.vm04.stdout:9/886: dread d9/d44/d4d/fed [0,4194304] 0 2026-03-10T14:08:20.337 INFO:tasks.workunit.client.1.vm04.stdout:5/966: write d7/d12/d2b/d3e/d3f/da6/fee [59778,114132] 0 2026-03-10T14:08:20.338 INFO:tasks.workunit.client.1.vm04.stdout:3/918: link da/dc/d3f/d54/f5d da/ded/df8/f13e 0 2026-03-10T14:08:20.340 INFO:tasks.workunit.client.1.vm04.stdout:1/916: chown d3/d22/d6d/d13b/f130 337739561 1 2026-03-10T14:08:20.341 INFO:tasks.workunit.client.1.vm04.stdout:0/917: dwrite d0/d2/d15/d22/d38/f93 [0,4194304] 0 2026-03-10T14:08:20.342 INFO:tasks.workunit.client.0.vm03.stdout:0/322: dread d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:20.343 INFO:tasks.workunit.client.0.vm03.stdout:0/323: read d3/d4d/d30/f32 [402458,107236] 0 2026-03-10T14:08:20.344 INFO:tasks.workunit.client.1.vm04.stdout:6/786: symlink d3/de/d35/d3f/d2d/d38/lf0 0 2026-03-10T14:08:20.344 INFO:tasks.workunit.client.0.vm03.stdout:0/324: creat d3/d46/f63 x:0 0 0 2026-03-10T14:08:20.344 INFO:tasks.workunit.client.1.vm04.stdout:4/856: dread - d4/df/d34/d6f/f9a zero size 2026-03-10T14:08:20.345 INFO:tasks.workunit.client.1.vm04.stdout:8/992: mkdir d0/d3/d73/db8/d103/d11f/d134 0 2026-03-10T14:08:20.345 INFO:tasks.workunit.client.1.vm04.stdout:4/857: chown d4/d14/d1b/f20 77007 1 2026-03-10T14:08:20.346 INFO:tasks.workunit.client.1.vm04.stdout:2/931: creat d0/d14/d39/d47/d70/dc3/f125 x:0 0 0 2026-03-10T14:08:20.348 INFO:tasks.workunit.client.1.vm04.stdout:9/887: mknod d9/da/dd/de7/d96/c132 0 2026-03-10T14:08:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:20 vm03.local ceph-mon[49718]: Upgrade: Updating mgr.vm03.rwbbep 2026-03-10T14:08:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:20 vm03.local ceph-mon[49718]: Deploying daemon mgr.vm03.rwbbep on vm03 2026-03-10T14:08:20.365 INFO:tasks.workunit.client.1.vm04.stdout:7/950: symlink d2/dc/l14f 0 2026-03-10T14:08:20.381 INFO:tasks.workunit.client.1.vm04.stdout:0/918: symlink d0/d2/d15/d22/d62/l120 0 2026-03-10T14:08:20.388 INFO:tasks.workunit.client.1.vm04.stdout:4/858: fsync d4/d14/d64/fab 0 2026-03-10T14:08:20.395 INFO:tasks.workunit.client.1.vm04.stdout:3/919: dread da/dc/d35/f50 [0,4194304] 0 2026-03-10T14:08:20.397 INFO:tasks.workunit.client.1.vm04.stdout:1/917: dread d3/d5c/f71 [4194304,4194304] 0 2026-03-10T14:08:20.398 INFO:tasks.workunit.client.1.vm04.stdout:2/932: symlink d0/l126 0 2026-03-10T14:08:20.399 INFO:tasks.workunit.client.1.vm04.stdout:2/933: chown d0/d14/d91/d4a/d8c/dab/d95/la1 59386 1 2026-03-10T14:08:20.401 INFO:tasks.workunit.client.1.vm04.stdout:6/787: mknod d3/de/d35/d3f/d2d/d32/d23/d24/d8e/cf1 0 2026-03-10T14:08:20.416 INFO:tasks.workunit.client.1.vm04.stdout:8/993: mkdir d0/d3/d63/d12/d135 0 2026-03-10T14:08:20.423 INFO:tasks.workunit.client.1.vm04.stdout:1/918: fsync d3/d22/d63/f69 0 2026-03-10T14:08:20.426 INFO:tasks.workunit.client.1.vm04.stdout:2/934: mknod d0/d14/d91/d8/d17/d4e/d8e/c127 0 2026-03-10T14:08:20.430 INFO:tasks.workunit.client.1.vm04.stdout:7/951: dwrite d2/dc/de/fed [0,4194304] 0 2026-03-10T14:08:20.432 INFO:tasks.workunit.client.1.vm04.stdout:6/788: truncate d3/f2a 140871 0 2026-03-10T14:08:20.434 INFO:tasks.workunit.client.1.vm04.stdout:9/888: write d9/da/dd/de7/d96/d9d/fe9 [2456256,68370] 0 2026-03-10T14:08:20.443 INFO:tasks.workunit.client.1.vm04.stdout:1/919: chown d3/d22/d63/cc9 836488 1 2026-03-10T14:08:20.443 INFO:tasks.workunit.client.1.vm04.stdout:7/952: chown d2/dc/de/d2d/d5c/da9/ldc 7 1 2026-03-10T14:08:20.443 INFO:tasks.workunit.client.1.vm04.stdout:9/889: dwrite d9/d44/f51 [4194304,4194304] 0 2026-03-10T14:08:20.444 INFO:tasks.workunit.client.1.vm04.stdout:9/890: truncate d9/d33/f4b 4635756 0 2026-03-10T14:08:20.450 INFO:tasks.workunit.client.1.vm04.stdout:6/789: symlink d3/de/d35/d3a/d43/d4c/lf2 0 2026-03-10T14:08:20.453 INFO:tasks.workunit.client.1.vm04.stdout:6/790: chown d3/d1d/d73/fc 43076 1 2026-03-10T14:08:20.453 INFO:tasks.workunit.client.1.vm04.stdout:7/953: dwrite d2/d2a/f9c [4194304,4194304] 0 2026-03-10T14:08:20.454 INFO:tasks.workunit.client.1.vm04.stdout:2/935: sync 2026-03-10T14:08:20.455 INFO:tasks.workunit.client.1.vm04.stdout:2/936: chown d0/d14/d1b/c21 362339 1 2026-03-10T14:08:20.461 INFO:tasks.workunit.client.0.vm03.stdout:9/369: dwrite d2/f1e [0,4194304] 0 2026-03-10T14:08:20.466 INFO:tasks.workunit.client.0.vm03.stdout:9/370: write d2/d29/d33/d55/f5f [4227,101449] 0 2026-03-10T14:08:20.468 INFO:tasks.workunit.client.1.vm04.stdout:5/967: getdents d7/d2d/d76 0 2026-03-10T14:08:20.469 INFO:tasks.workunit.client.1.vm04.stdout:4/859: rename d4/d14/d6d/df5/l10d to d4/d14/d3c/l11d 0 2026-03-10T14:08:20.483 INFO:tasks.workunit.client.1.vm04.stdout:8/994: dread d0/d3/d5/f70 [0,4194304] 0 2026-03-10T14:08:20.491 INFO:tasks.workunit.client.1.vm04.stdout:2/937: creat d0/db8/f128 x:0 0 0 2026-03-10T14:08:20.492 INFO:tasks.workunit.client.1.vm04.stdout:3/920: rename da/dc/d35/d37/d10a/f12c to da/dc/d47/d9b/d106/dde/df2/f13f 0 2026-03-10T14:08:20.495 INFO:tasks.workunit.client.1.vm04.stdout:8/995: fdatasync d0/d3/dd/d89/db5/fea 0 2026-03-10T14:08:20.501 INFO:tasks.workunit.client.1.vm04.stdout:6/791: rename d3/d1d/d73/f2b to d3/ded/ff3 0 2026-03-10T14:08:20.503 INFO:tasks.workunit.client.1.vm04.stdout:3/921: creat da/dc/d3f/d61/f140 x:0 0 0 2026-03-10T14:08:20.504 INFO:tasks.workunit.client.1.vm04.stdout:3/922: fdatasync da/dc/d3f/d61/dc1/f13a 0 2026-03-10T14:08:20.506 INFO:tasks.workunit.client.1.vm04.stdout:1/920: dwrite d3/d22/d63/f72 [0,4194304] 0 2026-03-10T14:08:20.514 INFO:tasks.workunit.client.1.vm04.stdout:5/968: rename d7/d9 to d7/d12/d2b/d3e/d57/d77/da5/d140 0 2026-03-10T14:08:20.514 INFO:tasks.workunit.client.1.vm04.stdout:2/938: truncate d0/d14/d91/d8/d17/f73 3082226 0 2026-03-10T14:08:20.514 INFO:tasks.workunit.client.1.vm04.stdout:6/792: creat d3/de/d35/d3f/d2d/d32/d23/d4e/ff4 x:0 0 0 2026-03-10T14:08:20.518 INFO:tasks.workunit.client.1.vm04.stdout:0/919: dread d0/d2/fa [4194304,4194304] 0 2026-03-10T14:08:20.519 INFO:tasks.workunit.client.1.vm04.stdout:8/996: sync 2026-03-10T14:08:20.522 INFO:tasks.workunit.client.1.vm04.stdout:9/891: getdents d9/da/d8c/de5/df6 0 2026-03-10T14:08:20.532 INFO:tasks.workunit.client.1.vm04.stdout:6/793: fsync d3/de/d35/d3f/d2d/d38/d40/fa5 0 2026-03-10T14:08:20.534 INFO:tasks.workunit.client.1.vm04.stdout:8/997: mkdir d0/d3/d73/db8/dd5/d136 0 2026-03-10T14:08:20.538 INFO:tasks.workunit.client.1.vm04.stdout:3/923: rename da/dc/d35/d52/d6d/fab to da/dc/d35/d37/f141 0 2026-03-10T14:08:20.543 INFO:tasks.workunit.client.1.vm04.stdout:7/954: dwrite d2/dc/de/d2d/f39 [4194304,4194304] 0 2026-03-10T14:08:20.552 INFO:tasks.workunit.client.1.vm04.stdout:9/892: sync 2026-03-10T14:08:20.555 INFO:tasks.workunit.client.1.vm04.stdout:5/969: dread d7/d12/d2b/d8c/fe5 [0,4194304] 0 2026-03-10T14:08:20.555 INFO:tasks.workunit.client.1.vm04.stdout:5/970: fsync d7/f13a 0 2026-03-10T14:08:20.556 INFO:tasks.workunit.client.1.vm04.stdout:5/971: read d7/d59/f11b [182029,2055] 0 2026-03-10T14:08:20.556 INFO:tasks.workunit.client.1.vm04.stdout:5/972: stat d7/d12/d2b/d3e/d57/d77/df1 0 2026-03-10T14:08:20.558 INFO:tasks.workunit.client.1.vm04.stdout:6/794: mkdir d3/de/d35/d3a/d43/df5 0 2026-03-10T14:08:20.561 INFO:tasks.workunit.client.1.vm04.stdout:3/924: creat da/dc/d47/d9b/f142 x:0 0 0 2026-03-10T14:08:20.564 INFO:tasks.workunit.client.1.vm04.stdout:4/860: dwrite d4/df/d34/fc3 [0,4194304] 0 2026-03-10T14:08:20.568 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:20 vm04.local ceph-mon[55966]: Upgrade: Updating mgr.vm03.rwbbep 2026-03-10T14:08:20.568 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:20 vm04.local ceph-mon[55966]: Deploying daemon mgr.vm03.rwbbep on vm03 2026-03-10T14:08:20.568 INFO:tasks.workunit.client.1.vm04.stdout:9/893: symlink d9/da/dd/de7/d96/d9d/l133 0 2026-03-10T14:08:20.571 INFO:tasks.workunit.client.1.vm04.stdout:7/955: creat d2/d9f/f150 x:0 0 0 2026-03-10T14:08:20.572 INFO:tasks.workunit.client.1.vm04.stdout:9/894: write d9/d44/d4d/ff2 [3284345,6380] 0 2026-03-10T14:08:20.573 INFO:tasks.workunit.client.1.vm04.stdout:9/895: chown d9/d5c/f12b 5017 1 2026-03-10T14:08:20.575 INFO:tasks.workunit.client.1.vm04.stdout:6/795: sync 2026-03-10T14:08:20.576 INFO:tasks.workunit.client.1.vm04.stdout:4/861: sync 2026-03-10T14:08:20.576 INFO:tasks.workunit.client.1.vm04.stdout:4/862: stat d4/df/d34/f7c 0 2026-03-10T14:08:20.584 INFO:tasks.workunit.client.1.vm04.stdout:2/939: dread d0/d14/d1b/f29 [4194304,4194304] 0 2026-03-10T14:08:20.585 INFO:tasks.workunit.client.1.vm04.stdout:2/940: dread - d0/d14/d91/d4a/d8c/dab/d46/f118 zero size 2026-03-10T14:08:20.589 INFO:tasks.workunit.client.1.vm04.stdout:2/941: dwrite d0/d14/d91/d8/d17/d4e/d85/f10e [0,4194304] 0 2026-03-10T14:08:20.593 INFO:tasks.workunit.client.1.vm04.stdout:5/973: truncate d7/d2d/f64 414071 0 2026-03-10T14:08:20.596 INFO:tasks.workunit.client.1.vm04.stdout:3/925: creat da/dc/d47/d9b/d106/dde/d121/f143 x:0 0 0 2026-03-10T14:08:20.601 INFO:tasks.workunit.client.1.vm04.stdout:1/921: dwrite d3/d22/d2f/f3c [0,4194304] 0 2026-03-10T14:08:20.602 INFO:tasks.workunit.client.1.vm04.stdout:7/956: readlink d2/d6b/lb4 0 2026-03-10T14:08:20.606 INFO:tasks.workunit.client.1.vm04.stdout:9/896: mkdir d9/d33/d134 0 2026-03-10T14:08:20.608 INFO:tasks.workunit.client.1.vm04.stdout:6/796: mknod d3/de/d35/d3f/d2d/d32/dd0/cf6 0 2026-03-10T14:08:20.609 INFO:tasks.workunit.client.1.vm04.stdout:0/920: dwrite d0/d2/d15/fef [0,4194304] 0 2026-03-10T14:08:20.609 INFO:tasks.workunit.client.1.vm04.stdout:9/897: dread - d9/da/dd/d1c/da3/ff5 zero size 2026-03-10T14:08:20.610 INFO:tasks.workunit.client.1.vm04.stdout:7/957: dwrite d2/d6b/f130 [0,4194304] 0 2026-03-10T14:08:20.612 INFO:tasks.workunit.client.1.vm04.stdout:6/797: write d3/de/d35/d3f/d2d/d32/d23/ddc/fe7 [510718,124023] 0 2026-03-10T14:08:20.618 INFO:tasks.workunit.client.1.vm04.stdout:6/798: read d3/de/d35/d3f/d2d/f89 [2522493,89509] 0 2026-03-10T14:08:20.619 INFO:tasks.workunit.client.1.vm04.stdout:6/799: rename d3/de/d35/d3f/d2d/d32/d23 to d3/de/d35/d3f/d2d/d32/d23/d24/df7 22 2026-03-10T14:08:20.620 INFO:tasks.workunit.client.1.vm04.stdout:4/863: readlink d4/df/db2/db6/dc9/dd0/l113 0 2026-03-10T14:08:20.623 INFO:tasks.workunit.client.1.vm04.stdout:5/974: mknod d7/d12/d2b/d3e/d57/d77/da5/c141 0 2026-03-10T14:08:20.624 INFO:tasks.workunit.client.1.vm04.stdout:3/926: dread - da/dc/d47/d9b/d106/dde/dac/f110 zero size 2026-03-10T14:08:20.626 INFO:tasks.workunit.client.1.vm04.stdout:1/922: fsync d3/d22/d63/d35/fe5 0 2026-03-10T14:08:20.626 INFO:tasks.workunit.client.1.vm04.stdout:1/923: dread - d3/d22/d2f/d109/d12b/d60/ff7 zero size 2026-03-10T14:08:20.632 INFO:tasks.workunit.client.1.vm04.stdout:8/998: dwrite d0/d3/d63/d12/d51/d67/f87 [0,4194304] 0 2026-03-10T14:08:20.637 INFO:tasks.workunit.client.1.vm04.stdout:5/975: sync 2026-03-10T14:08:20.640 INFO:tasks.workunit.client.1.vm04.stdout:3/927: creat da/dc/d47/f144 x:0 0 0 2026-03-10T14:08:20.645 INFO:tasks.workunit.client.1.vm04.stdout:0/921: symlink d0/d2/d15/l121 0 2026-03-10T14:08:20.648 INFO:tasks.workunit.client.1.vm04.stdout:4/864: rename d4/l7 to d4/d14/d3c/d62/de6/l11e 0 2026-03-10T14:08:20.651 INFO:tasks.workunit.client.1.vm04.stdout:8/999: symlink d0/d3/d73/db8/dd5/d136/l137 0 2026-03-10T14:08:20.656 INFO:tasks.workunit.client.1.vm04.stdout:3/928: sync 2026-03-10T14:08:20.660 INFO:tasks.workunit.client.1.vm04.stdout:5/976: rename d7/d12/d2b/d93/d9e/fe4 to d7/d2d/d32/d34/f142 0 2026-03-10T14:08:20.662 INFO:tasks.workunit.client.1.vm04.stdout:2/942: link d0/d14/d91/d4a/d8c/dab/l77 d0/d14/d91/d8/d17/d4e/dea/dde/l129 0 2026-03-10T14:08:20.673 INFO:tasks.workunit.client.1.vm04.stdout:1/924: creat d3/d22/d63/d35/dd9/d13/d38/d58/d5b/f13e x:0 0 0 2026-03-10T14:08:20.674 INFO:tasks.workunit.client.1.vm04.stdout:2/943: creat d0/d14/d91/d4a/d8c/dab/d95/d107/f12a x:0 0 0 2026-03-10T14:08:20.674 INFO:tasks.workunit.client.1.vm04.stdout:2/944: chown d0/d14/d91/d4a/d8c/dab/d46 4732523 1 2026-03-10T14:08:20.674 INFO:tasks.workunit.client.1.vm04.stdout:4/865: rename d4/df/db2/db4/d47/d4f/df9 to d4/d11f 0 2026-03-10T14:08:20.678 INFO:tasks.workunit.client.1.vm04.stdout:7/958: dread d2/dc/de/d2d/d60/fbd [0,4194304] 0 2026-03-10T14:08:20.678 INFO:tasks.workunit.client.1.vm04.stdout:2/945: dread d0/d14/d91/d8/d17/d35/f81 [0,4194304] 0 2026-03-10T14:08:20.683 INFO:tasks.workunit.client.1.vm04.stdout:2/946: dwrite d0/d14/d91/d4a/d8c/dab/d95/d107/f12a [0,4194304] 0 2026-03-10T14:08:20.691 INFO:tasks.workunit.client.1.vm04.stdout:2/947: chown d0/d14/d91/d3a/c4c 246093 1 2026-03-10T14:08:20.691 INFO:tasks.workunit.client.1.vm04.stdout:2/948: symlink d0/d14/d91/d8/d17/d4e/d85/d86/l12b 0 2026-03-10T14:08:20.701 INFO:tasks.workunit.client.1.vm04.stdout:1/925: read d3/d22/d63/f89 [166100,44397] 0 2026-03-10T14:08:20.703 INFO:tasks.workunit.client.1.vm04.stdout:1/926: chown d3/d22/d63/d35/dd9/d13/d1a/f62 331642 1 2026-03-10T14:08:20.703 INFO:tasks.workunit.client.1.vm04.stdout:1/927: chown d3/d5c/f71 11430 1 2026-03-10T14:08:20.705 INFO:tasks.workunit.client.1.vm04.stdout:1/928: mkdir d3/d22/d63/d35/dd9/d13/da0/dc5/d13f 0 2026-03-10T14:08:20.705 INFO:tasks.workunit.client.1.vm04.stdout:1/929: stat d3/d5c/fb2 0 2026-03-10T14:08:20.708 INFO:tasks.workunit.client.1.vm04.stdout:1/930: symlink d3/d22/d63/d35/dd9/d13/d38/d58/d135/l140 0 2026-03-10T14:08:20.709 INFO:tasks.workunit.client.1.vm04.stdout:2/949: sync 2026-03-10T14:08:20.710 INFO:tasks.workunit.client.1.vm04.stdout:2/950: write d0/d14/d91/d8/d17/f1f [2168956,95141] 0 2026-03-10T14:08:20.713 INFO:tasks.workunit.client.1.vm04.stdout:2/951: creat d0/d14/f12c x:0 0 0 2026-03-10T14:08:20.714 INFO:tasks.workunit.client.1.vm04.stdout:2/952: write d0/d14/d54/f10a [738695,29318] 0 2026-03-10T14:08:20.720 INFO:tasks.workunit.client.1.vm04.stdout:2/953: mkdir d0/d14/d39/da8/d12d 0 2026-03-10T14:08:20.723 INFO:tasks.workunit.client.1.vm04.stdout:1/931: sync 2026-03-10T14:08:20.732 INFO:tasks.workunit.client.1.vm04.stdout:1/932: rename d3/d22/d2f/l43 to d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/l141 0 2026-03-10T14:08:20.732 INFO:tasks.workunit.client.0.vm03.stdout:3/351: write d1d/d39/f42 [535983,45345] 0 2026-03-10T14:08:20.734 INFO:tasks.workunit.client.0.vm03.stdout:3/352: mknod d1d/d29/d41/d45/d55/c67 0 2026-03-10T14:08:20.734 INFO:tasks.workunit.client.1.vm04.stdout:2/954: dread d0/d14/d91/d4a/d8c/dab/db3/fd6 [4194304,4194304] 0 2026-03-10T14:08:20.735 INFO:tasks.workunit.client.0.vm03.stdout:3/353: write d1d/d29/f44 [467461,64125] 0 2026-03-10T14:08:20.739 INFO:tasks.workunit.client.0.vm03.stdout:3/354: unlink d1d/d39/f42 0 2026-03-10T14:08:20.743 INFO:tasks.workunit.client.0.vm03.stdout:3/355: dwrite d1d/d29/d41/d45/d55/f63 [0,4194304] 0 2026-03-10T14:08:20.746 INFO:tasks.workunit.client.1.vm04.stdout:4/866: dread d4/d14/d3c/d5e/f92 [0,4194304] 0 2026-03-10T14:08:20.746 INFO:tasks.workunit.client.1.vm04.stdout:1/933: creat d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/dce/f142 x:0 0 0 2026-03-10T14:08:20.751 INFO:tasks.workunit.client.0.vm03.stdout:7/263: dwrite d5/f1b [0,4194304] 0 2026-03-10T14:08:20.752 INFO:tasks.workunit.client.1.vm04.stdout:1/934: dwrite d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92/f93 [0,4194304] 0 2026-03-10T14:08:20.752 INFO:tasks.workunit.client.1.vm04.stdout:2/955: symlink d0/d14/d39/d47/d70/dad/dfc/l12e 0 2026-03-10T14:08:20.758 INFO:tasks.workunit.client.0.vm03.stdout:7/264: dwrite d5/d9/d14/d21/d28/f37 [4194304,4194304] 0 2026-03-10T14:08:20.765 INFO:tasks.workunit.client.1.vm04.stdout:2/956: symlink d0/d14/d39/d47/d70/dc3/l12f 0 2026-03-10T14:08:20.776 INFO:tasks.workunit.client.1.vm04.stdout:2/957: rename d0/d14/d91/d4a/c80 to d0/d14/d91/d8/d17/d4e/c130 0 2026-03-10T14:08:20.777 INFO:tasks.workunit.client.0.vm03.stdout:7/265: link d5/d9/d14/d26/d39/f45 d5/d9/d35/f52 0 2026-03-10T14:08:20.779 INFO:tasks.workunit.client.0.vm03.stdout:7/266: truncate d5/d9/f42 144446 0 2026-03-10T14:08:20.782 INFO:tasks.workunit.client.0.vm03.stdout:7/267: creat d5/f53 x:0 0 0 2026-03-10T14:08:20.784 INFO:tasks.workunit.client.1.vm04.stdout:9/898: truncate d9/da/dd/d1c/da3/fd0 1667876 0 2026-03-10T14:08:20.786 INFO:tasks.workunit.client.1.vm04.stdout:9/899: mknod d9/da/d8c/de5/d11d/c135 0 2026-03-10T14:08:20.788 INFO:tasks.workunit.client.1.vm04.stdout:9/900: unlink d9/d58/db5/c97 0 2026-03-10T14:08:20.790 INFO:tasks.workunit.client.1.vm04.stdout:9/901: creat d9/d58/db5/da5/dab/f136 x:0 0 0 2026-03-10T14:08:20.793 INFO:tasks.workunit.client.0.vm03.stdout:7/268: mknod d5/d9/d14/d21/d28/c54 0 2026-03-10T14:08:20.793 INFO:tasks.workunit.client.0.vm03.stdout:7/269: stat d5/d9/d14/d26/d39 0 2026-03-10T14:08:20.794 INFO:tasks.workunit.client.0.vm03.stdout:3/356: dread d1d/f26 [0,4194304] 0 2026-03-10T14:08:20.795 INFO:tasks.workunit.client.1.vm04.stdout:9/902: dwrite d9/da/dd/d1c/da3/dec/f12c [0,4194304] 0 2026-03-10T14:08:20.799 INFO:tasks.workunit.client.1.vm04.stdout:9/903: fsync d9/fae 0 2026-03-10T14:08:20.800 INFO:tasks.workunit.client.1.vm04.stdout:9/904: mkdir d9/d58/db5/d137 0 2026-03-10T14:08:20.800 INFO:tasks.workunit.client.1.vm04.stdout:9/905: stat d9/da/dd/d1c/da3/cc5 0 2026-03-10T14:08:20.801 INFO:tasks.workunit.client.1.vm04.stdout:9/906: chown d9/fe6 7 1 2026-03-10T14:08:20.804 INFO:tasks.workunit.client.0.vm03.stdout:7/270: mknod d5/d9/d14/d26/d39/c55 0 2026-03-10T14:08:20.805 INFO:tasks.workunit.client.0.vm03.stdout:7/271: write d5/f47 [766639,53618] 0 2026-03-10T14:08:20.808 INFO:tasks.workunit.client.0.vm03.stdout:7/272: dwrite d5/f6 [4194304,4194304] 0 2026-03-10T14:08:20.862 INFO:tasks.workunit.client.1.vm04.stdout:6/800: write d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fcb [2982364,60907] 0 2026-03-10T14:08:20.866 INFO:tasks.workunit.client.1.vm04.stdout:6/801: creat d3/de/d35/d3f/ff8 x:0 0 0 2026-03-10T14:08:20.872 INFO:tasks.workunit.client.1.vm04.stdout:6/802: read d3/de/d35/d3a/d43/d9c/fa8 [124858,36201] 0 2026-03-10T14:08:20.872 INFO:tasks.workunit.client.1.vm04.stdout:6/803: chown d3/d1d/d73/c9d 10889 1 2026-03-10T14:08:20.873 INFO:tasks.workunit.client.1.vm04.stdout:6/804: symlink d3/de/d35/d3f/d2d/d32/d23/d24/lf9 0 2026-03-10T14:08:20.877 INFO:tasks.workunit.client.0.vm03.stdout:5/442: dwrite d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:08:20.880 INFO:tasks.workunit.client.0.vm03.stdout:5/443: rmdir d4/d13/d73 39 2026-03-10T14:08:20.893 INFO:tasks.workunit.client.1.vm04.stdout:0/922: write d0/d2/fff [864948,80417] 0 2026-03-10T14:08:20.896 INFO:tasks.workunit.client.1.vm04.stdout:0/923: rename d0/d2/dc9 to d0/d2/d15/d49/d50/d5c/dd8/deb/d122 0 2026-03-10T14:08:20.899 INFO:tasks.workunit.client.1.vm04.stdout:3/929: write da/fd [1815636,76422] 0 2026-03-10T14:08:20.901 INFO:tasks.workunit.client.1.vm04.stdout:9/907: read d9/da/dd/d74/fcb [1818814,5202] 0 2026-03-10T14:08:20.904 INFO:tasks.workunit.client.1.vm04.stdout:9/908: dwrite d9/d58/f62 [0,4194304] 0 2026-03-10T14:08:20.910 INFO:tasks.workunit.client.1.vm04.stdout:9/909: write d9/da/dd/d1c/fdd [364760,10098] 0 2026-03-10T14:08:20.910 INFO:tasks.workunit.client.1.vm04.stdout:5/977: dwrite d7/d12/d2b/d3e/d57/d77/da5/d140/fe0 [0,4194304] 0 2026-03-10T14:08:20.912 INFO:tasks.workunit.client.1.vm04.stdout:3/930: creat da/d30/f145 x:0 0 0 2026-03-10T14:08:20.922 INFO:tasks.workunit.client.1.vm04.stdout:9/910: stat d9/da/dd/d74/l124 0 2026-03-10T14:08:20.927 INFO:tasks.workunit.client.1.vm04.stdout:5/978: creat d7/d59/d7e/d87/f143 x:0 0 0 2026-03-10T14:08:20.928 INFO:tasks.workunit.client.1.vm04.stdout:7/959: write d2/dc/de/d2d/d60/d7c/f105 [83533,92136] 0 2026-03-10T14:08:20.932 INFO:tasks.workunit.client.1.vm04.stdout:7/960: dread d2/dc/de/d2d/d38/f83 [0,4194304] 0 2026-03-10T14:08:20.934 INFO:tasks.workunit.client.1.vm04.stdout:5/979: mknod d7/d26/d6b/d6e/c144 0 2026-03-10T14:08:20.934 INFO:tasks.workunit.client.1.vm04.stdout:5/980: truncate d7/d59/d7e/d87/f143 394243 0 2026-03-10T14:08:20.935 INFO:tasks.workunit.client.1.vm04.stdout:5/981: readlink d7/d12/d2b/d93/d9e/lc5 0 2026-03-10T14:08:20.941 INFO:tasks.workunit.client.1.vm04.stdout:9/911: creat d9/d44/d4d/f138 x:0 0 0 2026-03-10T14:08:20.942 INFO:tasks.workunit.client.1.vm04.stdout:9/912: chown d9/d44/d59/f7c 419 1 2026-03-10T14:08:20.945 INFO:tasks.workunit.client.1.vm04.stdout:5/982: creat d7/d59/d7e/dc6/f145 x:0 0 0 2026-03-10T14:08:20.947 INFO:tasks.workunit.client.1.vm04.stdout:7/961: dread d2/d94/f3d [0,4194304] 0 2026-03-10T14:08:20.948 INFO:tasks.workunit.client.1.vm04.stdout:7/962: write d2/dc/de/d11/f19 [2555328,39484] 0 2026-03-10T14:08:20.967 INFO:tasks.workunit.client.1.vm04.stdout:5/983: rmdir d7/d2d/d117 0 2026-03-10T14:08:20.971 INFO:tasks.workunit.client.1.vm04.stdout:4/867: write d4/f77 [1321448,68404] 0 2026-03-10T14:08:20.974 INFO:tasks.workunit.client.1.vm04.stdout:1/935: truncate d3/d22/d2f/f3c 2213526 0 2026-03-10T14:08:20.974 INFO:tasks.workunit.client.1.vm04.stdout:1/936: readlink d3/d5c/lca 0 2026-03-10T14:08:20.977 INFO:tasks.workunit.client.1.vm04.stdout:1/937: dwrite d3/f2c [4194304,4194304] 0 2026-03-10T14:08:20.978 INFO:tasks.workunit.client.1.vm04.stdout:2/958: write d0/db8/fca [422870,93911] 0 2026-03-10T14:08:20.988 INFO:tasks.workunit.client.1.vm04.stdout:2/959: symlink d0/d14/d91/d4a/d8c/dab/d46/l131 0 2026-03-10T14:08:20.992 INFO:tasks.workunit.client.1.vm04.stdout:1/938: rename d3/d22/deb/f100 to d3/d22/d63/d35/dd9/d13/d38/d58/f143 0 2026-03-10T14:08:20.994 INFO:tasks.workunit.client.1.vm04.stdout:1/939: creat d3/d22/d63/d35/dd9/d13/d38/db5/dff/f144 x:0 0 0 2026-03-10T14:08:20.998 INFO:tasks.workunit.client.1.vm04.stdout:6/805: stat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/c72 0 2026-03-10T14:08:21.001 INFO:tasks.workunit.client.1.vm04.stdout:0/924: write d0/d2/d15/d22/d38/d56/dc1/fc3 [1256783,32209] 0 2026-03-10T14:08:21.003 INFO:tasks.workunit.client.1.vm04.stdout:0/925: creat d0/d2/d15/d22/d38/d56/dc1/dd4/f123 x:0 0 0 2026-03-10T14:08:21.011 INFO:tasks.workunit.client.1.vm04.stdout:0/926: dread d0/d2/d15/d22/f30 [4194304,4194304] 0 2026-03-10T14:08:21.014 INFO:tasks.workunit.client.1.vm04.stdout:0/927: dwrite d0/d2/d25/f64 [0,4194304] 0 2026-03-10T14:08:21.015 INFO:tasks.workunit.client.0.vm03.stdout:1/369: write d0/d2/df/f31 [1114748,63699] 0 2026-03-10T14:08:21.016 INFO:tasks.workunit.client.1.vm04.stdout:0/928: stat d0/d2/d15/d22/d38/d56/da7 0 2026-03-10T14:08:21.029 INFO:tasks.workunit.client.1.vm04.stdout:3/931: dwrite da/d3e/f4c [4194304,4194304] 0 2026-03-10T14:08:21.032 INFO:tasks.workunit.client.1.vm04.stdout:3/932: creat da/dc/d47/d9b/d106/dde/df2/f146 x:0 0 0 2026-03-10T14:08:21.033 INFO:tasks.workunit.client.1.vm04.stdout:3/933: write da/dc/d47/d9b/d106/dde/f125 [1026840,99354] 0 2026-03-10T14:08:21.039 INFO:tasks.workunit.client.1.vm04.stdout:9/913: truncate d9/da/dd/de7/fd2 1117433 0 2026-03-10T14:08:21.046 INFO:tasks.workunit.client.1.vm04.stdout:9/914: dwrite d9/d5c/fe2 [0,4194304] 0 2026-03-10T14:08:21.047 INFO:tasks.workunit.client.1.vm04.stdout:7/963: dwrite d2/dc/de/d2d/d60/d7c/d36/d8b/fde [4194304,4194304] 0 2026-03-10T14:08:21.051 INFO:tasks.workunit.client.1.vm04.stdout:7/964: dread - d2/dc/de/d2d/d60/d7c/df8/f131 zero size 2026-03-10T14:08:21.053 INFO:tasks.workunit.client.1.vm04.stdout:3/934: rename da/dc/d3f/d54/d66/fa7 to da/dc/d3f/f147 0 2026-03-10T14:08:21.060 INFO:tasks.workunit.client.1.vm04.stdout:0/929: sync 2026-03-10T14:08:21.071 INFO:tasks.workunit.client.1.vm04.stdout:5/984: dwrite d7/d12/d2b/f4d [0,4194304] 0 2026-03-10T14:08:21.076 INFO:tasks.workunit.client.1.vm04.stdout:4/868: dwrite d4/df/db2/db6/dc9/dd0/fed [0,4194304] 0 2026-03-10T14:08:21.081 INFO:tasks.workunit.client.1.vm04.stdout:4/869: dwrite d4/d14/fad [0,4194304] 0 2026-03-10T14:08:21.081 INFO:tasks.workunit.client.1.vm04.stdout:4/870: stat d4/df/db2/db4/d47/cfc 0 2026-03-10T14:08:21.083 INFO:tasks.workunit.client.1.vm04.stdout:7/965: truncate d2/dc/d4d/dcd/f45 978497 0 2026-03-10T14:08:21.083 INFO:tasks.workunit.client.1.vm04.stdout:0/930: rename d0/d6e/f8b to d0/d2/d25/f124 0 2026-03-10T14:08:21.091 INFO:tasks.workunit.client.1.vm04.stdout:5/985: readlink d7/d12/d2b/d3e/l60 0 2026-03-10T14:08:21.092 INFO:tasks.workunit.client.1.vm04.stdout:5/986: write d7/d59/d7e/d87/f143 [886524,35437] 0 2026-03-10T14:08:21.100 INFO:tasks.workunit.client.1.vm04.stdout:3/935: creat da/ded/df8/f148 x:0 0 0 2026-03-10T14:08:21.100 INFO:tasks.workunit.client.1.vm04.stdout:4/871: read d4/d14/d1b/f99 [569810,23489] 0 2026-03-10T14:08:21.111 INFO:tasks.workunit.client.1.vm04.stdout:3/936: fsync da/dc/d3f/d54/d66/f80 0 2026-03-10T14:08:21.112 INFO:tasks.workunit.client.1.vm04.stdout:3/937: rename da to da/dc/d3f/d61/d149 22 2026-03-10T14:08:21.113 INFO:tasks.workunit.client.1.vm04.stdout:5/987: dread d7/d12/d2b/d93/fca [0,4194304] 0 2026-03-10T14:08:21.114 INFO:tasks.workunit.client.1.vm04.stdout:5/988: chown d7/d59/d7d/c109 115 1 2026-03-10T14:08:21.123 INFO:tasks.workunit.client.1.vm04.stdout:7/966: link d2/dc/de/d2d/d60/d81/db3/f13e d2/dac/d115/f151 0 2026-03-10T14:08:21.123 INFO:tasks.workunit.client.1.vm04.stdout:2/960: write d0/d14/d91/d8/d17/d4e/d85/f88 [856978,4686] 0 2026-03-10T14:08:21.125 INFO:tasks.workunit.client.1.vm04.stdout:4/872: creat d4/df/db2/db6/dc9/d11c/f120 x:0 0 0 2026-03-10T14:08:21.130 INFO:tasks.workunit.client.0.vm03.stdout:1/370: dread d0/d18/d3b/f3d [0,4194304] 0 2026-03-10T14:08:21.131 INFO:tasks.workunit.client.1.vm04.stdout:5/989: rename d7/d59/d7e/d87 to d7/d12/d2b/d3e/d57/d77/da5/d140/d146 0 2026-03-10T14:08:21.131 INFO:tasks.workunit.client.1.vm04.stdout:5/990: chown d7/d59/d7e/ffa 218321521 1 2026-03-10T14:08:21.131 INFO:tasks.workunit.client.0.vm03.stdout:1/371: chown d0/fa 10247 1 2026-03-10T14:08:21.139 INFO:tasks.workunit.client.1.vm04.stdout:7/967: creat d2/dc/de/d2d/d5c/da9/f152 x:0 0 0 2026-03-10T14:08:21.140 INFO:tasks.workunit.client.1.vm04.stdout:9/915: getdents d9/d44/d4d/d7d 0 2026-03-10T14:08:21.143 INFO:tasks.workunit.client.1.vm04.stdout:9/916: dread d9/d44/f51 [4194304,4194304] 0 2026-03-10T14:08:21.147 INFO:tasks.workunit.client.1.vm04.stdout:6/806: dread d3/de/d35/d3a/d43/fe0 [0,4194304] 0 2026-03-10T14:08:21.149 INFO:tasks.workunit.client.1.vm04.stdout:5/991: fsync d7/d12/d2b/d3e/d3f/dc0/fc7 0 2026-03-10T14:08:21.153 INFO:tasks.workunit.client.1.vm04.stdout:3/938: link da/d3e/fb9 da/dc/d35/f14a 0 2026-03-10T14:08:21.154 INFO:tasks.workunit.client.1.vm04.stdout:4/873: sync 2026-03-10T14:08:21.170 INFO:tasks.workunit.client.1.vm04.stdout:5/992: stat d7/d12/d2b/d3e/c49 0 2026-03-10T14:08:21.170 INFO:tasks.workunit.client.1.vm04.stdout:5/993: readlink d7/d12/d2b/l90 0 2026-03-10T14:08:21.178 INFO:tasks.workunit.client.1.vm04.stdout:9/917: unlink d9/d44/d4d/f4e 0 2026-03-10T14:08:21.181 INFO:tasks.workunit.client.1.vm04.stdout:4/874: mknod d4/df7/c121 0 2026-03-10T14:08:21.183 INFO:tasks.workunit.client.1.vm04.stdout:2/961: rename d0/d14/d1b/c21 to d0/d14/d91/d8/d17/d4e/dea/c132 0 2026-03-10T14:08:21.187 INFO:tasks.workunit.client.1.vm04.stdout:1/940: dwrite d3/d22/d63/f65 [0,4194304] 0 2026-03-10T14:08:21.197 INFO:tasks.workunit.client.1.vm04.stdout:5/994: rmdir d7/d2d/d69/db8 39 2026-03-10T14:08:21.204 INFO:tasks.workunit.client.1.vm04.stdout:7/968: link d2/dc/de/d2d/d38/d50/dc8/l61 d2/df9/d145/l153 0 2026-03-10T14:08:21.209 INFO:tasks.workunit.client.1.vm04.stdout:9/918: mknod d9/da/dd/d1c/da3/dec/c139 0 2026-03-10T14:08:21.210 INFO:tasks.workunit.client.1.vm04.stdout:3/939: rename da/dc/d3f/d61/df7/lfa to da/dc/d47/d116/l14b 0 2026-03-10T14:08:21.220 INFO:tasks.workunit.client.1.vm04.stdout:1/941: dread - d3/d5c/d79/d98/f114 zero size 2026-03-10T14:08:21.223 INFO:tasks.workunit.client.1.vm04.stdout:5/995: rmdir d7/d59 39 2026-03-10T14:08:21.224 INFO:tasks.workunit.client.0.vm03.stdout:6/350: rename d8/db/d2c/d2d to d8/db/d49/d6c 0 2026-03-10T14:08:21.224 INFO:tasks.workunit.client.0.vm03.stdout:8/372: write da/f15 [4850084,69236] 0 2026-03-10T14:08:21.226 INFO:tasks.workunit.client.0.vm03.stdout:6/351: chown d8/db/d49/d6c/d32/f3e 120 1 2026-03-10T14:08:21.228 INFO:tasks.workunit.client.1.vm04.stdout:7/969: symlink d2/dc/de/d2d/d38/d50/dc8/d10e/l154 0 2026-03-10T14:08:21.234 INFO:tasks.workunit.client.1.vm04.stdout:9/919: creat d9/da/d5d/f13a x:0 0 0 2026-03-10T14:08:21.237 INFO:tasks.workunit.client.1.vm04.stdout:9/920: dwrite d9/da/d8c/de5/d11d/f129 [0,4194304] 0 2026-03-10T14:08:21.238 INFO:tasks.workunit.client.0.vm03.stdout:2/359: write d5/d10/d17/f19 [678691,66150] 0 2026-03-10T14:08:21.242 INFO:tasks.workunit.client.1.vm04.stdout:0/931: dwrite d0/d2/d15/d22/d62/fe9 [0,4194304] 0 2026-03-10T14:08:21.249 INFO:tasks.workunit.client.1.vm04.stdout:2/962: mknod d0/d14/d91/d4a/d66/dcd/c133 0 2026-03-10T14:08:21.250 INFO:tasks.workunit.client.1.vm04.stdout:6/807: getdents d3/de/d35/d3f/d2d/d32/d23/ddc 0 2026-03-10T14:08:21.253 INFO:tasks.workunit.client.0.vm03.stdout:1/372: rename d0/d2/df/d16/f1e to d0/d18/f74 0 2026-03-10T14:08:21.255 INFO:tasks.workunit.client.0.vm03.stdout:6/352: link d8/l36 d8/db/d49/l6d 0 2026-03-10T14:08:21.256 INFO:tasks.workunit.client.1.vm04.stdout:5/996: rmdir d7/d2d 39 2026-03-10T14:08:21.257 INFO:tasks.workunit.client.0.vm03.stdout:8/373: creat da/d58/d6c/f76 x:0 0 0 2026-03-10T14:08:21.262 INFO:tasks.workunit.client.1.vm04.stdout:4/875: rmdir d4/d11f 0 2026-03-10T14:08:21.271 INFO:tasks.workunit.client.1.vm04.stdout:9/921: read d9/d58/db5/da5/fc9 [393804,56361] 0 2026-03-10T14:08:21.274 INFO:tasks.workunit.client.0.vm03.stdout:8/374: creat da/d36/f77 x:0 0 0 2026-03-10T14:08:21.275 INFO:tasks.workunit.client.1.vm04.stdout:2/963: mknod d0/d14/d39/d47/d70/dad/dfc/c134 0 2026-03-10T14:08:21.276 INFO:tasks.workunit.client.1.vm04.stdout:2/964: readlink d0/d14/d91/d8/l18 0 2026-03-10T14:08:21.276 INFO:tasks.workunit.client.1.vm04.stdout:2/965: dread - d0/d14/d91/d8/dd/ff7 zero size 2026-03-10T14:08:21.277 INFO:tasks.workunit.client.1.vm04.stdout:2/966: read d0/d14/d91/d8/d17/d35/f81 [221247,28276] 0 2026-03-10T14:08:21.278 INFO:tasks.workunit.client.1.vm04.stdout:6/808: mkdir d3/de/d35/d3f/d2d/d38/d40/dfa 0 2026-03-10T14:08:21.281 INFO:tasks.workunit.client.1.vm04.stdout:2/967: dread d0/d14/d91/d4a/d8c/dab/db3/fd6 [4194304,4194304] 0 2026-03-10T14:08:21.283 INFO:tasks.workunit.client.0.vm03.stdout:1/373: rename d0/d2/df/d16/d20/f6b to d0/d18/f75 0 2026-03-10T14:08:21.283 INFO:tasks.workunit.client.1.vm04.stdout:9/922: stat d9/da/dd/de7/d96/f9c 0 2026-03-10T14:08:21.286 INFO:tasks.workunit.client.1.vm04.stdout:6/809: chown d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fd2 117 1 2026-03-10T14:08:21.288 INFO:tasks.workunit.client.1.vm04.stdout:9/923: dwrite d9/d44/f122 [0,4194304] 0 2026-03-10T14:08:21.295 INFO:tasks.workunit.client.1.vm04.stdout:9/924: fdatasync d9/d44/d59/f7c 0 2026-03-10T14:08:21.314 INFO:tasks.workunit.client.0.vm03.stdout:4/395: write d5/d9/db/f2a [1210305,27438] 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:2/968: unlink d0/d14/l51 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:5/997: truncate d7/d2d/f64 405436 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:4/876: link d4/d14/d64/c82 d4/db9/c122 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:2/969: creat d0/d14/d91/d8/d17/d4e/d8e/f135 x:0 0 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:2/970: stat d0/d14/d39/d47/d70/dc3/dd0/ff1 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:4/877: mkdir d4/df/db2/db6/d123 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:5/998: symlink d7/d26/d6b/l147 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:0/932: dread d0/d2/d15/d22/d38/d56/f84 [0,4194304] 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:5/999: write d7/d12/d2b/d3e/d3f/da6/fee [1074984,58564] 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:4/878: mkdir d4/d14/dac/db7/d124 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:0/933: stat d0/d2/d15/d49/d50/d5c/dd8/deb/d122/f110 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:6/810: dread d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fb2 [0,4194304] 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:6/811: chown d3/f4 51954102 1 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:0/934: write d0/d2/d15/d22/d38/d56/dc1/fc3 [1171250,75827] 0 2026-03-10T14:08:21.315 INFO:tasks.workunit.client.1.vm04.stdout:4/879: creat d4/df7/f125 x:0 0 0 2026-03-10T14:08:21.316 INFO:tasks.workunit.client.1.vm04.stdout:6/812: write d3/de/d35/d3a/d43/d4c/fe5 [640894,121157] 0 2026-03-10T14:08:21.317 INFO:tasks.workunit.client.1.vm04.stdout:0/935: mknod d0/d2/d15/d22/d62/c125 0 2026-03-10T14:08:21.320 INFO:tasks.workunit.client.1.vm04.stdout:4/880: mkdir d4/df/d34/d126 0 2026-03-10T14:08:21.321 INFO:tasks.workunit.client.1.vm04.stdout:0/936: stat d0/d2/d15/d49/d50/d5c/dd8/deb/d122/ld5 0 2026-03-10T14:08:21.322 INFO:tasks.workunit.client.1.vm04.stdout:9/925: dread d9/da/d8c/fea [0,4194304] 0 2026-03-10T14:08:21.328 INFO:tasks.workunit.client.0.vm03.stdout:6/353: sync 2026-03-10T14:08:21.329 INFO:tasks.workunit.client.1.vm04.stdout:6/813: symlink d3/de/d35/d3f/d2d/ddb/lfb 0 2026-03-10T14:08:21.329 INFO:tasks.workunit.client.1.vm04.stdout:4/881: dwrite d4/d14/f3b [4194304,4194304] 0 2026-03-10T14:08:21.330 INFO:tasks.workunit.client.1.vm04.stdout:0/937: fsync d0/d2/d15/d22/d38/d56/d66/f2b 0 2026-03-10T14:08:21.330 INFO:tasks.workunit.client.1.vm04.stdout:4/882: creat d4/d14/d3c/d62/f127 x:0 0 0 2026-03-10T14:08:21.337 INFO:tasks.workunit.client.0.vm03.stdout:8/375: rmdir da/d24/d49 39 2026-03-10T14:08:21.338 INFO:tasks.workunit.client.1.vm04.stdout:4/883: fsync d4/d14/d3c/d5e/f79 0 2026-03-10T14:08:21.342 INFO:tasks.workunit.client.1.vm04.stdout:0/938: dwrite d0/d2/d15/d22/d38/d56/dcb/dce/ffb [0,4194304] 0 2026-03-10T14:08:21.356 INFO:tasks.workunit.client.0.vm03.stdout:4/396: symlink d5/d9/l7f 0 2026-03-10T14:08:21.358 INFO:tasks.workunit.client.1.vm04.stdout:2/971: sync 2026-03-10T14:08:21.364 INFO:tasks.workunit.client.1.vm04.stdout:2/972: mknod d0/d14/d54/c136 0 2026-03-10T14:08:21.364 INFO:tasks.workunit.client.1.vm04.stdout:2/973: rename d0/d14/d91/d8/d17/d35/f5f to d0/d14/d91/d4a/d66/dda/f137 0 2026-03-10T14:08:21.365 INFO:tasks.workunit.client.1.vm04.stdout:2/974: write d0/d14/d91/d4a/d8c/dab/d95/d107/f12a [2181596,103819] 0 2026-03-10T14:08:21.370 INFO:tasks.workunit.client.1.vm04.stdout:2/975: symlink d0/d14/d39/d47/d70/dc3/dd0/l138 0 2026-03-10T14:08:21.374 INFO:tasks.workunit.client.1.vm04.stdout:2/976: chown d0/d14/d1b/d45/da5/f122 184 1 2026-03-10T14:08:21.374 INFO:tasks.workunit.client.1.vm04.stdout:2/977: readlink d0/d14/d91/d4a/d8c/dab/d46/d4b/d56/l63 0 2026-03-10T14:08:21.374 INFO:tasks.workunit.client.0.vm03.stdout:6/354: fsync d8/d1b/f3d 0 2026-03-10T14:08:21.380 INFO:tasks.workunit.client.1.vm04.stdout:3/940: write da/dc/d47/d9b/d106/dde/dac/fe8 [248300,39810] 0 2026-03-10T14:08:21.386 INFO:tasks.workunit.client.1.vm04.stdout:0/939: dread d0/d2/d25/fc0 [0,4194304] 0 2026-03-10T14:08:21.388 INFO:tasks.workunit.client.1.vm04.stdout:0/940: readlink d0/d2/d15/d22/d38/d56/da7/lf3 0 2026-03-10T14:08:21.390 INFO:tasks.workunit.client.1.vm04.stdout:7/970: dwrite d2/dc/de/f12 [0,4194304] 0 2026-03-10T14:08:21.391 INFO:tasks.workunit.client.1.vm04.stdout:9/926: dread d9/da/dd/f48 [0,4194304] 0 2026-03-10T14:08:21.392 INFO:tasks.workunit.client.0.vm03.stdout:1/374: creat d0/d2/df/f76 x:0 0 0 2026-03-10T14:08:21.401 INFO:tasks.workunit.client.1.vm04.stdout:3/941: symlink da/dc/d47/d9b/d106/dde/dac/l14c 0 2026-03-10T14:08:21.401 INFO:tasks.workunit.client.1.vm04.stdout:1/942: write d3/d22/f107 [710548,27322] 0 2026-03-10T14:08:21.402 INFO:tasks.workunit.client.0.vm03.stdout:0/325: truncate d3/d17/f56 4164361 0 2026-03-10T14:08:21.402 INFO:tasks.workunit.client.1.vm04.stdout:0/941: sync 2026-03-10T14:08:21.411 INFO:tasks.workunit.client.1.vm04.stdout:7/971: symlink d2/df9/d145/l155 0 2026-03-10T14:08:21.412 INFO:tasks.workunit.client.1.vm04.stdout:0/942: creat d0/d2/d15/d49/d50/d5c/dd8/deb/d122/f126 x:0 0 0 2026-03-10T14:08:21.414 INFO:tasks.workunit.client.1.vm04.stdout:9/927: symlink d9/da/d8c/de5/l13b 0 2026-03-10T14:08:21.415 INFO:tasks.workunit.client.0.vm03.stdout:4/397: creat d5/d47/d62/f80 x:0 0 0 2026-03-10T14:08:21.415 INFO:tasks.workunit.client.1.vm04.stdout:6/814: dwrite d3/de/d35/d3f/f22 [4194304,4194304] 0 2026-03-10T14:08:21.416 INFO:tasks.workunit.client.1.vm04.stdout:6/815: fsync d3/de/d35/d3f/ff8 0 2026-03-10T14:08:21.418 INFO:tasks.workunit.client.1.vm04.stdout:3/942: mknod da/dc/d3f/c14d 0 2026-03-10T14:08:21.420 INFO:tasks.workunit.client.1.vm04.stdout:1/943: link d3/d5c/f71 d3/d22/d2f/d109/d12b/d60/f145 0 2026-03-10T14:08:21.421 INFO:tasks.workunit.client.1.vm04.stdout:1/944: chown d3/d22/d63/d35/dd9/d13/d38/d58/l8a 5529004 1 2026-03-10T14:08:21.424 INFO:tasks.workunit.client.1.vm04.stdout:3/943: sync 2026-03-10T14:08:21.427 INFO:tasks.workunit.client.1.vm04.stdout:4/884: dwrite d4/d14/d3c/f3e [4194304,4194304] 0 2026-03-10T14:08:21.429 INFO:tasks.workunit.client.1.vm04.stdout:0/943: creat d0/d2/dbe/f127 x:0 0 0 2026-03-10T14:08:21.444 INFO:tasks.workunit.client.1.vm04.stdout:7/972: dread d2/dc/de/d2d/d38/f37 [0,4194304] 0 2026-03-10T14:08:21.447 INFO:tasks.workunit.client.1.vm04.stdout:2/978: dwrite d0/d14/d39/d47/f7e [0,4194304] 0 2026-03-10T14:08:21.449 INFO:tasks.workunit.client.1.vm04.stdout:6/816: mknod d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/cfc 0 2026-03-10T14:08:21.450 INFO:tasks.workunit.client.1.vm04.stdout:6/817: readlink d3/de/d35/d3f/d2d/l58 0 2026-03-10T14:08:21.453 INFO:tasks.workunit.client.0.vm03.stdout:6/355: creat d8/f6e x:0 0 0 2026-03-10T14:08:21.453 INFO:tasks.workunit.client.1.vm04.stdout:6/818: chown d3/de/d35/d3f/d2d/d38/d40/cdd 3957591 1 2026-03-10T14:08:21.459 INFO:tasks.workunit.client.1.vm04.stdout:6/819: dwrite d3/de/d35/d3a/d43/d4c/fe5 [0,4194304] 0 2026-03-10T14:08:21.464 INFO:tasks.workunit.client.0.vm03.stdout:8/376: mkdir da/d24/d78 0 2026-03-10T14:08:21.474 INFO:tasks.workunit.client.1.vm04.stdout:7/973: dread d2/dc/de/d2d/d60/d7c/d3b/f48 [0,4194304] 0 2026-03-10T14:08:21.475 INFO:tasks.workunit.client.1.vm04.stdout:3/944: mknod da/dc/d35/d52/d70/c14e 0 2026-03-10T14:08:21.480 INFO:tasks.workunit.client.1.vm04.stdout:9/928: creat d9/da/dd/d1c/da3/dec/d126/f13c x:0 0 0 2026-03-10T14:08:21.483 INFO:tasks.workunit.client.0.vm03.stdout:8/377: creat da/d3a/d44/f79 x:0 0 0 2026-03-10T14:08:21.487 INFO:tasks.workunit.client.0.vm03.stdout:9/371: dwrite d2/d14/d2b/d34/f6a [0,4194304] 0 2026-03-10T14:08:21.487 INFO:tasks.workunit.client.1.vm04.stdout:4/885: mknod d4/d14/c128 0 2026-03-10T14:08:21.487 INFO:tasks.workunit.client.1.vm04.stdout:3/945: fsync da/dc/d47/d9b/d106/dde/dac/fb4 0 2026-03-10T14:08:21.489 INFO:tasks.workunit.client.1.vm04.stdout:9/929: mknod d9/da/d8c/de5/d11d/c13d 0 2026-03-10T14:08:21.492 INFO:tasks.workunit.client.1.vm04.stdout:6/820: creat d3/de/de8/ffd x:0 0 0 2026-03-10T14:08:21.494 INFO:tasks.workunit.client.0.vm03.stdout:0/326: dread d3/d11/f13 [0,4194304] 0 2026-03-10T14:08:21.494 INFO:tasks.workunit.client.1.vm04.stdout:7/974: mknod d2/dc/de/d2d/d5c/da9/df6/d144/c156 0 2026-03-10T14:08:21.495 INFO:tasks.workunit.client.1.vm04.stdout:4/886: chown d4/d14/d6d/lf3 20982207 1 2026-03-10T14:08:21.495 INFO:tasks.workunit.client.0.vm03.stdout:6/356: getdents d8/db/d49/d58 0 2026-03-10T14:08:21.496 INFO:tasks.workunit.client.1.vm04.stdout:3/946: fdatasync da/dc/f39 0 2026-03-10T14:08:21.499 INFO:tasks.workunit.client.1.vm04.stdout:3/947: dwrite da/dc/fa4 [0,4194304] 0 2026-03-10T14:08:21.505 INFO:tasks.workunit.client.0.vm03.stdout:0/327: mkdir d3/d16/d64 0 2026-03-10T14:08:21.510 INFO:tasks.workunit.client.1.vm04.stdout:2/979: link d0/d14/d91/d4a/d8c/dab/d95/la1 d0/d14/d91/d8/d10d/d11b/l139 0 2026-03-10T14:08:21.511 INFO:tasks.workunit.client.1.vm04.stdout:2/980: write d0/db8/f128 [688641,129195] 0 2026-03-10T14:08:21.512 INFO:tasks.workunit.client.1.vm04.stdout:1/945: link d3/d22/d63/d35/dd9/d13/da0/de9/f124 d3/d5c/d79/de6/f146 0 2026-03-10T14:08:21.513 INFO:tasks.workunit.client.0.vm03.stdout:0/328: mknod d3/d46/c65 0 2026-03-10T14:08:21.516 INFO:tasks.workunit.client.1.vm04.stdout:6/821: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/ffe x:0 0 0 2026-03-10T14:08:21.517 INFO:tasks.workunit.client.1.vm04.stdout:7/975: creat d2/df9/d145/f157 x:0 0 0 2026-03-10T14:08:21.519 INFO:tasks.workunit.client.0.vm03.stdout:9/372: dread d2/f2f [0,4194304] 0 2026-03-10T14:08:21.519 INFO:tasks.workunit.client.1.vm04.stdout:4/887: symlink d4/d14/d3c/d85/l129 0 2026-03-10T14:08:21.521 INFO:tasks.workunit.client.0.vm03.stdout:6/357: link d8/l2b d8/db/l6f 0 2026-03-10T14:08:21.522 INFO:tasks.workunit.client.1.vm04.stdout:3/948: rmdir da/ded 39 2026-03-10T14:08:21.523 INFO:tasks.workunit.client.0.vm03.stdout:9/373: creat d2/d14/d2b/d43/f7c x:0 0 0 2026-03-10T14:08:21.524 INFO:tasks.workunit.client.1.vm04.stdout:9/930: mknod d9/da/dd/de7/d96/d118/c13e 0 2026-03-10T14:08:21.525 INFO:tasks.workunit.client.1.vm04.stdout:2/981: symlink d0/d14/d91/d4a/d8c/dab/d46/dc8/l13a 0 2026-03-10T14:08:21.525 INFO:tasks.workunit.client.1.vm04.stdout:9/931: write d9/d5c/fdc [818324,60756] 0 2026-03-10T14:08:21.526 INFO:tasks.workunit.client.1.vm04.stdout:1/946: truncate d3/d22/d63/d35/dd9/d13/da0/dc5/fd1 1016242 0 2026-03-10T14:08:21.527 INFO:tasks.workunit.client.1.vm04.stdout:7/976: symlink d2/dc/de/d2d/d60/d7c/d36/d8b/l158 0 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.0.vm03.stdout:9/374: symlink d2/d29/d33/d55/l7d 0 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.0.vm03.stdout:0/329: mkdir d3/d11/d66 0 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:3/949: fdatasync da/dc/d3f/d61/dc1/fda 0 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:6/822: rename d3/de/d35/d3f/d2d/d32/d23/d24/c67 to d3/d1d/d73/cff 0 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:6/823: chown d3/de/d35/d3f/d2d/lca 347 1 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:6/824: write d3/d1d/d73/fc6 [3606342,86029] 0 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:6/825: chown d3/de/d35/d3f/d2d/f98 85366 1 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:1/947: creat d3/d22/d63/d35/dd9/d13/da0/de9/f147 x:0 0 0 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:3/950: write da/dc/d35/d37/f5e [73598,51388] 0 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:3/951: chown da/d30 260064 1 2026-03-10T14:08:21.540 INFO:tasks.workunit.client.1.vm04.stdout:3/952: dread - da/dc/d3f/d54/d66/f129 zero size 2026-03-10T14:08:21.547 INFO:tasks.workunit.client.1.vm04.stdout:1/948: mknod d3/d22/d63/d35/dd9/c148 0 2026-03-10T14:08:21.553 INFO:tasks.workunit.client.1.vm04.stdout:4/888: dread d4/d14/d1b/f5c [0,4194304] 0 2026-03-10T14:08:21.555 INFO:tasks.workunit.client.1.vm04.stdout:3/953: symlink da/d8e/l14f 0 2026-03-10T14:08:21.555 INFO:tasks.workunit.client.1.vm04.stdout:1/949: creat d3/d22/f149 x:0 0 0 2026-03-10T14:08:21.563 INFO:tasks.workunit.client.0.vm03.stdout:9/375: dread d2/d14/f39 [0,4194304] 0 2026-03-10T14:08:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:21 vm04.local ceph-mon[55966]: pgmap v10: 65 pgs: 65 active+clean; 2.6 GiB data, 9.0 GiB used, 111 GiB / 120 GiB avail; 30 MiB/s rd, 87 MiB/s wr, 214 op/s 2026-03-10T14:08:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:21 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:21 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:21 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:21.570 INFO:tasks.workunit.client.1.vm04.stdout:0/944: write d0/d2/d15/d22/d38/d56/d66/f54 [919632,65885] 0 2026-03-10T14:08:21.571 INFO:tasks.workunit.client.0.vm03.stdout:9/376: mknod d2/d29/d33/d55/d72/c7e 0 2026-03-10T14:08:21.572 INFO:tasks.workunit.client.1.vm04.stdout:9/932: dread d9/d5c/fdc [0,4194304] 0 2026-03-10T14:08:21.573 INFO:tasks.workunit.client.0.vm03.stdout:9/377: write d2/d14/f39 [4863502,56404] 0 2026-03-10T14:08:21.576 INFO:tasks.workunit.client.1.vm04.stdout:9/933: dwrite d9/da/dd/d74/f92 [0,4194304] 0 2026-03-10T14:08:21.590 INFO:tasks.workunit.client.0.vm03.stdout:9/378: dread d2/f2c [0,4194304] 0 2026-03-10T14:08:21.590 INFO:tasks.workunit.client.1.vm04.stdout:4/889: symlink d4/df/db2/db4/d47/d4f/d8c/dc8/dff/l12a 0 2026-03-10T14:08:21.590 INFO:tasks.workunit.client.1.vm04.stdout:0/945: truncate d0/d2/d25/f3f 2326036 0 2026-03-10T14:08:21.593 INFO:tasks.workunit.client.0.vm03.stdout:9/379: mkdir d2/d29/d33/d60/d7f 0 2026-03-10T14:08:21.599 INFO:tasks.workunit.client.1.vm04.stdout:9/934: symlink d9/da/d11f/l13f 0 2026-03-10T14:08:21.599 INFO:tasks.workunit.client.1.vm04.stdout:9/935: chown d9/da/d5d/l69 529864 1 2026-03-10T14:08:21.600 INFO:tasks.workunit.client.1.vm04.stdout:9/936: dread - d9/da/d8c/de5/ffa zero size 2026-03-10T14:08:21.601 INFO:tasks.workunit.client.1.vm04.stdout:0/946: symlink d0/d2/dbe/l128 0 2026-03-10T14:08:21.607 INFO:tasks.workunit.client.1.vm04.stdout:9/937: mknod d9/da/dd/de7/db1/d116/c140 0 2026-03-10T14:08:21.608 INFO:tasks.workunit.client.0.vm03.stdout:9/380: dread d2/d29/d33/d41/f53 [0,4194304] 0 2026-03-10T14:08:21.610 INFO:tasks.workunit.client.1.vm04.stdout:3/954: dwrite da/dc/d3f/d61/dc1/fda [0,4194304] 0 2026-03-10T14:08:21.620 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:21 vm03.local ceph-mon[49718]: pgmap v10: 65 pgs: 65 active+clean; 2.6 GiB data, 9.0 GiB used, 111 GiB / 120 GiB avail; 30 MiB/s rd, 87 MiB/s wr, 214 op/s 2026-03-10T14:08:21.620 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:21 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:21.620 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:21 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:21.620 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:21 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:21.620 INFO:tasks.workunit.client.0.vm03.stdout:9/381: mknod d2/d29/d33/d55/c80 0 2026-03-10T14:08:21.620 INFO:tasks.workunit.client.1.vm04.stdout:9/938: creat d9/da/dd/d1c/da3/dec/d126/f141 x:0 0 0 2026-03-10T14:08:21.620 INFO:tasks.workunit.client.1.vm04.stdout:7/977: write d2/dc/de/d2d/d60/f118 [813396,46683] 0 2026-03-10T14:08:21.620 INFO:tasks.workunit.client.1.vm04.stdout:2/982: dwrite d0/d14/d91/d3a/fbf [0,4194304] 0 2026-03-10T14:08:21.620 INFO:tasks.workunit.client.0.vm03.stdout:9/382: fsync d2/d14/d2b/d43/f78 0 2026-03-10T14:08:21.623 INFO:tasks.workunit.client.1.vm04.stdout:9/939: creat d9/da/d8c/d121/f142 x:0 0 0 2026-03-10T14:08:21.624 INFO:tasks.workunit.client.1.vm04.stdout:0/947: sync 2026-03-10T14:08:21.624 INFO:tasks.workunit.client.1.vm04.stdout:7/978: sync 2026-03-10T14:08:21.625 INFO:tasks.workunit.client.0.vm03.stdout:9/383: write d2/f15 [4055595,92603] 0 2026-03-10T14:08:21.626 INFO:tasks.workunit.client.0.vm03.stdout:9/384: dread - d2/d29/d33/d41/f7b zero size 2026-03-10T14:08:21.630 INFO:tasks.workunit.client.0.vm03.stdout:9/385: mkdir d2/d14/d2b/d79/d81 0 2026-03-10T14:08:21.632 INFO:tasks.workunit.client.1.vm04.stdout:3/955: creat da/d8e/f150 x:0 0 0 2026-03-10T14:08:21.632 INFO:tasks.workunit.client.1.vm04.stdout:6/826: write d3/de/d35/d3a/d43/d4c/d5e/fc3 [114808,52540] 0 2026-03-10T14:08:21.636 INFO:tasks.workunit.client.1.vm04.stdout:1/950: dwrite d3/d22/d2f/d109/d12b/f27 [8388608,4194304] 0 2026-03-10T14:08:21.637 INFO:tasks.workunit.client.1.vm04.stdout:3/956: dwrite da/dc/d3f/f83 [0,4194304] 0 2026-03-10T14:08:21.638 INFO:tasks.workunit.client.1.vm04.stdout:3/957: dread - da/dc/d3f/d61/df7/f118 zero size 2026-03-10T14:08:21.645 INFO:tasks.workunit.client.1.vm04.stdout:9/940: rename d9/d44/f122 to d9/d44/f143 0 2026-03-10T14:08:21.653 INFO:tasks.workunit.client.1.vm04.stdout:0/948: read d0/d2/d15/d49/d50/d61/d75/f98 [89486,105060] 0 2026-03-10T14:08:21.653 INFO:tasks.workunit.client.1.vm04.stdout:0/949: readlink d0/d6e/l78 0 2026-03-10T14:08:21.654 INFO:tasks.workunit.client.1.vm04.stdout:0/950: truncate d0/d2/d15/d22/d38/f5b 5079302 0 2026-03-10T14:08:21.655 INFO:tasks.workunit.client.1.vm04.stdout:0/951: dread - d0/d2/d15/d49/d50/d5c/da4/f112 zero size 2026-03-10T14:08:21.658 INFO:tasks.workunit.client.1.vm04.stdout:4/890: truncate d4/d14/f3b 8048208 0 2026-03-10T14:08:21.663 INFO:tasks.workunit.client.1.vm04.stdout:9/941: mknod d9/da/dd/de7/db1/c144 0 2026-03-10T14:08:21.668 INFO:tasks.workunit.client.1.vm04.stdout:4/891: read d4/f5f [2107924,96219] 0 2026-03-10T14:08:21.673 INFO:tasks.workunit.client.1.vm04.stdout:0/952: fdatasync d0/d2/d15/d22/d38/d56/f67 0 2026-03-10T14:08:21.690 INFO:tasks.workunit.client.1.vm04.stdout:0/953: dwrite d0/d2/d15/d22/d38/fe6 [0,4194304] 0 2026-03-10T14:08:21.690 INFO:tasks.workunit.client.1.vm04.stdout:6/827: getdents d3/de/d35/d3f/d2d/ddb 0 2026-03-10T14:08:21.690 INFO:tasks.workunit.client.1.vm04.stdout:0/954: unlink d0/da2/caa 0 2026-03-10T14:08:21.690 INFO:tasks.workunit.client.1.vm04.stdout:6/828: dwrite d3/de/d35/d3a/d43/d4c/d5e/fea [0,4194304] 0 2026-03-10T14:08:21.690 INFO:tasks.workunit.client.1.vm04.stdout:4/892: rename d4/df/d34/f7c to d4/df7/f12b 0 2026-03-10T14:08:21.690 INFO:tasks.workunit.client.1.vm04.stdout:6/829: rmdir d3/de/d35/d3f/d2d/d32/d5c 39 2026-03-10T14:08:21.691 INFO:tasks.workunit.client.1.vm04.stdout:4/893: symlink d4/df/d34/l12c 0 2026-03-10T14:08:21.693 INFO:tasks.workunit.client.1.vm04.stdout:6/830: mkdir d3/de/d35/d3a/d43/d100 0 2026-03-10T14:08:21.694 INFO:tasks.workunit.client.1.vm04.stdout:4/894: fsync d4/d14/d1b/f9d 0 2026-03-10T14:08:21.694 INFO:tasks.workunit.client.1.vm04.stdout:4/895: stat d4/df/db2/de1/f10f 0 2026-03-10T14:08:21.697 INFO:tasks.workunit.client.1.vm04.stdout:4/896: chown d4/df/db2/db4/d47/d4f/d8c/dc8/dff/fd2 425863 1 2026-03-10T14:08:21.698 INFO:tasks.workunit.client.1.vm04.stdout:6/831: write d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f87 [5405092,59906] 0 2026-03-10T14:08:21.705 INFO:tasks.workunit.client.1.vm04.stdout:4/897: creat d4/d14/d3c/d62/f12d x:0 0 0 2026-03-10T14:08:21.705 INFO:tasks.workunit.client.1.vm04.stdout:4/898: rename d4/df/d34/d126 to d4/df/db2/db6/dc9/d104/d12e 0 2026-03-10T14:08:21.705 INFO:tasks.workunit.client.1.vm04.stdout:4/899: creat d4/d14/d6d/f12f x:0 0 0 2026-03-10T14:08:21.705 INFO:tasks.workunit.client.1.vm04.stdout:6/832: rename d3/de/d35/d3f/d2d/cda to d3/de/d35/d3f/d2d/c101 0 2026-03-10T14:08:21.711 INFO:tasks.workunit.client.1.vm04.stdout:9/942: rmdir d9/da/d8c/d121 39 2026-03-10T14:08:21.717 INFO:tasks.workunit.client.1.vm04.stdout:6/833: mknod d3/de/d35/d3f/d2d/d32/d5c/c102 0 2026-03-10T14:08:21.724 INFO:tasks.workunit.client.1.vm04.stdout:4/900: rename d4/d14/d64/ld8 to d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/l130 0 2026-03-10T14:08:21.724 INFO:tasks.workunit.client.1.vm04.stdout:4/901: rename d4/db9 to d4/df/db2/db6/d131 0 2026-03-10T14:08:21.724 INFO:tasks.workunit.client.1.vm04.stdout:6/834: mknod d3/de/d35/c103 0 2026-03-10T14:08:21.724 INFO:tasks.workunit.client.1.vm04.stdout:4/902: mkdir d4/df/db2/db4/d47/d4f/d8c/d132 0 2026-03-10T14:08:21.749 INFO:tasks.workunit.client.1.vm04.stdout:2/983: dwrite d0/d14/d91/d3a/d3e/f61 [8388608,4194304] 0 2026-03-10T14:08:21.753 INFO:tasks.workunit.client.1.vm04.stdout:2/984: dwrite d0/d14/d91/d8/dd/ff7 [0,4194304] 0 2026-03-10T14:08:21.786 INFO:tasks.workunit.client.0.vm03.stdout:7/273: write d5/d9/f17 [409848,70111] 0 2026-03-10T14:08:21.798 INFO:tasks.workunit.client.0.vm03.stdout:7/274: symlink d5/d9/d14/d26/l56 0 2026-03-10T14:08:21.801 INFO:tasks.workunit.client.1.vm04.stdout:7/979: truncate d2/dc/de/fed 2026880 0 2026-03-10T14:08:21.802 INFO:tasks.workunit.client.0.vm03.stdout:3/357: write d1d/d29/f2e [1407069,17804] 0 2026-03-10T14:08:21.803 INFO:tasks.workunit.client.1.vm04.stdout:1/951: truncate d3/f2c 6458707 0 2026-03-10T14:08:21.804 INFO:tasks.workunit.client.1.vm04.stdout:3/958: truncate da/f25 1888514 0 2026-03-10T14:08:21.805 INFO:tasks.workunit.client.0.vm03.stdout:3/358: chown d1d/d33/d65/d5d 21 1 2026-03-10T14:08:21.810 INFO:tasks.workunit.client.1.vm04.stdout:0/955: dwrite d0/d2/d15/d49/d50/f55 [0,4194304] 0 2026-03-10T14:08:21.811 INFO:tasks.workunit.client.0.vm03.stdout:3/359: write d1d/f4a [957139,40744] 0 2026-03-10T14:08:21.813 INFO:tasks.workunit.client.0.vm03.stdout:3/360: chown d1d/l3b 599131109 1 2026-03-10T14:08:21.818 INFO:tasks.workunit.client.1.vm04.stdout:9/943: dwrite d9/d58/db5/fc6 [0,4194304] 0 2026-03-10T14:08:21.819 INFO:tasks.workunit.client.1.vm04.stdout:9/944: truncate d9/d58/db5/f67 780381 0 2026-03-10T14:08:21.821 INFO:tasks.workunit.client.0.vm03.stdout:3/361: dread fc [0,4194304] 0 2026-03-10T14:08:21.824 INFO:tasks.workunit.client.0.vm03.stdout:3/362: chown d1d/f36 8129825 1 2026-03-10T14:08:21.824 INFO:tasks.workunit.client.1.vm04.stdout:3/959: creat da/dc/d3f/d61/f151 x:0 0 0 2026-03-10T14:08:21.826 INFO:tasks.workunit.client.0.vm03.stdout:5/444: write d4/d16/f2d [892880,48905] 0 2026-03-10T14:08:21.831 INFO:tasks.workunit.client.1.vm04.stdout:1/952: rename d3/d22/d63/d35/dd9/d13/d1a/c64 to d3/d22/d63/d35/dd9/d13/c14a 0 2026-03-10T14:08:21.832 INFO:tasks.workunit.client.0.vm03.stdout:3/363: dwrite d1d/d33/f3a [0,4194304] 0 2026-03-10T14:08:21.838 INFO:tasks.workunit.client.0.vm03.stdout:3/364: mkdir d1d/d33/d47/d53/d68 0 2026-03-10T14:08:21.839 INFO:tasks.workunit.client.0.vm03.stdout:3/365: rename d1d/d39/f60 to d1d/d59/f69 0 2026-03-10T14:08:21.842 INFO:tasks.workunit.client.0.vm03.stdout:3/366: unlink d1d/l27 0 2026-03-10T14:08:21.843 INFO:tasks.workunit.client.1.vm04.stdout:6/835: write d3/de/d35/d3f/d2d/d32/d23/d47/f62 [336476,67405] 0 2026-03-10T14:08:21.844 INFO:tasks.workunit.client.1.vm04.stdout:3/960: unlink da/d30/ca5 0 2026-03-10T14:08:21.847 INFO:tasks.workunit.client.0.vm03.stdout:7/275: rmdir d5/d9/d14/d26/d36 39 2026-03-10T14:08:21.847 INFO:tasks.workunit.client.1.vm04.stdout:4/903: dwrite d4/df/db2/db4/d47/d4f/d8c/dc8/dff/f87 [0,4194304] 0 2026-03-10T14:08:21.853 INFO:tasks.workunit.client.1.vm04.stdout:1/953: write d3/d22/d63/d35/dd9/d13/d38/d58/d5b/f119 [685232,17149] 0 2026-03-10T14:08:21.866 INFO:tasks.workunit.client.0.vm03.stdout:3/367: dread d1d/d33/f5e [0,4194304] 0 2026-03-10T14:08:21.866 INFO:tasks.workunit.client.1.vm04.stdout:4/904: fsync d4/df/db2/db4/d47/d4f/f6a 0 2026-03-10T14:08:21.867 INFO:tasks.workunit.client.1.vm04.stdout:4/905: write d4/d14/d3c/d62/f12d [700356,4885] 0 2026-03-10T14:08:21.868 INFO:tasks.workunit.client.1.vm04.stdout:7/980: link d2/dc/de/d2d/d60/d7c/d44/f66 d2/dc/de/d2d/d60/d81/db3/f159 0 2026-03-10T14:08:21.869 INFO:tasks.workunit.client.1.vm04.stdout:9/945: link d9/d33/cc3 d9/da/d8c/de5/d11d/c145 0 2026-03-10T14:08:21.871 INFO:tasks.workunit.client.1.vm04.stdout:6/836: mknod d3/de/d35/d3f/d2d/d38/d40/dfa/c104 0 2026-03-10T14:08:21.879 INFO:tasks.workunit.client.1.vm04.stdout:0/956: link d0/d2/d15/d22/d38/d56/lec d0/d2/d15/d49/d50/d5c/l129 0 2026-03-10T14:08:21.883 INFO:tasks.workunit.client.1.vm04.stdout:9/946: dread - d9/da/dd/d1c/da3/dec/f104 zero size 2026-03-10T14:08:21.883 INFO:tasks.workunit.client.1.vm04.stdout:9/947: chown d9/d33/f8f 0 1 2026-03-10T14:08:21.886 INFO:tasks.workunit.client.1.vm04.stdout:0/957: dread d0/d2/d15/d22/d38/d56/d66/f2e [4194304,4194304] 0 2026-03-10T14:08:21.887 INFO:tasks.workunit.client.1.vm04.stdout:0/958: chown d0/d2/d15/d22/d38/d56/dc1/dd4/f123 916580146 1 2026-03-10T14:08:21.893 INFO:tasks.workunit.client.1.vm04.stdout:0/959: unlink d0/d2/fd 0 2026-03-10T14:08:21.897 INFO:tasks.workunit.client.0.vm03.stdout:7/276: dread d5/d9/d14/f41 [0,4194304] 0 2026-03-10T14:08:21.897 INFO:tasks.workunit.client.1.vm04.stdout:2/985: dwrite d0/d14/d1b/f32 [0,4194304] 0 2026-03-10T14:08:21.908 INFO:tasks.workunit.client.1.vm04.stdout:0/960: dread - d0/d2/d15/d49/d50/d5c/dd8/deb/ff7 zero size 2026-03-10T14:08:21.909 INFO:tasks.workunit.client.1.vm04.stdout:7/981: getdents d2/df9 0 2026-03-10T14:08:21.910 INFO:tasks.workunit.client.1.vm04.stdout:2/986: rmdir d0/d14/d39 39 2026-03-10T14:08:21.916 INFO:tasks.workunit.client.1.vm04.stdout:7/982: creat d2/dc/de/d2d/d60/d7c/d64/d108/d117/f15a x:0 0 0 2026-03-10T14:08:21.918 INFO:tasks.workunit.client.1.vm04.stdout:3/961: write f4 [1563807,47217] 0 2026-03-10T14:08:21.923 INFO:tasks.workunit.client.1.vm04.stdout:2/987: fdatasync d0/d14/d91/d4a/d66/f7d 0 2026-03-10T14:08:21.932 INFO:tasks.workunit.client.1.vm04.stdout:7/983: rename d2/dc/de/d2d/d60/d7c/d3b/f49 to d2/dc/de/d11/de7/f15b 0 2026-03-10T14:08:21.935 INFO:tasks.workunit.client.1.vm04.stdout:3/962: dread - da/dc/d35/f107 zero size 2026-03-10T14:08:21.941 INFO:tasks.workunit.client.1.vm04.stdout:1/954: dwrite d3/d22/d63/f2d [0,4194304] 0 2026-03-10T14:08:21.943 INFO:tasks.workunit.client.0.vm03.stdout:2/360: dread d5/d2a/f34 [0,4194304] 0 2026-03-10T14:08:21.943 INFO:tasks.workunit.client.1.vm04.stdout:9/948: dread d9/d44/d70/fdb [0,4194304] 0 2026-03-10T14:08:21.945 INFO:tasks.workunit.client.1.vm04.stdout:4/906: write d4/d14/d1b/f99 [1498738,76152] 0 2026-03-10T14:08:21.948 INFO:tasks.workunit.client.1.vm04.stdout:2/988: write d0/d14/d91/d8/d17/d4e/dea/dde/ffe [5601371,28868] 0 2026-03-10T14:08:21.948 INFO:tasks.workunit.client.1.vm04.stdout:9/949: dwrite d9/d44/d4d/d7d/f123 [0,4194304] 0 2026-03-10T14:08:21.952 INFO:tasks.workunit.client.0.vm03.stdout:2/361: write d5/d10/f13 [5545459,51200] 0 2026-03-10T14:08:21.958 INFO:tasks.workunit.client.1.vm04.stdout:2/989: mknod d0/d14/d91/d8/d10d/d11b/c13b 0 2026-03-10T14:08:21.958 INFO:tasks.workunit.client.0.vm03.stdout:2/362: chown d5/f2d 642633 1 2026-03-10T14:08:21.959 INFO:tasks.workunit.client.1.vm04.stdout:2/990: chown d0/d14/d91/d3a/d3e/f38 15 1 2026-03-10T14:08:21.971 INFO:tasks.workunit.client.1.vm04.stdout:2/991: dread d0/d14/deb/f109 [0,4194304] 0 2026-03-10T14:08:21.973 INFO:tasks.workunit.client.1.vm04.stdout:6/837: write d3/de/d35/d3a/d43/d4c/fe5 [4891028,113819] 0 2026-03-10T14:08:21.976 INFO:tasks.workunit.client.1.vm04.stdout:1/955: creat d3/d22/d63/d35/dd9/d13/d1a/f14b x:0 0 0 2026-03-10T14:08:21.987 INFO:tasks.workunit.client.0.vm03.stdout:2/363: creat d5/d10/d1c/d54/d5f/f6c x:0 0 0 2026-03-10T14:08:21.987 INFO:tasks.workunit.client.0.vm03.stdout:2/364: write d5/d10/d1c/d40/f68 [866301,92777] 0 2026-03-10T14:08:21.987 INFO:tasks.workunit.client.0.vm03.stdout:2/365: rename d5/d10/f12 to d5/d2a/f6d 0 2026-03-10T14:08:21.987 INFO:tasks.workunit.client.1.vm04.stdout:2/992: creat d0/d14/d91/d4a/d66/dda/f13c x:0 0 0 2026-03-10T14:08:21.987 INFO:tasks.workunit.client.1.vm04.stdout:6/838: chown d3/de/d35/d3a/d43/f4b 858815248 1 2026-03-10T14:08:21.987 INFO:tasks.workunit.client.1.vm04.stdout:1/956: symlink d3/d22/d63/d35/dd9/d13/d38/d58/d5b/l14c 0 2026-03-10T14:08:21.987 INFO:tasks.workunit.client.1.vm04.stdout:6/839: symlink d3/de/l105 0 2026-03-10T14:08:21.989 INFO:tasks.workunit.client.1.vm04.stdout:6/840: rmdir d3/de/d35/d3f/d2d/d32/dd0 39 2026-03-10T14:08:21.991 INFO:tasks.workunit.client.1.vm04.stdout:9/950: dread d9/da/dd/d1c/f2e [4194304,4194304] 0 2026-03-10T14:08:21.997 INFO:tasks.workunit.client.1.vm04.stdout:9/951: dread d9/da/dd/de7/d96/d118/f102 [0,4194304] 0 2026-03-10T14:08:21.997 INFO:tasks.workunit.client.1.vm04.stdout:9/952: dread - d9/da/dd/d1c/f131 zero size 2026-03-10T14:08:22.000 INFO:tasks.workunit.client.1.vm04.stdout:1/957: creat d3/f14d x:0 0 0 2026-03-10T14:08:22.000 INFO:tasks.workunit.client.1.vm04.stdout:0/961: write d0/dee/ff0 [1002811,55579] 0 2026-03-10T14:08:22.000 INFO:tasks.workunit.client.1.vm04.stdout:1/958: chown d3/d5c/fb2 6014596 1 2026-03-10T14:08:22.004 INFO:tasks.workunit.client.1.vm04.stdout:1/959: dread d3/d22/d63/f65 [0,4194304] 0 2026-03-10T14:08:22.006 INFO:tasks.workunit.client.1.vm04.stdout:9/953: mknod d9/da/dd/d74/c146 0 2026-03-10T14:08:22.008 INFO:tasks.workunit.client.1.vm04.stdout:7/984: truncate d2/dc/d4d/dcd/f77 862661 0 2026-03-10T14:08:22.010 INFO:tasks.workunit.client.1.vm04.stdout:7/985: write d2/dc/de/d2d/d38/fa5 [1093751,1321] 0 2026-03-10T14:08:22.011 INFO:tasks.workunit.client.1.vm04.stdout:4/907: dwrite d4/f5f [0,4194304] 0 2026-03-10T14:08:22.013 INFO:tasks.workunit.client.1.vm04.stdout:4/908: chown d4/d14/d3c/d5e/l61 37498 1 2026-03-10T14:08:22.013 INFO:tasks.workunit.client.1.vm04.stdout:3/963: dwrite da/d30/fd0 [0,4194304] 0 2026-03-10T14:08:22.020 INFO:tasks.workunit.client.1.vm04.stdout:9/954: dread d9/d58/db5/da5/fc4 [0,4194304] 0 2026-03-10T14:08:22.024 INFO:tasks.workunit.client.1.vm04.stdout:9/955: dwrite d9/d44/d4d/f138 [0,4194304] 0 2026-03-10T14:08:22.034 INFO:tasks.workunit.client.1.vm04.stdout:2/993: write d0/db8/fd3 [448993,39638] 0 2026-03-10T14:08:22.034 INFO:tasks.workunit.client.1.vm04.stdout:2/994: fdatasync d0/d14/d91/d8/d17/d4e/d85/f88 0 2026-03-10T14:08:22.036 INFO:tasks.workunit.client.1.vm04.stdout:6/841: write d3/de/d35/d3f/d2d/d32/d5c/f7f [608110,67095] 0 2026-03-10T14:08:22.041 INFO:tasks.workunit.client.1.vm04.stdout:2/995: dwrite d0/d14/d91/d8/f30 [0,4194304] 0 2026-03-10T14:08:22.043 INFO:tasks.workunit.client.0.vm03.stdout:1/375: write d0/d2/df/f1f [1481469,96460] 0 2026-03-10T14:08:22.053 INFO:tasks.workunit.client.1.vm04.stdout:3/964: stat da/dc/d35/d37/c3b 0 2026-03-10T14:08:22.054 INFO:tasks.workunit.client.0.vm03.stdout:1/376: mknod d0/d2/df/d27/c77 0 2026-03-10T14:08:22.054 INFO:tasks.workunit.client.0.vm03.stdout:8/378: write da/d24/f32 [48243,88570] 0 2026-03-10T14:08:22.054 INFO:tasks.workunit.client.0.vm03.stdout:4/398: dwrite d5/d9/db/d19/f51 [0,4194304] 0 2026-03-10T14:08:22.057 INFO:tasks.workunit.client.0.vm03.stdout:4/399: dwrite d5/d9/db/d19/f51 [4194304,4194304] 0 2026-03-10T14:08:22.058 INFO:tasks.workunit.client.1.vm04.stdout:0/962: symlink d0/d2/d15/d49/d50/d61/ddb/l12a 0 2026-03-10T14:08:22.060 INFO:tasks.workunit.client.1.vm04.stdout:6/842: truncate d3/de/d35/d3f/d2d/d32/d23/f31 3911004 0 2026-03-10T14:08:22.064 INFO:tasks.workunit.client.1.vm04.stdout:2/996: rename d0/d14/l69 to d0/d14/d91/d4a/d66/dda/df4/l13d 0 2026-03-10T14:08:22.066 INFO:tasks.workunit.client.1.vm04.stdout:4/909: mkdir d4/df/db2/db4/d133 0 2026-03-10T14:08:22.067 INFO:tasks.workunit.client.0.vm03.stdout:1/377: truncate d0/d18/d3b/f53 128993 0 2026-03-10T14:08:22.068 INFO:tasks.workunit.client.1.vm04.stdout:3/965: dread - da/dc/d35/d37/f114 zero size 2026-03-10T14:08:22.069 INFO:tasks.workunit.client.0.vm03.stdout:1/378: truncate d0/f48 2469702 0 2026-03-10T14:08:22.070 INFO:tasks.workunit.client.1.vm04.stdout:2/997: write d0/d14/d91/d4a/fed [1029275,42949] 0 2026-03-10T14:08:22.070 INFO:tasks.workunit.client.0.vm03.stdout:4/400: link d5/l2d d5/d9/db/d19/d38/d53/l81 0 2026-03-10T14:08:22.071 INFO:tasks.workunit.client.1.vm04.stdout:3/966: mkdir da/dc/d35/d52/d6d/d152 0 2026-03-10T14:08:22.073 INFO:tasks.workunit.client.1.vm04.stdout:6/843: creat d3/de/d35/d3f/d2d/d32/d23/d24/d8e/f106 x:0 0 0 2026-03-10T14:08:22.074 INFO:tasks.workunit.client.1.vm04.stdout:6/844: write d3/d1d/f44 [4542992,55609] 0 2026-03-10T14:08:22.075 INFO:tasks.workunit.client.1.vm04.stdout:6/845: chown d3/de/d35/d3f/d2d/d32/d23/d4e 5 1 2026-03-10T14:08:22.079 INFO:tasks.workunit.client.1.vm04.stdout:2/998: mknod d0/d14/deb/c13e 0 2026-03-10T14:08:22.081 INFO:tasks.workunit.client.1.vm04.stdout:0/963: readlink d0/d2/d15/d22/d38/d56/dcb/dce/df5/l119 0 2026-03-10T14:08:22.086 INFO:tasks.workunit.client.1.vm04.stdout:6/846: mkdir d3/de/d35/d3f/d2d/d38/d40/d107 0 2026-03-10T14:08:22.087 INFO:tasks.workunit.client.1.vm04.stdout:0/964: symlink d0/dee/l12b 0 2026-03-10T14:08:22.088 INFO:tasks.workunit.client.1.vm04.stdout:4/910: getdents d4/d14/d6d 0 2026-03-10T14:08:22.089 INFO:tasks.workunit.client.1.vm04.stdout:6/847: mknod d3/de/d35/d3f/d2d/d32/d23/d4e/c108 0 2026-03-10T14:08:22.090 INFO:tasks.workunit.client.1.vm04.stdout:0/965: creat d0/d2/d15/d49/d50/d5c/f12c x:0 0 0 2026-03-10T14:08:22.095 INFO:tasks.workunit.client.1.vm04.stdout:2/999: creat d0/d14/d91/d8/d17/d4e/dea/f13f x:0 0 0 2026-03-10T14:08:22.095 INFO:tasks.workunit.client.1.vm04.stdout:4/911: creat d4/df/db2/de1/f134 x:0 0 0 2026-03-10T14:08:22.096 INFO:tasks.workunit.client.1.vm04.stdout:4/912: dwrite d4/df/f115 [0,4194304] 0 2026-03-10T14:08:22.097 INFO:tasks.workunit.client.1.vm04.stdout:4/913: readlink d4/d14/d3c/d5e/lfa 0 2026-03-10T14:08:22.098 INFO:tasks.workunit.client.1.vm04.stdout:4/914: chown d4/df/db2/db4/f37 121553 1 2026-03-10T14:08:22.104 INFO:tasks.workunit.client.1.vm04.stdout:0/966: mkdir d0/d2/d15/d12d 0 2026-03-10T14:08:22.106 INFO:tasks.workunit.client.1.vm04.stdout:4/915: link d4/df/c16 d4/df/db2/de1/c135 0 2026-03-10T14:08:22.124 INFO:tasks.workunit.client.1.vm04.stdout:0/967: dwrite d0/d2/d15/d22/d38/d56/dc1/dd4/f123 [0,4194304] 0 2026-03-10T14:08:22.124 INFO:tasks.workunit.client.1.vm04.stdout:0/968: mknod d0/d2/d15/d22/d38/d56/dcb/dce/c12e 0 2026-03-10T14:08:22.124 INFO:tasks.workunit.client.0.vm03.stdout:6/358: write d8/db/d12/f5d [838796,103660] 0 2026-03-10T14:08:22.124 INFO:tasks.workunit.client.0.vm03.stdout:6/359: chown d8/db/d49/d6c/d32/c52 49427 1 2026-03-10T14:08:22.124 INFO:tasks.workunit.client.0.vm03.stdout:6/360: rename d8/db/d12/c17 to d8/db/d2c/c70 0 2026-03-10T14:08:22.124 INFO:tasks.workunit.client.0.vm03.stdout:6/361: mknod d8/db/d12/d51/d5c/d60/c71 0 2026-03-10T14:08:22.126 INFO:tasks.workunit.client.1.vm04.stdout:0/969: getdents d0/d2/d15/d49/d50/d5c/dd8/deb/d122 0 2026-03-10T14:08:22.129 INFO:tasks.workunit.client.1.vm04.stdout:0/970: creat d0/d2/d15/d22/d38/d56/dc1/f12f x:0 0 0 2026-03-10T14:08:22.130 INFO:tasks.workunit.client.1.vm04.stdout:0/971: mknod d0/d2/d15/d49/d50/d5c/dd8/deb/d122/c130 0 2026-03-10T14:08:22.131 INFO:tasks.workunit.client.1.vm04.stdout:0/972: creat d0/d2/d15/d49/f131 x:0 0 0 2026-03-10T14:08:22.132 INFO:tasks.workunit.client.1.vm04.stdout:0/973: fdatasync d0/d6e/f111 0 2026-03-10T14:08:22.134 INFO:tasks.workunit.client.1.vm04.stdout:0/974: link d0/d2/d25/f7f d0/d2/d15/d49/d50/d10e/f132 0 2026-03-10T14:08:22.139 INFO:tasks.workunit.client.0.vm03.stdout:6/362: creat d8/db/d12/f72 x:0 0 0 2026-03-10T14:08:22.146 INFO:tasks.workunit.client.0.vm03.stdout:6/363: read d8/db/d49/d6c/d32/f41 [4185485,35031] 0 2026-03-10T14:08:22.152 INFO:tasks.workunit.client.0.vm03.stdout:6/364: creat d8/d1b/d1c/f73 x:0 0 0 2026-03-10T14:08:22.171 INFO:tasks.workunit.client.0.vm03.stdout:8/379: dread da/d24/f3d [0,4194304] 0 2026-03-10T14:08:22.172 INFO:tasks.workunit.client.0.vm03.stdout:1/379: dread d0/d18/f25 [0,4194304] 0 2026-03-10T14:08:22.178 INFO:tasks.workunit.client.0.vm03.stdout:1/380: fdatasync d0/d2/df/d16/d41/f68 0 2026-03-10T14:08:22.181 INFO:tasks.workunit.client.0.vm03.stdout:1/381: read - d0/d2/d34/f69 zero size 2026-03-10T14:08:22.183 INFO:tasks.workunit.client.0.vm03.stdout:8/380: unlink da/d36/f3b 0 2026-03-10T14:08:22.187 INFO:tasks.workunit.client.0.vm03.stdout:1/382: mknod d0/d18/d3b/d50/c78 0 2026-03-10T14:08:22.196 INFO:tasks.workunit.client.1.vm04.stdout:3/967: dread da/d3e/ff4 [0,4194304] 0 2026-03-10T14:08:22.200 INFO:tasks.workunit.client.1.vm04.stdout:3/968: truncate da/dc/d35/d37/ff6 95228 0 2026-03-10T14:08:22.202 INFO:tasks.workunit.client.0.vm03.stdout:0/330: write d3/d17/f35 [2143901,118734] 0 2026-03-10T14:08:22.206 INFO:tasks.workunit.client.1.vm04.stdout:0/975: sync 2026-03-10T14:08:22.211 INFO:tasks.workunit.client.0.vm03.stdout:0/331: chown d3/d46/d54/c58 33070751 1 2026-03-10T14:08:22.212 INFO:tasks.workunit.client.0.vm03.stdout:8/381: dwrite da/d3c/d4b/f63 [0,4194304] 0 2026-03-10T14:08:22.217 INFO:tasks.workunit.client.0.vm03.stdout:1/383: getdents d0/d2/df/d16 0 2026-03-10T14:08:22.229 INFO:tasks.workunit.client.0.vm03.stdout:0/332: mknod d3/d46/d5e/c67 0 2026-03-10T14:08:22.232 INFO:tasks.workunit.client.0.vm03.stdout:1/384: mknod d0/d2/df/d16/c79 0 2026-03-10T14:08:22.233 INFO:tasks.workunit.client.0.vm03.stdout:9/386: write d2/fc [2237959,63872] 0 2026-03-10T14:08:22.233 INFO:tasks.workunit.client.1.vm04.stdout:9/956: write d9/fae [471391,86281] 0 2026-03-10T14:08:22.234 INFO:tasks.workunit.client.1.vm04.stdout:1/960: write d3/d22/d2f/f5d [1224009,31192] 0 2026-03-10T14:08:22.236 INFO:tasks.workunit.client.1.vm04.stdout:1/961: creat d3/d22/d6d/d13b/f14e x:0 0 0 2026-03-10T14:08:22.248 INFO:tasks.workunit.client.0.vm03.stdout:8/382: mknod da/c7a 0 2026-03-10T14:08:22.248 INFO:tasks.workunit.client.1.vm04.stdout:9/957: creat d9/da/dd/de7/d96/d9d/f147 x:0 0 0 2026-03-10T14:08:22.248 INFO:tasks.workunit.client.1.vm04.stdout:1/962: creat d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92/f14f x:0 0 0 2026-03-10T14:08:22.248 INFO:tasks.workunit.client.1.vm04.stdout:9/958: unlink d9/d58/f62 0 2026-03-10T14:08:22.248 INFO:tasks.workunit.client.1.vm04.stdout:6/848: write d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f86 [1816791,25408] 0 2026-03-10T14:08:22.248 INFO:tasks.workunit.client.1.vm04.stdout:6/849: dwrite d3/de/d35/d3f/f22 [4194304,4194304] 0 2026-03-10T14:08:22.252 INFO:tasks.workunit.client.0.vm03.stdout:1/385: symlink d0/d2/df/d27/l7a 0 2026-03-10T14:08:22.253 INFO:tasks.workunit.client.1.vm04.stdout:9/959: mkdir d9/da/dd/de7/d96/d148 0 2026-03-10T14:08:22.254 INFO:tasks.workunit.client.1.vm04.stdout:9/960: readlink d9/da/dd/d1c/da3/lc8 0 2026-03-10T14:08:22.259 INFO:tasks.workunit.client.0.vm03.stdout:8/383: fsync da/d3c/d4b/f63 0 2026-03-10T14:08:22.262 INFO:tasks.workunit.client.1.vm04.stdout:4/916: dwrite d4/df/db2/db4/d47/d4f/d8c/dc8/dff/fd2 [0,4194304] 0 2026-03-10T14:08:22.268 INFO:tasks.workunit.client.0.vm03.stdout:0/333: unlink d3/d11/f18 0 2026-03-10T14:08:22.270 INFO:tasks.workunit.client.1.vm04.stdout:9/961: mkdir d9/da/d5d/d149 0 2026-03-10T14:08:22.272 INFO:tasks.workunit.client.1.vm04.stdout:0/976: rmdir d0/d2/d15/d22/d38/d56/dc1 39 2026-03-10T14:08:22.276 INFO:tasks.workunit.client.1.vm04.stdout:9/962: fsync d9/d33/f8f 0 2026-03-10T14:08:22.277 INFO:tasks.workunit.client.0.vm03.stdout:1/386: write d0/d2/df/d16/d20/f72 [417150,50665] 0 2026-03-10T14:08:22.279 INFO:tasks.workunit.client.0.vm03.stdout:1/387: read d0/d2/df/f6c [134,80373] 0 2026-03-10T14:08:22.280 INFO:tasks.workunit.client.1.vm04.stdout:7/986: write d2/d94/f3d [1683250,11075] 0 2026-03-10T14:08:22.284 INFO:tasks.workunit.client.0.vm03.stdout:5/445: write d4/d13/d1f/f74 [2053129,51050] 0 2026-03-10T14:08:22.287 INFO:tasks.workunit.client.1.vm04.stdout:6/850: getdents d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f 0 2026-03-10T14:08:22.288 INFO:tasks.workunit.client.1.vm04.stdout:7/987: mknod d2/dc/de/d2d/d60/d7c/d64/c15c 0 2026-03-10T14:08:22.288 INFO:tasks.workunit.client.0.vm03.stdout:1/388: rmdir d0/d18/d3b/d50 39 2026-03-10T14:08:22.288 INFO:tasks.workunit.client.1.vm04.stdout:7/988: stat d2/d2a/f9c 0 2026-03-10T14:08:22.294 INFO:tasks.workunit.client.1.vm04.stdout:7/989: dread d2/dc/d4d/dcd/f45 [0,4194304] 0 2026-03-10T14:08:22.296 INFO:tasks.workunit.client.1.vm04.stdout:7/990: mknod d2/dc/de/d2d/d60/d7c/df8/c15d 0 2026-03-10T14:08:22.297 INFO:tasks.workunit.client.1.vm04.stdout:6/851: sync 2026-03-10T14:08:22.299 INFO:tasks.workunit.client.1.vm04.stdout:6/852: write d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f86 [1085897,119458] 0 2026-03-10T14:08:22.300 INFO:tasks.workunit.client.1.vm04.stdout:3/969: dwrite da/d3e/f44 [4194304,4194304] 0 2026-03-10T14:08:22.302 INFO:tasks.workunit.client.1.vm04.stdout:3/970: read - da/dc/d47/d9b/d106/dde/df2/f13f zero size 2026-03-10T14:08:22.302 INFO:tasks.workunit.client.1.vm04.stdout:3/971: dread - da/dc/d35/d37/d10a/f13c zero size 2026-03-10T14:08:22.311 INFO:tasks.workunit.client.1.vm04.stdout:3/972: creat da/dc/d47/d9b/d106/dde/f153 x:0 0 0 2026-03-10T14:08:22.313 INFO:tasks.workunit.client.1.vm04.stdout:3/973: mkdir da/dc/d35/d52/d6d/d152/d154 0 2026-03-10T14:08:22.314 INFO:tasks.workunit.client.0.vm03.stdout:8/384: rmdir da/d24/d49 39 2026-03-10T14:08:22.317 INFO:tasks.workunit.client.1.vm04.stdout:4/917: dread d4/f2c [0,4194304] 0 2026-03-10T14:08:22.317 INFO:tasks.workunit.client.1.vm04.stdout:4/918: chown d4/d14/d3c/d62/f12d 0 1 2026-03-10T14:08:22.318 INFO:tasks.workunit.client.1.vm04.stdout:3/974: stat da/dc/d35/d52/d53/d78/cff 0 2026-03-10T14:08:22.321 INFO:tasks.workunit.client.0.vm03.stdout:0/334: symlink d3/d11/d2c/d4a/d4b/l68 0 2026-03-10T14:08:22.321 INFO:tasks.workunit.client.1.vm04.stdout:3/975: symlink da/dc/d35/d52/d70/l155 0 2026-03-10T14:08:22.322 INFO:tasks.workunit.client.1.vm04.stdout:4/919: unlink d4/fa 0 2026-03-10T14:08:22.323 INFO:tasks.workunit.client.0.vm03.stdout:5/446: rename d4/d16/d19/f6f to d4/d35/f92 0 2026-03-10T14:08:22.325 INFO:tasks.workunit.client.1.vm04.stdout:4/920: dwrite d4/df/db2/db4/f4b [0,4194304] 0 2026-03-10T14:08:22.326 INFO:tasks.workunit.client.0.vm03.stdout:0/335: read d3/d17/f56 [3925964,99151] 0 2026-03-10T14:08:22.326 INFO:tasks.workunit.client.0.vm03.stdout:0/336: readlink d3/d4d/l59 0 2026-03-10T14:08:22.330 INFO:tasks.workunit.client.1.vm04.stdout:4/921: read - d4/df/d34/d6f/ff8 zero size 2026-03-10T14:08:22.331 INFO:tasks.workunit.client.0.vm03.stdout:0/337: dwrite d3/d17/f35 [0,4194304] 0 2026-03-10T14:08:22.333 INFO:tasks.workunit.client.1.vm04.stdout:4/922: mkdir d4/df/d34/d136 0 2026-03-10T14:08:22.345 INFO:tasks.workunit.client.1.vm04.stdout:1/963: write d3/f2c [547373,69264] 0 2026-03-10T14:08:22.347 INFO:tasks.workunit.client.1.vm04.stdout:1/964: chown d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d92 319686211 1 2026-03-10T14:08:22.347 INFO:tasks.workunit.client.0.vm03.stdout:1/389: rename d0/d2/d34/f3a to d0/d42/f7b 0 2026-03-10T14:08:22.350 INFO:tasks.workunit.client.0.vm03.stdout:3/368: write d1d/f31 [2968667,42613] 0 2026-03-10T14:08:22.351 INFO:tasks.workunit.client.0.vm03.stdout:8/385: truncate da/d24/f2d 745627 0 2026-03-10T14:08:22.352 INFO:tasks.workunit.client.0.vm03.stdout:8/386: dread da/d24/f32 [0,4194304] 0 2026-03-10T14:08:22.353 INFO:tasks.workunit.client.1.vm04.stdout:0/977: write d0/d2/d15/d22/d38/d56/dcb/ff6 [545103,1726] 0 2026-03-10T14:08:22.354 INFO:tasks.workunit.client.1.vm04.stdout:9/963: write d9/da/d5d/d81/f98 [136452,40088] 0 2026-03-10T14:08:22.355 INFO:tasks.workunit.client.0.vm03.stdout:0/338: dread - d3/d4d/d47/f48 zero size 2026-03-10T14:08:22.357 INFO:tasks.workunit.client.1.vm04.stdout:6/853: write d3/de/d35/d3a/d43/f8a [990288,54593] 0 2026-03-10T14:08:22.357 INFO:tasks.workunit.client.1.vm04.stdout:6/854: readlink d3/de/l105 0 2026-03-10T14:08:22.361 INFO:tasks.workunit.client.1.vm04.stdout:1/965: stat d3/d22/d63/c55 0 2026-03-10T14:08:22.361 INFO:tasks.workunit.client.1.vm04.stdout:1/966: read d3/d22/f107 [318132,28889] 0 2026-03-10T14:08:22.362 INFO:tasks.workunit.client.1.vm04.stdout:1/967: read d3/d22/f107 [518989,24235] 0 2026-03-10T14:08:22.363 INFO:tasks.workunit.client.1.vm04.stdout:0/978: mkdir d0/d2/d15/d49/d50/d5c/dd8/deb/d133 0 2026-03-10T14:08:22.365 INFO:tasks.workunit.client.1.vm04.stdout:3/976: write da/dc/d3f/d54/d66/f117 [443373,50315] 0 2026-03-10T14:08:22.369 INFO:tasks.workunit.client.1.vm04.stdout:4/923: dwrite d4/df/db2/db4/d47/d4f/fa6 [4194304,4194304] 0 2026-03-10T14:08:22.375 INFO:tasks.workunit.client.1.vm04.stdout:1/968: rmdir d3/d22/d2f 39 2026-03-10T14:08:22.377 INFO:tasks.workunit.client.0.vm03.stdout:3/369: creat d1d/d29/d41/d45/f6a x:0 0 0 2026-03-10T14:08:22.378 INFO:tasks.workunit.client.1.vm04.stdout:1/969: dwrite d3/f2c [4194304,4194304] 0 2026-03-10T14:08:22.379 INFO:tasks.workunit.client.1.vm04.stdout:0/979: readlink d0/d2/d15/d22/d38/d56/l6b 0 2026-03-10T14:08:22.379 INFO:tasks.workunit.client.1.vm04.stdout:4/924: mknod d4/d14/d64/c137 0 2026-03-10T14:08:22.384 INFO:tasks.workunit.client.0.vm03.stdout:7/277: dwrite d5/d9/d14/d26/d39/f45 [0,4194304] 0 2026-03-10T14:08:22.385 INFO:tasks.workunit.client.0.vm03.stdout:7/278: truncate d5/f6 8440564 0 2026-03-10T14:08:22.387 INFO:tasks.workunit.client.1.vm04.stdout:0/980: unlink d0/d2/d15/d22/d38/c5d 0 2026-03-10T14:08:22.388 INFO:tasks.workunit.client.0.vm03.stdout:8/387: rename f9 to da/d3a/d44/d64/f7b 0 2026-03-10T14:08:22.390 INFO:tasks.workunit.client.1.vm04.stdout:3/977: dread f8 [0,4194304] 0 2026-03-10T14:08:22.391 INFO:tasks.workunit.client.1.vm04.stdout:4/925: unlink d4/df/db2/db4/d47/cfc 0 2026-03-10T14:08:22.392 INFO:tasks.workunit.client.1.vm04.stdout:6/855: creat d3/de/d35/d3f/d2d/f109 x:0 0 0 2026-03-10T14:08:22.396 INFO:tasks.workunit.client.1.vm04.stdout:4/926: mkdir d4/d14/d6d/d138 0 2026-03-10T14:08:22.396 INFO:tasks.workunit.client.0.vm03.stdout:5/447: creat d4/f93 x:0 0 0 2026-03-10T14:08:22.401 INFO:tasks.workunit.client.1.vm04.stdout:1/970: creat d3/d22/d63/f150 x:0 0 0 2026-03-10T14:08:22.402 INFO:tasks.workunit.client.0.vm03.stdout:0/339: creat d3/d4d/d30/f69 x:0 0 0 2026-03-10T14:08:22.403 INFO:tasks.workunit.client.0.vm03.stdout:0/340: stat d3/d4d/d47/f48 0 2026-03-10T14:08:22.405 INFO:tasks.workunit.client.1.vm04.stdout:4/927: readlink d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/lbd 0 2026-03-10T14:08:22.405 INFO:tasks.workunit.client.1.vm04.stdout:4/928: chown d4/d14/d64/fab 2 1 2026-03-10T14:08:22.406 INFO:tasks.workunit.client.1.vm04.stdout:4/929: read - d4/d14/d3c/d62/f127 zero size 2026-03-10T14:08:22.416 INFO:tasks.workunit.client.1.vm04.stdout:1/971: readlink d3/d22/d63/d35/d6c/lbb 0 2026-03-10T14:08:22.416 INFO:tasks.workunit.client.1.vm04.stdout:9/964: rename d9/dd3 to d9/da/d5d/d81/d14a 0 2026-03-10T14:08:22.416 INFO:tasks.workunit.client.1.vm04.stdout:6/856: creat d3/de/d35/d3a/d43/f10a x:0 0 0 2026-03-10T14:08:22.416 INFO:tasks.workunit.client.1.vm04.stdout:4/930: readlink d4/lc 0 2026-03-10T14:08:22.418 INFO:tasks.workunit.client.1.vm04.stdout:0/981: rename d0/d2/d15/d22/d62 to d0/d2/d15/d49/d50/d5c/dd8/d134 0 2026-03-10T14:08:22.418 INFO:tasks.workunit.client.1.vm04.stdout:3/978: rename da/dc/d3f to da/dc/d3f/d54/d66/d156 22 2026-03-10T14:08:22.423 INFO:tasks.workunit.client.1.vm04.stdout:6/857: dwrite d3/de/d35/d3f/d2d/f89 [0,4194304] 0 2026-03-10T14:08:22.423 INFO:tasks.workunit.client.1.vm04.stdout:4/931: creat d4/d14/d64/f139 x:0 0 0 2026-03-10T14:08:22.425 INFO:tasks.workunit.client.0.vm03.stdout:1/390: rename d0/d2/df/d27/l4d to d0/d18/d3b/d50/l7c 0 2026-03-10T14:08:22.426 INFO:tasks.workunit.client.1.vm04.stdout:6/858: chown d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/dad/lc5 55 1 2026-03-10T14:08:22.426 INFO:tasks.workunit.client.1.vm04.stdout:3/979: stat da/dc/d35/d37/fec 0 2026-03-10T14:08:22.429 INFO:tasks.workunit.client.1.vm04.stdout:3/980: dread da/dc/d3f/d61/dc1/fda [0,4194304] 0 2026-03-10T14:08:22.430 INFO:tasks.workunit.client.1.vm04.stdout:0/982: creat d0/d2/d15/d49/d50/d61/d75/f135 x:0 0 0 2026-03-10T14:08:22.431 INFO:tasks.workunit.client.0.vm03.stdout:5/448: fdatasync d4/d16/d19/d4a/f81 0 2026-03-10T14:08:22.436 INFO:tasks.workunit.client.1.vm04.stdout:4/932: creat d4/d14/d3c/d62/de6/f13a x:0 0 0 2026-03-10T14:08:22.440 INFO:tasks.workunit.client.1.vm04.stdout:1/972: creat d3/d22/d63/d35/dd9/d13/d38/d58/f151 x:0 0 0 2026-03-10T14:08:22.440 INFO:tasks.workunit.client.0.vm03.stdout:5/449: write d4/d35/f8e [1017985,3203] 0 2026-03-10T14:08:22.445 INFO:tasks.workunit.client.1.vm04.stdout:3/981: read da/fe [726954,66502] 0 2026-03-10T14:08:22.448 INFO:tasks.workunit.client.1.vm04.stdout:4/933: truncate d4/d14/d3c/d62/de6/f102 130734 0 2026-03-10T14:08:22.451 INFO:tasks.workunit.client.1.vm04.stdout:1/973: fsync d3/d22/d63/d35/dd9/d13/d38/d58/d5b/f7c 0 2026-03-10T14:08:22.451 INFO:tasks.workunit.client.1.vm04.stdout:3/982: creat da/dc/d47/d9b/d106/dde/dac/f157 x:0 0 0 2026-03-10T14:08:22.452 INFO:tasks.workunit.client.1.vm04.stdout:3/983: chown da/dc/d3f/d61/cbf 86051239 1 2026-03-10T14:08:22.452 INFO:tasks.workunit.client.1.vm04.stdout:0/983: sync 2026-03-10T14:08:22.454 INFO:tasks.workunit.client.1.vm04.stdout:3/984: read - da/dc/d47/d9b/d106/dde/d121/f143 zero size 2026-03-10T14:08:22.457 INFO:tasks.workunit.client.1.vm04.stdout:1/974: mknod d3/d22/d6d/d13b/c152 0 2026-03-10T14:08:22.460 INFO:tasks.workunit.client.1.vm04.stdout:0/984: rename d0/d2/d15/d22/le5 to d0/d2/d15/d49/d50/d5c/l136 0 2026-03-10T14:08:22.462 INFO:tasks.workunit.client.1.vm04.stdout:3/985: mkdir da/dc/d47/d9b/dcb/d158 0 2026-03-10T14:08:22.464 INFO:tasks.workunit.client.1.vm04.stdout:4/934: fsync d4/df/d34/f95 0 2026-03-10T14:08:22.468 INFO:tasks.workunit.client.1.vm04.stdout:3/986: mkdir da/d8e/db5/d159 0 2026-03-10T14:08:22.471 INFO:tasks.workunit.client.1.vm04.stdout:1/975: mkdir d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/d137/d153 0 2026-03-10T14:08:22.472 INFO:tasks.workunit.client.1.vm04.stdout:1/976: write d3/d22/d63/d35/dd9/d13/d38/d58/d5b/f13e [890357,90302] 0 2026-03-10T14:08:22.478 INFO:tasks.workunit.client.1.vm04.stdout:1/977: dread d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/f11a [0,4194304] 0 2026-03-10T14:08:22.480 INFO:tasks.workunit.client.1.vm04.stdout:0/985: unlink d0/d2/d15/d22/d38/d56/da7/lf3 0 2026-03-10T14:08:22.482 INFO:tasks.workunit.client.1.vm04.stdout:3/987: fdatasync da/d30/f42 0 2026-03-10T14:08:22.484 INFO:tasks.workunit.client.1.vm04.stdout:9/965: truncate d9/da/dd/d74/fcb 1625586 0 2026-03-10T14:08:22.488 INFO:tasks.workunit.client.1.vm04.stdout:3/988: symlink da/dc/d47/d9b/d106/dde/l15a 0 2026-03-10T14:08:22.490 INFO:tasks.workunit.client.0.vm03.stdout:3/370: rename f1b to d1d/d29/f6b 0 2026-03-10T14:08:22.490 INFO:tasks.workunit.client.0.vm03.stdout:3/371: fdatasync d1d/d33/f3a 0 2026-03-10T14:08:22.492 INFO:tasks.workunit.client.1.vm04.stdout:6/859: dwrite d3/de/d35/d3f/d2d/d32/dd0/fde [0,4194304] 0 2026-03-10T14:08:22.494 INFO:tasks.workunit.client.1.vm04.stdout:6/860: chown d3/de/d35/d3f/d2d/d32/d23/d24/d8e/f106 123937195 1 2026-03-10T14:08:22.495 INFO:tasks.workunit.client.0.vm03.stdout:8/388: getdents da/d36/d40/d50/d70 0 2026-03-10T14:08:22.502 INFO:tasks.workunit.client.1.vm04.stdout:9/966: rename d9/da/d8c/fbf to d9/da/dd/de7/f14b 0 2026-03-10T14:08:22.508 INFO:tasks.workunit.client.1.vm04.stdout:1/978: mknod d3/d22/d63/d35/dd9/d13/da7/c154 0 2026-03-10T14:08:22.510 INFO:tasks.workunit.client.1.vm04.stdout:9/967: dwrite d9/da/dd/d1c/f22 [0,4194304] 0 2026-03-10T14:08:22.511 INFO:tasks.workunit.client.0.vm03.stdout:5/450: truncate d4/f82 3375835 0 2026-03-10T14:08:22.511 INFO:tasks.workunit.client.1.vm04.stdout:0/986: mknod d0/d2/d15/c137 0 2026-03-10T14:08:22.512 INFO:tasks.workunit.client.1.vm04.stdout:0/987: readlink d0/d2/d15/d22/d38/l4f 0 2026-03-10T14:08:22.514 INFO:tasks.workunit.client.1.vm04.stdout:3/989: creat da/dc/d35/d37/d10a/f15b x:0 0 0 2026-03-10T14:08:22.519 INFO:tasks.workunit.client.0.vm03.stdout:1/391: symlink d0/d18/l7d 0 2026-03-10T14:08:22.522 INFO:tasks.workunit.client.1.vm04.stdout:4/935: link d4/df7/cee d4/df/db2/db4/d47/d4f/c13b 0 2026-03-10T14:08:22.522 INFO:tasks.workunit.client.0.vm03.stdout:2/366: getdents d5/d10/d1c/d54/d5f 0 2026-03-10T14:08:22.525 INFO:tasks.workunit.client.1.vm04.stdout:1/979: fsync d3/d22/d63/d35/dd9/d13/d38/db5/fb8 0 2026-03-10T14:08:22.529 INFO:tasks.workunit.client.0.vm03.stdout:8/389: mkdir da/d36/d7c 0 2026-03-10T14:08:22.529 INFO:tasks.workunit.client.1.vm04.stdout:0/988: mknod d0/dee/c138 0 2026-03-10T14:08:22.532 INFO:tasks.workunit.client.1.vm04.stdout:0/989: dwrite d0/d2/f17 [4194304,4194304] 0 2026-03-10T14:08:22.534 INFO:tasks.workunit.client.1.vm04.stdout:6/861: mkdir d3/de/d35/d3f/d2d/d32/d23/d24/d6f/de2/d10b 0 2026-03-10T14:08:22.534 INFO:tasks.workunit.client.0.vm03.stdout:5/451: creat d4/d13/d43/f94 x:0 0 0 2026-03-10T14:08:22.534 INFO:tasks.workunit.client.1.vm04.stdout:4/936: chown d4/df/d34/f69 6 1 2026-03-10T14:08:22.534 INFO:tasks.workunit.client.0.vm03.stdout:5/452: readlink d4/d6/l61 0 2026-03-10T14:08:22.538 INFO:tasks.workunit.client.0.vm03.stdout:1/392: mkdir d0/d2/df/d27/d7e 0 2026-03-10T14:08:22.538 INFO:tasks.workunit.client.0.vm03.stdout:1/393: readlink d0/d18/d1d/l40 0 2026-03-10T14:08:22.540 INFO:tasks.workunit.client.1.vm04.stdout:4/937: dread - d4/d14/d64/fca zero size 2026-03-10T14:08:22.541 INFO:tasks.workunit.client.1.vm04.stdout:6/862: creat d3/de/d35/d3f/d2d/d38/d40/f10c x:0 0 0 2026-03-10T14:08:22.542 INFO:tasks.workunit.client.0.vm03.stdout:2/367: rmdir d5/d10/d1c/d50 39 2026-03-10T14:08:22.543 INFO:tasks.workunit.client.1.vm04.stdout:1/980: creat d3/d22/d63/f155 x:0 0 0 2026-03-10T14:08:22.544 INFO:tasks.workunit.client.0.vm03.stdout:0/341: rename d3/d17/f60 to d3/d16/d21/f6a 0 2026-03-10T14:08:22.544 INFO:tasks.workunit.client.1.vm04.stdout:6/863: rename d3/f4 to d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f10d 0 2026-03-10T14:08:22.545 INFO:tasks.workunit.client.1.vm04.stdout:4/938: unlink d4/d14/d1b/f20 0 2026-03-10T14:08:22.546 INFO:tasks.workunit.client.1.vm04.stdout:1/981: stat d3/d22/d2f/d57 0 2026-03-10T14:08:22.547 INFO:tasks.workunit.client.1.vm04.stdout:1/982: chown d3/d5c/d79/de6 427425641 1 2026-03-10T14:08:22.548 INFO:tasks.workunit.client.0.vm03.stdout:0/342: dwrite d3/d17/f35 [0,4194304] 0 2026-03-10T14:08:22.552 INFO:tasks.workunit.client.0.vm03.stdout:7/279: getdents d5/d9/d14/d26/d36 0 2026-03-10T14:08:22.556 INFO:tasks.workunit.client.0.vm03.stdout:3/372: mknod d1d/c6c 0 2026-03-10T14:08:22.559 INFO:tasks.workunit.client.0.vm03.stdout:5/453: unlink d4/d13/d43/f51 0 2026-03-10T14:08:22.559 INFO:tasks.workunit.client.0.vm03.stdout:5/454: chown d4/d13/d43/f8b 1668939 1 2026-03-10T14:08:22.562 INFO:tasks.workunit.client.0.vm03.stdout:2/368: creat d5/d2a/f6e x:0 0 0 2026-03-10T14:08:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:22 vm04.local ceph-mon[55966]: pgmap v11: 65 pgs: 65 active+clean; 2.6 GiB data, 9.0 GiB used, 111 GiB / 120 GiB avail; 30 MiB/s rd, 87 MiB/s wr, 214 op/s 2026-03-10T14:08:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:22 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:22 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:08:22.565 INFO:tasks.workunit.client.0.vm03.stdout:7/280: symlink d5/d9/d35/l57 0 2026-03-10T14:08:22.568 INFO:tasks.workunit.client.0.vm03.stdout:5/455: creat d4/d16/d19/d6e/f95 x:0 0 0 2026-03-10T14:08:22.569 INFO:tasks.workunit.client.0.vm03.stdout:2/369: readlink d5/d10/l1b 0 2026-03-10T14:08:22.572 INFO:tasks.workunit.client.0.vm03.stdout:7/281: mknod d5/d9/d3e/c58 0 2026-03-10T14:08:22.575 INFO:tasks.workunit.client.0.vm03.stdout:5/456: dread d4/d13/d1f/f21 [0,4194304] 0 2026-03-10T14:08:22.577 INFO:tasks.workunit.client.0.vm03.stdout:2/370: dread - d5/d35/f58 zero size 2026-03-10T14:08:22.579 INFO:tasks.workunit.client.0.vm03.stdout:7/282: mkdir d5/d9/d14/d59 0 2026-03-10T14:08:22.580 INFO:tasks.workunit.client.0.vm03.stdout:2/371: symlink d5/d35/l6f 0 2026-03-10T14:08:22.582 INFO:tasks.workunit.client.0.vm03.stdout:2/372: fdatasync d5/d10/d1f/d4f/f55 0 2026-03-10T14:08:22.584 INFO:tasks.workunit.client.0.vm03.stdout:7/283: symlink d5/d9/d14/d21/d28/l5a 0 2026-03-10T14:08:22.585 INFO:tasks.workunit.client.0.vm03.stdout:7/284: dread - d5/d9/d14/d26/f38 zero size 2026-03-10T14:08:22.587 INFO:tasks.workunit.client.0.vm03.stdout:2/373: stat d5/d10/l1a 0 2026-03-10T14:08:22.589 INFO:tasks.workunit.client.0.vm03.stdout:2/374: dread d5/d10/d17/f18 [4194304,4194304] 0 2026-03-10T14:08:22.590 INFO:tasks.workunit.client.1.vm04.stdout:3/990: sync 2026-03-10T14:08:22.601 INFO:tasks.workunit.client.1.vm04.stdout:3/991: mkdir da/d8e/db5/d159/d15c 0 2026-03-10T14:08:22.601 INFO:tasks.workunit.client.1.vm04.stdout:3/992: symlink da/ded/df8/l15d 0 2026-03-10T14:08:22.601 INFO:tasks.workunit.client.0.vm03.stdout:2/375: creat d5/d10/d1c/d40/d59/f70 x:0 0 0 2026-03-10T14:08:22.601 INFO:tasks.workunit.client.0.vm03.stdout:7/285: creat d5/d9/d14/d21/d28/f5b x:0 0 0 2026-03-10T14:08:22.601 INFO:tasks.workunit.client.0.vm03.stdout:7/286: readlink d5/l1a 0 2026-03-10T14:08:22.601 INFO:tasks.workunit.client.0.vm03.stdout:2/376: truncate d5/d10/d1f/f5e 916351 0 2026-03-10T14:08:22.601 INFO:tasks.workunit.client.0.vm03.stdout:7/287: creat d5/d9/d14/d26/f5c x:0 0 0 2026-03-10T14:08:22.601 INFO:tasks.workunit.client.0.vm03.stdout:7/288: dread - d5/d9/d14/d21/d28/f5b zero size 2026-03-10T14:08:22.602 INFO:tasks.workunit.client.0.vm03.stdout:7/289: fdatasync d5/d9/f22 0 2026-03-10T14:08:22.603 INFO:tasks.workunit.client.0.vm03.stdout:7/290: chown d5/d9/d3e/l4b 0 1 2026-03-10T14:08:22.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:22 vm03.local ceph-mon[49718]: pgmap v11: 65 pgs: 65 active+clean; 2.6 GiB data, 9.0 GiB used, 111 GiB / 120 GiB avail; 30 MiB/s rd, 87 MiB/s wr, 214 op/s 2026-03-10T14:08:22.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:22 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:22.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:22 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:08:22.609 INFO:tasks.workunit.client.0.vm03.stdout:7/291: truncate d5/d9/d14/d21/f29 4009920 0 2026-03-10T14:08:22.610 INFO:tasks.workunit.client.0.vm03.stdout:7/292: truncate d5/d9/d14/f4d 725842 0 2026-03-10T14:08:22.610 INFO:tasks.workunit.client.0.vm03.stdout:7/293: dread - d5/f53 zero size 2026-03-10T14:08:22.615 INFO:tasks.workunit.client.1.vm04.stdout:9/968: dwrite d9/d58/fba [0,4194304] 0 2026-03-10T14:08:22.616 INFO:tasks.workunit.client.0.vm03.stdout:2/377: dread d5/fa [0,4194304] 0 2026-03-10T14:08:22.617 INFO:tasks.workunit.client.1.vm04.stdout:9/969: dread d9/da/d8c/fea [0,4194304] 0 2026-03-10T14:08:22.618 INFO:tasks.workunit.client.0.vm03.stdout:8/390: sync 2026-03-10T14:08:22.618 INFO:tasks.workunit.client.0.vm03.stdout:3/373: sync 2026-03-10T14:08:22.618 INFO:tasks.workunit.client.0.vm03.stdout:0/343: sync 2026-03-10T14:08:22.618 INFO:tasks.workunit.client.0.vm03.stdout:8/391: chown da 2383 1 2026-03-10T14:08:22.619 INFO:tasks.workunit.client.0.vm03.stdout:5/457: sync 2026-03-10T14:08:22.619 INFO:tasks.workunit.client.0.vm03.stdout:8/392: fsync da/d3a/d44/d64/f68 0 2026-03-10T14:08:22.622 INFO:tasks.workunit.client.0.vm03.stdout:7/294: mknod d5/d9/d14/d26/c5d 0 2026-03-10T14:08:22.625 INFO:tasks.workunit.client.1.vm04.stdout:9/970: symlink d9/da/dd/de7/d96/l14c 0 2026-03-10T14:08:22.626 INFO:tasks.workunit.client.1.vm04.stdout:4/939: write d4/d14/d64/fd6 [3243947,99428] 0 2026-03-10T14:08:22.627 INFO:tasks.workunit.client.0.vm03.stdout:2/378: dwrite d5/f9 [0,4194304] 0 2026-03-10T14:08:22.628 INFO:tasks.workunit.client.1.vm04.stdout:6/864: dwrite d3/de/d35/d3f/d2d/d38/f50 [0,4194304] 0 2026-03-10T14:08:22.630 INFO:tasks.workunit.client.0.vm03.stdout:5/458: creat d4/d40/f96 x:0 0 0 2026-03-10T14:08:22.637 INFO:tasks.workunit.client.0.vm03.stdout:8/393: readlink da/l22 0 2026-03-10T14:08:22.644 INFO:tasks.workunit.client.0.vm03.stdout:0/344: dread d3/d4d/f22 [0,4194304] 0 2026-03-10T14:08:22.644 INFO:tasks.workunit.client.0.vm03.stdout:5/459: dread d4/d6/de/f4f [0,4194304] 0 2026-03-10T14:08:22.646 INFO:tasks.workunit.client.0.vm03.stdout:4/401: write d5/d9/db/d19/d34/f6b [1324721,106793] 0 2026-03-10T14:08:22.647 INFO:tasks.workunit.client.1.vm04.stdout:6/865: unlink d3/d1d/d73/cbc 0 2026-03-10T14:08:22.649 INFO:tasks.workunit.client.0.vm03.stdout:4/402: dread d5/d9/d2b/f63 [0,4194304] 0 2026-03-10T14:08:22.650 INFO:tasks.workunit.client.0.vm03.stdout:4/403: read d5/d9/db/d19/f51 [6122236,46917] 0 2026-03-10T14:08:22.650 INFO:tasks.workunit.client.1.vm04.stdout:6/866: dwrite d3/de/d35/d3f/d2d/f18 [4194304,4194304] 0 2026-03-10T14:08:22.654 INFO:tasks.workunit.client.0.vm03.stdout:4/404: dwrite d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:22.659 INFO:tasks.workunit.client.1.vm04.stdout:3/993: dwrite da/dc/d35/d52/f69 [0,4194304] 0 2026-03-10T14:08:22.659 INFO:tasks.workunit.client.0.vm03.stdout:5/460: dread - d4/d16/d19/d23/d3f/f7a zero size 2026-03-10T14:08:22.662 INFO:tasks.workunit.client.1.vm04.stdout:1/983: dwrite d3/d22/d63/faa [0,4194304] 0 2026-03-10T14:08:22.679 INFO:tasks.workunit.client.1.vm04.stdout:9/971: rename d9/d33/fbe to d9/d33/d134/f14d 0 2026-03-10T14:08:22.680 INFO:tasks.workunit.client.1.vm04.stdout:4/940: unlink d4/df/d34/c3f 0 2026-03-10T14:08:22.682 INFO:tasks.workunit.client.0.vm03.stdout:0/345: symlink d3/d11/d66/l6b 0 2026-03-10T14:08:22.684 INFO:tasks.workunit.client.1.vm04.stdout:6/867: unlink d3/de/d35/d3f/d2d/d32/f81 0 2026-03-10T14:08:22.686 INFO:tasks.workunit.client.1.vm04.stdout:3/994: rmdir da/dc/d47/d9b/d106/dde/dac 39 2026-03-10T14:08:22.691 INFO:tasks.workunit.client.0.vm03.stdout:2/379: rmdir d5/d10/d1f/d5d 39 2026-03-10T14:08:22.691 INFO:tasks.workunit.client.1.vm04.stdout:6/868: unlink d3/de/d35/d3a/d43/fe0 0 2026-03-10T14:08:22.692 INFO:tasks.workunit.client.0.vm03.stdout:2/380: write d5/d2a/f6e [179578,71439] 0 2026-03-10T14:08:22.692 INFO:tasks.workunit.client.1.vm04.stdout:6/869: fdatasync d3/de/d35/d3f/d2d/d32/dd0/fde 0 2026-03-10T14:08:22.694 INFO:tasks.workunit.client.1.vm04.stdout:9/972: dwrite d9/da/dd/de7/f14b [0,4194304] 0 2026-03-10T14:08:22.696 INFO:tasks.workunit.client.1.vm04.stdout:3/995: fdatasync da/dc/d47/d9b/d106/dde/dac/f157 0 2026-03-10T14:08:22.700 INFO:tasks.workunit.client.0.vm03.stdout:0/346: creat d3/d46/f6c x:0 0 0 2026-03-10T14:08:22.705 INFO:tasks.workunit.client.1.vm04.stdout:3/996: dread f4 [4194304,4194304] 0 2026-03-10T14:08:22.710 INFO:tasks.workunit.client.1.vm04.stdout:4/941: sync 2026-03-10T14:08:22.710 INFO:tasks.workunit.client.0.vm03.stdout:5/461: rename d4/d13/d1f/f21 to d4/d6/f97 0 2026-03-10T14:08:22.713 INFO:tasks.workunit.client.1.vm04.stdout:9/973: chown d9/da/d8c/de5/d11d/c145 238357 1 2026-03-10T14:08:22.718 INFO:tasks.workunit.client.1.vm04.stdout:1/984: link d3/d22/d63/d35/dd9/d13/d38/db5/dc4/ffc d3/d22/d63/d35/dd9/d13/d38/db5/f156 0 2026-03-10T14:08:22.719 INFO:tasks.workunit.client.0.vm03.stdout:5/462: link c3 d4/d16/d19/c98 0 2026-03-10T14:08:22.719 INFO:tasks.workunit.client.1.vm04.stdout:0/990: creat d0/d2/d15/d49/d50/d10d/f139 x:0 0 0 2026-03-10T14:08:22.721 INFO:tasks.workunit.client.1.vm04.stdout:6/870: mkdir d3/de/d35/d10e 0 2026-03-10T14:08:22.722 INFO:tasks.workunit.client.0.vm03.stdout:4/405: getdents d5 0 2026-03-10T14:08:22.722 INFO:tasks.workunit.client.0.vm03.stdout:4/406: dread - d5/d47/f46 zero size 2026-03-10T14:08:22.722 INFO:tasks.workunit.client.1.vm04.stdout:6/871: rename d3/de/d35/d3f/d2d/d32 to d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d10f 22 2026-03-10T14:08:22.723 INFO:tasks.workunit.client.1.vm04.stdout:4/942: creat d4/d14/d1b/f13c x:0 0 0 2026-03-10T14:08:22.724 INFO:tasks.workunit.client.1.vm04.stdout:9/974: truncate d9/d58/db5/da5/fc9 1222631 0 2026-03-10T14:08:22.730 INFO:tasks.workunit.client.0.vm03.stdout:0/347: dread d3/d4d/f49 [0,4194304] 0 2026-03-10T14:08:22.732 INFO:tasks.workunit.client.1.vm04.stdout:6/872: unlink d3/de/d35/d3f/d2d/fe3 0 2026-03-10T14:08:22.735 INFO:tasks.workunit.client.0.vm03.stdout:4/407: getdents d5/d47/d5b 0 2026-03-10T14:08:22.738 INFO:tasks.workunit.client.0.vm03.stdout:0/348: creat d3/d17/f6d x:0 0 0 2026-03-10T14:08:22.741 INFO:tasks.workunit.client.0.vm03.stdout:0/349: dwrite d3/d16/f2d [0,4194304] 0 2026-03-10T14:08:22.741 INFO:tasks.workunit.client.1.vm04.stdout:4/943: truncate d4/df/db2/db4/fdb 140977 0 2026-03-10T14:08:22.750 INFO:tasks.workunit.client.1.vm04.stdout:0/991: dwrite d0/d2/d15/d22/d38/d56/d66/f54 [0,4194304] 0 2026-03-10T14:08:22.757 INFO:tasks.workunit.client.1.vm04.stdout:0/992: dwrite d0/d2/d15/d22/d38/d56/dcb/ff6 [0,4194304] 0 2026-03-10T14:08:22.779 INFO:tasks.workunit.client.0.vm03.stdout:4/408: dread d5/d9/f44 [0,4194304] 0 2026-03-10T14:08:22.784 INFO:tasks.workunit.client.0.vm03.stdout:4/409: creat d5/d47/d5b/d64/f82 x:0 0 0 2026-03-10T14:08:22.788 INFO:tasks.workunit.client.0.vm03.stdout:4/410: dwrite d5/d9/db/d19/d38/d53/d55/f61 [0,4194304] 0 2026-03-10T14:08:22.803 INFO:tasks.workunit.client.0.vm03.stdout:4/411: fdatasync d5/d9/db/d19/d34/f5d 0 2026-03-10T14:08:22.804 INFO:tasks.workunit.client.1.vm04.stdout:0/993: rename d0/dee/l12b to d0/d2/d15/d22/d38/d56/dcb/dde/l13a 0 2026-03-10T14:08:22.804 INFO:tasks.workunit.client.1.vm04.stdout:4/944: creat d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/f13d x:0 0 0 2026-03-10T14:08:22.804 INFO:tasks.workunit.client.1.vm04.stdout:4/945: link d4/df/d34/d6f/le0 d4/df/db2/db4/d47/d4f/d8c/d132/l13e 0 2026-03-10T14:08:22.804 INFO:tasks.workunit.client.1.vm04.stdout:4/946: write d4/df/fa1 [325045,123774] 0 2026-03-10T14:08:22.804 INFO:tasks.workunit.client.1.vm04.stdout:4/947: write d4/df/db2/db4/d47/d4f/d8c/dc8/dff/fd2 [3466930,57434] 0 2026-03-10T14:08:22.805 INFO:tasks.workunit.client.0.vm03.stdout:4/412: link d5/d9/db/d19/d38/d53/f59 d5/d9/db/d19/d38/d53/f83 0 2026-03-10T14:08:22.819 INFO:tasks.workunit.client.0.vm03.stdout:6/365: dwrite f5 [0,4194304] 0 2026-03-10T14:08:22.831 INFO:tasks.workunit.client.0.vm03.stdout:6/366: dread d8/d1b/f31 [0,4194304] 0 2026-03-10T14:08:22.840 INFO:tasks.workunit.client.0.vm03.stdout:6/367: getdents d8/db/df 0 2026-03-10T14:08:22.846 INFO:tasks.workunit.client.0.vm03.stdout:6/368: symlink d8/d11/l74 0 2026-03-10T14:08:22.934 INFO:tasks.workunit.client.0.vm03.stdout:8/394: write da/d3c/d4b/f63 [4887586,45924] 0 2026-03-10T14:08:22.934 INFO:tasks.workunit.client.0.vm03.stdout:9/387: write d2/f4c [745111,126029] 0 2026-03-10T14:08:22.934 INFO:tasks.workunit.client.0.vm03.stdout:8/395: readlink da/d36/l5e 0 2026-03-10T14:08:22.937 INFO:tasks.workunit.client.0.vm03.stdout:8/396: chown da/d3a/d44/d64 8 1 2026-03-10T14:08:22.940 INFO:tasks.workunit.client.1.vm04.stdout:7/991: dread d2/dc/de/fed [0,4194304] 0 2026-03-10T14:08:22.946 INFO:tasks.workunit.client.0.vm03.stdout:8/397: fsync da/d3c/f48 0 2026-03-10T14:08:22.947 INFO:tasks.workunit.client.1.vm04.stdout:3/997: write da/d30/fe9 [3044,103264] 0 2026-03-10T14:08:22.950 INFO:tasks.workunit.client.1.vm04.stdout:1/985: dwrite d3/d22/d63/d35/dd9/fea [0,4194304] 0 2026-03-10T14:08:22.956 INFO:tasks.workunit.client.1.vm04.stdout:9/975: dwrite d9/da/f6b [0,4194304] 0 2026-03-10T14:08:22.962 INFO:tasks.workunit.client.1.vm04.stdout:3/998: mknod da/dc/d35/d52/d6d/d152/d154/c15e 0 2026-03-10T14:08:22.963 INFO:tasks.workunit.client.0.vm03.stdout:9/388: rmdir d2/d67 0 2026-03-10T14:08:22.966 INFO:tasks.workunit.client.1.vm04.stdout:1/986: stat d3/d22/d63/d35/dd9/c9b 0 2026-03-10T14:08:22.969 INFO:tasks.workunit.client.1.vm04.stdout:3/999: creat da/dc/d35/d37/d127/f15f x:0 0 0 2026-03-10T14:08:22.970 INFO:tasks.workunit.client.1.vm04.stdout:1/987: mknod d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/d84/c157 0 2026-03-10T14:08:22.976 INFO:tasks.workunit.client.0.vm03.stdout:8/398: mkdir da/d36/d40/d50/d70/d7d 0 2026-03-10T14:08:22.978 INFO:tasks.workunit.client.1.vm04.stdout:1/988: truncate d3/d5c/fef 302161 0 2026-03-10T14:08:22.985 INFO:tasks.workunit.client.1.vm04.stdout:6/873: dwrite d3/de/d35/d3f/d2d/d32/f97 [0,4194304] 0 2026-03-10T14:08:22.987 INFO:tasks.workunit.client.1.vm04.stdout:1/989: rmdir d3/d22/d63/d35/dd9/d13/d38/db5/dff 39 2026-03-10T14:08:22.998 INFO:tasks.workunit.client.0.vm03.stdout:8/399: symlink da/d36/d40/d50/d70/l7e 0 2026-03-10T14:08:22.998 INFO:tasks.workunit.client.1.vm04.stdout:9/976: link d9/da/dd/d74/c146 d9/da/dd/de7/d96/c14e 0 2026-03-10T14:08:22.998 INFO:tasks.workunit.client.1.vm04.stdout:4/948: write d4/df/d34/d6f/ff8 [653105,66893] 0 2026-03-10T14:08:22.998 INFO:tasks.workunit.client.1.vm04.stdout:0/994: dwrite d0/d2/dbe/fd7 [0,4194304] 0 2026-03-10T14:08:22.998 INFO:tasks.workunit.client.1.vm04.stdout:0/995: dread - d0/d2/d15/d49/d50/d10d/f139 zero size 2026-03-10T14:08:22.998 INFO:tasks.workunit.client.1.vm04.stdout:7/992: write d2/dc/de/d2d/d60/d7c/d3b/f48 [2071635,86420] 0 2026-03-10T14:08:23.000 INFO:tasks.workunit.client.0.vm03.stdout:9/389: sync 2026-03-10T14:08:23.010 INFO:tasks.workunit.client.0.vm03.stdout:8/400: fsync da/d24/f43 0 2026-03-10T14:08:23.014 INFO:tasks.workunit.client.0.vm03.stdout:9/390: getdents d2/d29/d33/d55/d72 0 2026-03-10T14:08:23.014 INFO:tasks.workunit.client.1.vm04.stdout:6/874: symlink d3/de/d35/d3f/d2d/d32/d23/d24/d6f/de2/d10b/l110 0 2026-03-10T14:08:23.016 INFO:tasks.workunit.client.1.vm04.stdout:7/993: mknod d2/dc/de/d2d/d5c/da9/df6/d144/c15e 0 2026-03-10T14:08:23.017 INFO:tasks.workunit.client.0.vm03.stdout:9/391: fdatasync d2/d29/d38/f47 0 2026-03-10T14:08:23.017 INFO:tasks.workunit.client.1.vm04.stdout:1/990: mkdir d3/d22/d63/d35/d158 0 2026-03-10T14:08:23.026 INFO:tasks.workunit.client.1.vm04.stdout:7/994: mknod d2/dc/de/d2d/d60/d7c/d64/c15f 0 2026-03-10T14:08:23.026 INFO:tasks.workunit.client.1.vm04.stdout:7/995: dread - d2/dac/d115/f151 zero size 2026-03-10T14:08:23.032 INFO:tasks.workunit.client.0.vm03.stdout:9/392: dread d2/d14/f25 [0,4194304] 0 2026-03-10T14:08:23.033 INFO:tasks.workunit.client.1.vm04.stdout:7/996: creat d2/dc/d4d/d7f/f160 x:0 0 0 2026-03-10T14:08:23.037 INFO:tasks.workunit.client.0.vm03.stdout:9/393: mknod d2/d29/d38/c82 0 2026-03-10T14:08:23.056 INFO:tasks.workunit.client.0.vm03.stdout:9/394: link d2/d29/f35 d2/f83 0 2026-03-10T14:08:23.059 INFO:tasks.workunit.client.1.vm04.stdout:4/949: dwrite d4/df/d34/d6f/f9a [0,4194304] 0 2026-03-10T14:08:23.073 INFO:tasks.workunit.client.0.vm03.stdout:9/395: symlink d2/d14/d2b/d76/l84 0 2026-03-10T14:08:23.078 INFO:tasks.workunit.client.1.vm04.stdout:9/977: dwrite d9/da/dd/fd9 [0,4194304] 0 2026-03-10T14:08:23.080 INFO:tasks.workunit.client.1.vm04.stdout:4/950: truncate d4/d14/d6d/f8a 832840 0 2026-03-10T14:08:23.081 INFO:tasks.workunit.client.1.vm04.stdout:0/996: write d0/d2/f12 [2829287,34107] 0 2026-03-10T14:08:23.084 INFO:tasks.workunit.client.0.vm03.stdout:1/394: dwrite d0/d2/df/d16/d20/f38 [0,4194304] 0 2026-03-10T14:08:23.088 INFO:tasks.workunit.client.1.vm04.stdout:4/951: dwrite d4/d14/d3c/f46 [0,4194304] 0 2026-03-10T14:08:23.089 INFO:tasks.workunit.client.1.vm04.stdout:4/952: dread - d4/d14/d64/f139 zero size 2026-03-10T14:08:23.101 INFO:tasks.workunit.client.1.vm04.stdout:1/991: write d3/d22/deb/f11c [1607663,45212] 0 2026-03-10T14:08:23.104 INFO:tasks.workunit.client.1.vm04.stdout:6/875: dwrite d3/de/d35/d3f/f17 [4194304,4194304] 0 2026-03-10T14:08:23.106 INFO:tasks.workunit.client.1.vm04.stdout:6/876: chown d3/de/d35/d3f/d2d/d32 4 1 2026-03-10T14:08:23.107 INFO:tasks.workunit.client.1.vm04.stdout:6/877: write d3/de/d35/d3a/d43/f10a [312350,121750] 0 2026-03-10T14:08:23.114 INFO:tasks.workunit.client.1.vm04.stdout:7/997: write d2/f20 [873963,61999] 0 2026-03-10T14:08:23.115 INFO:tasks.workunit.client.1.vm04.stdout:7/998: fsync d2/d2a/f9c 0 2026-03-10T14:08:23.118 INFO:tasks.workunit.client.1.vm04.stdout:1/992: chown d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb/l125 22843 1 2026-03-10T14:08:23.121 INFO:tasks.workunit.client.1.vm04.stdout:6/878: creat d3/d1d/d73/f111 x:0 0 0 2026-03-10T14:08:23.129 INFO:tasks.workunit.client.1.vm04.stdout:1/993: creat d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/d84/f159 x:0 0 0 2026-03-10T14:08:23.129 INFO:tasks.workunit.client.1.vm04.stdout:9/978: dread d9/da/d5d/f9b [0,4194304] 0 2026-03-10T14:08:23.132 INFO:tasks.workunit.client.1.vm04.stdout:1/994: creat d3/d22/d63/d35/dd9/d13/d38/d58/d113/db7/d137/d153/f15a x:0 0 0 2026-03-10T14:08:23.135 INFO:tasks.workunit.client.1.vm04.stdout:0/997: sync 2026-03-10T14:08:23.136 INFO:tasks.workunit.client.1.vm04.stdout:0/998: stat d0/d2/d15/d49/d50/d5c 0 2026-03-10T14:08:23.136 INFO:tasks.workunit.client.1.vm04.stdout:1/995: dwrite d3/d22/d63/d35/dd9/d13/d38/d58/d5b/f13e [0,4194304] 0 2026-03-10T14:08:23.137 INFO:tasks.workunit.client.1.vm04.stdout:0/999: write d0/dee/ff0 [511412,102776] 0 2026-03-10T14:08:23.139 INFO:tasks.workunit.client.0.vm03.stdout:3/374: dwrite d1d/f32 [0,4194304] 0 2026-03-10T14:08:23.146 INFO:tasks.workunit.client.1.vm04.stdout:7/999: dread d2/dc/de/d11/f19 [0,4194304] 0 2026-03-10T14:08:23.147 INFO:tasks.workunit.client.1.vm04.stdout:9/979: mkdir d9/d58/db5/da5/d10b/d14f 0 2026-03-10T14:08:23.150 INFO:tasks.workunit.client.1.vm04.stdout:9/980: dread d9/da/dd/d1c/f22 [0,4194304] 0 2026-03-10T14:08:23.158 INFO:tasks.workunit.client.1.vm04.stdout:9/981: chown d9/c11 283364332 1 2026-03-10T14:08:23.163 INFO:tasks.workunit.client.1.vm04.stdout:1/996: creat d3/d22/d63/d35/dd9/d13/d38/d58/f15b x:0 0 0 2026-03-10T14:08:23.166 INFO:tasks.workunit.client.1.vm04.stdout:1/997: mkdir d3/d22/d63/d35/dd9/d13/d38/d58/d5b/d7b/dc2/ddb/d108/d15c 0 2026-03-10T14:08:23.178 INFO:tasks.workunit.client.1.vm04.stdout:1/998: dread d3/d22/d63/d35/dd9/d13/fc3 [0,4194304] 0 2026-03-10T14:08:23.184 INFO:tasks.workunit.client.1.vm04.stdout:1/999: symlink d3/d22/d63/d35/dd9/d13/d38/d58/dcc/l15d 0 2026-03-10T14:08:23.194 INFO:tasks.workunit.client.1.vm04.stdout:4/953: truncate d4/df/d34/f8f 2020514 0 2026-03-10T14:08:23.196 INFO:tasks.workunit.client.1.vm04.stdout:6/879: dwrite d3/d1d/f6c [0,4194304] 0 2026-03-10T14:08:23.196 INFO:tasks.workunit.client.1.vm04.stdout:4/954: chown d4/d14/d3c/d85/ffd 18436603 1 2026-03-10T14:08:23.213 INFO:tasks.workunit.client.1.vm04.stdout:9/982: write d9/da/d5d/d81/fcd [314456,122563] 0 2026-03-10T14:08:23.218 INFO:tasks.workunit.client.1.vm04.stdout:4/955: getdents d4/df/db2/de1 0 2026-03-10T14:08:23.218 INFO:tasks.workunit.client.1.vm04.stdout:4/956: readlink d4/lc 0 2026-03-10T14:08:23.225 INFO:tasks.workunit.client.0.vm03.stdout:7/295: dwrite d5/fb [0,4194304] 0 2026-03-10T14:08:23.239 INFO:tasks.workunit.client.1.vm04.stdout:4/957: dread d4/d14/f110 [0,4194304] 0 2026-03-10T14:08:23.245 INFO:tasks.workunit.client.1.vm04.stdout:4/958: creat d4/df/db2/db4/d47/d4f/d8c/dc8/f13f x:0 0 0 2026-03-10T14:08:23.247 INFO:tasks.workunit.client.0.vm03.stdout:7/296: symlink d5/l5e 0 2026-03-10T14:08:23.266 INFO:tasks.workunit.client.1.vm04.stdout:4/959: sync 2026-03-10T14:08:23.267 INFO:tasks.workunit.client.1.vm04.stdout:4/960: readlink d4/df/db2/db4/d47/d4f/l90 0 2026-03-10T14:08:23.270 INFO:tasks.workunit.client.1.vm04.stdout:4/961: chown d4/df/c16 5320330 1 2026-03-10T14:08:23.275 INFO:tasks.workunit.client.1.vm04.stdout:4/962: getdents d4/df 0 2026-03-10T14:08:23.280 INFO:tasks.workunit.client.1.vm04.stdout:6/880: dwrite d3/ded/ff3 [4194304,4194304] 0 2026-03-10T14:08:23.281 INFO:tasks.workunit.client.1.vm04.stdout:6/881: truncate d3/de/d35/d3f/d2d/d32/d23/d4e/ff4 147749 0 2026-03-10T14:08:23.288 INFO:tasks.workunit.client.1.vm04.stdout:9/983: write d9/da/d5d/d81/d14a/fd7 [683311,42661] 0 2026-03-10T14:08:23.292 INFO:tasks.workunit.client.0.vm03.stdout:2/381: dwrite d5/d10/d1f/f3f [0,4194304] 0 2026-03-10T14:08:23.294 INFO:tasks.workunit.client.0.vm03.stdout:5/463: write d4/d40/d4e/f5c [125944,25902] 0 2026-03-10T14:08:23.295 INFO:tasks.workunit.client.0.vm03.stdout:2/382: write d5/d10/d31/f4e [4750987,22511] 0 2026-03-10T14:08:23.295 INFO:tasks.workunit.client.1.vm04.stdout:9/984: mknod d9/d58/db5/da5/d10b/d14f/c150 0 2026-03-10T14:08:23.297 INFO:tasks.workunit.client.1.vm04.stdout:6/882: creat d3/de/d35/d3a/d43/f112 x:0 0 0 2026-03-10T14:08:23.298 INFO:tasks.workunit.client.1.vm04.stdout:9/985: truncate d9/da/dd/d74/fcb 1562866 0 2026-03-10T14:08:23.301 INFO:tasks.workunit.client.1.vm04.stdout:6/883: truncate d3/de/d35/d3f/fd8 60788 0 2026-03-10T14:08:23.306 INFO:tasks.workunit.client.1.vm04.stdout:6/884: rename d3/de/d35/d3f/d2d/d32/d23/ddc/fe7 to d3/de/d35/d3a/d43/f113 0 2026-03-10T14:08:23.308 INFO:tasks.workunit.client.0.vm03.stdout:5/464: creat d4/d13/d1f/d84/f99 x:0 0 0 2026-03-10T14:08:23.311 INFO:tasks.workunit.client.1.vm04.stdout:6/885: creat d3/de/d35/d3f/d2d/d38/f114 x:0 0 0 2026-03-10T14:08:23.312 INFO:tasks.workunit.client.1.vm04.stdout:4/963: dread d4/f96 [4194304,4194304] 0 2026-03-10T14:08:23.313 INFO:tasks.workunit.client.0.vm03.stdout:5/465: chown d4/d16/f2d 32 1 2026-03-10T14:08:23.314 INFO:tasks.workunit.client.1.vm04.stdout:4/964: read d4/d14/d64/fd6 [2328190,72346] 0 2026-03-10T14:08:23.315 INFO:tasks.workunit.client.1.vm04.stdout:6/886: creat d3/de/d35/d3f/d2d/d32/d23/d24/f115 x:0 0 0 2026-03-10T14:08:23.336 INFO:tasks.workunit.client.0.vm03.stdout:0/350: rmdir d3/d17 39 2026-03-10T14:08:23.349 INFO:tasks.workunit.client.1.vm04.stdout:4/965: dread d4/d14/d1b/f9d [0,4194304] 0 2026-03-10T14:08:23.350 INFO:tasks.workunit.client.1.vm04.stdout:4/966: dread - d4/d14/d6d/f9c zero size 2026-03-10T14:08:23.357 INFO:tasks.workunit.client.1.vm04.stdout:9/986: rmdir d9/da/dd/d74 39 2026-03-10T14:08:23.359 INFO:tasks.workunit.client.0.vm03.stdout:4/413: dwrite d5/d9/db/d19/d38/d53/d55/f61 [4194304,4194304] 0 2026-03-10T14:08:23.374 INFO:tasks.workunit.client.1.vm04.stdout:4/967: dread d4/df/db2/db4/d47/d4f/d8c/dc8/dff/f87 [0,4194304] 0 2026-03-10T14:08:23.374 INFO:tasks.workunit.client.1.vm04.stdout:4/968: stat d4/df/db2/db4/d47/d4f/d8c/dc8/dff/l12a 0 2026-03-10T14:08:23.375 INFO:tasks.workunit.client.1.vm04.stdout:6/887: mknod d3/de/d35/d3f/d2d/d38/d40/d107/c116 0 2026-03-10T14:08:23.379 INFO:tasks.workunit.client.0.vm03.stdout:5/466: write d4/d16/d19/f79 [1004454,27367] 0 2026-03-10T14:08:23.383 INFO:tasks.workunit.client.1.vm04.stdout:4/969: creat d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/f140 x:0 0 0 2026-03-10T14:08:23.391 INFO:tasks.workunit.client.1.vm04.stdout:4/970: rename d4/df/d34/f8f to d4/df/db2/db4/d47/d4f/d8c/dc8/dff/f141 0 2026-03-10T14:08:23.393 INFO:tasks.workunit.client.1.vm04.stdout:4/971: creat d4/df/db2/db4/d47/d4f/f142 x:0 0 0 2026-03-10T14:08:23.395 INFO:tasks.workunit.client.0.vm03.stdout:6/369: write d8/d1b/f29 [2059938,10493] 0 2026-03-10T14:08:23.395 INFO:tasks.workunit.client.0.vm03.stdout:9/396: getdents d2 0 2026-03-10T14:08:23.413 INFO:tasks.workunit.client.1.vm04.stdout:6/888: write d3/de/d35/d3f/d2d/d38/d40/fbf [160464,32057] 0 2026-03-10T14:08:23.414 INFO:tasks.workunit.client.0.vm03.stdout:4/414: creat d5/d47/d5b/f84 x:0 0 0 2026-03-10T14:08:23.415 INFO:tasks.workunit.client.0.vm03.stdout:4/415: stat d5/d9/db/d19/d38/d53/d55/l6f 0 2026-03-10T14:08:23.416 INFO:tasks.workunit.client.0.vm03.stdout:4/416: stat d5/d9/db/d19/d38/d53/d55/f75 0 2026-03-10T14:08:23.417 INFO:tasks.workunit.client.0.vm03.stdout:4/417: chown d5/d9/db/d19/d38/d53/l70 116165 1 2026-03-10T14:08:23.417 INFO:tasks.workunit.client.1.vm04.stdout:6/889: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f85 [4194304,4194304] 0 2026-03-10T14:08:23.421 INFO:tasks.workunit.client.1.vm04.stdout:9/987: dread d9/d33/f8f [0,4194304] 0 2026-03-10T14:08:23.422 INFO:tasks.workunit.client.1.vm04.stdout:6/890: dwrite d3/de/d35/d3f/d2d/f18 [4194304,4194304] 0 2026-03-10T14:08:23.422 INFO:tasks.workunit.client.0.vm03.stdout:0/351: symlink d3/d11/d2c/l6e 0 2026-03-10T14:08:23.422 INFO:tasks.workunit.client.0.vm03.stdout:5/467: truncate d4/d6/de/f89 788663 0 2026-03-10T14:08:23.423 INFO:tasks.workunit.client.0.vm03.stdout:0/352: read d3/d4d/f22 [1376437,13694] 0 2026-03-10T14:08:23.435 INFO:tasks.workunit.client.1.vm04.stdout:6/891: chown d3/de/d35/d3f/l1e 0 1 2026-03-10T14:08:23.436 INFO:tasks.workunit.client.1.vm04.stdout:9/988: fsync d9/da/d5d/d81/d14a/f105 0 2026-03-10T14:08:23.437 INFO:tasks.workunit.client.1.vm04.stdout:6/892: fdatasync d3/de/d35/d3a/d43/f10a 0 2026-03-10T14:08:23.438 INFO:tasks.workunit.client.1.vm04.stdout:4/972: dwrite d4/d14/d64/f97 [0,4194304] 0 2026-03-10T14:08:23.442 INFO:tasks.workunit.client.1.vm04.stdout:9/989: dwrite d9/d58/db5/f67 [0,4194304] 0 2026-03-10T14:08:23.442 INFO:tasks.workunit.client.1.vm04.stdout:4/973: chown d4/df/db2/db4/d47/d4f/d8c/cdf 2365530 1 2026-03-10T14:08:23.444 INFO:tasks.workunit.client.0.vm03.stdout:9/397: read - d2/d29/d33/f70 zero size 2026-03-10T14:08:23.447 INFO:tasks.workunit.client.0.vm03.stdout:9/398: stat d2/d14/c1c 0 2026-03-10T14:08:23.450 INFO:tasks.workunit.client.1.vm04.stdout:6/893: write d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fb2 [3668094,80148] 0 2026-03-10T14:08:23.453 INFO:tasks.workunit.client.1.vm04.stdout:9/990: creat d9/d44/f151 x:0 0 0 2026-03-10T14:08:23.453 INFO:tasks.workunit.client.1.vm04.stdout:4/974: mkdir d4/df/d34/d6f/d143 0 2026-03-10T14:08:23.454 INFO:tasks.workunit.client.1.vm04.stdout:9/991: truncate d9/d5c/fb6 3489878 0 2026-03-10T14:08:23.455 INFO:tasks.workunit.client.0.vm03.stdout:6/370: dwrite d8/db/df/f37 [0,4194304] 0 2026-03-10T14:08:23.457 INFO:tasks.workunit.client.0.vm03.stdout:4/418: dwrite d5/d47/d5b/f79 [0,4194304] 0 2026-03-10T14:08:23.459 INFO:tasks.workunit.client.1.vm04.stdout:4/975: stat d4/df/db2/db4/fdb 0 2026-03-10T14:08:23.472 INFO:tasks.workunit.client.0.vm03.stdout:0/353: readlink d3/d17/l2f 0 2026-03-10T14:08:23.472 INFO:tasks.workunit.client.0.vm03.stdout:4/419: fdatasync d5/d47/d5b/f84 0 2026-03-10T14:08:23.473 INFO:tasks.workunit.client.1.vm04.stdout:9/992: write d9/d5c/fdc [1825532,89425] 0 2026-03-10T14:08:23.473 INFO:tasks.workunit.client.1.vm04.stdout:9/993: write d9/da/d5d/d81/d14a/f105 [1001483,89881] 0 2026-03-10T14:08:23.475 INFO:tasks.workunit.client.0.vm03.stdout:9/399: sync 2026-03-10T14:08:23.478 INFO:tasks.workunit.client.1.vm04.stdout:9/994: getdents d9/da/dd 0 2026-03-10T14:08:23.479 INFO:tasks.workunit.client.1.vm04.stdout:9/995: chown d9/l78 64574 1 2026-03-10T14:08:23.480 INFO:tasks.workunit.client.0.vm03.stdout:6/371: link d8/db/d2c/c5a d8/db/d49/d58/c75 0 2026-03-10T14:08:23.481 INFO:tasks.workunit.client.1.vm04.stdout:9/996: fdatasync d9/da/d5d/f90 0 2026-03-10T14:08:23.482 INFO:tasks.workunit.client.0.vm03.stdout:9/400: symlink d2/d14/d2b/l85 0 2026-03-10T14:08:23.492 INFO:tasks.workunit.client.1.vm04.stdout:9/997: getdents d9/d58/db5/da5/dab 0 2026-03-10T14:08:23.492 INFO:tasks.workunit.client.0.vm03.stdout:6/372: mkdir d8/db/d49/d76 0 2026-03-10T14:08:23.493 INFO:tasks.workunit.client.0.vm03.stdout:6/373: write d8/db/d12/f40 [320107,110474] 0 2026-03-10T14:08:23.493 INFO:tasks.workunit.client.0.vm03.stdout:6/374: stat d8/db/d49/d6c/f66 0 2026-03-10T14:08:23.495 INFO:tasks.workunit.client.0.vm03.stdout:9/401: mknod d2/d29/d33/c86 0 2026-03-10T14:08:23.511 INFO:tasks.workunit.client.0.vm03.stdout:4/420: dread d5/d9/db/f28 [0,4194304] 0 2026-03-10T14:08:23.516 INFO:tasks.workunit.client.0.vm03.stdout:4/421: mkdir d5/d47/d5b/d64/d85 0 2026-03-10T14:08:23.545 INFO:tasks.workunit.client.0.vm03.stdout:8/401: rename da/d24/c4a to da/d36/d6a/c7f 0 2026-03-10T14:08:23.548 INFO:tasks.workunit.client.1.vm04.stdout:6/894: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/fb7 [0,4194304] 0 2026-03-10T14:08:23.549 INFO:tasks.workunit.client.0.vm03.stdout:1/395: rename d0/d2/df/f6a to d0/d42/f7f 0 2026-03-10T14:08:23.554 INFO:tasks.workunit.client.1.vm04.stdout:9/998: write d9/da/dd/d1c/da3/fb3 [1140117,113114] 0 2026-03-10T14:08:23.556 INFO:tasks.workunit.client.0.vm03.stdout:3/375: rename d1d/l22 to d1d/d29/l6d 0 2026-03-10T14:08:23.556 INFO:tasks.workunit.client.1.vm04.stdout:6/895: creat d3/de/d35/d3f/d2d/d32/d23/d4e/f117 x:0 0 0 2026-03-10T14:08:23.557 INFO:tasks.workunit.client.1.vm04.stdout:9/999: creat d9/d58/db5/f152 x:0 0 0 2026-03-10T14:08:23.582 INFO:tasks.workunit.client.0.vm03.stdout:7/297: truncate d5/d9/d14/d26/d39/f45 2665680 0 2026-03-10T14:08:23.583 INFO:tasks.workunit.client.0.vm03.stdout:7/298: chown d5/d9/d14/d26/d39/c55 572237406 1 2026-03-10T14:08:23.591 INFO:tasks.workunit.client.1.vm04.stdout:6/896: dread d3/de/d35/d3f/d2d/f2e [0,4194304] 0 2026-03-10T14:08:23.595 INFO:tasks.workunit.client.1.vm04.stdout:6/897: rename d3/de/d35/d3f/d2d/d38/f60 to d3/de/d35/d3f/d2d/d32/d23/d24/f118 0 2026-03-10T14:08:23.597 INFO:tasks.workunit.client.1.vm04.stdout:6/898: write d3/de/d35/d3a/d43/f113 [1200610,45231] 0 2026-03-10T14:08:23.598 INFO:tasks.workunit.client.1.vm04.stdout:6/899: stat d3/de/d35/d3f/d2d/l58 0 2026-03-10T14:08:23.601 INFO:tasks.workunit.client.0.vm03.stdout:8/402: rmdir da/d36/d7c 0 2026-03-10T14:08:23.608 INFO:tasks.workunit.client.0.vm03.stdout:2/383: rename d5/ff to d5/d10/d1c/d40/d59/f71 0 2026-03-10T14:08:23.610 INFO:tasks.workunit.client.1.vm04.stdout:6/900: mknod d3/de/d35/d3f/d2d/d38/d40/dfa/c119 0 2026-03-10T14:08:23.612 INFO:tasks.workunit.client.1.vm04.stdout:6/901: truncate d3/de/d35/d3a/d43/d4c/f4d 1255869 0 2026-03-10T14:08:23.613 INFO:tasks.workunit.client.1.vm04.stdout:6/902: symlink d3/d1d/d73/l11a 0 2026-03-10T14:08:23.615 INFO:tasks.workunit.client.1.vm04.stdout:6/903: chown d3/de/d35/d3a/d43/d9c/laa 14742 1 2026-03-10T14:08:23.616 INFO:tasks.workunit.client.1.vm04.stdout:6/904: creat d3/ded/f11b x:0 0 0 2026-03-10T14:08:23.621 INFO:tasks.workunit.client.1.vm04.stdout:6/905: mknod d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/c11c 0 2026-03-10T14:08:23.625 INFO:tasks.workunit.client.1.vm04.stdout:6/906: rename d3/de/d35/d3a/d43/d4c/d5e/l91 to d3/de/d35/d3f/d2d/d32/d23/d24/d6f/de2/l11d 0 2026-03-10T14:08:23.626 INFO:tasks.workunit.client.0.vm03.stdout:8/403: dread - da/d3a/d44/f6d zero size 2026-03-10T14:08:23.628 INFO:tasks.workunit.client.0.vm03.stdout:5/468: dwrite d4/d16/f71 [0,4194304] 0 2026-03-10T14:08:23.639 INFO:tasks.workunit.client.0.vm03.stdout:7/299: rmdir d5/d9/d14/d59 0 2026-03-10T14:08:23.644 INFO:tasks.workunit.client.0.vm03.stdout:5/469: mknod d4/d40/d4e/c9a 0 2026-03-10T14:08:23.650 INFO:tasks.workunit.client.0.vm03.stdout:5/470: dread d4/d6/fa [0,4194304] 0 2026-03-10T14:08:23.675 INFO:tasks.workunit.client.0.vm03.stdout:0/354: truncate d3/f10 703448 0 2026-03-10T14:08:23.676 INFO:tasks.workunit.client.1.vm04.stdout:4/976: dread d4/df/db2/db4/d47/d4f/ff0 [0,4194304] 0 2026-03-10T14:08:23.679 INFO:tasks.workunit.client.1.vm04.stdout:6/907: dread d3/de/d35/d3f/d2d/d38/d40/f5d [0,4194304] 0 2026-03-10T14:08:23.680 INFO:tasks.workunit.client.1.vm04.stdout:4/977: rename d4/df/d34/d6f/ff8 to d4/df/db2/db6/d123/f144 0 2026-03-10T14:08:23.681 INFO:tasks.workunit.client.1.vm04.stdout:4/978: chown d4/df/db2/db4/fdb 14794809 1 2026-03-10T14:08:23.687 INFO:tasks.workunit.client.1.vm04.stdout:6/908: mkdir d3/de/d35/d3f/d2d/d32/d23/d47/d11e 0 2026-03-10T14:08:23.690 INFO:tasks.workunit.client.1.vm04.stdout:4/979: truncate d4/df/d34/fef 1262602 0 2026-03-10T14:08:23.692 INFO:tasks.workunit.client.0.vm03.stdout:0/355: chown d3/l5 22 1 2026-03-10T14:08:23.693 INFO:tasks.workunit.client.1.vm04.stdout:6/909: mknod d3/d1d/c11f 0 2026-03-10T14:08:23.695 INFO:tasks.workunit.client.1.vm04.stdout:4/980: dwrite d4/df/db2/db4/d47/d4f/d8c/dc8/dff/fd2 [0,4194304] 0 2026-03-10T14:08:23.697 INFO:tasks.workunit.client.1.vm04.stdout:4/981: write d4/d14/fad [4670195,96328] 0 2026-03-10T14:08:23.703 INFO:tasks.workunit.client.1.vm04.stdout:4/982: fdatasync d4/d14/d1b/f6c 0 2026-03-10T14:08:23.706 INFO:tasks.workunit.client.1.vm04.stdout:4/983: truncate d4/d14/d64/fab 33110 0 2026-03-10T14:08:23.710 INFO:tasks.workunit.client.1.vm04.stdout:6/910: link d3/ded/f11b d3/de/d35/d3f/d2d/d32/d23/d24/f120 0 2026-03-10T14:08:23.715 INFO:tasks.workunit.client.1.vm04.stdout:4/984: dread d4/d14/d1b/fc0 [0,4194304] 0 2026-03-10T14:08:23.716 INFO:tasks.workunit.client.0.vm03.stdout:9/402: dwrite d2/d14/d2b/f2d [4194304,4194304] 0 2026-03-10T14:08:23.720 INFO:tasks.workunit.client.1.vm04.stdout:4/985: dwrite d4/df/db2/db4/d47/d4f/d8c/dc8/dff/d74/f140 [0,4194304] 0 2026-03-10T14:08:23.725 INFO:tasks.workunit.client.0.vm03.stdout:0/356: truncate d3/f28 3027288 0 2026-03-10T14:08:23.728 INFO:tasks.workunit.client.1.vm04.stdout:4/986: mknod d4/df/db2/db4/d47/d4f/d8c/dc8/dff/c145 0 2026-03-10T14:08:23.733 INFO:tasks.workunit.client.0.vm03.stdout:5/471: link d4/d6/de/f89 d4/d35/f9b 0 2026-03-10T14:08:23.734 INFO:tasks.workunit.client.1.vm04.stdout:4/987: creat d4/df/db2/db4/d133/f146 x:0 0 0 2026-03-10T14:08:23.738 INFO:tasks.workunit.client.0.vm03.stdout:4/422: dwrite d5/d9/f25 [0,4194304] 0 2026-03-10T14:08:23.761 INFO:tasks.workunit.client.0.vm03.stdout:0/357: symlink d3/d11/d2c/d4a/l6f 0 2026-03-10T14:08:23.761 INFO:tasks.workunit.client.0.vm03.stdout:7/300: dwrite d5/d9/d14/d26/d39/f45 [0,4194304] 0 2026-03-10T14:08:23.778 INFO:tasks.workunit.client.1.vm04.stdout:6/911: sync 2026-03-10T14:08:23.780 INFO:tasks.workunit.client.1.vm04.stdout:6/912: symlink d3/de/l121 0 2026-03-10T14:08:23.781 INFO:tasks.workunit.client.1.vm04.stdout:6/913: mkdir d3/de/d35/d3f/d2d/d38/d122 0 2026-03-10T14:08:23.782 INFO:tasks.workunit.client.1.vm04.stdout:6/914: creat d3/ded/f123 x:0 0 0 2026-03-10T14:08:23.785 INFO:tasks.workunit.client.1.vm04.stdout:6/915: dread d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/fb7 [0,4194304] 0 2026-03-10T14:08:23.785 INFO:tasks.workunit.client.0.vm03.stdout:5/472: fdatasync d4/d40/f5b 0 2026-03-10T14:08:23.788 INFO:tasks.workunit.client.1.vm04.stdout:6/916: fsync d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/dad/fbb 0 2026-03-10T14:08:23.790 INFO:tasks.workunit.client.1.vm04.stdout:6/917: write d3/de/d35/d3f/d2d/d38/d40/fbf [298295,21703] 0 2026-03-10T14:08:23.791 INFO:tasks.workunit.client.0.vm03.stdout:3/376: dwrite d1d/f3c [0,4194304] 0 2026-03-10T14:08:23.792 INFO:tasks.workunit.client.1.vm04.stdout:6/918: write d3/de/d35/d3f/d2d/d32/d23/d24/f115 [294245,27552] 0 2026-03-10T14:08:23.797 INFO:tasks.workunit.client.0.vm03.stdout:3/377: write d1d/f2b [4641125,64899] 0 2026-03-10T14:08:23.804 INFO:tasks.workunit.client.1.vm04.stdout:6/919: mknod d3/de/d35/d3f/d2d/ddb/c124 0 2026-03-10T14:08:23.809 INFO:tasks.workunit.client.1.vm04.stdout:6/920: link d3/de/d35/d3f/d2d/d38/d40/dfa/c119 d3/de/d35/d10e/c125 0 2026-03-10T14:08:23.818 INFO:tasks.workunit.client.0.vm03.stdout:5/473: creat d4/d13/d1f/d8c/f9c x:0 0 0 2026-03-10T14:08:23.825 INFO:tasks.workunit.client.0.vm03.stdout:3/378: mkdir d1d/d29/d41/d45/d55/d6e 0 2026-03-10T14:08:23.827 INFO:tasks.workunit.client.1.vm04.stdout:6/921: dread d3/de/d35/d3f/d2d/d32/d23/f31 [0,4194304] 0 2026-03-10T14:08:23.827 INFO:tasks.workunit.client.0.vm03.stdout:5/474: dread - d4/d13/d1f/f70 zero size 2026-03-10T14:08:23.837 INFO:tasks.workunit.client.1.vm04.stdout:4/988: dwrite d4/d14/d3c/d62/de6/f102 [0,4194304] 0 2026-03-10T14:08:23.837 INFO:tasks.workunit.client.0.vm03.stdout:3/379: write d1d/d29/f6b [1226148,67781] 0 2026-03-10T14:08:23.843 INFO:tasks.workunit.client.0.vm03.stdout:5/475: dread - d4/d13/d73/f77 zero size 2026-03-10T14:08:23.847 INFO:tasks.workunit.client.0.vm03.stdout:5/476: chown d4/d6/f6d 41109 1 2026-03-10T14:08:23.847 INFO:tasks.workunit.client.1.vm04.stdout:6/922: write d3/de/d35/d3f/d2d/f34 [1159722,118380] 0 2026-03-10T14:08:23.847 INFO:tasks.workunit.client.0.vm03.stdout:5/477: chown d4/d16/d19/d23/c90 2 1 2026-03-10T14:08:23.849 INFO:tasks.workunit.client.1.vm04.stdout:6/923: mknod d3/de/d35/d3a/d43/d4c/d5e/c126 0 2026-03-10T14:08:23.850 INFO:tasks.workunit.client.0.vm03.stdout:3/380: symlink d1d/d33/d65/d48/l6f 0 2026-03-10T14:08:23.852 INFO:tasks.workunit.client.1.vm04.stdout:6/924: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f86 [0,4194304] 0 2026-03-10T14:08:23.853 INFO:tasks.workunit.client.1.vm04.stdout:6/925: dread - d3/de/d35/d3f/d2d/d32/d23/d24/f120 zero size 2026-03-10T14:08:23.874 INFO:tasks.workunit.client.0.vm03.stdout:9/403: dwrite d2/d14/d2b/d34/f59 [0,4194304] 0 2026-03-10T14:08:23.883 INFO:tasks.workunit.client.0.vm03.stdout:9/404: dread d2/f2c [0,4194304] 0 2026-03-10T14:08:23.889 INFO:tasks.workunit.client.0.vm03.stdout:4/423: creat d5/d9/db/d19/d38/f86 x:0 0 0 2026-03-10T14:08:23.897 INFO:tasks.workunit.client.1.vm04.stdout:6/926: fdatasync d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f86 0 2026-03-10T14:08:23.899 INFO:tasks.workunit.client.1.vm04.stdout:6/927: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/de2/f127 x:0 0 0 2026-03-10T14:08:23.904 INFO:tasks.workunit.client.0.vm03.stdout:9/405: mkdir d2/d29/d33/d6d/d87 0 2026-03-10T14:08:23.909 INFO:tasks.workunit.client.1.vm04.stdout:6/928: write d3/d1d/d73/fc [1157430,7781] 0 2026-03-10T14:08:23.911 INFO:tasks.workunit.client.0.vm03.stdout:9/406: dwrite d2/f1e [0,4194304] 0 2026-03-10T14:08:23.912 INFO:tasks.workunit.client.1.vm04.stdout:6/929: creat d3/d1d/d73/f128 x:0 0 0 2026-03-10T14:08:23.914 INFO:tasks.workunit.client.1.vm04.stdout:6/930: fdatasync d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fb2 0 2026-03-10T14:08:23.915 INFO:tasks.workunit.client.0.vm03.stdout:9/407: write d2/d14/d2b/f2d [7053266,119085] 0 2026-03-10T14:08:23.916 INFO:tasks.workunit.client.0.vm03.stdout:9/408: fdatasync d2/d29/d38/f47 0 2026-03-10T14:08:23.922 INFO:tasks.workunit.client.0.vm03.stdout:9/409: write d2/d29/d33/d55/f5f [1446954,80070] 0 2026-03-10T14:08:23.933 INFO:tasks.workunit.client.0.vm03.stdout:5/478: link d4/d6/l61 d4/d13/d43/l9d 0 2026-03-10T14:08:23.938 INFO:tasks.workunit.client.0.vm03.stdout:5/479: stat d4/d16/d19/d23/f50 0 2026-03-10T14:08:23.943 INFO:tasks.workunit.client.1.vm04.stdout:6/931: write d3/de/d35/d3f/d2d/d38/fcd [3919301,93698] 0 2026-03-10T14:08:23.945 INFO:tasks.workunit.client.1.vm04.stdout:6/932: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/f129 x:0 0 0 2026-03-10T14:08:23.947 INFO:tasks.workunit.client.1.vm04.stdout:6/933: fsync d3/de/d35/d3f/d2d/d32/d23/d24/d6f/d71/fb6 0 2026-03-10T14:08:23.950 INFO:tasks.workunit.client.1.vm04.stdout:6/934: creat d3/de/d35/d3a/d43/d4c/d5e/d76/f12a x:0 0 0 2026-03-10T14:08:23.952 INFO:tasks.workunit.client.1.vm04.stdout:6/935: mkdir d3/de/d35/d3f/d2d/d38/d40/d107/d12b 0 2026-03-10T14:08:23.953 INFO:tasks.workunit.client.1.vm04.stdout:6/936: fsync d3/de/d35/d3f/fa3 0 2026-03-10T14:08:23.954 INFO:tasks.workunit.client.1.vm04.stdout:6/937: fdatasync d3/de/d35/d3f/d2d/d32/d23/d24/f115 0 2026-03-10T14:08:23.957 INFO:tasks.workunit.client.1.vm04.stdout:6/938: creat d3/de/d35/d3a/d43/df5/f12c x:0 0 0 2026-03-10T14:08:23.959 INFO:tasks.workunit.client.1.vm04.stdout:6/939: write d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f10d [2628500,118781] 0 2026-03-10T14:08:23.978 INFO:tasks.workunit.client.1.vm04.stdout:4/989: dread d4/d14/d3c/d85/ffd [0,4194304] 0 2026-03-10T14:08:23.978 INFO:tasks.workunit.client.1.vm04.stdout:4/990: stat d4/df/f60 0 2026-03-10T14:08:23.980 INFO:tasks.workunit.client.1.vm04.stdout:4/991: write d4/d14/d1b/f99 [2426693,43920] 0 2026-03-10T14:08:23.983 INFO:tasks.workunit.client.1.vm04.stdout:6/940: sync 2026-03-10T14:08:23.990 INFO:tasks.workunit.client.1.vm04.stdout:6/941: write d3/de/d35/d3a/d43/d4c/f53 [471125,122582] 0 2026-03-10T14:08:23.993 INFO:tasks.workunit.client.1.vm04.stdout:4/992: dwrite d4/df/d34/fef [0,4194304] 0 2026-03-10T14:08:24.005 INFO:tasks.workunit.client.1.vm04.stdout:6/942: creat d3/de/d35/d3a/d43/d100/f12d x:0 0 0 2026-03-10T14:08:24.006 INFO:tasks.workunit.client.1.vm04.stdout:6/943: write d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f87 [5316488,106586] 0 2026-03-10T14:08:24.008 INFO:tasks.workunit.client.1.vm04.stdout:4/993: mknod d4/d14/d3c/c147 0 2026-03-10T14:08:24.009 INFO:tasks.workunit.client.1.vm04.stdout:6/944: mknod d3/de/d35/d3f/d2d/d32/d5c/c12e 0 2026-03-10T14:08:24.010 INFO:tasks.workunit.client.1.vm04.stdout:6/945: dread - d3/de/d35/d3f/d2d/d32/d23/d24/f120 zero size 2026-03-10T14:08:24.014 INFO:tasks.workunit.client.1.vm04.stdout:6/946: rename d3/de/d35/d3f/d2d/ddb to d3/de/d35/d3a/d43/d4c/d12f 0 2026-03-10T14:08:24.017 INFO:tasks.workunit.client.1.vm04.stdout:6/947: dwrite d3/de/d35/d3a/d43/d4c/d5e/fc3 [0,4194304] 0 2026-03-10T14:08:24.027 INFO:tasks.workunit.client.0.vm03.stdout:5/480: creat d4/f9e x:0 0 0 2026-03-10T14:08:24.029 INFO:tasks.workunit.client.1.vm04.stdout:4/994: dwrite d4/df/db2/db4/f37 [4194304,4194304] 0 2026-03-10T14:08:24.037 INFO:tasks.workunit.client.1.vm04.stdout:6/948: write d3/de/d35/d3f/d2d/d32/d9e/fb3 [930862,11101] 0 2026-03-10T14:08:24.039 INFO:tasks.workunit.client.1.vm04.stdout:4/995: read d4/df/db2/db6/dc9/dd0/fd3 [394972,99000] 0 2026-03-10T14:08:24.040 INFO:tasks.workunit.client.1.vm04.stdout:6/949: creat d3/de/d35/d3f/d2d/d38/d40/f130 x:0 0 0 2026-03-10T14:08:24.043 INFO:tasks.workunit.client.1.vm04.stdout:4/996: symlink d4/df/db2/db4/d47/d4f/d8c/dc8/l148 0 2026-03-10T14:08:24.045 INFO:tasks.workunit.client.1.vm04.stdout:6/950: creat d3/de/d35/d3f/d2d/d38/d40/d107/f131 x:0 0 0 2026-03-10T14:08:24.048 INFO:tasks.workunit.client.1.vm04.stdout:6/951: dwrite d3/de/d35/d3f/d2d/d32/d23/d4e/f117 [0,4194304] 0 2026-03-10T14:08:24.050 INFO:tasks.workunit.client.1.vm04.stdout:6/952: dread - d3/ded/f123 zero size 2026-03-10T14:08:24.052 INFO:tasks.workunit.client.1.vm04.stdout:4/997: symlink d4/df/d34/l149 0 2026-03-10T14:08:24.053 INFO:tasks.workunit.client.1.vm04.stdout:6/953: readlink d3/de/d35/d3a/d43/l7d 0 2026-03-10T14:08:24.054 INFO:tasks.workunit.client.1.vm04.stdout:6/954: rmdir d3/de/d35/d3a 39 2026-03-10T14:08:24.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.056+0000 7f5a5359e700 1 -- 192.168.123.103:0/831644178 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54072440 msgr2=0x7f5a5410be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.056+0000 7f5a5359e700 1 --2- 192.168.123.103:0/831644178 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54072440 0x7f5a5410be90 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7f5a48009b00 tx=0x7f5a48009e10 comp rx=0 tx=0).stop 2026-03-10T14:08:24.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.058+0000 7f5a5359e700 1 -- 192.168.123.103:0/831644178 shutdown_connections 2026-03-10T14:08:24.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.058+0000 7f5a5359e700 1 --2- 192.168.123.103:0/831644178 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54072440 0x7f5a5410be90 unknown :-1 s=CLOSED pgs=329 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.058+0000 7f5a5359e700 1 --2- 192.168.123.103:0/831644178 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a54071a60 0x7f5a54071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.058+0000 7f5a5359e700 1 -- 192.168.123.103:0/831644178 >> 192.168.123.103:0/831644178 conn(0x7f5a5406d1a0 msgr2=0x7f5a5406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:24.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.059+0000 7f5a5359e700 1 -- 192.168.123.103:0/831644178 shutdown_connections 2026-03-10T14:08:24.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.059+0000 7f5a5359e700 1 -- 192.168.123.103:0/831644178 wait complete. 2026-03-10T14:08:24.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.060+0000 7f5a5359e700 1 Processor -- start 2026-03-10T14:08:24.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.060+0000 7f5a5359e700 1 -- start start 2026-03-10T14:08:24.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.060+0000 7f5a5359e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54071a60 0x7f5a541a4a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.060+0000 7f5a5359e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a54072440 0x7f5a541a4f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.060+0000 7f5a5359e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a541a5590 con 0x7f5a54071a60 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.060+0000 7f5a5359e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a541a56d0 con 0x7f5a54072440 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.061+0000 7f5a5259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54071a60 0x7f5a541a4a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.061+0000 7f5a5259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54071a60 0x7f5a541a4a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59270/0 (socket says 192.168.123.103:59270) 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.061+0000 7f5a5259c700 1 -- 192.168.123.103:0/1150367556 learned_addr learned my addr 192.168.123.103:0/1150367556 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.061+0000 7f5a51d9b700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a54072440 0x7f5a541a4f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.061+0000 7f5a5259c700 1 -- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a54072440 msgr2=0x7f5a541a4f70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.061+0000 7f5a5259c700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a54072440 0x7f5a541a4f70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.061+0000 7f5a5259c700 1 -- 192.168.123.103:0/1150367556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a480097e0 con 0x7f5a54071a60 2026-03-10T14:08:24.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.061+0000 7f5a5259c700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54071a60 0x7f5a541a4a30 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7f5a4400b700 tx=0x7f5a4400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:24.060 INFO:tasks.workunit.client.1.vm04.stdout:4/998: rename d4/d14/c15 to d4/d14/dac/c14a 0 2026-03-10T14:08:24.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.066+0000 7f5a437fe700 1 -- 192.168.123.103:0/1150367556 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a44010820 con 0x7f5a54071a60 2026-03-10T14:08:24.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.066+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a541aa180 con 0x7f5a54071a60 2026-03-10T14:08:24.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.066+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a541aa6d0 con 0x7f5a54071a60 2026-03-10T14:08:24.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.066+0000 7f5a437fe700 1 -- 192.168.123.103:0/1150367556 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5a44010e60 con 0x7f5a54071a60 2026-03-10T14:08:24.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.067+0000 7f5a437fe700 1 -- 192.168.123.103:0/1150367556 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a4400d360 con 0x7f5a54071a60 2026-03-10T14:08:24.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.067+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5a5419ec20 con 0x7f5a54071a60 2026-03-10T14:08:24.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.068+0000 7f5a437fe700 1 -- 192.168.123.103:0/1150367556 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7f5a4400f810 con 0x7f5a54071a60 2026-03-10T14:08:24.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.068+0000 7f5a437fe700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f5a3c071ed0 0x7f5a3c074380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.068+0000 7f5a437fe700 1 -- 192.168.123.103:0/1150367556 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f5a44092620 con 0x7f5a54071a60 2026-03-10T14:08:24.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.070+0000 7f5a51d9b700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f5a3c071ed0 0x7f5a3c074380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.070+0000 7f5a51d9b700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f5a3c071ed0 0x7f5a3c074380 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5a4800b5c0 tx=0x7f5a48005fd0 comp rx=0 tx=0).ready entity=mgr.24413 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:24.086 INFO:tasks.workunit.client.1.vm04.stdout:6/955: truncate d3/de/d35/d3f/d2d/f18 4559768 0 2026-03-10T14:08:24.089 INFO:tasks.workunit.client.1.vm04.stdout:6/956: creat d3/de/d35/d3f/d2d/d32/d23/d47/f132 x:0 0 0 2026-03-10T14:08:24.090 INFO:tasks.workunit.client.1.vm04.stdout:4/999: dwrite d4/df/d34/f1f [0,4194304] 0 2026-03-10T14:08:24.100 INFO:tasks.workunit.client.1.vm04.stdout:6/957: sync 2026-03-10T14:08:24.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.090+0000 7f5a437fe700 1 -- 192.168.123.103:0/1150367556 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f5a4405ac20 con 0x7f5a54071a60 2026-03-10T14:08:24.101 INFO:tasks.workunit.client.1.vm04.stdout:6/958: sync 2026-03-10T14:08:24.123 INFO:tasks.workunit.client.0.vm03.stdout:6/375: rename d8/l1e to d8/db/d49/d76/l77 0 2026-03-10T14:08:24.139 INFO:tasks.workunit.client.0.vm03.stdout:6/376: dwrite d8/f6e [0,4194304] 0 2026-03-10T14:08:24.141 INFO:tasks.workunit.client.1.vm04.stdout:6/959: dread d3/de/d35/d3f/f22 [0,4194304] 0 2026-03-10T14:08:24.143 INFO:tasks.workunit.client.1.vm04.stdout:6/960: chown d3/de/d35/d3a/d43/d4c/f4d 418210641 1 2026-03-10T14:08:24.150 INFO:tasks.workunit.client.1.vm04.stdout:6/961: write d3/de/d35/d3f/d2d/fc7 [348714,21082] 0 2026-03-10T14:08:24.162 INFO:tasks.workunit.client.1.vm04.stdout:6/962: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/f120 [0,4194304] 0 2026-03-10T14:08:24.165 INFO:tasks.workunit.client.1.vm04.stdout:6/963: mknod d3/de/d35/d10e/c133 0 2026-03-10T14:08:24.166 INFO:tasks.workunit.client.1.vm04.stdout:6/964: truncate d3/ff 1022999 0 2026-03-10T14:08:24.169 INFO:tasks.workunit.client.1.vm04.stdout:6/965: dwrite d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f10d [4194304,4194304] 0 2026-03-10T14:08:24.177 INFO:tasks.workunit.client.1.vm04.stdout:6/966: dread d3/d1d/d73/fc [0,4194304] 0 2026-03-10T14:08:24.184 INFO:tasks.workunit.client.0.vm03.stdout:1/396: getdents d0/d2 0 2026-03-10T14:08:24.188 INFO:tasks.workunit.client.1.vm04.stdout:6/967: getdents d3/de/d35/d3f/d2d/d32/dd0 0 2026-03-10T14:08:24.189 INFO:tasks.workunit.client.1.vm04.stdout:6/968: creat d3/de/d35/d3f/d2d/d32/d23/d24/d6f/dd4/f134 x:0 0 0 2026-03-10T14:08:24.190 INFO:tasks.workunit.client.1.vm04.stdout:6/969: rmdir d3/de/d35/d3f/d2d/d38 39 2026-03-10T14:08:24.194 INFO:tasks.workunit.client.1.vm04.stdout:6/970: chown d3/de/d35/d3f/d2d/d32/d9e 2112 1 2026-03-10T14:08:24.197 INFO:tasks.workunit.client.1.vm04.stdout:6/971: creat d3/de/d35/d3f/d2d/d32/d23/d47/d11e/f135 x:0 0 0 2026-03-10T14:08:24.199 INFO:tasks.workunit.client.1.vm04.stdout:6/972: creat d3/de/d35/d3f/d2d/d32/d5c/f136 x:0 0 0 2026-03-10T14:08:24.200 INFO:tasks.workunit.client.1.vm04.stdout:6/973: readlink d3/de/d35/d3f/d2d/d32/d23/d24/lf9 0 2026-03-10T14:08:24.202 INFO:tasks.workunit.client.1.vm04.stdout:6/974: creat d3/de/f137 x:0 0 0 2026-03-10T14:08:24.203 INFO:tasks.workunit.client.1.vm04.stdout:6/975: mknod d3/de/d35/d3a/c138 0 2026-03-10T14:08:24.212 INFO:tasks.workunit.client.1.vm04.stdout:6/976: dread d3/de/d35/d3f/d2d/f21 [0,4194304] 0 2026-03-10T14:08:24.215 INFO:tasks.workunit.client.1.vm04.stdout:6/977: rename d3/de/d35/d3a/fb5 to d3/de/d35/d3f/d2d/d32/d23/d4e/f139 0 2026-03-10T14:08:24.220 INFO:tasks.workunit.client.1.vm04.stdout:6/978: creat d3/de/d35/d3f/d2d/d32/d23/d47/d11e/f13a x:0 0 0 2026-03-10T14:08:24.221 INFO:tasks.workunit.client.1.vm04.stdout:6/979: creat d3/de/d35/d3f/d2d/d32/d23/d47/f13b x:0 0 0 2026-03-10T14:08:24.222 INFO:tasks.workunit.client.1.vm04.stdout:6/980: read d3/de/d35/d3f/d2d/d32/d23/d24/d8e/fcb [5948028,22656] 0 2026-03-10T14:08:24.223 INFO:tasks.workunit.client.1.vm04.stdout:6/981: write d3/de/d35/d3f/d2d/d32/d23/d24/f118 [3673294,55459] 0 2026-03-10T14:08:24.226 INFO:tasks.workunit.client.1.vm04.stdout:6/982: dwrite d3/d1d/f44 [0,4194304] 0 2026-03-10T14:08:24.256 INFO:tasks.workunit.client.1.vm04.stdout:6/983: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:08:24.259 INFO:tasks.workunit.client.0.vm03.stdout:8/404: rmdir da/d36/d40/d73 0 2026-03-10T14:08:24.259 INFO:tasks.workunit.client.1.vm04.stdout:6/984: mknod d3/de/d35/d3a/d43/df5/c13c 0 2026-03-10T14:08:24.261 INFO:tasks.workunit.client.1.vm04.stdout:6/985: truncate d3/de/d35/d3f/d2d/f18 5588884 0 2026-03-10T14:08:24.269 INFO:tasks.workunit.client.1.vm04.stdout:6/986: rename d3/de/d35/d3f/f17 to d3/de/d35/d3f/d2d/d38/d122/f13d 0 2026-03-10T14:08:24.272 INFO:tasks.workunit.client.1.vm04.stdout:6/987: symlink d3/de/d35/d3f/d2d/d32/d23/ddc/l13e 0 2026-03-10T14:08:24.292 INFO:tasks.workunit.client.0.vm03.stdout:8/405: creat da/d36/d40/d50/d70/d7d/f80 x:0 0 0 2026-03-10T14:08:24.292 INFO:tasks.workunit.client.0.vm03.stdout:8/406: truncate da/d58/f72 954017 0 2026-03-10T14:08:24.298 INFO:tasks.workunit.client.1.vm04.stdout:6/988: rmdir d3/de/d35/d3f/d2d/d32 39 2026-03-10T14:08:24.309 INFO:tasks.workunit.client.1.vm04.stdout:6/989: dwrite d3/de/d35/d3f/d2d/d32/d23/f31 [0,4194304] 0 2026-03-10T14:08:24.315 INFO:tasks.workunit.client.1.vm04.stdout:6/990: creat d3/de/d35/d3a/f13f x:0 0 0 2026-03-10T14:08:24.316 INFO:tasks.workunit.client.1.vm04.stdout:6/991: truncate d3/de/d35/d3f/d2d/d38/f114 713797 0 2026-03-10T14:08:24.318 INFO:tasks.workunit.client.1.vm04.stdout:6/992: mknod d3/de/d35/d3f/d2d/d32/d9e/c140 0 2026-03-10T14:08:24.319 INFO:tasks.workunit.client.1.vm04.stdout:6/993: rename d3/d1d/d73/fa to d3/de/d35/d3f/d2d/d32/d5c/f141 0 2026-03-10T14:08:24.321 INFO:tasks.workunit.client.1.vm04.stdout:6/994: symlink d3/de/d35/d3f/d2d/d32/d23/d4e/l142 0 2026-03-10T14:08:24.333 INFO:tasks.workunit.client.1.vm04.stdout:6/995: dwrite d3/de/fb9 [0,4194304] 0 2026-03-10T14:08:24.338 INFO:tasks.workunit.client.1.vm04.stdout:6/996: link d3/de/d35/d3f/d2d/f89 d3/de/d35/d3f/dba/f143 0 2026-03-10T14:08:24.340 INFO:tasks.workunit.client.1.vm04.stdout:6/997: symlink d3/de/d35/d3f/d2d/d32/d9e/l144 0 2026-03-10T14:08:24.341 INFO:tasks.workunit.client.1.vm04.stdout:6/998: chown d3/de/d35/d3f/d2d/d32/d23/d24/d8e/d8f/la0 1 1 2026-03-10T14:08:24.354 INFO:tasks.workunit.client.1.vm04.stdout:6/999: truncate d3/de/d35/d3a/d43/f113 1034777 0 2026-03-10T14:08:24.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.347+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 --> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5a54061190 con 0x7f5a3c071ed0 2026-03-10T14:08:24.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.353+0000 7f5a437fe700 1 -- 192.168.123.103:0/1150367556 <== mgr.24413 v2:192.168.123.104:6828/3678786563 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f5a54061190 con 0x7f5a3c071ed0 2026-03-10T14:08:24.357 INFO:tasks.workunit.client.1.vm04.stderr:+ rm -rf -- ./tmp.oa6XzvJ2NI 2026-03-10T14:08:24.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.360+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f5a3c071ed0 msgr2=0x7f5a3c074380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.360+0000 7f5a5359e700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f5a3c071ed0 0x7f5a3c074380 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f5a4800b5c0 tx=0x7f5a48005fd0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.360+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54071a60 msgr2=0x7f5a541a4a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.360+0000 7f5a5359e700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54071a60 0x7f5a541a4a30 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7f5a4400b700 tx=0x7f5a4400bac0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.363+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 shutdown_connections 2026-03-10T14:08:24.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.363+0000 7f5a5359e700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a54071a60 0x7f5a541a4a30 unknown :-1 s=CLOSED pgs=330 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.363+0000 7f5a5359e700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f5a3c071ed0 0x7f5a3c074380 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.363+0000 7f5a5359e700 1 --2- 192.168.123.103:0/1150367556 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a54072440 0x7f5a541a4f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.363+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 >> 192.168.123.103:0/1150367556 conn(0x7f5a5406d1a0 msgr2=0x7f5a5410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:24.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.363+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 shutdown_connections 2026-03-10T14:08:24.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.363+0000 7f5a5359e700 1 -- 192.168.123.103:0/1150367556 wait complete. 2026-03-10T14:08:24.375 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:08:24.462 INFO:tasks.workunit.client.0.vm03.stdout:0/358: truncate d3/d16/f2d 3002481 0 2026-03-10T14:08:24.462 INFO:tasks.workunit.client.0.vm03.stdout:7/301: write d5/f32 [363714,84418] 0 2026-03-10T14:08:24.462 INFO:tasks.workunit.client.0.vm03.stdout:0/359: fdatasync d3/d17/f35 0 2026-03-10T14:08:24.462 INFO:tasks.workunit.client.0.vm03.stdout:0/360: stat d3/d16/f34 0 2026-03-10T14:08:24.465 INFO:tasks.workunit.client.0.vm03.stdout:7/302: truncate d5/f48 778167 0 2026-03-10T14:08:24.468 INFO:tasks.workunit.client.0.vm03.stdout:0/361: mknod d3/d46/d54/c70 0 2026-03-10T14:08:24.499 INFO:tasks.workunit.client.0.vm03.stdout:7/303: mkdir d5/d9/d14/d26/d5f 0 2026-03-10T14:08:24.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.508+0000 7f7cef22c700 1 -- 192.168.123.103:0/3917965057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8071980 msgr2=0x7f7ce8071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.508+0000 7f7cef22c700 1 --2- 192.168.123.103:0/3917965057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8071980 0x7f7ce8071d90 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7f7cd80077e0 tx=0x7f7cd8007af0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.508+0000 7f7cef22c700 1 -- 192.168.123.103:0/3917965057 shutdown_connections 2026-03-10T14:08:24.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.508+0000 7f7cef22c700 1 --2- 192.168.123.103:0/3917965057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7ce8072360 0x7f7ce80770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.508+0000 7f7cef22c700 1 --2- 192.168.123.103:0/3917965057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8071980 0x7f7ce8071d90 unknown :-1 s=CLOSED pgs=331 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.508+0000 7f7cef22c700 1 -- 192.168.123.103:0/3917965057 >> 192.168.123.103:0/3917965057 conn(0x7f7ce806d1a0 msgr2=0x7f7ce806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.508+0000 7f7cef22c700 1 -- 192.168.123.103:0/3917965057 shutdown_connections 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.508+0000 7f7cef22c700 1 -- 192.168.123.103:0/3917965057 wait complete. 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7cef22c700 1 Processor -- start 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7cef22c700 1 -- start start 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7cef22c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7ce8072360 0x7f7ce8082550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7cef22c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8082a90 0x7f7ce8082f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7cef22c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ce812dd80 con 0x7f7ce8082a90 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7cef22c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ce812def0 con 0x7f7ce8072360 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7ce7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8082a90 0x7f7ce8082f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7ce7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8082a90 0x7f7ce8082f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59284/0 (socket says 192.168.123.103:59284) 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7ce7fff700 1 -- 192.168.123.103:0/2477512312 learned_addr learned my addr 192.168.123.103:0/2477512312 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.509+0000 7f7cecfc8700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7ce8072360 0x7f7ce8082550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7ce7fff700 1 -- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7ce8072360 msgr2=0x7f7ce8082550 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7ce7fff700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7ce8072360 0x7f7ce8082550 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7ce7fff700 1 -- 192.168.123.103:0/2477512312 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7cd8007430 con 0x7f7ce8082a90 2026-03-10T14:08:24.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7cecfc8700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7ce8072360 0x7f7ce8082550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:08:24.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7ce7fff700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8082a90 0x7f7ce8082f00 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7f7ce0007f00 tx=0x7f7ce000d3b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:24.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7ce5ffb700 1 -- 192.168.123.103:0/2477512312 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ce000dc90 con 0x7f7ce8082a90 2026-03-10T14:08:24.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7cef22c700 1 -- 192.168.123.103:0/2477512312 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7ce812e170 con 0x7f7ce8082a90 2026-03-10T14:08:24.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7cef22c700 1 -- 192.168.123.103:0/2477512312 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7ce812e6c0 con 0x7f7ce8082a90 2026-03-10T14:08:24.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7ce5ffb700 1 -- 192.168.123.103:0/2477512312 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7ce000f040 con 0x7f7ce8082a90 2026-03-10T14:08:24.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.511+0000 7f7ce5ffb700 1 -- 192.168.123.103:0/2477512312 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ce0012660 con 0x7f7ce8082a90 2026-03-10T14:08:24.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.512+0000 7f7ce5ffb700 1 -- 192.168.123.103:0/2477512312 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7f7ce00127c0 con 0x7f7ce8082a90 2026-03-10T14:08:24.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.513+0000 7f7ce5ffb700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f7cd0072020 0x7f7cd00744d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.513+0000 7f7ce5ffb700 1 -- 192.168.123.103:0/2477512312 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f7ce0093530 con 0x7f7ce8082a90 2026-03-10T14:08:24.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.513+0000 7f7cecfc8700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f7cd0072020 0x7f7cd00744d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.514+0000 7f7cecfc8700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f7cd0072020 0x7f7cd00744d0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f7cd8007400 tx=0x7f7cd80058e0 comp rx=0 tx=0).ready entity=mgr.24413 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:24.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.514+0000 7f7cef22c700 1 -- 192.168.123.103:0/2477512312 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7cd4005320 con 0x7f7ce8082a90 2026-03-10T14:08:24.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.517+0000 7f7ce5ffb700 1 -- 192.168.123.103:0/2477512312 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f7ce005c310 con 0x7f7ce8082a90 2026-03-10T14:08:24.623 INFO:tasks.workunit.client.0.vm03.stdout:2/384: mknod d5/d10/c72 0 2026-03-10T14:08:24.673 INFO:tasks.workunit.client.0.vm03.stdout:2/385: dread d5/d10/d1f/f3e [0,4194304] 0 2026-03-10T14:08:24.703 INFO:tasks.workunit.client.0.vm03.stdout:3/381: creat d1d/d29/d41/f70 x:0 0 0 2026-03-10T14:08:24.711 INFO:tasks.workunit.client.0.vm03.stdout:3/382: creat d1d/d39/d51/f71 x:0 0 0 2026-03-10T14:08:24.725 INFO:tasks.workunit.client.0.vm03.stdout:3/383: dwrite d1d/d29/f2e [0,4194304] 0 2026-03-10T14:08:24.737 INFO:tasks.workunit.client.0.vm03.stdout:3/384: rmdir d1d/d33/d47/d53 39 2026-03-10T14:08:24.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.737+0000 7f7cef22c700 1 -- 192.168.123.103:0/2477512312 --> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7cd4000bf0 con 0x7f7cd0072020 2026-03-10T14:08:24.737 INFO:tasks.workunit.client.0.vm03.stdout:3/385: truncate d1d/d29/d41/d45/f6a 526838 0 2026-03-10T14:08:24.743 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.742+0000 7f7ce5ffb700 1 -- 192.168.123.103:0/2477512312 <== mgr.24413 v2:192.168.123.104:6828/3678786563 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f7cd4000bf0 con 0x7f7cd0072020 2026-03-10T14:08:24.747 INFO:tasks.workunit.client.0.vm03.stdout:3/386: mkdir d1d/d39/d51/d72 0 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.747+0000 7f7ccf7fe700 1 -- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f7cd0072020 msgr2=0x7f7cd00744d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.747+0000 7f7ccf7fe700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f7cd0072020 0x7f7cd00744d0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f7cd8007400 tx=0x7f7cd80058e0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.747+0000 7f7ccf7fe700 1 -- 192.168.123.103:0/2477512312 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8082a90 msgr2=0x7f7ce8082f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.747+0000 7f7ccf7fe700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8082a90 0x7f7ce8082f00 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7f7ce0007f00 tx=0x7f7ce000d3b0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.748+0000 7f7ccf7fe700 1 -- 192.168.123.103:0/2477512312 shutdown_connections 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.748+0000 7f7ccf7fe700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f7cd0072020 0x7f7cd00744d0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.748+0000 7f7ccf7fe700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7ce8072360 0x7f7ce8082550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.748+0000 7f7ccf7fe700 1 --2- 192.168.123.103:0/2477512312 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ce8082a90 0x7f7ce8082f00 unknown :-1 s=CLOSED pgs=332 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.748+0000 7f7ccf7fe700 1 -- 192.168.123.103:0/2477512312 >> 192.168.123.103:0/2477512312 conn(0x7f7ce806d1a0 msgr2=0x7f7ce806e090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:24.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.751+0000 7f7ccf7fe700 1 -- 192.168.123.103:0/2477512312 shutdown_connections 2026-03-10T14:08:24.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.751+0000 7f7ccf7fe700 1 -- 192.168.123.103:0/2477512312 wait complete. 2026-03-10T14:08:24.751 INFO:tasks.workunit.client.0.vm03.stdout:4/424: write d5/d9/db/d19/d34/f5d [220710,110115] 0 2026-03-10T14:08:24.888 INFO:tasks.workunit.client.0.vm03.stdout:9/410: write d2/d14/f25 [567945,36865] 0 2026-03-10T14:08:24.890 INFO:tasks.workunit.client.0.vm03.stdout:9/411: dread d2/f54 [0,4194304] 0 2026-03-10T14:08:24.891 INFO:tasks.workunit.client.0.vm03.stdout:9/412: rename d2/d14 to d2/d14/d2b/d79/d81/d88 22 2026-03-10T14:08:24.891 INFO:tasks.workunit.client.0.vm03.stdout:9/413: readlink d2/l10 0 2026-03-10T14:08:24.893 INFO:tasks.workunit.client.0.vm03.stdout:9/414: rename d2/d14/d2b/d76/l84 to d2/d14/d2b/d34/l89 0 2026-03-10T14:08:24.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.925+0000 7f451f39c700 1 -- 192.168.123.103:0/986992801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 msgr2=0x7f451810be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.925+0000 7f451f39c700 1 --2- 192.168.123.103:0/986992801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 0x7f451810be90 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7f4508009b00 tx=0x7f4508009e10 comp rx=0 tx=0).stop 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.925+0000 7f451f39c700 1 -- 192.168.123.103:0/986992801 shutdown_connections 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.925+0000 7f451f39c700 1 --2- 192.168.123.103:0/986992801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 0x7f451810be90 unknown :-1 s=CLOSED pgs=333 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.925+0000 7f451f39c700 1 --2- 192.168.123.103:0/986992801 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4518071a60 0x7f4518071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.925+0000 7f451f39c700 1 -- 192.168.123.103:0/986992801 >> 192.168.123.103:0/986992801 conn(0x7f451806d1a0 msgr2=0x7f451806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.925+0000 7f451f39c700 1 -- 192.168.123.103:0/986992801 shutdown_connections 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.925+0000 7f451f39c700 1 -- 192.168.123.103:0/986992801 wait complete. 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.926+0000 7f451f39c700 1 Processor -- start 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.926+0000 7f451f39c700 1 -- start start 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.926+0000 7f451f39c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4518071a60 0x7f45181a49f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451f39c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 0x7f45181a4f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451f39c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f45181a5550 con 0x7f4518072440 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451f39c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f45181a5690 con 0x7f4518071a60 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451e39a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4518071a60 0x7f45181a49f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451e39a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4518071a60 0x7f45181a49f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37956/0 (socket says 192.168.123.103:37956) 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451e39a700 1 -- 192.168.123.103:0/3169100829 learned_addr learned my addr 192.168.123.103:0/3169100829 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451db99700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 0x7f45181a4f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451db99700 1 -- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4518071a60 msgr2=0x7f45181a49f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451db99700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4518071a60 0x7f45181a49f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451db99700 1 -- 192.168.123.103:0/3169100829 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45080097e0 con 0x7f4518072440 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.927+0000 7f451db99700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 0x7f45181a4f30 secure :-1 s=READY pgs=334 cs=0 l=1 rev1=1 crypto rx=0x7f4508009ad0 tx=0x7f450800bb40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.928+0000 7f450f7fe700 1 -- 192.168.123.103:0/3169100829 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f450801d070 con 0x7f4518072440 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.928+0000 7f451f39c700 1 -- 192.168.123.103:0/3169100829 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f45181aa0e0 con 0x7f4518072440 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.928+0000 7f451f39c700 1 -- 192.168.123.103:0/3169100829 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f45181aa570 con 0x7f4518072440 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.928+0000 7f450f7fe700 1 -- 192.168.123.103:0/3169100829 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4508022470 con 0x7f4518072440 2026-03-10T14:08:24.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.928+0000 7f450f7fe700 1 -- 192.168.123.103:0/3169100829 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f450800f670 con 0x7f4518072440 2026-03-10T14:08:24.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.929+0000 7f450d7fa700 1 -- 192.168.123.103:0/3169100829 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f451804ea50 con 0x7f4518072440 2026-03-10T14:08:24.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.932+0000 7f450f7fe700 1 -- 192.168.123.103:0/3169100829 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7f45080225e0 con 0x7f4518072440 2026-03-10T14:08:24.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.933+0000 7f450f7fe700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f4504072020 0x7f45040744d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:24.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.933+0000 7f450f7fe700 1 -- 192.168.123.103:0/3169100829 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f4508094150 con 0x7f4518072440 2026-03-10T14:08:24.946 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:24 vm03.local ceph-mon[49718]: pgmap v12: 65 pgs: 65 active+clean; 2.9 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 57 MiB/s rd, 138 MiB/s wr, 342 op/s 2026-03-10T14:08:24.946 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:24 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:24.946 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:24 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:24.946 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:24 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:24.946 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:24 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:24.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.935+0000 7f451e39a700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f4504072020 0x7f45040744d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:24.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.935+0000 7f450f7fe700 1 -- 192.168.123.103:0/3169100829 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f450805cec0 con 0x7f4518072440 2026-03-10T14:08:24.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:24.935+0000 7f451e39a700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f4504072020 0x7f45040744d0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f4514005950 tx=0x7f4514009450 comp rx=0 tx=0).ready entity=mgr.24413 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:25.000 INFO:tasks.workunit.client.0.vm03.stdout:5/481: link d4/d16/d19/f25 d4/f9f 0 2026-03-10T14:08:25.015 INFO:tasks.workunit.client.0.vm03.stdout:5/482: mkdir d4/d16/da0 0 2026-03-10T14:08:25.031 INFO:tasks.workunit.client.0.vm03.stdout:5/483: getdents d4/d16/d19/d23/d3f 0 2026-03-10T14:08:25.032 INFO:tasks.workunit.client.0.vm03.stdout:5/484: chown d4/d16/d19/d23/d3f/l49 10 1 2026-03-10T14:08:25.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:24 vm04.local ceph-mon[55966]: pgmap v12: 65 pgs: 65 active+clean; 2.9 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 57 MiB/s rd, 138 MiB/s wr, 342 op/s 2026-03-10T14:08:25.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:24 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:25.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:24 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:25.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:24 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:25.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:24 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:25.146 INFO:tasks.workunit.client.0.vm03.stdout:3/387: mknod d1d/c73 0 2026-03-10T14:08:25.146 INFO:tasks.workunit.client.0.vm03.stdout:3/388: chown d1d/d33/d47/c5c 171640769 1 2026-03-10T14:08:25.152 INFO:tasks.workunit.client.0.vm03.stdout:3/389: dread f17 [0,4194304] 0 2026-03-10T14:08:25.156 INFO:tasks.workunit.client.0.vm03.stdout:3/390: dwrite d1d/d29/d41/d45/f6a [0,4194304] 0 2026-03-10T14:08:25.164 INFO:tasks.workunit.client.0.vm03.stdout:3/391: symlink d1d/l74 0 2026-03-10T14:08:25.171 INFO:tasks.workunit.client.0.vm03.stdout:3/392: rename d1d/f31 to d1d/d29/d41/f75 0 2026-03-10T14:08:25.180 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.181+0000 7f450d7fa700 1 -- 192.168.123.103:0/3169100829 --> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f4518061190 con 0x7f4504072020 2026-03-10T14:08:25.189 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (4m) 1s ago 5m 21.9M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (2m) 1s ago 5m 8544k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (4m) 14s ago 4m 11.1M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 1s ago 5m 7419k - 18.2.0 dc2bc1663786 57962aef7443 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (4m) 14s ago 4m 7407k - 18.2.0 dc2bc1663786 0918365fa827 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (4m) 1s ago 4m 88.9M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (3m) 1s ago 3m 133M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (2m) 1s ago 2m 99.5M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (2m) 14s ago 2m 64.8M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (3m) 14s ago 3m 173M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (5s) 1s ago 5m 40.9M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (29s) 14s ago 4m 554M - 19.2.3-678-ge911bdeb 654f31e6858e b6f8e3ea0f04 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 1s ago 5m 53.2M 2048M 18.2.0 dc2bc1663786 f59cc7d5bdfd 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (4m) 14s ago 4m 37.6M 2048M 18.2.0 dc2bc1663786 4113774b34c7 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (5m) 1s ago 5m 14.5M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (4m) 14s ago 4m 14.8M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 1s ago 4m 351M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (4m) 1s ago 4m 336M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:08:25.190 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (3m) 1s ago 3m 299M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:08:25.191 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (3m) 14s ago 3m 359M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:08:25.191 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (3m) 14s ago 3m 332M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:08:25.191 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (3m) 14s ago 3m 335M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:08:25.191 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (8s) 1s ago 4m 39.1M - 2.43.0 a07b618ecd1d f37e7168fda4 2026-03-10T14:08:25.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.188+0000 7f450f7fe700 1 -- 192.168.123.103:0/3169100829 <== mgr.24413 v2:192.168.123.104:6828/3678786563 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f4518061190 con 0x7f4504072020 2026-03-10T14:08:25.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.191+0000 7f451f39c700 1 -- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f4504072020 msgr2=0x7f45040744d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:25.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.191+0000 7f451f39c700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f4504072020 0x7f45040744d0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f4514005950 tx=0x7f4514009450 comp rx=0 tx=0).stop 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.191+0000 7f451f39c700 1 -- 192.168.123.103:0/3169100829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 msgr2=0x7f45181a4f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.191+0000 7f451f39c700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 0x7f45181a4f30 secure :-1 s=READY pgs=334 cs=0 l=1 rev1=1 crypto rx=0x7f4508009ad0 tx=0x7f450800bb40 comp rx=0 tx=0).stop 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.192+0000 7f451f39c700 1 -- 192.168.123.103:0/3169100829 shutdown_connections 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.192+0000 7f451f39c700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f4504072020 0x7f45040744d0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.192+0000 7f451f39c700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4518071a60 0x7f45181a49f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.192+0000 7f451f39c700 1 --2- 192.168.123.103:0/3169100829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4518072440 0x7f45181a4f30 unknown :-1 s=CLOSED pgs=334 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.192+0000 7f451f39c700 1 -- 192.168.123.103:0/3169100829 >> 192.168.123.103:0/3169100829 conn(0x7f451806d1a0 msgr2=0x7f451810a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.192+0000 7f451f39c700 1 -- 192.168.123.103:0/3169100829 shutdown_connections 2026-03-10T14:08:25.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.192+0000 7f451f39c700 1 -- 192.168.123.103:0/3169100829 wait complete. 2026-03-10T14:08:25.198 INFO:tasks.workunit.client.0.vm03.stdout:3/393: link d1d/c23 d1d/d29/d41/d45/d55/d6e/c76 0 2026-03-10T14:08:25.200 INFO:tasks.workunit.client.0.vm03.stdout:3/394: chown d1d/c23 10 1 2026-03-10T14:08:25.200 INFO:tasks.workunit.client.0.vm03.stdout:3/395: chown lf 114851 1 2026-03-10T14:08:25.218 INFO:tasks.workunit.client.0.vm03.stdout:3/396: symlink d1d/d33/d47/d53/d68/l77 0 2026-03-10T14:08:25.220 INFO:tasks.workunit.client.0.vm03.stdout:3/397: dread d1d/f4a [0,4194304] 0 2026-03-10T14:08:25.302 INFO:tasks.workunit.client.0.vm03.stdout:0/362: symlink d3/d4d/l71 0 2026-03-10T14:08:25.305 INFO:tasks.workunit.client.0.vm03.stdout:6/377: write d8/d11/f35 [999240,6299] 0 2026-03-10T14:08:25.307 INFO:tasks.workunit.client.0.vm03.stdout:0/363: dwrite d3/d46/f6c [0,4194304] 0 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.317+0000 7f7403940700 1 -- 192.168.123.103:0/262148285 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc072360 msgr2=0x7f73fc0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.317+0000 7f7403940700 1 --2- 192.168.123.103:0/262148285 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc072360 0x7f73fc0770e0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f73f400d3f0 tx=0x7f73f400d700 comp rx=0 tx=0).stop 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 -- 192.168.123.103:0/262148285 shutdown_connections 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 --2- 192.168.123.103:0/262148285 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc072360 0x7f73fc0770e0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 --2- 192.168.123.103:0/262148285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73fc071980 0x7f73fc071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 -- 192.168.123.103:0/262148285 >> 192.168.123.103:0/262148285 conn(0x7f73fc06d1a0 msgr2=0x7f73fc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 -- 192.168.123.103:0/262148285 shutdown_connections 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 -- 192.168.123.103:0/262148285 wait complete. 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 Processor -- start 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 -- start start 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc071980 0x7f73fc131300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73fc131840 0x7f73fc07f530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73fc131cb0 con 0x7f73fc131840 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.318+0000 7f7403940700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73fc131e20 con 0x7f73fc071980 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.319+0000 7f74016dc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc071980 0x7f73fc131300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.319+0000 7f74016dc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc071980 0x7f73fc131300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37974/0 (socket says 192.168.123.103:37974) 2026-03-10T14:08:25.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.319+0000 7f74016dc700 1 -- 192.168.123.103:0/3853116475 learned_addr learned my addr 192.168.123.103:0/3853116475 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:08:25.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.319+0000 7f7400edb700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73fc131840 0x7f73fc07f530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:25.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.319+0000 7f74016dc700 1 -- 192.168.123.103:0/3853116475 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73fc131840 msgr2=0x7f73fc07f530 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:25.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.319+0000 7f74016dc700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73fc131840 0x7f73fc07f530 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.319+0000 7f74016dc700 1 -- 192.168.123.103:0/3853116475 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73f4007ed0 con 0x7f73fc071980 2026-03-10T14:08:25.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.322+0000 7f74016dc700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc071980 0x7f73fc131300 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f73f800b770 tx=0x7f73f800bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:25.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.324+0000 7f73f27fc700 1 -- 192.168.123.103:0/3853116475 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f73f800f820 con 0x7f73fc071980 2026-03-10T14:08:25.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.324+0000 7f73f27fc700 1 -- 192.168.123.103:0/3853116475 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f73f800fe60 con 0x7f73fc071980 2026-03-10T14:08:25.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.324+0000 7f73f27fc700 1 -- 192.168.123.103:0/3853116475 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f73f800d610 con 0x7f73fc071980 2026-03-10T14:08:25.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.324+0000 7f7403940700 1 -- 192.168.123.103:0/3853116475 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f73fc07fa70 con 0x7f73fc071980 2026-03-10T14:08:25.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.324+0000 7f7403940700 1 -- 192.168.123.103:0/3853116475 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f73fc07ff60 con 0x7f73fc071980 2026-03-10T14:08:25.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.324+0000 7f7403940700 1 -- 192.168.123.103:0/3853116475 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f73fc12b500 con 0x7f73fc071980 2026-03-10T14:08:25.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.326+0000 7f73f27fc700 1 -- 192.168.123.103:0/3853116475 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7f73f801e030 con 0x7f73fc071980 2026-03-10T14:08:25.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.326+0000 7f73f27fc700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f73e8071f50 0x7f73e8074400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:25.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.326+0000 7f73f27fc700 1 -- 192.168.123.103:0/3853116475 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f73f8092670 con 0x7f73fc071980 2026-03-10T14:08:25.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.327+0000 7f7400edb700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f73e8071f50 0x7f73e8074400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:25.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.327+0000 7f7400edb700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f73e8071f50 0x7f73e8074400 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f73f4007590 tx=0x7f73f40074a0 comp rx=0 tx=0).ready entity=mgr.24413 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:25.333 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.329+0000 7f73f27fc700 1 -- 192.168.123.103:0/3853116475 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f73f805b440 con 0x7f73fc071980 2026-03-10T14:08:25.361 INFO:tasks.workunit.client.0.vm03.stdout:1/397: write d0/f10 [608039,19813] 0 2026-03-10T14:08:25.362 INFO:tasks.workunit.client.0.vm03.stdout:8/407: write da/f31 [3170823,117074] 0 2026-03-10T14:08:25.388 INFO:tasks.workunit.client.0.vm03.stdout:0/364: mknod d3/d46/c72 0 2026-03-10T14:08:25.388 INFO:tasks.workunit.client.0.vm03.stdout:0/365: chown d3/d11/d2c/l6e 93837683 1 2026-03-10T14:08:25.393 INFO:tasks.workunit.client.0.vm03.stdout:1/398: creat d0/d42/f80 x:0 0 0 2026-03-10T14:08:25.394 INFO:tasks.workunit.client.0.vm03.stdout:1/399: chown d0 25 1 2026-03-10T14:08:25.397 INFO:tasks.workunit.client.0.vm03.stdout:8/408: mknod da/d3c/d51/c81 0 2026-03-10T14:08:25.405 INFO:tasks.workunit.client.0.vm03.stdout:6/378: dread d8/db/d12/d51/d5c/f68 [0,4194304] 0 2026-03-10T14:08:25.418 INFO:tasks.workunit.client.0.vm03.stdout:6/379: write d8/db/d12/f40 [762216,71289] 0 2026-03-10T14:08:25.422 INFO:tasks.workunit.client.0.vm03.stdout:7/304: dwrite d5/d9/f42 [0,4194304] 0 2026-03-10T14:08:25.439 INFO:tasks.workunit.client.0.vm03.stdout:0/366: truncate d3/f1e 5745762 0 2026-03-10T14:08:25.443 INFO:tasks.workunit.client.0.vm03.stdout:0/367: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:08:25.476 INFO:tasks.workunit.client.0.vm03.stdout:2/386: dwrite d5/d10/d17/f28 [0,4194304] 0 2026-03-10T14:08:25.477 INFO:tasks.workunit.client.0.vm03.stdout:8/409: symlink da/d3c/d4b/l82 0 2026-03-10T14:08:25.502 INFO:tasks.workunit.client.0.vm03.stdout:6/380: read - d8/db/df/f63 zero size 2026-03-10T14:08:25.518 INFO:tasks.workunit.client.0.vm03.stdout:4/425: write d5/d9/db/f12 [238177,103170] 0 2026-03-10T14:08:25.527 INFO:tasks.workunit.client.0.vm03.stdout:1/400: fsync d0/d42/f7f 0 2026-03-10T14:08:25.539 INFO:tasks.workunit.client.0.vm03.stdout:4/426: dread d5/f3c [0,4194304] 0 2026-03-10T14:08:25.539 INFO:tasks.workunit.client.0.vm03.stdout:0/368: dread d3/d17/f56 [0,4194304] 0 2026-03-10T14:08:25.541 INFO:tasks.workunit.client.0.vm03.stdout:6/381: fdatasync d8/db/d49/d6c/d32/f4b 0 2026-03-10T14:08:25.542 INFO:tasks.workunit.client.0.vm03.stdout:6/382: chown d8/db/df/f37 9782 1 2026-03-10T14:08:25.542 INFO:tasks.workunit.client.0.vm03.stdout:9/415: write d2/d14/f30 [2987780,78748] 0 2026-03-10T14:08:25.561 INFO:tasks.workunit.client.0.vm03.stdout:8/410: sync 2026-03-10T14:08:25.561 INFO:tasks.workunit.client.0.vm03.stdout:6/383: sync 2026-03-10T14:08:25.574 INFO:tasks.workunit.client.0.vm03.stdout:5/485: write d4/d16/f1c [5112175,69937] 0 2026-03-10T14:08:25.591 INFO:tasks.workunit.client.0.vm03.stdout:3/398: dwrite d1d/d29/d41/f56 [0,4194304] 0 2026-03-10T14:08:25.736 INFO:tasks.workunit.client.0.vm03.stdout:7/305: dwrite d5/d9/d14/d21/f29 [0,4194304] 0 2026-03-10T14:08:25.747 INFO:tasks.workunit.client.0.vm03.stdout:4/427: dwrite f3 [0,4194304] 0 2026-03-10T14:08:25.747 INFO:tasks.workunit.client.0.vm03.stdout:4/428: chown d5/l6a 57 1 2026-03-10T14:08:25.752 INFO:tasks.workunit.client.0.vm03.stdout:9/416: mkdir d2/d14/d2b/d79/d8a 0 2026-03-10T14:08:25.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.774+0000 7f7403940700 1 -- 192.168.123.103:0/3853116475 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f73fc02cc30 con 0x7f73fc071980 2026-03-10T14:08:25.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.780+0000 7f73f27fc700 1 -- 192.168.123.103:0/3853116475 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+770 (secure 0 0 0) 0x7f73f805ab90 con 0x7f73fc071980 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 1, 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 13, 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T14:08:25.780 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:08:25.781 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:08:25.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 -- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f73e8071f50 msgr2=0x7f73e8074400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:25.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f73e8071f50 0x7f73e8074400 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f73f4007590 tx=0x7f73f40074a0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 -- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc071980 msgr2=0x7f73fc131300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:25.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc071980 0x7f73fc131300 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f73f800b770 tx=0x7f73f800bb30 comp rx=0 tx=0).stop 2026-03-10T14:08:25.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 -- 192.168.123.103:0/3853116475 shutdown_connections 2026-03-10T14:08:25.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f73e8071f50 0x7f73e8074400 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73fc071980 0x7f73fc131300 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 --2- 192.168.123.103:0/3853116475 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73fc131840 0x7f73fc07f530 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 -- 192.168.123.103:0/3853116475 >> 192.168.123.103:0/3853116475 conn(0x7f73fc06d1a0 msgr2=0x7f73fc0763d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:25.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 -- 192.168.123.103:0/3853116475 shutdown_connections 2026-03-10T14:08:25.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.784+0000 7f73e7fff700 1 -- 192.168.123.103:0/3853116475 wait complete. 2026-03-10T14:08:25.827 INFO:tasks.workunit.client.0.vm03.stdout:1/401: truncate d0/d2/f1a 2286597 0 2026-03-10T14:08:25.864 INFO:tasks.workunit.client.0.vm03.stdout:8/411: dwrite f2 [4194304,4194304] 0 2026-03-10T14:08:25.896 INFO:tasks.workunit.client.0.vm03.stdout:5/486: stat d4/d16/d19/l75 0 2026-03-10T14:08:25.939 INFO:tasks.workunit.client.0.vm03.stdout:3/399: symlink d1d/d39/l78 0 2026-03-10T14:08:25.961 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:25 vm03.local ceph-mon[49718]: from='client.14708 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:08:25.961 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:25 vm03.local ceph-mon[49718]: from='client.14712 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:08:25.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.959+0000 7f25d8489700 1 -- 192.168.123.103:0/3867032573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0103980 msgr2=0x7f25d0103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:25.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.959+0000 7f25d8489700 1 --2- 192.168.123.103:0/3867032573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0103980 0x7f25d0103dd0 secure :-1 s=READY pgs=335 cs=0 l=1 rev1=1 crypto rx=0x7f25c0009b00 tx=0x7f25c0009e10 comp rx=0 tx=0).stop 2026-03-10T14:08:25.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.964+0000 7f25d8489700 1 -- 192.168.123.103:0/3867032573 shutdown_connections 2026-03-10T14:08:25.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.964+0000 7f25d8489700 1 --2- 192.168.123.103:0/3867032573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0103980 0x7f25d0103dd0 unknown :-1 s=CLOSED pgs=335 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.964+0000 7f25d8489700 1 --2- 192.168.123.103:0/3867032573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25d0102780 0x7f25d0102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.964+0000 7f25d8489700 1 -- 192.168.123.103:0/3867032573 >> 192.168.123.103:0/3867032573 conn(0x7f25d00fdd10 msgr2=0x7f25d0100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:25.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.965+0000 7f25d8489700 1 -- 192.168.123.103:0/3867032573 shutdown_connections 2026-03-10T14:08:25.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.965+0000 7f25d8489700 1 -- 192.168.123.103:0/3867032573 wait complete. 2026-03-10T14:08:25.965 INFO:tasks.workunit.client.0.vm03.stdout:0/369: mknod d3/d11/c73 0 2026-03-10T14:08:25.965 INFO:tasks.workunit.client.0.vm03.stdout:0/370: chown d3/d46 3 1 2026-03-10T14:08:25.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.967+0000 7f25d8489700 1 Processor -- start 2026-03-10T14:08:25.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.969+0000 7f25d8489700 1 -- start start 2026-03-10T14:08:25.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.969+0000 7f25d8489700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0102780 0x7f25d0198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:25.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.969+0000 7f25d8489700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25d0103980 0x7f25d0198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:25.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.969+0000 7f25d8489700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25d0198b80 con 0x7f25d0102780 2026-03-10T14:08:25.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.969+0000 7f25d8489700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25d0198cc0 con 0x7f25d0103980 2026-03-10T14:08:25.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.970+0000 7f25d6225700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0102780 0x7f25d0198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:25.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.970+0000 7f25d6225700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0102780 0x7f25d0198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59332/0 (socket says 192.168.123.103:59332) 2026-03-10T14:08:25.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.970+0000 7f25d6225700 1 -- 192.168.123.103:0/4184790237 learned_addr learned my addr 192.168.123.103:0/4184790237 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:08:25.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.970+0000 7f25d5a24700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25d0103980 0x7f25d0198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:25.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.970+0000 7f25d6225700 1 -- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25d0103980 msgr2=0x7f25d0198560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:25.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.970+0000 7f25d6225700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25d0103980 0x7f25d0198560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:25.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.970+0000 7f25d6225700 1 -- 192.168.123.103:0/4184790237 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f25c00097e0 con 0x7f25d0102780 2026-03-10T14:08:25.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.970+0000 7f25d6225700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0102780 0x7f25d0198020 secure :-1 s=READY pgs=336 cs=0 l=1 rev1=1 crypto rx=0x7f25cc00d900 tx=0x7f25cc00dc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:25.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.973+0000 7f25c77fe700 1 -- 192.168.123.103:0/4184790237 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f25cc0041d0 con 0x7f25d0102780 2026-03-10T14:08:25.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.973+0000 7f25d8489700 1 -- 192.168.123.103:0/4184790237 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f25d019d770 con 0x7f25d0102780 2026-03-10T14:08:25.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.973+0000 7f25d8489700 1 -- 192.168.123.103:0/4184790237 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f25d019dcc0 con 0x7f25d0102780 2026-03-10T14:08:25.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.973+0000 7f25c77fe700 1 -- 192.168.123.103:0/4184790237 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f25cc004330 con 0x7f25d0102780 2026-03-10T14:08:25.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.974+0000 7f25c77fe700 1 -- 192.168.123.103:0/4184790237 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f25cc003d70 con 0x7f25d0102780 2026-03-10T14:08:25.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.975+0000 7f25d8489700 1 -- 192.168.123.103:0/4184790237 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f25b4005320 con 0x7f25d0102780 2026-03-10T14:08:25.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.977+0000 7f25c77fe700 1 -- 192.168.123.103:0/4184790237 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7f25cc009730 con 0x7f25d0102780 2026-03-10T14:08:25.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.977+0000 7f25c77fe700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f25bc071d50 0x7f25bc074200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:25.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.977+0000 7f25c77fe700 1 -- 192.168.123.103:0/4184790237 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f25cc0922e0 con 0x7f25d0102780 2026-03-10T14:08:25.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.980+0000 7f25d5a24700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f25bc071d50 0x7f25bc074200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:25.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.980+0000 7f25c77fe700 1 -- 192.168.123.103:0/4184790237 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f25cc05afa0 con 0x7f25d0102780 2026-03-10T14:08:25.982 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:25.984+0000 7f25d5a24700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f25bc071d50 0x7f25bc074200 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f25c000b5c0 tx=0x7f25c0005fb0 comp rx=0 tx=0).ready entity=mgr.24413 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:26.062 INFO:tasks.workunit.client.0.vm03.stdout:6/384: write d8/db/d49/f4a [1470322,119444] 0 2026-03-10T14:08:26.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:25 vm04.local ceph-mon[55966]: from='client.14708 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:08:26.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:25 vm04.local ceph-mon[55966]: from='client.14712 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:08:26.076 INFO:tasks.workunit.client.0.vm03.stdout:4/429: mkdir d5/d47/d62/d87 0 2026-03-10T14:08:26.079 INFO:tasks.workunit.client.0.vm03.stdout:9/417: write d2/d14/f64 [730363,56927] 0 2026-03-10T14:08:26.080 INFO:tasks.workunit.client.0.vm03.stdout:9/418: write d2/d14/d2b/d43/f7c [35310,39932] 0 2026-03-10T14:08:26.087 INFO:tasks.workunit.client.0.vm03.stdout:1/402: fsync d0/d2/f2a 0 2026-03-10T14:08:26.097 INFO:tasks.workunit.client.0.vm03.stdout:2/387: creat d5/d10/d1f/d5d/f73 x:0 0 0 2026-03-10T14:08:26.101 INFO:tasks.workunit.client.0.vm03.stdout:2/388: chown d5/d10/d1c/f5b 854980 1 2026-03-10T14:08:26.104 INFO:tasks.workunit.client.0.vm03.stdout:4/430: sync 2026-03-10T14:08:26.108 INFO:tasks.workunit.client.0.vm03.stdout:5/487: mknod d4/d13/d1f/d8c/ca1 0 2026-03-10T14:08:26.111 INFO:tasks.workunit.client.0.vm03.stdout:3/400: mknod d1d/d39/d51/c79 0 2026-03-10T14:08:26.112 INFO:tasks.workunit.client.0.vm03.stdout:3/401: stat d1d/f2b 0 2026-03-10T14:08:26.114 INFO:tasks.workunit.client.0.vm03.stdout:0/371: creat d3/d4d/d47/f74 x:0 0 0 2026-03-10T14:08:26.117 INFO:tasks.workunit.client.0.vm03.stdout:6/385: symlink d8/d11/d18/l78 0 2026-03-10T14:08:26.118 INFO:tasks.workunit.client.0.vm03.stdout:0/372: sync 2026-03-10T14:08:26.119 INFO:tasks.workunit.client.0.vm03.stdout:6/386: sync 2026-03-10T14:08:26.144 INFO:tasks.workunit.client.0.vm03.stdout:9/419: fsync d2/d29/d38/f4e 0 2026-03-10T14:08:26.147 INFO:tasks.workunit.client.0.vm03.stdout:9/420: dwrite d2/d29/d38/f47 [0,4194304] 0 2026-03-10T14:08:26.175 INFO:tasks.workunit.client.0.vm03.stdout:2/389: dwrite d5/d2a/f6e [0,4194304] 0 2026-03-10T14:08:26.189 INFO:tasks.workunit.client.0.vm03.stdout:5/488: dread - d4/d13/d43/f8b zero size 2026-03-10T14:08:26.192 INFO:tasks.workunit.client.0.vm03.stdout:3/402: dwrite d1d/d29/d41/f75 [0,4194304] 0 2026-03-10T14:08:26.202 INFO:tasks.workunit.client.0.vm03.stdout:0/373: creat d3/d46/d5e/f75 x:0 0 0 2026-03-10T14:08:26.202 INFO:tasks.workunit.client.0.vm03.stdout:0/374: stat d3/d11/d2c/d4a/d4b/l68 0 2026-03-10T14:08:26.212 INFO:tasks.workunit.client.0.vm03.stdout:9/421: mknod d2/d29/d33/d60/c8b 0 2026-03-10T14:08:26.215 INFO:tasks.workunit.client.0.vm03.stdout:9/422: dwrite d2/fc [0,4194304] 0 2026-03-10T14:08:26.262 INFO:tasks.workunit.client.0.vm03.stdout:8/412: dread f5 [0,4194304] 0 2026-03-10T14:08:26.263 INFO:tasks.workunit.client.0.vm03.stdout:8/413: write da/d58/d6c/f76 [133282,66894] 0 2026-03-10T14:08:26.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.387+0000 7f25d8489700 1 -- 192.168.123.103:0/4184790237 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f25b4005cc0 con 0x7f25d0102780 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.388+0000 7f25c77fe700 1 -- 192.168.123.103:0/4184790237 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1882 (secure 0 0 0) 0x7f25cc05a6f0 con 0x7f25d0102780 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:08:26.387 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:08:26.388 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.402+0000 7f25c57fa700 1 -- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f25bc071d50 msgr2=0x7f25bc074200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.402+0000 7f25c57fa700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f25bc071d50 0x7f25bc074200 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f25c000b5c0 tx=0x7f25c0005fb0 comp rx=0 tx=0).stop 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.402+0000 7f25c57fa700 1 -- 192.168.123.103:0/4184790237 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0102780 msgr2=0x7f25d0198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.402+0000 7f25c57fa700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0102780 0x7f25d0198020 secure :-1 s=READY pgs=336 cs=0 l=1 rev1=1 crypto rx=0x7f25cc00d900 tx=0x7f25cc00dc10 comp rx=0 tx=0).stop 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.402+0000 7f25c57fa700 1 -- 192.168.123.103:0/4184790237 shutdown_connections 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.402+0000 7f25c57fa700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f25d0102780 0x7f25d0198020 unknown :-1 s=CLOSED pgs=336 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.402+0000 7f25c57fa700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f25bc071d50 0x7f25bc074200 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.402+0000 7f25c57fa700 1 --2- 192.168.123.103:0/4184790237 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f25d0103980 0x7f25d0198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.403+0000 7f25c57fa700 1 -- 192.168.123.103:0/4184790237 >> 192.168.123.103:0/4184790237 conn(0x7f25d00fdd10 msgr2=0x7f25d0106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.403+0000 7f25c57fa700 1 -- 192.168.123.103:0/4184790237 shutdown_connections 2026-03-10T14:08:26.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.403+0000 7f25c57fa700 1 -- 192.168.123.103:0/4184790237 wait complete. 2026-03-10T14:08:26.402 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:08:26.414 INFO:tasks.workunit.client.0.vm03.stdout:7/306: truncate d5/f1b 1335378 0 2026-03-10T14:08:26.415 INFO:tasks.workunit.client.0.vm03.stdout:1/403: write d0/f70 [213959,36015] 0 2026-03-10T14:08:26.502 INFO:tasks.workunit.client.0.vm03.stdout:4/431: getdents d5/d47/d62/d7e 0 2026-03-10T14:08:26.526 INFO:tasks.workunit.client.0.vm03.stdout:5/489: chown d4/c5 32817850 1 2026-03-10T14:08:26.564 INFO:tasks.workunit.client.0.vm03.stdout:6/387: rename d8/db/d2c to d8/d11/d18/d79 0 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.585+0000 7f8e0a7d2700 1 -- 192.168.123.103:0/2074960159 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04072360 msgr2=0x7f8e040770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.585+0000 7f8e0a7d2700 1 --2- 192.168.123.103:0/2074960159 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04072360 0x7f8e040770e0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f8dfc00b210 tx=0x7f8dfc00b520 comp rx=0 tx=0).stop 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.585+0000 7f8e0a7d2700 1 -- 192.168.123.103:0/2074960159 shutdown_connections 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.585+0000 7f8e0a7d2700 1 --2- 192.168.123.103:0/2074960159 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04072360 0x7f8e040770e0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.585+0000 7f8e0a7d2700 1 --2- 192.168.123.103:0/2074960159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e04071980 0x7f8e04071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.585+0000 7f8e0a7d2700 1 -- 192.168.123.103:0/2074960159 >> 192.168.123.103:0/2074960159 conn(0x7f8e0406d1a0 msgr2=0x7f8e0406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.585+0000 7f8e0a7d2700 1 -- 192.168.123.103:0/2074960159 shutdown_connections 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.585+0000 7f8e0a7d2700 1 -- 192.168.123.103:0/2074960159 wait complete. 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e0a7d2700 1 Processor -- start 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e0a7d2700 1 -- start start 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e0a7d2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04071980 0x7f8e04082580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e0a7d2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e04082ac0 0x7f8e04082f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e0a7d2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e041b2a90 con 0x7f8e04082ac0 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e0a7d2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e041b2bd0 con 0x7f8e04071980 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e03fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04071980 0x7f8e04082580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e03fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04071980 0x7f8e04082580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38010/0 (socket says 192.168.123.103:38010) 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.586+0000 7f8e03fff700 1 -- 192.168.123.103:0/319916102 learned_addr learned my addr 192.168.123.103:0/319916102 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.587+0000 7f8e03fff700 1 -- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e04082ac0 msgr2=0x7f8e04082f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.587+0000 7f8e03fff700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e04082ac0 0x7f8e04082f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.587+0000 7f8e03fff700 1 -- 192.168.123.103:0/319916102 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8dfc009e30 con 0x7f8e04071980 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.587+0000 7f8e03fff700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04071980 0x7f8e04082580 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f8df4009d30 tx=0x7f8df400e3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.587+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8df400ed00 con 0x7f8e04071980 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.587+0000 7f8e0a7d2700 1 -- 192.168.123.103:0/319916102 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8e041b2dd0 con 0x7f8e04071980 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.587+0000 7f8e0a7d2700 1 -- 192.168.123.103:0/319916102 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8e041b32a0 con 0x7f8e04071980 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.588+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8df4010040 con 0x7f8e04071980 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.588+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8df4013610 con 0x7f8e04071980 2026-03-10T14:08:26.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.588+0000 7f8e0a7d2700 1 -- 192.168.123.103:0/319916102 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8df0005320 con 0x7f8e04071980 2026-03-10T14:08:26.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.590+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 25) v1 ==== 95039+0+0 (secure 0 0 0) 0x7f8df40149f0 con 0x7f8e04071980 2026-03-10T14:08:26.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.590+0000 7f8e017fa700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f8dec074180 0x7f8dec076630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:26.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.590+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f8df4092f70 con 0x7f8e04071980 2026-03-10T14:08:26.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.591+0000 7f8e037fe700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f8dec074180 0x7f8dec076630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:26.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.593+0000 7f8e037fe700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f8dec074180 0x7f8dec076630 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f8e04072ff0 tx=0x7f8dfc000f40 comp rx=0 tx=0).ready entity=mgr.24413 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:26.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.596+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f8df405bba0 con 0x7f8e04071980 2026-03-10T14:08:26.627 INFO:tasks.workunit.client.0.vm03.stdout:9/423: fsync d2/d29/d38/f4b 0 2026-03-10T14:08:26.631 INFO:tasks.workunit.client.0.vm03.stdout:9/424: dread d2/d14/f64 [0,4194304] 0 2026-03-10T14:08:26.731 INFO:tasks.workunit.client.0.vm03.stdout:0/375: dwrite d3/d16/f39 [0,4194304] 0 2026-03-10T14:08:26.733 INFO:tasks.workunit.client.0.vm03.stdout:7/307: fdatasync d5/d9/f19 0 2026-03-10T14:08:26.734 INFO:tasks.workunit.client.0.vm03.stdout:2/390: mkdir d5/d10/d1c/d50/d74 0 2026-03-10T14:08:26.735 INFO:tasks.workunit.client.0.vm03.stdout:4/432: creat d5/d9/db/d19/d34/f88 x:0 0 0 2026-03-10T14:08:26.737 INFO:tasks.workunit.client.0.vm03.stdout:4/433: dread d5/d9/db/d19/d38/d53/d55/f75 [0,4194304] 0 2026-03-10T14:08:26.738 INFO:tasks.workunit.client.0.vm03.stdout:4/434: dread - d5/d9/db/d19/f3a zero size 2026-03-10T14:08:26.738 INFO:tasks.workunit.client.0.vm03.stdout:4/435: readlink d5/d9/db/d19/l1a 0 2026-03-10T14:08:26.743 INFO:tasks.workunit.client.0.vm03.stdout:1/404: unlink d0/d18/l22 0 2026-03-10T14:08:26.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.767+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mgrmap(e 26) v1 ==== 44873+0+0 (secure 0 0 0) 0x7f8df4057e10 con 0x7f8e04071980 2026-03-10T14:08:26.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.767+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f8dec074180 msgr2=0x7f8dec076630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:26.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:26.767+0000 7f8e017fa700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f8dec074180 0x7f8dec076630 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f8e04072ff0 tx=0x7f8dfc000f40 comp rx=0 tx=0).stop 2026-03-10T14:08:26.799 INFO:tasks.workunit.client.0.vm03.stdout:2/391: stat d5/d10/f16 0 2026-03-10T14:08:26.804 INFO:tasks.workunit.client.0.vm03.stdout:4/436: symlink d5/d9/db/d19/l89 0 2026-03-10T14:08:26.805 INFO:tasks.workunit.client.0.vm03.stdout:3/403: rename d1d/f36 to d1d/f7a 0 2026-03-10T14:08:26.806 INFO:tasks.workunit.client.0.vm03.stdout:8/414: link da/d3c/f4f da/d3c/d4b/d69/f83 0 2026-03-10T14:08:26.807 INFO:tasks.workunit.client.0.vm03.stdout:0/376: dread d3/d16/d21/d3c/f5b [0,4194304] 0 2026-03-10T14:08:26.811 INFO:tasks.workunit.client.0.vm03.stdout:1/405: dread d0/d2/d34/f5f [0,4194304] 0 2026-03-10T14:08:26.812 INFO:tasks.workunit.client.0.vm03.stdout:1/406: chown d0/f70 25281323 1 2026-03-10T14:08:26.813 INFO:tasks.workunit.client.0.vm03.stdout:2/392: read d5/d10/d1c/d40/d59/f71 [2905392,120164] 0 2026-03-10T14:08:26.849 INFO:tasks.workunit.client.0.vm03.stdout:4/437: dread f2 [0,4194304] 0 2026-03-10T14:08:26.850 INFO:tasks.workunit.client.0.vm03.stdout:4/438: chown d5/d47/d62/d87 218991 1 2026-03-10T14:08:26.855 INFO:tasks.workunit.client.0.vm03.stdout:8/415: fsync da/d24/f28 0 2026-03-10T14:08:26.862 INFO:tasks.workunit.client.0.vm03.stdout:1/407: readlink d0/d2/df/d16/d41/l4a 0 2026-03-10T14:08:26.862 INFO:tasks.workunit.client.0.vm03.stdout:1/408: truncate d0/f70 1271550 0 2026-03-10T14:08:26.863 INFO:tasks.workunit.client.0.vm03.stdout:2/393: creat d5/d10/d1f/d4f/f75 x:0 0 0 2026-03-10T14:08:26.865 INFO:tasks.workunit.client.0.vm03.stdout:7/308: link d5/l1a d5/d9/d14/d26/d36/d51/l60 0 2026-03-10T14:08:26.866 INFO:tasks.workunit.client.0.vm03.stdout:7/309: fdatasync d5/d9/f42 0 2026-03-10T14:08:26.866 INFO:tasks.workunit.client.0.vm03.stdout:7/310: chown d5/d9/d14/d21/d28/f5b 2 1 2026-03-10T14:08:26.867 INFO:tasks.workunit.client.0.vm03.stdout:7/311: chown d5/d9/d35/f52 32865563 1 2026-03-10T14:08:26.877 INFO:tasks.workunit.client.0.vm03.stdout:6/388: getdents d8/db/d12/d51 0 2026-03-10T14:08:26.880 INFO:tasks.workunit.client.0.vm03.stdout:5/490: write d4/d16/d19/d23/f50 [1014225,87590] 0 2026-03-10T14:08:26.891 INFO:tasks.workunit.client.0.vm03.stdout:4/439: mkdir d5/d47/d62/d8a 0 2026-03-10T14:08:26.892 INFO:tasks.workunit.client.0.vm03.stdout:0/377: mkdir d3/d11/d76 0 2026-03-10T14:08:26.902 INFO:tasks.workunit.client.0.vm03.stdout:1/409: dread d0/d2/df/f43 [0,4194304] 0 2026-03-10T14:08:26.903 INFO:tasks.workunit.client.0.vm03.stdout:8/416: creat da/d24/d49/f84 x:0 0 0 2026-03-10T14:08:26.910 INFO:tasks.workunit.client.0.vm03.stdout:6/389: dread d8/d11/d18/f21 [0,4194304] 0 2026-03-10T14:08:26.916 INFO:tasks.workunit.client.0.vm03.stdout:3/404: rmdir d1d 39 2026-03-10T14:08:26.918 INFO:tasks.workunit.client.0.vm03.stdout:7/312: mknod d5/d9/d14/d26/c61 0 2026-03-10T14:08:26.918 INFO:tasks.workunit.client.0.vm03.stdout:7/313: dread - d5/d9/d14/d26/f38 zero size 2026-03-10T14:08:26.919 INFO:tasks.workunit.client.0.vm03.stdout:9/425: truncate d2/f15 1521676 0 2026-03-10T14:08:26.920 INFO:tasks.workunit.client.0.vm03.stdout:3/405: dwrite fc [0,4194304] 0 2026-03-10T14:08:26.925 INFO:tasks.workunit.client.0.vm03.stdout:5/491: sync 2026-03-10T14:08:26.925 INFO:tasks.workunit.client.0.vm03.stdout:5/492: chown d4/d13/d43/f8b 53972834 1 2026-03-10T14:08:26.930 INFO:tasks.workunit.client.0.vm03.stdout:4/440: rename d5/d47/c4b to d5/d9/db/d19/d38/d7b/c8b 0 2026-03-10T14:08:26.934 INFO:tasks.workunit.client.0.vm03.stdout:6/390: dread - d8/db/d12/d51/d5c/f69 zero size 2026-03-10T14:08:26.936 INFO:tasks.workunit.client.0.vm03.stdout:1/410: mkdir d0/d2/df/d27/d7e/d81 0 2026-03-10T14:08:26.942 INFO:tasks.workunit.client.0.vm03.stdout:2/394: write f4 [4670692,130353] 0 2026-03-10T14:08:26.945 INFO:tasks.workunit.client.0.vm03.stdout:2/395: dread d5/d10/d17/f28 [0,4194304] 0 2026-03-10T14:08:26.952 INFO:tasks.workunit.client.0.vm03.stdout:5/493: mknod d4/d16/d19/d6e/d7f/ca2 0 2026-03-10T14:08:26.952 INFO:tasks.workunit.client.0.vm03.stdout:5/494: stat d4/d6/c7 0 2026-03-10T14:08:26.952 INFO:tasks.workunit.client.0.vm03.stdout:5/495: dread - d4/d6/f78 zero size 2026-03-10T14:08:26.952 INFO:tasks.workunit.client.0.vm03.stdout:5/496: readlink d4/d16/l41 0 2026-03-10T14:08:26.953 INFO:tasks.workunit.client.0.vm03.stdout:5/497: write d4/d35/f8e [462284,116118] 0 2026-03-10T14:08:26.962 INFO:tasks.workunit.client.0.vm03.stdout:0/378: rename d3/d4d/l27 to d3/d11/d66/l77 0 2026-03-10T14:08:26.965 INFO:tasks.workunit.client.0.vm03.stdout:9/426: dread d2/d14/f1a [0,4194304] 0 2026-03-10T14:08:26.970 INFO:tasks.workunit.client.0.vm03.stdout:2/396: dread d5/d10/f16 [0,4194304] 0 2026-03-10T14:08:26.973 INFO:tasks.workunit.client.0.vm03.stdout:8/417: dwrite f6 [0,4194304] 0 2026-03-10T14:08:26.976 INFO:tasks.workunit.client.0.vm03.stdout:7/314: symlink d5/l62 0 2026-03-10T14:08:26.976 INFO:tasks.workunit.client.0.vm03.stdout:3/406: mknod d1d/d39/c7b 0 2026-03-10T14:08:26.990 INFO:tasks.workunit.client.0.vm03.stdout:4/441: write d5/d9/db/d19/d34/f5c [493512,33600] 0 2026-03-10T14:08:26.992 INFO:tasks.workunit.client.0.vm03.stdout:5/498: mkdir d4/d16/d19/d6e/da3 0 2026-03-10T14:08:26.993 INFO:tasks.workunit.client.0.vm03.stdout:0/379: creat d3/d16/d21/f78 x:0 0 0 2026-03-10T14:08:26.994 INFO:tasks.workunit.client.0.vm03.stdout:2/397: readlink d5/d35/l3b 0 2026-03-10T14:08:27.003 INFO:tasks.workunit.client.0.vm03.stdout:8/418: fdatasync da/f33 0 2026-03-10T14:08:27.003 INFO:tasks.workunit.client.0.vm03.stdout:8/419: stat da/fd 0 2026-03-10T14:08:27.004 INFO:tasks.workunit.client.0.vm03.stdout:6/391: mkdir d8/d11/d7a 0 2026-03-10T14:08:27.004 INFO:tasks.workunit.client.0.vm03.stdout:1/411: symlink d0/d2/df/d27/d7e/d81/l82 0 2026-03-10T14:08:27.004 INFO:tasks.workunit.client.0.vm03.stdout:3/407: mknod d1d/d33/d47/d53/c7c 0 2026-03-10T14:08:27.004 INFO:tasks.workunit.client.0.vm03.stdout:7/315: sync 2026-03-10T14:08:27.011 INFO:tasks.workunit.client.0.vm03.stdout:4/442: creat d5/d6e/f8c x:0 0 0 2026-03-10T14:08:27.012 INFO:tasks.workunit.client.0.vm03.stdout:5/499: creat d4/d13/d1f/d8c/fa4 x:0 0 0 2026-03-10T14:08:27.014 INFO:tasks.workunit.client.0.vm03.stdout:1/412: dread d0/d2/df/d16/d20/f5a [0,4194304] 0 2026-03-10T14:08:27.019 INFO:tasks.workunit.client.0.vm03.stdout:9/427: dwrite d2/d14/f1b [0,4194304] 0 2026-03-10T14:08:27.020 INFO:tasks.workunit.client.0.vm03.stdout:9/428: readlink d2/l4d 0 2026-03-10T14:08:27.020 INFO:tasks.workunit.client.0.vm03.stdout:9/429: chown d2/d29/d33/d55/d72 92 1 2026-03-10T14:08:27.024 INFO:tasks.workunit.client.0.vm03.stdout:2/398: mkdir d5/d10/d1f/d4f/d76 0 2026-03-10T14:08:27.031 INFO:tasks.workunit.client.0.vm03.stdout:6/392: fdatasync d8/db/d12/d51/d5c/d60/f67 0 2026-03-10T14:08:27.033 INFO:tasks.workunit.client.0.vm03.stdout:3/408: dread d1d/d33/f5e [0,4194304] 0 2026-03-10T14:08:27.036 INFO:tasks.workunit.client.0.vm03.stdout:4/443: fdatasync d5/d47/f46 0 2026-03-10T14:08:27.037 INFO:tasks.workunit.client.0.vm03.stdout:5/500: fdatasync d4/d16/d19/f25 0 2026-03-10T14:08:27.039 INFO:tasks.workunit.client.0.vm03.stdout:0/380: write d3/d16/d21/f50 [460305,106858] 0 2026-03-10T14:08:27.042 INFO:tasks.workunit.client.0.vm03.stdout:0/381: stat d3/d46/d54 0 2026-03-10T14:08:27.043 INFO:tasks.workunit.client.0.vm03.stdout:0/382: chown d3/d4d/l59 1592158 1 2026-03-10T14:08:27.043 INFO:tasks.workunit.client.0.vm03.stdout:0/383: readlink d3/d11/d2c/l6e 0 2026-03-10T14:08:27.043 INFO:tasks.workunit.client.0.vm03.stdout:0/384: chown d3/d16/d21/l43 11863760 1 2026-03-10T14:08:27.048 INFO:tasks.workunit.client.0.vm03.stdout:6/393: mkdir d8/db/d12/d51/d5c/d7b 0 2026-03-10T14:08:27.049 INFO:tasks.workunit.client.0.vm03.stdout:6/394: chown d8/d11/d18 32418758 1 2026-03-10T14:08:27.052 INFO:tasks.workunit.client.0.vm03.stdout:7/316: unlink d5/d9/ce 0 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: pgmap v13: 65 pgs: 65 active+clean; 2.9 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 43 MiB/s rd, 96 MiB/s wr, 239 op/s 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: from='client.14716 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/3853116475' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/4184790237' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:27.057 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:26 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:27.057 INFO:tasks.workunit.client.0.vm03.stdout:9/430: dread d2/d29/d33/f3c [0,4194304] 0 2026-03-10T14:08:27.059 INFO:tasks.workunit.client.0.vm03.stdout:5/501: truncate d4/d40/d4e/f80 950135 0 2026-03-10T14:08:27.062 INFO:tasks.workunit.client.0.vm03.stdout:1/413: write d0/d2/df/f43 [3085223,116575] 0 2026-03-10T14:08:27.063 INFO:tasks.workunit.client.0.vm03.stdout:1/414: dread - d0/d2/d34/f69 zero size 2026-03-10T14:08:27.064 INFO:tasks.workunit.client.0.vm03.stdout:5/502: dread d4/d6/de/f4f [0,4194304] 0 2026-03-10T14:08:27.065 INFO:tasks.workunit.client.0.vm03.stdout:5/503: dread - d4/d16/d19/d6e/f95 zero size 2026-03-10T14:08:27.068 INFO:tasks.workunit.client.0.vm03.stdout:8/420: dwrite da/d24/f2d [0,4194304] 0 2026-03-10T14:08:27.069 INFO:tasks.workunit.client.0.vm03.stdout:6/395: truncate f0 3437239 0 2026-03-10T14:08:27.071 INFO:tasks.workunit.client.0.vm03.stdout:7/317: unlink d5/d9/d14/l18 0 2026-03-10T14:08:27.072 INFO:tasks.workunit.client.0.vm03.stdout:7/318: read d5/d9/d14/d21/f29 [816047,42569] 0 2026-03-10T14:08:27.082 INFO:tasks.workunit.client.0.vm03.stdout:9/431: mkdir d2/d29/d33/d60/d8c 0 2026-03-10T14:08:27.087 INFO:tasks.workunit.client.0.vm03.stdout:2/399: link d5/d10/c72 d5/d10/d31/c77 0 2026-03-10T14:08:27.087 INFO:tasks.workunit.client.0.vm03.stdout:7/319: creat d5/d9/d14/d26/d39/f63 x:0 0 0 2026-03-10T14:08:27.091 INFO:tasks.workunit.client.0.vm03.stdout:3/409: getdents d1d/d29/d41/d45 0 2026-03-10T14:08:27.093 INFO:tasks.workunit.client.0.vm03.stdout:4/444: link d5/d9/db/d19/c2f d5/d47/d5b/d64/d85/c8d 0 2026-03-10T14:08:27.094 INFO:tasks.workunit.client.0.vm03.stdout:8/421: sync 2026-03-10T14:08:27.095 INFO:tasks.workunit.client.0.vm03.stdout:9/432: creat d2/d14/d2b/d79/d8a/f8d x:0 0 0 2026-03-10T14:08:27.101 INFO:tasks.workunit.client.0.vm03.stdout:3/410: unlink d1d/f3c 0 2026-03-10T14:08:27.106 INFO:tasks.workunit.client.0.vm03.stdout:8/422: dread da/d3a/d44/f46 [0,4194304] 0 2026-03-10T14:08:27.109 INFO:tasks.workunit.client.0.vm03.stdout:0/385: dwrite d3/f19 [4194304,4194304] 0 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: pgmap v13: 65 pgs: 65 active+clean; 2.9 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 43 MiB/s rd, 96 MiB/s wr, 239 op/s 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: from='client.14716 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/3853116475' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/4184790237' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:27.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:26 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' 2026-03-10T14:08:27.111 INFO:tasks.workunit.client.0.vm03.stdout:6/396: write d8/d11/d18/f21 [887888,84179] 0 2026-03-10T14:08:27.112 INFO:tasks.workunit.client.0.vm03.stdout:6/397: truncate d8/db/d12/f5d 1127116 0 2026-03-10T14:08:27.121 INFO:tasks.workunit.client.0.vm03.stdout:2/400: write d5/d35/f49 [230366,68580] 0 2026-03-10T14:08:27.130 INFO:tasks.workunit.client.0.vm03.stdout:1/415: link d0/d42/f7f d0/d2/df/f83 0 2026-03-10T14:08:27.130 INFO:tasks.workunit.client.0.vm03.stdout:5/504: getdents d4/d16/d19/d6e/d7f 0 2026-03-10T14:08:27.131 INFO:tasks.workunit.client.0.vm03.stdout:5/505: stat d4/d16/d19/d6e/l59 0 2026-03-10T14:08:27.132 INFO:tasks.workunit.client.0.vm03.stdout:3/411: dread - d1d/d33/d65/d48/f58 zero size 2026-03-10T14:08:27.132 INFO:tasks.workunit.client.0.vm03.stdout:3/412: read fc [4018652,58717] 0 2026-03-10T14:08:27.133 INFO:tasks.workunit.client.0.vm03.stdout:8/423: mkdir da/d3c/d51/d85 0 2026-03-10T14:08:27.136 INFO:tasks.workunit.client.0.vm03.stdout:0/386: mkdir d3/d46/d54/d79 0 2026-03-10T14:08:27.143 INFO:tasks.workunit.client.0.vm03.stdout:2/401: write d5/d10/d1c/d40/f4b [405241,59481] 0 2026-03-10T14:08:27.143 INFO:tasks.workunit.client.0.vm03.stdout:2/402: readlink d5/d2a/l2f 0 2026-03-10T14:08:27.144 INFO:tasks.workunit.client.0.vm03.stdout:9/433: write d2/d14/d2b/d43/f56 [826325,73911] 0 2026-03-10T14:08:27.151 INFO:tasks.workunit.client.0.vm03.stdout:4/445: truncate d5/d47/d5b/f79 2191174 0 2026-03-10T14:08:27.152 INFO:tasks.workunit.client.0.vm03.stdout:9/434: sync 2026-03-10T14:08:27.153 INFO:tasks.workunit.client.0.vm03.stdout:1/416: creat d0/d18/d1d/f84 x:0 0 0 2026-03-10T14:08:27.154 INFO:tasks.workunit.client.0.vm03.stdout:6/398: write d8/db/d49/d6c/f66 [440385,126520] 0 2026-03-10T14:08:27.158 INFO:tasks.workunit.client.0.vm03.stdout:7/320: creat d5/d9/d14/d26/f64 x:0 0 0 2026-03-10T14:08:27.162 INFO:tasks.workunit.client.0.vm03.stdout:5/506: write d4/d16/d19/d6e/f85 [423076,45224] 0 2026-03-10T14:08:27.171 INFO:tasks.workunit.client.0.vm03.stdout:5/507: dread d4/d13/d43/f58 [0,4194304] 0 2026-03-10T14:08:27.177 INFO:tasks.workunit.client.0.vm03.stdout:2/403: truncate d5/d10/d17/f20 4518637 0 2026-03-10T14:08:27.179 INFO:tasks.workunit.client.0.vm03.stdout:8/424: dwrite da/f15 [4194304,4194304] 0 2026-03-10T14:08:27.181 INFO:tasks.workunit.client.0.vm03.stdout:8/425: truncate da/d24/d49/f84 536246 0 2026-03-10T14:08:27.182 INFO:tasks.workunit.client.0.vm03.stdout:9/435: unlink d2/d29/d33/f3c 0 2026-03-10T14:08:27.182 INFO:tasks.workunit.client.0.vm03.stdout:9/436: dread - d2/d29/d33/d41/f7b zero size 2026-03-10T14:08:27.187 INFO:tasks.workunit.client.0.vm03.stdout:9/437: read d2/d14/f25 [218002,42645] 0 2026-03-10T14:08:27.200 INFO:tasks.workunit.client.0.vm03.stdout:0/387: getdents d3/d46/d54/d79 0 2026-03-10T14:08:27.200 INFO:tasks.workunit.client.0.vm03.stdout:5/508: write d4/d13/d1f/f83 [917973,22456] 0 2026-03-10T14:08:27.200 INFO:tasks.workunit.client.0.vm03.stdout:5/509: chown d4/d13/d43/l9d 65511560 1 2026-03-10T14:08:27.201 INFO:tasks.workunit.client.0.vm03.stdout:5/510: write d4/d40/d4e/f5c [707413,103527] 0 2026-03-10T14:08:27.203 INFO:tasks.workunit.client.0.vm03.stdout:4/446: mkdir d5/d47/d62/d87/d8e 0 2026-03-10T14:08:27.207 INFO:tasks.workunit.client.0.vm03.stdout:7/321: write d5/d9/d14/f2f [2578033,119938] 0 2026-03-10T14:08:27.208 INFO:tasks.workunit.client.0.vm03.stdout:7/322: write d5/f53 [512683,69877] 0 2026-03-10T14:08:27.209 INFO:tasks.workunit.client.0.vm03.stdout:3/413: truncate d1d/d29/d41/f56 2456942 0 2026-03-10T14:08:27.217 INFO:tasks.workunit.client.0.vm03.stdout:9/438: symlink d2/d29/d33/d41/d46/l8e 0 2026-03-10T14:08:27.220 INFO:tasks.workunit.client.0.vm03.stdout:2/404: mknod d5/d10/d1c/c78 0 2026-03-10T14:08:27.223 INFO:tasks.workunit.client.0.vm03.stdout:5/511: unlink d4/d16/d19/c39 0 2026-03-10T14:08:27.223 INFO:tasks.workunit.client.0.vm03.stdout:4/447: rmdir d5/d9 39 2026-03-10T14:08:27.224 INFO:tasks.workunit.client.0.vm03.stdout:5/512: stat d4/f55 0 2026-03-10T14:08:27.225 INFO:tasks.workunit.client.0.vm03.stdout:8/426: truncate da/d58/d5f/f71 800702 0 2026-03-10T14:08:27.226 INFO:tasks.workunit.client.0.vm03.stdout:8/427: write f6 [1204780,84097] 0 2026-03-10T14:08:27.226 INFO:tasks.workunit.client.0.vm03.stdout:8/428: dread - da/d3a/d44/f79 zero size 2026-03-10T14:08:27.227 INFO:tasks.workunit.client.0.vm03.stdout:8/429: read da/d24/f2d [2812345,79945] 0 2026-03-10T14:08:27.233 INFO:tasks.workunit.client.0.vm03.stdout:1/417: link d0/d2/l2f d0/d2/d71/l85 0 2026-03-10T14:08:27.236 INFO:tasks.workunit.client.0.vm03.stdout:3/414: write d1d/d39/d51/f61 [171885,115254] 0 2026-03-10T14:08:27.240 INFO:tasks.workunit.client.0.vm03.stdout:2/405: chown d5/d10/d17/f26 30745 1 2026-03-10T14:08:27.243 INFO:tasks.workunit.client.0.vm03.stdout:9/439: rename d2/l21 to d2/d29/d33/d41/d46/l8f 0 2026-03-10T14:08:27.248 INFO:tasks.workunit.client.0.vm03.stdout:7/323: symlink d5/d9/d14/d21/d28/l65 0 2026-03-10T14:08:27.249 INFO:tasks.workunit.client.0.vm03.stdout:2/406: sync 2026-03-10T14:08:27.251 INFO:tasks.workunit.client.0.vm03.stdout:0/388: creat d3/d4d/d30/f7a x:0 0 0 2026-03-10T14:08:27.252 INFO:tasks.workunit.client.0.vm03.stdout:7/324: dread d5/f32 [0,4194304] 0 2026-03-10T14:08:27.255 INFO:tasks.workunit.client.0.vm03.stdout:8/430: write da/d3a/d44/f46 [1935597,109369] 0 2026-03-10T14:08:27.256 INFO:tasks.workunit.client.0.vm03.stdout:5/513: dwrite d4/d16/d19/f25 [0,4194304] 0 2026-03-10T14:08:27.259 INFO:tasks.workunit.client.0.vm03.stdout:6/399: link d8/fd d8/db/d12/f7c 0 2026-03-10T14:08:27.264 INFO:tasks.workunit.client.0.vm03.stdout:4/448: dwrite d5/d9/db/d19/d34/f5d [0,4194304] 0 2026-03-10T14:08:27.268 INFO:tasks.workunit.client.0.vm03.stdout:1/418: fsync d0/d2/df/d16/f4f 0 2026-03-10T14:08:27.278 INFO:tasks.workunit.client.0.vm03.stdout:0/389: mkdir d3/d11/d2c/d4a/d7b 0 2026-03-10T14:08:27.284 INFO:tasks.workunit.client.0.vm03.stdout:5/514: mkdir d4/d40/da5 0 2026-03-10T14:08:27.297 INFO:tasks.workunit.client.0.vm03.stdout:3/415: link d1d/d33/l35 d1d/d33/l7d 0 2026-03-10T14:08:27.297 INFO:tasks.workunit.client.0.vm03.stdout:3/416: fsync d1d/d29/d41/d45/f6a 0 2026-03-10T14:08:27.298 INFO:tasks.workunit.client.0.vm03.stdout:8/431: mknod da/d3c/c86 0 2026-03-10T14:08:27.301 INFO:tasks.workunit.client.0.vm03.stdout:7/325: creat d5/f66 x:0 0 0 2026-03-10T14:08:27.305 INFO:tasks.workunit.client.0.vm03.stdout:4/449: mknod d5/d47/c8f 0 2026-03-10T14:08:27.307 INFO:tasks.workunit.client.0.vm03.stdout:4/450: chown d5/d9/db/d19/d38/f86 679461 1 2026-03-10T14:08:27.315 INFO:tasks.workunit.client.0.vm03.stdout:3/417: dread d1d/d29/d41/d45/d55/f63 [0,4194304] 0 2026-03-10T14:08:27.317 INFO:tasks.workunit.client.0.vm03.stdout:8/432: dread da/f2e [0,4194304] 0 2026-03-10T14:08:27.317 INFO:tasks.workunit.client.0.vm03.stdout:7/326: mknod d5/d9/d35/c67 0 2026-03-10T14:08:27.318 INFO:tasks.workunit.client.0.vm03.stdout:6/400: write d8/db/d12/f26 [3836063,44024] 0 2026-03-10T14:08:27.319 INFO:tasks.workunit.client.0.vm03.stdout:2/407: write d5/fa [1954550,72836] 0 2026-03-10T14:08:27.321 INFO:tasks.workunit.client.0.vm03.stdout:6/401: chown d8/d1b/f3d 450 1 2026-03-10T14:08:27.324 INFO:tasks.workunit.client.0.vm03.stdout:1/419: dwrite d0/d2/df/f83 [0,4194304] 0 2026-03-10T14:08:27.325 INFO:tasks.workunit.client.0.vm03.stdout:0/390: dwrite d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:27.326 INFO:tasks.workunit.client.0.vm03.stdout:1/420: read - d0/d18/d1d/f84 zero size 2026-03-10T14:08:27.328 INFO:tasks.workunit.client.0.vm03.stdout:0/391: readlink d3/d17/l2f 0 2026-03-10T14:08:27.348 INFO:tasks.workunit.client.0.vm03.stdout:9/440: rename d2/d14/d2b/d34/c63 to d2/d14/c90 0 2026-03-10T14:08:27.349 INFO:tasks.workunit.client.0.vm03.stdout:9/441: write d2/d14/d2b/d34/f59 [1652058,76216] 0 2026-03-10T14:08:27.354 INFO:tasks.workunit.client.0.vm03.stdout:9/442: dwrite d2/d14/f30 [0,4194304] 0 2026-03-10T14:08:27.372 INFO:tasks.workunit.client.0.vm03.stdout:5/515: getdents d4/d13 0 2026-03-10T14:08:27.374 INFO:tasks.workunit.client.0.vm03.stdout:1/421: fdatasync d0/d18/f74 0 2026-03-10T14:08:27.375 INFO:tasks.workunit.client.0.vm03.stdout:0/392: creat d3/d46/d5e/f7c x:0 0 0 2026-03-10T14:08:27.378 INFO:tasks.workunit.client.0.vm03.stdout:6/402: rename d8/db/df/f27 to d8/d11/d18/f7d 0 2026-03-10T14:08:27.384 INFO:tasks.workunit.client.0.vm03.stdout:3/418: mkdir d1d/d29/d41/d45/d5b/d7e 0 2026-03-10T14:08:27.388 INFO:tasks.workunit.client.0.vm03.stdout:7/327: link d5/d9/d14/d26/f64 d5/d9/d14/d26/d5f/f68 0 2026-03-10T14:08:27.389 INFO:tasks.workunit.client.0.vm03.stdout:2/408: symlink d5/d10/d1f/d4f/d76/l79 0 2026-03-10T14:08:27.389 INFO:tasks.workunit.client.0.vm03.stdout:2/409: stat d5/d10/d1c/d50/f69 0 2026-03-10T14:08:27.390 INFO:tasks.workunit.client.0.vm03.stdout:2/410: truncate d5/d2a/f6d 999640 0 2026-03-10T14:08:27.391 INFO:tasks.workunit.client.0.vm03.stdout:5/516: creat d4/d13/d1f/d8c/fa6 x:0 0 0 2026-03-10T14:08:27.392 INFO:tasks.workunit.client.0.vm03.stdout:1/422: fsync d0/d18/d1d/f5e 0 2026-03-10T14:08:27.395 INFO:tasks.workunit.client.0.vm03.stdout:0/393: creat d3/d11/d2c/d4a/d4b/f7d x:0 0 0 2026-03-10T14:08:27.396 INFO:tasks.workunit.client.0.vm03.stdout:0/394: dread d3/f10 [0,4194304] 0 2026-03-10T14:08:27.398 INFO:tasks.workunit.client.0.vm03.stdout:9/443: symlink d2/d29/d33/d41/d5c/l91 0 2026-03-10T14:08:27.399 INFO:tasks.workunit.client.0.vm03.stdout:3/419: fdatasync d1d/d33/f5e 0 2026-03-10T14:08:27.402 INFO:tasks.workunit.client.0.vm03.stdout:5/517: fsync d4/f37 0 2026-03-10T14:08:27.404 INFO:tasks.workunit.client.0.vm03.stdout:1/423: creat d0/d2/df/d27/d7e/d81/f86 x:0 0 0 2026-03-10T14:08:27.407 INFO:tasks.workunit.client.0.vm03.stdout:8/433: write da/f33 [349524,3994] 0 2026-03-10T14:08:27.412 INFO:tasks.workunit.client.0.vm03.stdout:8/434: dwrite da/d3a/d44/d64/f68 [0,4194304] 0 2026-03-10T14:08:27.413 INFO:tasks.workunit.client.0.vm03.stdout:4/451: write d5/f7 [3074920,73847] 0 2026-03-10T14:08:27.421 INFO:tasks.workunit.client.0.vm03.stdout:0/395: creat d3/d16/d21/d3c/f7e x:0 0 0 2026-03-10T14:08:27.422 INFO:tasks.workunit.client.0.vm03.stdout:0/396: dread - d3/d4d/d47/f48 zero size 2026-03-10T14:08:27.422 INFO:tasks.workunit.client.0.vm03.stdout:3/420: stat l13 0 2026-03-10T14:08:27.424 INFO:tasks.workunit.client.0.vm03.stdout:7/328: link d5/d9/d35/f52 d5/d9/d35/f69 0 2026-03-10T14:08:27.426 INFO:tasks.workunit.client.0.vm03.stdout:7/329: dwrite d5/d9/d14/f4d [0,4194304] 0 2026-03-10T14:08:27.427 INFO:tasks.workunit.client.0.vm03.stdout:2/411: mknod d5/d10/d17/c7a 0 2026-03-10T14:08:27.427 INFO:tasks.workunit.client.0.vm03.stdout:5/518: mkdir d4/d13/d1f/d8c/da7 0 2026-03-10T14:08:27.431 INFO:tasks.workunit.client.0.vm03.stdout:6/403: getdents d8/db/d12/d51 0 2026-03-10T14:08:27.449 INFO:tasks.workunit.client.0.vm03.stdout:0/397: dread - d3/d4d/d47/f5f zero size 2026-03-10T14:08:27.450 INFO:tasks.workunit.client.0.vm03.stdout:3/421: fdatasync f17 0 2026-03-10T14:08:27.452 INFO:tasks.workunit.client.0.vm03.stdout:0/398: dwrite d3/d46/f6c [0,4194304] 0 2026-03-10T14:08:27.459 INFO:tasks.workunit.client.0.vm03.stdout:5/519: rmdir d4/d13/d73 39 2026-03-10T14:08:27.464 INFO:tasks.workunit.client.0.vm03.stdout:2/412: rename d5/d10/d1f/d5d to d5/d10/d1c/d40/d59/d7b 0 2026-03-10T14:08:27.466 INFO:tasks.workunit.client.0.vm03.stdout:7/330: truncate d5/f47 1078419 0 2026-03-10T14:08:27.467 INFO:tasks.workunit.client.0.vm03.stdout:6/404: dread - d8/d1b/d1c/f73 zero size 2026-03-10T14:08:27.469 INFO:tasks.workunit.client.0.vm03.stdout:8/435: symlink da/d3c/l87 0 2026-03-10T14:08:27.473 INFO:tasks.workunit.client.0.vm03.stdout:1/424: dwrite d0/d2/df/d27/f58 [0,4194304] 0 2026-03-10T14:08:27.474 INFO:tasks.workunit.client.0.vm03.stdout:3/422: unlink d1d/l64 0 2026-03-10T14:08:27.475 INFO:tasks.workunit.client.0.vm03.stdout:3/423: readlink d1d/d33/d65/d48/l6f 0 2026-03-10T14:08:27.502 INFO:tasks.workunit.client.0.vm03.stdout:9/444: dwrite d2/f54 [0,4194304] 0 2026-03-10T14:08:27.508 INFO:tasks.workunit.client.0.vm03.stdout:9/445: dwrite d2/d14/d2b/d43/f56 [0,4194304] 0 2026-03-10T14:08:27.510 INFO:tasks.workunit.client.0.vm03.stdout:9/446: readlink d2/d29/l6b 0 2026-03-10T14:08:27.512 INFO:tasks.workunit.client.0.vm03.stdout:2/413: dread - d5/d10/d1c/d40/f52 zero size 2026-03-10T14:08:27.513 INFO:tasks.workunit.client.0.vm03.stdout:2/414: read d5/d10/d17/f18 [2735641,61046] 0 2026-03-10T14:08:27.517 INFO:tasks.workunit.client.0.vm03.stdout:7/331: creat d5/d9/d14/d26/d39/f6a x:0 0 0 2026-03-10T14:08:27.519 INFO:tasks.workunit.client.0.vm03.stdout:6/405: truncate d8/db/f1f 4655173 0 2026-03-10T14:08:27.524 INFO:tasks.workunit.client.0.vm03.stdout:4/452: creat d5/f90 x:0 0 0 2026-03-10T14:08:27.544 INFO:tasks.workunit.client.0.vm03.stdout:5/520: dread d4/d13/f4b [0,4194304] 0 2026-03-10T14:08:27.549 INFO:tasks.workunit.client.0.vm03.stdout:2/415: unlink d5/d10/d1c/d40/d59/d7b/l66 0 2026-03-10T14:08:27.551 INFO:tasks.workunit.client.0.vm03.stdout:1/425: write d0/d2/f46 [2785265,78263] 0 2026-03-10T14:08:27.553 INFO:tasks.workunit.client.0.vm03.stdout:3/424: dwrite d1d/f62 [0,4194304] 0 2026-03-10T14:08:27.558 INFO:tasks.workunit.client.0.vm03.stdout:4/453: read d5/fe [4045468,20611] 0 2026-03-10T14:08:27.560 INFO:tasks.workunit.client.0.vm03.stdout:0/399: rmdir d3/d16/d64 0 2026-03-10T14:08:27.570 INFO:tasks.workunit.client.0.vm03.stdout:5/521: chown d4/d13/d43/f94 94116 1 2026-03-10T14:08:27.576 INFO:tasks.workunit.client.0.vm03.stdout:6/406: dread d8/db/df/f37 [0,4194304] 0 2026-03-10T14:08:27.576 INFO:tasks.workunit.client.0.vm03.stdout:1/426: symlink d0/d2/df/d27/l87 0 2026-03-10T14:08:27.578 INFO:tasks.workunit.client.0.vm03.stdout:7/332: creat d5/d9/d14/d26/d5f/f6b x:0 0 0 2026-03-10T14:08:27.581 INFO:tasks.workunit.client.0.vm03.stdout:3/425: rename d1d/d33/l43 to d1d/d33/d65/d48/l7f 0 2026-03-10T14:08:27.582 INFO:tasks.workunit.client.0.vm03.stdout:4/454: dread - d5/d9/db/f6d zero size 2026-03-10T14:08:27.584 INFO:tasks.workunit.client.0.vm03.stdout:0/400: truncate d3/d16/f3e 932772 0 2026-03-10T14:08:27.588 INFO:tasks.workunit.client.0.vm03.stdout:0/401: dwrite d3/d11/d2c/d4a/d4b/f7d [0,4194304] 0 2026-03-10T14:08:27.593 INFO:tasks.workunit.client.0.vm03.stdout:6/407: rmdir d8/db/d12/d51/d5c 39 2026-03-10T14:08:27.604 INFO:tasks.workunit.client.0.vm03.stdout:8/436: truncate da/d3c/d4b/f63 3177343 0 2026-03-10T14:08:27.606 INFO:tasks.workunit.client.0.vm03.stdout:5/522: dread d4/f37 [0,4194304] 0 2026-03-10T14:08:27.606 INFO:tasks.workunit.client.0.vm03.stdout:9/447: truncate d2/d29/d33/d55/f5f 305078 0 2026-03-10T14:08:27.606 INFO:tasks.workunit.client.0.vm03.stdout:9/448: read d2/d14/d2b/d43/f56 [1509698,37505] 0 2026-03-10T14:08:27.609 INFO:tasks.workunit.client.0.vm03.stdout:4/455: truncate d5/d9/db/f24 4982510 0 2026-03-10T14:08:27.609 INFO:tasks.workunit.client.0.vm03.stdout:4/456: stat d5/d9/f16 0 2026-03-10T14:08:27.615 INFO:tasks.workunit.client.0.vm03.stdout:8/437: creat da/d3a/d44/f88 x:0 0 0 2026-03-10T14:08:27.620 INFO:tasks.workunit.client.0.vm03.stdout:7/333: creat d5/d9/d14/d26/d36/d51/f6c x:0 0 0 2026-03-10T14:08:27.621 INFO:tasks.workunit.client.0.vm03.stdout:7/334: stat d5/d9/d14/d26/d36/c49 0 2026-03-10T14:08:27.621 INFO:tasks.workunit.client.0.vm03.stdout:7/335: chown d5/f53 173 1 2026-03-10T14:08:27.623 INFO:tasks.workunit.client.0.vm03.stdout:4/457: mknod d5/d9/db/d19/d38/c91 0 2026-03-10T14:08:27.625 INFO:tasks.workunit.client.0.vm03.stdout:1/427: write d0/d2/fe [4797286,34169] 0 2026-03-10T14:08:27.628 INFO:tasks.workunit.client.0.vm03.stdout:2/416: dwrite d5/d2a/f45 [0,4194304] 0 2026-03-10T14:08:27.632 INFO:tasks.workunit.client.0.vm03.stdout:8/438: creat da/d24/d49/f89 x:0 0 0 2026-03-10T14:08:27.635 INFO:tasks.workunit.client.0.vm03.stdout:5/523: symlink d4/d13/d1f/d84/la8 0 2026-03-10T14:08:27.645 INFO:tasks.workunit.client.0.vm03.stdout:7/336: unlink d5/d9/d14/d21/d28/c3c 0 2026-03-10T14:08:27.646 INFO:tasks.workunit.client.0.vm03.stdout:7/337: write d5/d9/d14/f41 [3441406,63939] 0 2026-03-10T14:08:27.646 INFO:tasks.workunit.client.0.vm03.stdout:7/338: read - d5/d9/d14/d26/d5f/f68 zero size 2026-03-10T14:08:27.647 INFO:tasks.workunit.client.0.vm03.stdout:7/339: read - d5/d9/d14/d26/d36/f3a zero size 2026-03-10T14:08:27.650 INFO:tasks.workunit.client.0.vm03.stdout:1/428: creat d0/d2/d34/f88 x:0 0 0 2026-03-10T14:08:27.661 INFO:tasks.workunit.client.0.vm03.stdout:8/439: rename da/d3c/l87 to da/d3c/d4b/d69/l8a 0 2026-03-10T14:08:27.664 INFO:tasks.workunit.client.0.vm03.stdout:1/429: dread d0/d2/d34/f56 [0,4194304] 0 2026-03-10T14:08:27.665 INFO:tasks.workunit.client.0.vm03.stdout:7/340: dread d5/fb [0,4194304] 0 2026-03-10T14:08:27.665 INFO:tasks.workunit.client.0.vm03.stdout:7/341: chown d5/d9 2770416 1 2026-03-10T14:08:27.665 INFO:tasks.workunit.client.0.vm03.stdout:1/430: chown d0/d18/d3b/d50 251528 1 2026-03-10T14:08:27.668 INFO:tasks.workunit.client.0.vm03.stdout:3/426: truncate d1d/f32 2010444 0 2026-03-10T14:08:27.670 INFO:tasks.workunit.client.0.vm03.stdout:0/402: write d3/d16/f2d [3653981,71036] 0 2026-03-10T14:08:27.670 INFO:tasks.workunit.client.0.vm03.stdout:6/408: write d8/db/d12/d51/d5c/d60/f67 [684671,38063] 0 2026-03-10T14:08:27.677 INFO:tasks.workunit.client.0.vm03.stdout:4/458: unlink d5/c5a 0 2026-03-10T14:08:27.681 INFO:tasks.workunit.client.0.vm03.stdout:2/417: mknod d5/d10/d1c/d40/d59/d7b/c7c 0 2026-03-10T14:08:27.685 INFO:tasks.workunit.client.0.vm03.stdout:7/342: rename d5 to d5/d6d 22 2026-03-10T14:08:27.688 INFO:tasks.workunit.client.0.vm03.stdout:5/524: symlink d4/d13/d73/la9 0 2026-03-10T14:08:27.689 INFO:tasks.workunit.client.0.vm03.stdout:5/525: truncate d4/d13/d43/f94 83941 0 2026-03-10T14:08:27.689 INFO:tasks.workunit.client.0.vm03.stdout:5/526: stat d4/d16/d19/d23/d3f/f7a 0 2026-03-10T14:08:27.690 INFO:tasks.workunit.client.0.vm03.stdout:5/527: chown d4/d13/d1f/f74 4124 1 2026-03-10T14:08:27.691 INFO:tasks.workunit.client.0.vm03.stdout:3/427: creat d1d/d33/f80 x:0 0 0 2026-03-10T14:08:27.692 INFO:tasks.workunit.client.0.vm03.stdout:8/440: write da/d24/d49/f66 [3036675,99096] 0 2026-03-10T14:08:27.700 INFO:tasks.workunit.client.0.vm03.stdout:9/449: getdents d2/d14/d2b/d79/d8a 0 2026-03-10T14:08:27.705 INFO:tasks.workunit.client.0.vm03.stdout:0/403: dread d3/d4d/d30/f32 [0,4194304] 0 2026-03-10T14:08:27.707 INFO:tasks.workunit.client.0.vm03.stdout:2/418: rename d5/d10/d17/f19 to d5/d10/d31/f7d 0 2026-03-10T14:08:27.708 INFO:tasks.workunit.client.0.vm03.stdout:6/409: write d8/d1b/f31 [1027372,128638] 0 2026-03-10T14:08:27.711 INFO:tasks.workunit.client.0.vm03.stdout:4/459: dwrite d5/d9/db/f2a [0,4194304] 0 2026-03-10T14:08:27.728 INFO:tasks.workunit.client.0.vm03.stdout:1/431: dwrite d0/d2/df/d16/f61 [0,4194304] 0 2026-03-10T14:08:27.731 INFO:tasks.workunit.client.0.vm03.stdout:3/428: creat d1d/d29/d41/d45/d55/f81 x:0 0 0 2026-03-10T14:08:27.731 INFO:tasks.workunit.client.0.vm03.stdout:9/450: mknod d2/d29/d33/d55/c92 0 2026-03-10T14:08:27.732 INFO:tasks.workunit.client.0.vm03.stdout:3/429: dread - d1d/d29/d41/d45/d55/f81 zero size 2026-03-10T14:08:27.734 INFO:tasks.workunit.client.0.vm03.stdout:0/404: symlink d3/d4d/d47/l7f 0 2026-03-10T14:08:27.737 INFO:tasks.workunit.client.0.vm03.stdout:0/405: stat d3/d4d/f22 0 2026-03-10T14:08:27.744 INFO:tasks.workunit.client.0.vm03.stdout:6/410: symlink d8/d3b/l7e 0 2026-03-10T14:08:27.755 INFO:tasks.workunit.client.0.vm03.stdout:8/441: unlink da/c26 0 2026-03-10T14:08:27.756 INFO:tasks.workunit.client.0.vm03.stdout:8/442: dread da/d24/f28 [0,4194304] 0 2026-03-10T14:08:27.759 INFO:tasks.workunit.client.0.vm03.stdout:8/443: dread da/d24/f43 [0,4194304] 0 2026-03-10T14:08:27.769 INFO:tasks.workunit.client.0.vm03.stdout:6/411: fsync d8/db/d12/f6a 0 2026-03-10T14:08:27.781 INFO:tasks.workunit.client.0.vm03.stdout:7/343: link d5/d9/d14/f2f d5/d9/d14/d21/d28/f6e 0 2026-03-10T14:08:27.781 INFO:tasks.workunit.client.0.vm03.stdout:8/444: mkdir da/d3c/d51/d8b 0 2026-03-10T14:08:27.781 INFO:tasks.workunit.client.0.vm03.stdout:8/445: write da/f15 [4704937,95912] 0 2026-03-10T14:08:27.781 INFO:tasks.workunit.client.0.vm03.stdout:9/451: truncate d2/f2f 1979820 0 2026-03-10T14:08:27.781 INFO:tasks.workunit.client.0.vm03.stdout:9/452: truncate d2/d14/d2b/d79/d8a/f8d 576844 0 2026-03-10T14:08:27.781 INFO:tasks.workunit.client.0.vm03.stdout:9/453: chown d2/d29/d33/d60/d8c 401264 1 2026-03-10T14:08:27.783 INFO:tasks.workunit.client.0.vm03.stdout:3/430: link d1d/d29/d41/f75 d1d/d39/d51/f82 0 2026-03-10T14:08:27.785 INFO:tasks.workunit.client.0.vm03.stdout:0/406: symlink d3/d11/l80 0 2026-03-10T14:08:27.785 INFO:tasks.workunit.client.0.vm03.stdout:0/407: dread - d3/d16/d21/d3c/f7e zero size 2026-03-10T14:08:27.786 INFO:tasks.workunit.client.0.vm03.stdout:0/408: dread - d3/d46/d5e/f7c zero size 2026-03-10T14:08:27.791 INFO:tasks.workunit.client.0.vm03.stdout:1/432: sync 2026-03-10T14:08:27.794 INFO:tasks.workunit.client.0.vm03.stdout:7/344: mkdir d5/d9/d14/d21/d6f 0 2026-03-10T14:08:27.797 INFO:tasks.workunit.client.0.vm03.stdout:4/460: write d5/d9/f44 [764130,110701] 0 2026-03-10T14:08:27.797 INFO:tasks.workunit.client.0.vm03.stdout:5/528: dwrite d4/d13/d1f/f70 [0,4194304] 0 2026-03-10T14:08:27.814 INFO:tasks.workunit.client.0.vm03.stdout:6/412: write d8/db/d49/d6c/d32/f3e [825071,30585] 0 2026-03-10T14:08:27.817 INFO:tasks.workunit.client.0.vm03.stdout:9/454: chown d2/ce 3 1 2026-03-10T14:08:27.818 INFO:tasks.workunit.client.0.vm03.stdout:3/431: truncate d1d/f66 2133121 0 2026-03-10T14:08:27.820 INFO:tasks.workunit.client.0.vm03.stdout:1/433: symlink d0/d2/df/d27/d7e/l89 0 2026-03-10T14:08:27.820 INFO:tasks.workunit.client.0.vm03.stdout:7/345: fsync d5/d9/d14/d26/f64 0 2026-03-10T14:08:27.822 INFO:tasks.workunit.client.0.vm03.stdout:5/529: creat d4/d16/d19/d4a/faa x:0 0 0 2026-03-10T14:08:27.822 INFO:tasks.workunit.client.0.vm03.stdout:5/530: readlink d4/d6/l2f 0 2026-03-10T14:08:27.824 INFO:tasks.workunit.client.0.vm03.stdout:8/446: mknod da/d24/c8c 0 2026-03-10T14:08:27.827 INFO:tasks.workunit.client.0.vm03.stdout:9/455: unlink d2/d14/d2b/d34/f6a 0 2026-03-10T14:08:27.827 INFO:tasks.workunit.client.0.vm03.stdout:9/456: stat d2/d29/d33/d60/c8b 0 2026-03-10T14:08:27.829 INFO:tasks.workunit.client.0.vm03.stdout:9/457: dread d2/d14/f1b [0,4194304] 0 2026-03-10T14:08:27.831 INFO:tasks.workunit.client.0.vm03.stdout:9/458: chown d2/d14/c17 2 1 2026-03-10T14:08:27.836 INFO:tasks.workunit.client.0.vm03.stdout:2/419: link d5/d10/d17/f20 d5/d10/d1c/f7e 0 2026-03-10T14:08:27.836 INFO:tasks.workunit.client.0.vm03.stdout:2/420: chown d5/d10/d1c/d54/d5f/f6c 7666415 1 2026-03-10T14:08:27.837 INFO:tasks.workunit.client.0.vm03.stdout:1/434: write d0/d42/f7b [2977477,95201] 0 2026-03-10T14:08:27.839 INFO:tasks.workunit.client.0.vm03.stdout:1/435: dread d0/d18/f75 [0,4194304] 0 2026-03-10T14:08:27.843 INFO:tasks.workunit.client.0.vm03.stdout:3/432: dwrite d1d/d29/d41/f75 [4194304,4194304] 0 2026-03-10T14:08:27.844 INFO:tasks.workunit.client.0.vm03.stdout:7/346: symlink d5/d9/d14/d26/d5f/l70 0 2026-03-10T14:08:27.854 INFO:tasks.workunit.client.0.vm03.stdout:4/461: mknod d5/c92 0 2026-03-10T14:08:27.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:27 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:27.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:27 vm03.local ceph-mon[49718]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr fail", "who": "vm04.ywwcto"}]: dispatch 2026-03-10T14:08:27.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:27 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr fail", "who": "vm04.ywwcto"}]: dispatch 2026-03-10T14:08:27.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:27 vm03.local ceph-mon[49718]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T14:08:27.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:27 vm03.local ceph-mon[49718]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd='[{"prefix": "mgr fail", "who": "vm04.ywwcto"}]': finished 2026-03-10T14:08:27.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:27 vm03.local ceph-mon[49718]: mgrmap e26: vm03.rwbbep(active, starting, since 0.0142565s) 2026-03-10T14:08:27.862 INFO:tasks.workunit.client.0.vm03.stdout:0/409: creat d3/d4d/d30/f81 x:0 0 0 2026-03-10T14:08:27.868 INFO:tasks.workunit.client.0.vm03.stdout:7/347: fsync d5/f32 0 2026-03-10T14:08:27.868 INFO:tasks.workunit.client.0.vm03.stdout:7/348: readlink d5/d9/d14/d26/d36/l3d 0 2026-03-10T14:08:27.872 INFO:tasks.workunit.client.0.vm03.stdout:5/531: creat d4/d13/d1f/d8c/da7/fab x:0 0 0 2026-03-10T14:08:27.874 INFO:tasks.workunit.client.0.vm03.stdout:6/413: truncate d8/db/d12/f40 473604 0 2026-03-10T14:08:27.876 INFO:tasks.workunit.client.0.vm03.stdout:8/447: dwrite da/d3c/f48 [0,4194304] 0 2026-03-10T14:08:27.885 INFO:tasks.workunit.client.0.vm03.stdout:1/436: mknod d0/d2/df/c8a 0 2026-03-10T14:08:27.888 INFO:tasks.workunit.client.0.vm03.stdout:3/433: mkdir d1d/d29/d41/d45/d55/d6e/d83 0 2026-03-10T14:08:27.890 INFO:tasks.workunit.client.0.vm03.stdout:7/349: rmdir d5/d9/d14/d26/d36/d51 39 2026-03-10T14:08:27.892 INFO:tasks.workunit.client.0.vm03.stdout:4/462: creat d5/d6e/f93 x:0 0 0 2026-03-10T14:08:27.898 INFO:tasks.workunit.client.0.vm03.stdout:8/448: symlink da/d36/d40/l8d 0 2026-03-10T14:08:27.902 INFO:tasks.workunit.client.0.vm03.stdout:2/421: rename d5/d10/d1c/c78 to d5/d10/d1c/d40/d59/d7b/c7f 0 2026-03-10T14:08:27.909 INFO:tasks.workunit.client.0.vm03.stdout:6/414: dwrite d8/db/d49/d6c/d32/d3a/f5e [4194304,4194304] 0 2026-03-10T14:08:27.923 INFO:tasks.workunit.client.0.vm03.stdout:1/437: truncate d0/d2/d34/f73 488237 0 2026-03-10T14:08:27.923 INFO:tasks.workunit.client.0.vm03.stdout:1/438: chown d0/d2/df/d27/l5c 206 1 2026-03-10T14:08:27.924 INFO:tasks.workunit.client.0.vm03.stdout:7/350: mknod d5/d9/d14/d26/d5f/c71 0 2026-03-10T14:08:27.925 INFO:tasks.workunit.client.0.vm03.stdout:4/463: creat d5/d9/d2b/f94 x:0 0 0 2026-03-10T14:08:27.931 INFO:tasks.workunit.client.0.vm03.stdout:9/459: getdents d2/d14/d2b/d43 0 2026-03-10T14:08:27.934 INFO:tasks.workunit.client.0.vm03.stdout:9/460: dwrite d2/d14/d2b/d79/d8a/f8d [0,4194304] 0 2026-03-10T14:08:27.936 INFO:tasks.workunit.client.0.vm03.stdout:0/410: link d3/d46/f63 d3/d46/f82 0 2026-03-10T14:08:27.936 INFO:tasks.workunit.client.0.vm03.stdout:0/411: fsync d3/d46/d5e/f75 0 2026-03-10T14:08:27.942 INFO:tasks.workunit.client.0.vm03.stdout:2/422: mknod d5/d35/c80 0 2026-03-10T14:08:27.952 INFO:tasks.workunit.client.0.vm03.stdout:7/351: read d5/d9/d14/d21/d28/f37 [1284027,105906] 0 2026-03-10T14:08:27.952 INFO:tasks.workunit.client.0.vm03.stdout:7/352: stat d5/d9/d14/l44 0 2026-03-10T14:08:27.957 INFO:tasks.workunit.client.0.vm03.stdout:2/423: rmdir d5/d10/d1c 39 2026-03-10T14:08:27.960 INFO:tasks.workunit.client.0.vm03.stdout:3/434: link d1d/d33/l3e d1d/d29/d41/d45/d55/l84 0 2026-03-10T14:08:27.960 INFO:tasks.workunit.client.0.vm03.stdout:3/435: stat d1d/d59/f69 0 2026-03-10T14:08:27.962 INFO:tasks.workunit.client.0.vm03.stdout:7/353: stat d5/d9/d14/d21/d28/c2a 0 2026-03-10T14:08:27.965 INFO:tasks.workunit.client.0.vm03.stdout:0/412: mknod d3/d11/c83 0 2026-03-10T14:08:27.968 INFO:tasks.workunit.client.0.vm03.stdout:6/415: write d8/fe [1361435,30078] 0 2026-03-10T14:08:27.969 INFO:tasks.workunit.client.0.vm03.stdout:1/439: write d0/d2/df/d16/d20/f72 [686056,30701] 0 2026-03-10T14:08:27.969 INFO:tasks.workunit.client.0.vm03.stdout:4/464: write d5/d9/db/f28 [2415150,121649] 0 2026-03-10T14:08:27.970 INFO:tasks.workunit.client.0.vm03.stdout:9/461: write d2/d29/d33/f70 [674629,82344] 0 2026-03-10T14:08:27.971 INFO:tasks.workunit.client.0.vm03.stdout:6/416: dread - d8/d1b/d1c/f73 zero size 2026-03-10T14:08:27.971 INFO:tasks.workunit.client.0.vm03.stdout:4/465: chown d5/d9/f25 7287625 1 2026-03-10T14:08:27.973 INFO:tasks.workunit.client.0.vm03.stdout:5/532: dwrite d4/d16/d19/d6e/f57 [0,4194304] 0 2026-03-10T14:08:27.975 INFO:tasks.workunit.client.0.vm03.stdout:8/449: dwrite da/f62 [0,4194304] 0 2026-03-10T14:08:27.979 INFO:tasks.workunit.client.0.vm03.stdout:6/417: dread d8/d11/d18/f7d [0,4194304] 0 2026-03-10T14:08:27.979 INFO:tasks.workunit.client.0.vm03.stdout:6/418: stat d8/d1b/f61 0 2026-03-10T14:08:27.991 INFO:tasks.workunit.client.0.vm03.stdout:2/424: truncate d5/d10/d1c/d40/f52 332688 0 2026-03-10T14:08:27.992 INFO:tasks.workunit.client.0.vm03.stdout:2/425: chown d5/d10/d31/f38 18882 1 2026-03-10T14:08:27.998 INFO:tasks.workunit.client.0.vm03.stdout:9/462: fdatasync d2/fc 0 2026-03-10T14:08:28.001 INFO:tasks.workunit.client.0.vm03.stdout:1/440: mknod d0/d42/c8b 0 2026-03-10T14:08:28.003 INFO:tasks.workunit.client.0.vm03.stdout:8/450: dread da/d36/d40/f47 [0,4194304] 0 2026-03-10T14:08:28.023 INFO:tasks.workunit.client.0.vm03.stdout:3/436: link d1d/f2b d1d/d29/d41/d45/d55/f85 0 2026-03-10T14:08:28.028 INFO:tasks.workunit.client.0.vm03.stdout:6/419: dwrite d8/d1b/d1c/f50 [0,4194304] 0 2026-03-10T14:08:28.037 INFO:tasks.workunit.client.0.vm03.stdout:7/354: truncate d5/d9/f30 3056319 0 2026-03-10T14:08:28.049 INFO:tasks.workunit.client.0.vm03.stdout:4/466: read d5/d9/db/f3d [1279875,39120] 0 2026-03-10T14:08:28.049 INFO:tasks.workunit.client.0.vm03.stdout:4/467: chown d5/d9/db/f12 724089993 1 2026-03-10T14:08:28.052 INFO:tasks.workunit.client.0.vm03.stdout:4/468: dwrite d5/d9/f16 [4194304,4194304] 0 2026-03-10T14:08:28.053 INFO:tasks.workunit.client.0.vm03.stdout:1/441: rmdir d0/d18/d3b 39 2026-03-10T14:08:28.053 INFO:tasks.workunit.client.0.vm03.stdout:1/442: chown d0/d18/f75 408 1 2026-03-10T14:08:28.053 INFO:tasks.workunit.client.0.vm03.stdout:4/469: read - d5/d9/db/d19/f3a zero size 2026-03-10T14:08:28.055 INFO:tasks.workunit.client.0.vm03.stdout:8/451: stat da/c10 0 2026-03-10T14:08:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:27 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:27 vm04.local ceph-mon[55966]: from='mgr.24413 192.168.123.104:0/2489071061' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr fail", "who": "vm04.ywwcto"}]: dispatch 2026-03-10T14:08:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:27 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "mgr fail", "who": "vm04.ywwcto"}]: dispatch 2026-03-10T14:08:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:27 vm04.local ceph-mon[55966]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T14:08:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:27 vm04.local ceph-mon[55966]: from='mgr.24413 ' entity='mgr.vm04.ywwcto' cmd='[{"prefix": "mgr fail", "who": "vm04.ywwcto"}]': finished 2026-03-10T14:08:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:27 vm04.local ceph-mon[55966]: mgrmap e26: vm03.rwbbep(active, starting, since 0.0142565s) 2026-03-10T14:08:28.063 INFO:tasks.workunit.client.0.vm03.stdout:5/533: link d4/d16/f1c d4/d16/fac 0 2026-03-10T14:08:28.065 INFO:tasks.workunit.client.0.vm03.stdout:0/413: symlink d3/d11/l84 0 2026-03-10T14:08:28.068 INFO:tasks.workunit.client.0.vm03.stdout:0/414: dwrite d3/d17/f6d [0,4194304] 0 2026-03-10T14:08:28.069 INFO:tasks.workunit.client.0.vm03.stdout:3/437: creat d1d/d33/d47/d53/d68/f86 x:0 0 0 2026-03-10T14:08:28.069 INFO:tasks.workunit.client.0.vm03.stdout:3/438: chown d1d/d33 71 1 2026-03-10T14:08:28.084 INFO:tasks.workunit.client.0.vm03.stdout:6/420: dread d8/db/d12/f57 [0,4194304] 0 2026-03-10T14:08:28.086 INFO:tasks.workunit.client.0.vm03.stdout:6/421: dread d8/d11/d18/f34 [0,4194304] 0 2026-03-10T14:08:28.093 INFO:tasks.workunit.client.0.vm03.stdout:6/422: dwrite d8/d1b/f31 [0,4194304] 0 2026-03-10T14:08:28.101 INFO:tasks.workunit.client.0.vm03.stdout:7/355: mknod d5/d9/d14/d26/d39/c72 0 2026-03-10T14:08:28.107 INFO:tasks.workunit.client.0.vm03.stdout:2/426: truncate d5/d10/f4c 566492 0 2026-03-10T14:08:28.108 INFO:tasks.workunit.client.0.vm03.stdout:2/427: write d5/d10/f13 [2132969,96416] 0 2026-03-10T14:08:28.116 INFO:tasks.workunit.client.0.vm03.stdout:9/463: rename d2/d29/d33/d55/f5f to d2/d29/d33/d41/d5c/f93 0 2026-03-10T14:08:28.121 INFO:tasks.workunit.client.0.vm03.stdout:8/452: symlink da/d3c/d4b/l8e 0 2026-03-10T14:08:28.130 INFO:tasks.workunit.client.0.vm03.stdout:3/439: creat d1d/d29/f87 x:0 0 0 2026-03-10T14:08:28.131 INFO:tasks.workunit.client.0.vm03.stdout:4/470: dread d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:28.154 INFO:tasks.workunit.client.0.vm03.stdout:6/423: rename d8/fe to d8/d1b/f7f 0 2026-03-10T14:08:28.160 INFO:tasks.workunit.client.0.vm03.stdout:9/464: mknod d2/d29/d38/c94 0 2026-03-10T14:08:28.168 INFO:tasks.workunit.client.0.vm03.stdout:0/415: creat d3/d11/d2c/d4a/d7b/f85 x:0 0 0 2026-03-10T14:08:28.172 INFO:tasks.workunit.client.0.vm03.stdout:8/453: write f5 [2812492,52729] 0 2026-03-10T14:08:28.173 INFO:tasks.workunit.client.0.vm03.stdout:3/440: write d1d/f4a [1293712,99210] 0 2026-03-10T14:08:28.173 INFO:tasks.workunit.client.0.vm03.stdout:3/441: stat d1d/d29/d41/d45/f6a 0 2026-03-10T14:08:28.174 INFO:tasks.workunit.client.0.vm03.stdout:4/471: readlink d5/l2d 0 2026-03-10T14:08:28.174 INFO:tasks.workunit.client.0.vm03.stdout:7/356: symlink d5/d9/d14/d26/l73 0 2026-03-10T14:08:28.175 INFO:tasks.workunit.client.0.vm03.stdout:7/357: dread - d5/d9/d14/d26/d39/f63 zero size 2026-03-10T14:08:28.182 INFO:tasks.workunit.client.0.vm03.stdout:9/465: mkdir d2/d29/d33/d41/d95 0 2026-03-10T14:08:28.187 INFO:tasks.workunit.client.0.vm03.stdout:5/534: creat d4/d13/fad x:0 0 0 2026-03-10T14:08:28.191 INFO:tasks.workunit.client.0.vm03.stdout:0/416: dwrite d3/f19 [4194304,4194304] 0 2026-03-10T14:08:28.193 INFO:tasks.workunit.client.0.vm03.stdout:8/454: truncate da/d24/f32 4272 0 2026-03-10T14:08:28.201 INFO:tasks.workunit.client.0.vm03.stdout:4/472: creat d5/d9/db/d19/d38/d53/f95 x:0 0 0 2026-03-10T14:08:28.207 INFO:tasks.workunit.client.0.vm03.stdout:7/358: rename d5/d9/d14/d21/d28/c54 to d5/d9/d14/d26/d5f/c74 0 2026-03-10T14:08:28.210 INFO:tasks.workunit.client.0.vm03.stdout:1/443: getdents d0/d2/d71 0 2026-03-10T14:08:28.215 INFO:tasks.workunit.client.0.vm03.stdout:0/417: truncate d3/d46/f82 883825 0 2026-03-10T14:08:28.226 INFO:tasks.workunit.client.0.vm03.stdout:0/418: dread d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:28.226 INFO:tasks.workunit.client.0.vm03.stdout:0/419: write d3/d46/d5e/f75 [340714,94558] 0 2026-03-10T14:08:28.227 INFO:tasks.workunit.client.0.vm03.stdout:3/442: creat d1d/d33/d65/d5d/f88 x:0 0 0 2026-03-10T14:08:28.233 INFO:tasks.workunit.client.0.vm03.stdout:6/424: truncate d8/db/d49/d6c/d32/d3a/f5e 4155927 0 2026-03-10T14:08:28.247 INFO:tasks.workunit.client.0.vm03.stdout:8/455: dwrite da/d3a/d44/f6d [0,4194304] 0 2026-03-10T14:08:28.247 INFO:tasks.workunit.client.0.vm03.stdout:2/428: link d5/f4d d5/d35/f81 0 2026-03-10T14:08:28.247 INFO:tasks.workunit.client.0.vm03.stdout:2/429: dread - d5/d10/d1f/f3a zero size 2026-03-10T14:08:28.247 INFO:tasks.workunit.client.0.vm03.stdout:1/444: unlink d0/d42/l45 0 2026-03-10T14:08:28.255 INFO:tasks.workunit.client.0.vm03.stdout:5/535: rename d4/d6/de/l5d to d4/d13/d8f/lae 0 2026-03-10T14:08:28.258 INFO:tasks.workunit.client.0.vm03.stdout:5/536: dwrite d4/d13/d1f/d8c/fa4 [0,4194304] 0 2026-03-10T14:08:28.260 INFO:tasks.workunit.client.0.vm03.stdout:5/537: fdatasync d4/d16/d19/d4a/faa 0 2026-03-10T14:08:28.260 INFO:tasks.workunit.client.0.vm03.stdout:5/538: chown d4/d16/d19 10508085 1 2026-03-10T14:08:28.272 INFO:tasks.workunit.client.0.vm03.stdout:7/359: sync 2026-03-10T14:08:28.272 INFO:tasks.workunit.client.0.vm03.stdout:8/456: sync 2026-03-10T14:08:28.276 INFO:tasks.workunit.client.0.vm03.stdout:8/457: dwrite da/d3a/d44/f46 [0,4194304] 0 2026-03-10T14:08:28.278 INFO:tasks.workunit.client.0.vm03.stdout:4/473: mkdir d5/d96 0 2026-03-10T14:08:28.288 INFO:tasks.workunit.client.0.vm03.stdout:0/420: fsync d3/d46/d5e/f75 0 2026-03-10T14:08:28.293 INFO:tasks.workunit.client.0.vm03.stdout:9/466: rename d2/d14/f25 to d2/d14/f96 0 2026-03-10T14:08:28.303 INFO:tasks.workunit.client.0.vm03.stdout:3/443: rename d1d/d39/c7b to d1d/d39/d51/d72/c89 0 2026-03-10T14:08:28.337 INFO:tasks.workunit.client.0.vm03.stdout:0/421: creat d3/d11/d66/f86 x:0 0 0 2026-03-10T14:08:28.340 INFO:tasks.workunit.client.0.vm03.stdout:7/360: rename d5/d9/d14/d26/d39/c55 to d5/d9/d3e/c75 0 2026-03-10T14:08:28.342 INFO:tasks.workunit.client.0.vm03.stdout:6/425: dread f0 [0,4194304] 0 2026-03-10T14:08:28.358 INFO:tasks.workunit.client.0.vm03.stdout:3/444: mkdir d1d/d29/d8a 0 2026-03-10T14:08:28.360 INFO:tasks.workunit.client.0.vm03.stdout:2/430: write d5/d10/f16 [4787796,95955] 0 2026-03-10T14:08:28.362 INFO:tasks.workunit.client.0.vm03.stdout:1/445: dwrite d0/d2/df/d27/f32 [0,4194304] 0 2026-03-10T14:08:28.364 INFO:tasks.workunit.client.0.vm03.stdout:4/474: write d5/d9/db/d19/f51 [3415134,34510] 0 2026-03-10T14:08:28.364 INFO:tasks.workunit.client.0.vm03.stdout:1/446: stat d0/d2/df/d27/f32 0 2026-03-10T14:08:28.418 INFO:tasks.workunit.client.0.vm03.stdout:8/458: fdatasync da/d3c/d4b/d69/f83 0 2026-03-10T14:08:28.419 INFO:tasks.workunit.client.0.vm03.stdout:9/467: symlink d2/d14/d2b/l97 0 2026-03-10T14:08:28.420 INFO:tasks.workunit.client.0.vm03.stdout:9/468: truncate d2/f54 4534116 0 2026-03-10T14:08:28.420 INFO:tasks.workunit.client.0.vm03.stdout:9/469: chown d2/d29/d38/l3b 54786 1 2026-03-10T14:08:28.425 INFO:tasks.workunit.client.0.vm03.stdout:0/422: creat d3/d11/d2c/d4a/d7b/f87 x:0 0 0 2026-03-10T14:08:28.429 INFO:tasks.workunit.client.0.vm03.stdout:0/423: dwrite d3/d16/d21/d3c/f7e [0,4194304] 0 2026-03-10T14:08:28.436 INFO:tasks.workunit.client.0.vm03.stdout:7/361: mknod d5/d9/d14/d26/d5f/c76 0 2026-03-10T14:08:28.447 INFO:tasks.workunit.client.0.vm03.stdout:6/426: mkdir d8/d11/d18/d79/d80 0 2026-03-10T14:08:28.450 INFO:tasks.workunit.client.0.vm03.stdout:5/539: write d4/d35/f92 [1243948,113586] 0 2026-03-10T14:08:28.454 INFO:tasks.workunit.client.0.vm03.stdout:5/540: readlink d4/d16/d19/l75 0 2026-03-10T14:08:28.460 INFO:tasks.workunit.client.0.vm03.stdout:1/447: stat d0/f48 0 2026-03-10T14:08:28.461 INFO:tasks.workunit.client.0.vm03.stdout:4/475: creat d5/d9/f97 x:0 0 0 2026-03-10T14:08:28.461 INFO:tasks.workunit.client.0.vm03.stdout:8/459: creat da/d36/d4d/f8f x:0 0 0 2026-03-10T14:08:28.463 INFO:tasks.workunit.client.0.vm03.stdout:0/424: dread - d3/d4d/d30/f69 zero size 2026-03-10T14:08:28.465 INFO:tasks.workunit.client.0.vm03.stdout:1/448: unlink d0/d2/df/d27/c77 0 2026-03-10T14:08:28.465 INFO:tasks.workunit.client.0.vm03.stdout:1/449: write d0/d18/d1d/f84 [387515,90714] 0 2026-03-10T14:08:28.467 INFO:tasks.workunit.client.0.vm03.stdout:9/470: creat d2/d29/d33/d60/d8c/f98 x:0 0 0 2026-03-10T14:08:28.468 INFO:tasks.workunit.client.0.vm03.stdout:8/460: fsync da/fe 0 2026-03-10T14:08:28.471 INFO:tasks.workunit.client.0.vm03.stdout:8/461: mknod da/d36/d40/d50/d70/d7d/c90 0 2026-03-10T14:08:28.472 INFO:tasks.workunit.client.0.vm03.stdout:0/425: creat d3/d46/d54/d79/f88 x:0 0 0 2026-03-10T14:08:28.472 INFO:tasks.workunit.client.0.vm03.stdout:6/427: rename d8/db/d12/f44 to d8/db/d12/d51/f81 0 2026-03-10T14:08:28.475 INFO:tasks.workunit.client.0.vm03.stdout:5/541: sync 2026-03-10T14:08:28.500 INFO:tasks.workunit.client.0.vm03.stdout:8/462: creat da/d3a/d44/d64/f91 x:0 0 0 2026-03-10T14:08:28.501 INFO:tasks.workunit.client.0.vm03.stdout:3/445: dwrite fc [0,4194304] 0 2026-03-10T14:08:28.503 INFO:tasks.workunit.client.0.vm03.stdout:3/446: stat d1d/c2a 0 2026-03-10T14:08:28.505 INFO:tasks.workunit.client.0.vm03.stdout:2/431: write d5/f4d [391223,37899] 0 2026-03-10T14:08:28.507 INFO:tasks.workunit.client.0.vm03.stdout:3/447: dwrite d1d/d29/f87 [0,4194304] 0 2026-03-10T14:08:28.514 INFO:tasks.workunit.client.0.vm03.stdout:6/428: dread d8/f6e [0,4194304] 0 2026-03-10T14:08:28.515 INFO:tasks.workunit.client.0.vm03.stdout:6/429: stat d8/db/d49/d6c/d32 0 2026-03-10T14:08:28.520 INFO:tasks.workunit.client.0.vm03.stdout:5/542: mknod d4/d13/d1f/d8c/da7/caf 0 2026-03-10T14:08:28.525 INFO:tasks.workunit.client.0.vm03.stdout:7/362: write d5/d9/d14/d26/d39/f45 [2119120,83135] 0 2026-03-10T14:08:28.528 INFO:tasks.workunit.client.0.vm03.stdout:8/463: truncate da/d24/f3d 490128 0 2026-03-10T14:08:28.530 INFO:tasks.workunit.client.0.vm03.stdout:2/432: creat d5/d10/d1c/d54/d5f/f82 x:0 0 0 2026-03-10T14:08:28.538 INFO:tasks.workunit.client.0.vm03.stdout:4/476: dwrite d5/d9/db/d19/d38/d53/f83 [0,4194304] 0 2026-03-10T14:08:28.539 INFO:tasks.workunit.client.0.vm03.stdout:1/450: write d0/d2/df/d16/d20/f38 [3515189,10263] 0 2026-03-10T14:08:28.539 INFO:tasks.workunit.client.0.vm03.stdout:1/451: dwrite d0/d2/f46 [0,4194304] 0 2026-03-10T14:08:28.548 INFO:tasks.workunit.client.0.vm03.stdout:9/471: write d2/f15 [1324771,121768] 0 2026-03-10T14:08:28.549 INFO:tasks.workunit.client.0.vm03.stdout:5/543: unlink d4/f55 0 2026-03-10T14:08:28.550 INFO:tasks.workunit.client.0.vm03.stdout:7/363: dread - d5/d9/d14/d26/f5c zero size 2026-03-10T14:08:28.550 INFO:tasks.workunit.client.0.vm03.stdout:7/364: chown d5 26596578 1 2026-03-10T14:08:28.551 INFO:tasks.workunit.client.0.vm03.stdout:5/544: write d4/d13/d1f/d8c/fa4 [1572642,105964] 0 2026-03-10T14:08:28.556 INFO:tasks.workunit.client.0.vm03.stdout:8/464: mknod da/d3c/d4b/c92 0 2026-03-10T14:08:28.557 INFO:tasks.workunit.client.0.vm03.stdout:3/448: dread d1d/f2b [0,4194304] 0 2026-03-10T14:08:28.568 INFO:tasks.workunit.client.0.vm03.stdout:0/426: write d3/f10 [1075765,51655] 0 2026-03-10T14:08:28.570 INFO:tasks.workunit.client.0.vm03.stdout:0/427: write d3/d17/f6d [2145332,82919] 0 2026-03-10T14:08:28.572 INFO:tasks.workunit.client.0.vm03.stdout:5/545: unlink d4/d13/d1f/c46 0 2026-03-10T14:08:28.574 INFO:tasks.workunit.client.0.vm03.stdout:2/433: mkdir d5/d10/d1c/d50/d74/d83 0 2026-03-10T14:08:28.574 INFO:tasks.workunit.client.0.vm03.stdout:2/434: fdatasync d5/d35/f49 0 2026-03-10T14:08:28.575 INFO:tasks.workunit.client.0.vm03.stdout:3/449: fsync d1d/f2b 0 2026-03-10T14:08:28.576 INFO:tasks.workunit.client.0.vm03.stdout:2/435: write d5/d35/f81 [402541,118091] 0 2026-03-10T14:08:28.578 INFO:tasks.workunit.client.0.vm03.stdout:4/477: creat d5/d96/f98 x:0 0 0 2026-03-10T14:08:28.587 INFO:tasks.workunit.client.0.vm03.stdout:0/428: fsync d3/d4d/d47/f48 0 2026-03-10T14:08:28.588 INFO:tasks.workunit.client.0.vm03.stdout:0/429: readlink d3/d11/d2c/d4a/l6f 0 2026-03-10T14:08:28.589 INFO:tasks.workunit.client.0.vm03.stdout:3/450: read d1d/d29/d41/d45/f6a [2400439,12516] 0 2026-03-10T14:08:28.593 INFO:tasks.workunit.client.0.vm03.stdout:3/451: dread d1d/f2b [0,4194304] 0 2026-03-10T14:08:28.595 INFO:tasks.workunit.client.0.vm03.stdout:5/546: creat d4/d13/d73/fb0 x:0 0 0 2026-03-10T14:08:28.603 INFO:tasks.workunit.client.0.vm03.stdout:9/472: dwrite d2/d29/d38/f51 [0,4194304] 0 2026-03-10T14:08:28.609 INFO:tasks.workunit.client.0.vm03.stdout:4/478: rename d5/d47/d62/d7e to d5/d9/db/d19/d99 0 2026-03-10T14:08:28.612 INFO:tasks.workunit.client.0.vm03.stdout:4/479: dwrite d5/d96/f98 [0,4194304] 0 2026-03-10T14:08:28.615 INFO:tasks.workunit.client.0.vm03.stdout:1/452: mknod d0/d18/d3b/c8c 0 2026-03-10T14:08:28.642 INFO:tasks.workunit.client.0.vm03.stdout:0/430: mkdir d3/d11/d2c/d4a/d4b/d89 0 2026-03-10T14:08:28.643 INFO:tasks.workunit.client.0.vm03.stdout:6/430: dread d8/db/d12/f26 [0,4194304] 0 2026-03-10T14:08:28.643 INFO:tasks.workunit.client.0.vm03.stdout:0/431: fdatasync d3/f9 0 2026-03-10T14:08:28.652 INFO:tasks.workunit.client.0.vm03.stdout:7/365: symlink d5/d9/d14/d26/d36/d51/l77 0 2026-03-10T14:08:28.652 INFO:tasks.workunit.client.0.vm03.stdout:7/366: chown d5/d9/f19 28712 1 2026-03-10T14:08:28.656 INFO:tasks.workunit.client.0.vm03.stdout:1/453: sync 2026-03-10T14:08:28.661 INFO:tasks.workunit.client.0.vm03.stdout:3/452: creat d1d/d33/d65/d48/f8b x:0 0 0 2026-03-10T14:08:28.683 INFO:tasks.workunit.client.0.vm03.stdout:9/473: read d2/d14/d2b/f68 [1979970,20629] 0 2026-03-10T14:08:28.698 INFO:tasks.workunit.client.0.vm03.stdout:0/432: symlink d3/d11/d2c/d4a/d4b/l8a 0 2026-03-10T14:08:28.704 INFO:tasks.workunit.client.0.vm03.stdout:5/547: creat d4/d13/d1f/fb1 x:0 0 0 2026-03-10T14:08:28.711 INFO:tasks.workunit.client.0.vm03.stdout:7/367: creat d5/d9/d14/d26/d5f/f78 x:0 0 0 2026-03-10T14:08:28.712 INFO:tasks.workunit.client.0.vm03.stdout:1/454: creat d0/d2/df/d27/d7e/d81/f8d x:0 0 0 2026-03-10T14:08:28.712 INFO:tasks.workunit.client.0.vm03.stdout:7/368: write d5/f66 [653116,104355] 0 2026-03-10T14:08:28.716 INFO:tasks.workunit.client.0.vm03.stdout:5/548: dwrite d4/d13/d1f/fb1 [0,4194304] 0 2026-03-10T14:08:28.719 INFO:tasks.workunit.client.0.vm03.stdout:5/549: truncate d4/d13/d1f/d8c/da7/fab 268905 0 2026-03-10T14:08:28.730 INFO:tasks.workunit.client.0.vm03.stdout:2/436: link d5/d35/c80 d5/d10/d1c/d50/d74/c84 0 2026-03-10T14:08:28.731 INFO:tasks.workunit.client.0.vm03.stdout:2/437: dread - d5/d10/d1f/f3a zero size 2026-03-10T14:08:28.733 INFO:tasks.workunit.client.0.vm03.stdout:8/465: creat da/d36/d6a/f93 x:0 0 0 2026-03-10T14:08:28.737 INFO:tasks.workunit.client.0.vm03.stdout:3/453: rmdir d1d/d33/d65 39 2026-03-10T14:08:28.739 INFO:tasks.workunit.client.0.vm03.stdout:9/474: mknod d2/d14/d2b/d79/c99 0 2026-03-10T14:08:28.741 INFO:tasks.workunit.client.0.vm03.stdout:4/480: mknod d5/c9a 0 2026-03-10T14:08:28.749 INFO:tasks.workunit.client.0.vm03.stdout:0/433: unlink d3/d16/d21/d3c/f7e 0 2026-03-10T14:08:28.749 INFO:tasks.workunit.client.0.vm03.stdout:0/434: chown d3/d4d/c62 30772 1 2026-03-10T14:08:28.752 INFO:tasks.workunit.client.0.vm03.stdout:6/431: write d8/db/d12/f40 [191704,63607] 0 2026-03-10T14:08:28.762 INFO:tasks.workunit.client.0.vm03.stdout:5/550: creat d4/d16/d19/d4a/fb2 x:0 0 0 2026-03-10T14:08:28.764 INFO:tasks.workunit.client.0.vm03.stdout:1/455: dread d0/d2/df/f1f [4194304,4194304] 0 2026-03-10T14:08:28.766 INFO:tasks.workunit.client.0.vm03.stdout:2/438: stat d5/d10/d1c/d40/d59/d7b/c7f 0 2026-03-10T14:08:28.768 INFO:tasks.workunit.client.0.vm03.stdout:8/466: symlink da/d3c/d4b/d69/l94 0 2026-03-10T14:08:28.769 INFO:tasks.workunit.client.0.vm03.stdout:8/467: stat da/d58/d6c/f76 0 2026-03-10T14:08:28.769 INFO:tasks.workunit.client.0.vm03.stdout:8/468: readlink da/d58/l5c 0 2026-03-10T14:08:28.770 INFO:tasks.workunit.client.0.vm03.stdout:8/469: write da/d36/d4d/f8f [773040,57495] 0 2026-03-10T14:08:28.780 INFO:tasks.workunit.client.0.vm03.stdout:4/481: unlink d5/d9/db/c2c 0 2026-03-10T14:08:28.786 INFO:tasks.workunit.client.0.vm03.stdout:0/435: rmdir d3/d4d/d47 39 2026-03-10T14:08:28.793 INFO:tasks.workunit.client.0.vm03.stdout:6/432: write f5 [4897773,81649] 0 2026-03-10T14:08:28.794 INFO:tasks.workunit.client.0.vm03.stdout:6/433: write d8/d1b/f31 [2739439,54936] 0 2026-03-10T14:08:28.803 INFO:tasks.workunit.client.0.vm03.stdout:7/369: symlink d5/d9/d14/d26/d36/l79 0 2026-03-10T14:08:28.805 INFO:tasks.workunit.client.0.vm03.stdout:1/456: fdatasync d0/d2/d34/f56 0 2026-03-10T14:08:28.806 INFO:tasks.workunit.client.0.vm03.stdout:1/457: chown d0/d2/df/d16/d41/l4a 4091 1 2026-03-10T14:08:28.806 INFO:tasks.workunit.client.0.vm03.stdout:1/458: stat d0/d2/df/f1b 0 2026-03-10T14:08:28.807 INFO:tasks.workunit.client.0.vm03.stdout:1/459: stat d0/d18/d3b/c8c 0 2026-03-10T14:08:28.808 INFO:tasks.workunit.client.0.vm03.stdout:5/551: dread d4/d13/d1f/f20 [4194304,4194304] 0 2026-03-10T14:08:28.811 INFO:tasks.workunit.client.0.vm03.stdout:2/439: mkdir d5/d10/d1c/d40/d59/d85 0 2026-03-10T14:08:28.821 INFO:tasks.workunit.client.0.vm03.stdout:8/470: dwrite da/f1f [0,4194304] 0 2026-03-10T14:08:28.829 INFO:tasks.workunit.client.0.vm03.stdout:8/471: dwrite da/f15 [8388608,4194304] 0 2026-03-10T14:08:28.845 INFO:tasks.workunit.client.0.vm03.stdout:9/475: mkdir d2/d29/d9a 0 2026-03-10T14:08:28.846 INFO:tasks.workunit.client.0.vm03.stdout:9/476: write d2/d29/d33/d60/d8c/f98 [480777,7382] 0 2026-03-10T14:08:28.846 INFO:tasks.workunit.client.0.vm03.stdout:9/477: write d2/d14/d2b/d79/d8a/f8d [4058569,14186] 0 2026-03-10T14:08:28.855 INFO:tasks.workunit.client.0.vm03.stdout:4/482: creat d5/d9/db/d19/d34/f9b x:0 0 0 2026-03-10T14:08:28.868 INFO:tasks.workunit.client.0.vm03.stdout:6/434: creat d8/db/d12/d51/d5c/d60/f82 x:0 0 0 2026-03-10T14:08:28.876 INFO:tasks.workunit.client.0.vm03.stdout:6/435: dread d8/d1b/f3d [0,4194304] 0 2026-03-10T14:08:28.883 INFO:tasks.workunit.client.0.vm03.stdout:5/552: truncate d4/d16/d19/d4a/f81 78321 0 2026-03-10T14:08:28.893 INFO:tasks.workunit.client.0.vm03.stdout:2/440: dwrite d5/f2d [4194304,4194304] 0 2026-03-10T14:08:28.894 INFO:tasks.workunit.client.0.vm03.stdout:2/441: chown d5/d35/f81 2728496 1 2026-03-10T14:08:28.920 INFO:tasks.workunit.client.0.vm03.stdout:9/478: symlink d2/d29/d33/d60/l9b 0 2026-03-10T14:08:28.922 INFO:tasks.workunit.client.0.vm03.stdout:4/483: rmdir d5/d47/d5b/d64/d85 39 2026-03-10T14:08:28.931 INFO:tasks.workunit.client.0.vm03.stdout:6/436: mkdir d8/db/d49/d6c/d83 0 2026-03-10T14:08:28.941 INFO:tasks.workunit.client.0.vm03.stdout:3/454: link d1d/d29/d41/d45/c57 d1d/d33/d65/d5d/c8c 0 2026-03-10T14:08:28.944 INFO:tasks.workunit.client.0.vm03.stdout:3/455: dread d1d/d29/f87 [0,4194304] 0 2026-03-10T14:08:28.950 INFO:tasks.workunit.client.0.vm03.stdout:2/442: creat d5/d10/d1c/d40/f86 x:0 0 0 2026-03-10T14:08:28.952 INFO:tasks.workunit.client.0.vm03.stdout:3/456: dread d1d/d29/d41/d45/f6a [0,4194304] 0 2026-03-10T14:08:28.954 INFO:tasks.workunit.client.0.vm03.stdout:5/553: write d4/d13/d43/f72 [946695,120840] 0 2026-03-10T14:08:28.958 INFO:tasks.workunit.client.0.vm03.stdout:5/554: dwrite d4/d13/d73/f77 [0,4194304] 0 2026-03-10T14:08:28.971 INFO:tasks.workunit.client.0.vm03.stdout:9/479: mkdir d2/d29/d33/d55/d9c 0 2026-03-10T14:08:28.972 INFO:tasks.workunit.client.0.vm03.stdout:9/480: fdatasync d2/d29/d33/d41/f7b 0 2026-03-10T14:08:28.976 INFO:tasks.workunit.client.0.vm03.stdout:0/436: creat d3/d4d/d30/f8b x:0 0 0 2026-03-10T14:08:28.977 INFO:tasks.workunit.client.0.vm03.stdout:0/437: dread - d3/d11/d66/f86 zero size 2026-03-10T14:08:28.978 INFO:tasks.workunit.client.0.vm03.stdout:0/438: dread d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:28.988 INFO:tasks.workunit.client.0.vm03.stdout:7/370: rename d5/d9/d14/d21/f29 to d5/d9/f7a 0 2026-03-10T14:08:28.991 INFO:tasks.workunit.client.0.vm03.stdout:6/437: mkdir d8/d1b/d1c/d84 0 2026-03-10T14:08:28.992 INFO:tasks.workunit.client.0.vm03.stdout:6/438: chown d8/db/d49/d6c/d32/f41 10126 1 2026-03-10T14:08:29.016 INFO:tasks.workunit.client.0.vm03.stdout:2/443: symlink d5/d10/d1f/d4f/l87 0 2026-03-10T14:08:29.038 INFO:tasks.workunit.client.0.vm03.stdout:5/555: rmdir d4/d13/d73 39 2026-03-10T14:08:29.042 INFO:tasks.workunit.client.0.vm03.stdout:4/484: link d5/d6e/f93 d5/d6e/f9c 0 2026-03-10T14:08:29.052 INFO:tasks.workunit.client.0.vm03.stdout:1/460: rename d0/d2/d34/f69 to d0/d42/f8e 0 2026-03-10T14:08:29.052 INFO:tasks.workunit.client.0.vm03.stdout:1/461: write d0/d18/d1d/f84 [892988,101918] 0 2026-03-10T14:08:29.056 INFO:tasks.workunit.client.0.vm03.stdout:7/371: dread - d5/d9/d3e/f4e zero size 2026-03-10T14:08:29.057 INFO:tasks.workunit.client.0.vm03.stdout:6/439: truncate d8/db/d12/d51/d5c/f69 290409 0 2026-03-10T14:08:29.060 INFO:tasks.workunit.client.0.vm03.stdout:6/440: dread d8/db/d12/f57 [0,4194304] 0 2026-03-10T14:08:29.063 INFO:tasks.workunit.client.0.vm03.stdout:3/457: symlink d1d/d29/d41/d45/d5b/d7e/l8d 0 2026-03-10T14:08:29.064 INFO:tasks.workunit.client.0.vm03.stdout:5/556: symlink d4/d13/d1f/d8c/da7/lb3 0 2026-03-10T14:08:29.069 INFO:tasks.workunit.client.0.vm03.stdout:9/481: write d2/d29/d33/d41/f57 [1057098,107655] 0 2026-03-10T14:08:29.071 INFO:tasks.workunit.client.0.vm03.stdout:8/472: getdents da/d24 0 2026-03-10T14:08:29.076 INFO:tasks.workunit.client.0.vm03.stdout:1/462: mknod d0/d18/d1d/c8f 0 2026-03-10T14:08:29.081 INFO:tasks.workunit.client.0.vm03.stdout:0/439: rename d3/f1e to d3/d46/f8c 0 2026-03-10T14:08:29.082 INFO:tasks.workunit.client.0.vm03.stdout:1/463: chown d0/d2/df/d16/l19 852618 1 2026-03-10T14:08:29.082 INFO:tasks.workunit.client.0.vm03.stdout:1/464: chown d0/l57 26 1 2026-03-10T14:08:29.083 INFO:tasks.workunit.client.0.vm03.stdout:1/465: chown d0/d2/df/f1b 5612936 1 2026-03-10T14:08:29.083 INFO:tasks.workunit.client.0.vm03.stdout:1/466: chown d0/d2/df/d16/c79 568560581 1 2026-03-10T14:08:29.084 INFO:tasks.workunit.client.0.vm03.stdout:1/467: dread - d0/d18/d1d/f6f zero size 2026-03-10T14:08:29.084 INFO:tasks.workunit.client.0.vm03.stdout:7/372: mkdir d5/d9/d14/d26/d36/d51/d7b 0 2026-03-10T14:08:29.085 INFO:tasks.workunit.client.0.vm03.stdout:7/373: dread - d5/d9/d14/d26/d5f/f78 zero size 2026-03-10T14:08:29.086 INFO:tasks.workunit.client.0.vm03.stdout:6/441: mknod d8/d11/d18/d79/d80/c85 0 2026-03-10T14:08:29.090 INFO:tasks.workunit.client.0.vm03.stdout:2/444: rename d5/d10/d1c/c53 to d5/d10/d1c/d50/d74/c88 0 2026-03-10T14:08:29.090 INFO:tasks.workunit.client.0.vm03.stdout:0/440: chown d3/d4d/c3d 26 1 2026-03-10T14:08:29.092 INFO:tasks.workunit.client.0.vm03.stdout:1/468: mkdir d0/d2/d71/d90 0 2026-03-10T14:08:29.093 INFO:tasks.workunit.client.0.vm03.stdout:7/374: fdatasync d5/d9/d14/d26/d36/f3a 0 2026-03-10T14:08:29.097 INFO:tasks.workunit.client.0.vm03.stdout:0/441: rmdir d3/d16/d21/d3c 39 2026-03-10T14:08:29.100 INFO:tasks.workunit.client.0.vm03.stdout:5/557: sync 2026-03-10T14:08:29.100 INFO:tasks.workunit.client.0.vm03.stdout:8/473: sync 2026-03-10T14:08:29.101 INFO:tasks.workunit.client.0.vm03.stdout:8/474: dread da/d24/f43 [0,4194304] 0 2026-03-10T14:08:29.106 INFO:tasks.workunit.client.0.vm03.stdout:7/375: creat d5/d9/d35/f7c x:0 0 0 2026-03-10T14:08:29.110 INFO:tasks.workunit.client.0.vm03.stdout:4/485: write d5/d9/db/d19/d38/d53/d55/f75 [3866192,21690] 0 2026-03-10T14:08:29.111 INFO:tasks.workunit.client.0.vm03.stdout:4/486: fdatasync d5/d9/db/d19/f51 0 2026-03-10T14:08:29.111 INFO:tasks.workunit.client.0.vm03.stdout:4/487: stat d5/d9/l7f 0 2026-03-10T14:08:29.117 INFO:tasks.workunit.client.0.vm03.stdout:3/458: write d1d/f66 [406500,118899] 0 2026-03-10T14:08:29.117 INFO:tasks.workunit.client.0.vm03.stdout:3/459: chown d1d/d29/d41/d45 43326694 1 2026-03-10T14:08:29.124 INFO:tasks.workunit.client.0.vm03.stdout:9/482: truncate d2/d14/f30 3058184 0 2026-03-10T14:08:29.143 INFO:tasks.workunit.client.0.vm03.stdout:6/442: mkdir d8/db/d12/d51/d86 0 2026-03-10T14:08:29.154 INFO:tasks.workunit.client.0.vm03.stdout:5/558: symlink d4/d13/d1f/d8c/da7/lb4 0 2026-03-10T14:08:29.169 INFO:tasks.workunit.client.0.vm03.stdout:2/445: dwrite d5/f9 [0,4194304] 0 2026-03-10T14:08:29.174 INFO:tasks.workunit.client.0.vm03.stdout:8/475: creat da/d3c/d4b/d4c/f95 x:0 0 0 2026-03-10T14:08:29.178 INFO:tasks.workunit.client.0.vm03.stdout:1/469: mkdir d0/d2/df/d91 0 2026-03-10T14:08:29.183 INFO:tasks.workunit.client.0.vm03.stdout:4/488: creat d5/d47/d5b/d64/d85/f9d x:0 0 0 2026-03-10T14:08:29.184 INFO:tasks.workunit.client.0.vm03.stdout:3/460: creat d1d/d39/d51/d72/f8e x:0 0 0 2026-03-10T14:08:29.186 INFO:tasks.workunit.client.0.vm03.stdout:9/483: mkdir d2/d29/d33/d55/d72/d9d 0 2026-03-10T14:08:29.187 INFO:tasks.workunit.client.0.vm03.stdout:6/443: readlink d8/db/d12/l6b 0 2026-03-10T14:08:29.188 INFO:tasks.workunit.client.0.vm03.stdout:5/559: rename d4/d13/d43/f8b to d4/d16/d19/d4a/fb5 0 2026-03-10T14:08:29.189 INFO:tasks.workunit.client.0.vm03.stdout:5/560: truncate d4/f9e 20195 0 2026-03-10T14:08:29.189 INFO:tasks.workunit.client.0.vm03.stdout:5/561: dread - d4/d13/fad zero size 2026-03-10T14:08:29.191 INFO:tasks.workunit.client.0.vm03.stdout:7/376: mknod d5/d9/d14/d26/d36/d51/d7b/c7d 0 2026-03-10T14:08:29.199 INFO:tasks.workunit.client.0.vm03.stdout:1/470: creat d0/d2/df/d27/d7e/f92 x:0 0 0 2026-03-10T14:08:29.199 INFO:tasks.workunit.client.0.vm03.stdout:4/489: symlink d5/d47/d62/l9e 0 2026-03-10T14:08:29.199 INFO:tasks.workunit.client.0.vm03.stdout:3/461: unlink d1d/c73 0 2026-03-10T14:08:29.199 INFO:tasks.workunit.client.0.vm03.stdout:5/562: rename c3 to d4/d40/cb6 0 2026-03-10T14:08:29.199 INFO:tasks.workunit.client.0.vm03.stdout:5/563: chown d4/d13/d1f/f83 0 1 2026-03-10T14:08:29.199 INFO:tasks.workunit.client.0.vm03.stdout:2/446: mknod d5/d10/d1c/d40/d59/d7b/c89 0 2026-03-10T14:08:29.201 INFO:tasks.workunit.client.0.vm03.stdout:4/490: mknod d5/d9/db/d19/d38/d53/d55/c9f 0 2026-03-10T14:08:29.204 INFO:tasks.workunit.client.0.vm03.stdout:9/484: mknod d2/d14/d2b/d79/d81/c9e 0 2026-03-10T14:08:29.205 INFO:tasks.workunit.client.0.vm03.stdout:0/442: getdents d3/d11/d2c/d4a 0 2026-03-10T14:08:29.206 INFO:tasks.workunit.client.0.vm03.stdout:2/447: mkdir d5/d2a/d8a 0 2026-03-10T14:08:29.207 INFO:tasks.workunit.client.0.vm03.stdout:8/476: link da/d58/l5c da/d36/d40/l96 0 2026-03-10T14:08:29.209 INFO:tasks.workunit.client.0.vm03.stdout:2/448: dread d5/d35/f81 [0,4194304] 0 2026-03-10T14:08:29.209 INFO:tasks.workunit.client.0.vm03.stdout:7/377: fsync d5/f1b 0 2026-03-10T14:08:29.210 INFO:tasks.workunit.client.0.vm03.stdout:2/449: readlink d5/d10/d1f/d4f/d76/l79 0 2026-03-10T14:08:29.213 INFO:tasks.workunit.client.0.vm03.stdout:7/378: dwrite d5/d9/d35/f52 [0,4194304] 0 2026-03-10T14:08:29.214 INFO:tasks.workunit.client.0.vm03.stdout:7/379: write d5/d9/d35/f69 [886912,64462] 0 2026-03-10T14:08:29.228 INFO:tasks.workunit.client.0.vm03.stdout:5/564: dread - d4/d16/d19/d4a/fb5 zero size 2026-03-10T14:08:29.231 INFO:tasks.workunit.client.0.vm03.stdout:1/471: symlink d0/d2/df/l93 0 2026-03-10T14:08:29.233 INFO:tasks.workunit.client.0.vm03.stdout:3/462: write d1d/d33/d65/d48/f58 [107448,84539] 0 2026-03-10T14:08:29.238 INFO:tasks.workunit.client.0.vm03.stdout:0/443: sync 2026-03-10T14:08:29.238 INFO:tasks.workunit.client.0.vm03.stdout:6/444: rmdir d8/db/d12/d51/d5c/d7b 0 2026-03-10T14:08:29.238 INFO:tasks.workunit.client.0.vm03.stdout:0/444: truncate d3/d4d/d30/f7a 925082 0 2026-03-10T14:08:29.240 INFO:tasks.workunit.client.0.vm03.stdout:8/477: creat da/d3a/d44/f97 x:0 0 0 2026-03-10T14:08:29.242 INFO:tasks.workunit.client.0.vm03.stdout:4/491: mkdir d5/d47/d62/d8a/da0 0 2026-03-10T14:08:29.243 INFO:tasks.workunit.client.0.vm03.stdout:4/492: write d5/f7 [3200188,101564] 0 2026-03-10T14:08:29.244 INFO:tasks.workunit.client.0.vm03.stdout:3/463: symlink d1d/d29/l8f 0 2026-03-10T14:08:29.244 INFO:tasks.workunit.client.0.vm03.stdout:2/450: sync 2026-03-10T14:08:29.244 INFO:tasks.workunit.client.0.vm03.stdout:4/493: write d5/d47/d5b/d64/d85/f9d [953210,118637] 0 2026-03-10T14:08:29.245 INFO:tasks.workunit.client.0.vm03.stdout:4/494: truncate d5/d6e/f8c 386091 0 2026-03-10T14:08:29.249 INFO:tasks.workunit.client.0.vm03.stdout:4/495: dread d5/d9/db/d19/d34/f5c [0,4194304] 0 2026-03-10T14:08:29.252 INFO:tasks.workunit.client.0.vm03.stdout:1/472: rmdir d0/d2/d34 39 2026-03-10T14:08:29.253 INFO:tasks.workunit.client.0.vm03.stdout:9/485: write d2/d29/d38/f4b [1904160,122486] 0 2026-03-10T14:08:29.257 INFO:tasks.workunit.client.0.vm03.stdout:0/445: creat d3/d46/d54/d79/f8d x:0 0 0 2026-03-10T14:08:29.258 INFO:tasks.workunit.client.0.vm03.stdout:8/478: symlink da/d3a/l98 0 2026-03-10T14:08:29.259 INFO:tasks.workunit.client.0.vm03.stdout:0/446: read d3/d4d/d30/f7a [359364,10999] 0 2026-03-10T14:08:29.259 INFO:tasks.workunit.client.0.vm03.stdout:7/380: symlink d5/d9/d14/d26/d36/l7e 0 2026-03-10T14:08:29.263 INFO:tasks.workunit.client.0.vm03.stdout:3/464: creat d1d/d33/d47/d53/d68/f90 x:0 0 0 2026-03-10T14:08:29.269 INFO:tasks.workunit.client.0.vm03.stdout:5/565: dwrite d4/f37 [0,4194304] 0 2026-03-10T14:08:29.270 INFO:tasks.workunit.client.0.vm03.stdout:4/496: mknod d5/d9/db/d19/d99/ca1 0 2026-03-10T14:08:29.272 INFO:tasks.workunit.client.0.vm03.stdout:2/451: write d5/d10/d1f/f3e [284850,75702] 0 2026-03-10T14:08:29.277 INFO:tasks.workunit.client.0.vm03.stdout:6/445: creat d8/db/d49/d6c/d83/f87 x:0 0 0 2026-03-10T14:08:29.282 INFO:tasks.workunit.client.0.vm03.stdout:8/479: unlink da/d24/d49/f84 0 2026-03-10T14:08:29.282 INFO:tasks.workunit.client.0.vm03.stdout:8/480: fdatasync f5 0 2026-03-10T14:08:29.282 INFO:tasks.workunit.client.0.vm03.stdout:8/481: readlink da/d36/d40/l8d 0 2026-03-10T14:08:29.286 INFO:tasks.workunit.client.0.vm03.stdout:4/497: rmdir d5/d9/db/d19 39 2026-03-10T14:08:29.287 INFO:tasks.workunit.client.0.vm03.stdout:2/452: unlink d5/d10/f13 0 2026-03-10T14:08:29.291 INFO:tasks.workunit.client.0.vm03.stdout:1/473: dwrite d0/d2/df/d27/d7e/f92 [0,4194304] 0 2026-03-10T14:08:29.299 INFO:tasks.workunit.client.0.vm03.stdout:8/482: mkdir da/d36/d40/d50/d70/d99 0 2026-03-10T14:08:29.300 INFO:tasks.workunit.client.0.vm03.stdout:3/465: mkdir d1d/d29/d41/d45/d55/d6e/d83/d91 0 2026-03-10T14:08:29.301 INFO:tasks.workunit.client.0.vm03.stdout:3/466: dread - d1d/d59/f69 zero size 2026-03-10T14:08:29.301 INFO:tasks.workunit.client.0.vm03.stdout:5/566: mknod d4/d6/cb7 0 2026-03-10T14:08:29.304 INFO:tasks.workunit.client.0.vm03.stdout:1/474: creat d0/d2/df/d27/f94 x:0 0 0 2026-03-10T14:08:29.316 INFO:tasks.workunit.client.0.vm03.stdout:0/447: creat d3/d11/f8e x:0 0 0 2026-03-10T14:08:29.317 INFO:tasks.workunit.client.0.vm03.stdout:8/483: truncate da/f2e 5499493 0 2026-03-10T14:08:29.317 INFO:tasks.workunit.client.0.vm03.stdout:4/498: rmdir d5/d9/db/d19/d38/d53 39 2026-03-10T14:08:29.317 INFO:tasks.workunit.client.0.vm03.stdout:8/484: dwrite da/d36/d40/d50/d70/d7d/f80 [0,4194304] 0 2026-03-10T14:08:29.317 INFO:tasks.workunit.client.0.vm03.stdout:1/475: creat d0/d2/df/d27/d7e/f95 x:0 0 0 2026-03-10T14:08:29.319 INFO:tasks.workunit.client.0.vm03.stdout:4/499: chown d5/d9/db/d19/d38/d53/d71 987916 1 2026-03-10T14:08:29.326 INFO:tasks.workunit.client.0.vm03.stdout:4/500: chown d5/d47/d5b/f84 50 1 2026-03-10T14:08:29.326 INFO:tasks.workunit.client.0.vm03.stdout:8/485: creat da/d3a/f9a x:0 0 0 2026-03-10T14:08:29.326 INFO:tasks.workunit.client.0.vm03.stdout:2/453: rename d5/d10/d1c/d50/d74/c84 to d5/d10/d1c/d54/d5f/c8b 0 2026-03-10T14:08:29.327 INFO:tasks.workunit.client.0.vm03.stdout:2/454: chown d5/d10/d1c/f5b 0 1 2026-03-10T14:08:29.327 INFO:tasks.workunit.client.0.vm03.stdout:6/446: getdents d8/db/d49/d58 0 2026-03-10T14:08:29.327 INFO:tasks.workunit.client.0.vm03.stdout:1/476: fsync d0/d2/d34/f56 0 2026-03-10T14:08:29.327 INFO:tasks.workunit.client.0.vm03.stdout:6/447: read d8/d1b/f7f [1166299,57232] 0 2026-03-10T14:08:29.327 INFO:tasks.workunit.client.0.vm03.stdout:0/448: mkdir d3/d4d/d8f 0 2026-03-10T14:08:29.327 INFO:tasks.workunit.client.0.vm03.stdout:6/448: dread d8/d1b/f31 [0,4194304] 0 2026-03-10T14:08:29.327 INFO:tasks.workunit.client.0.vm03.stdout:3/467: creat d1d/d33/f92 x:0 0 0 2026-03-10T14:08:29.329 INFO:tasks.workunit.client.0.vm03.stdout:3/468: truncate d1d/d33/d65/d5d/f88 692796 0 2026-03-10T14:08:29.333 INFO:tasks.workunit.client.0.vm03.stdout:5/567: rename d4/d13/d73 to d4/d16/d19/d23/db8 0 2026-03-10T14:08:29.334 INFO:tasks.workunit.client.0.vm03.stdout:2/455: fdatasync d5/d10/d1c/d54/d5f/f6c 0 2026-03-10T14:08:29.335 INFO:tasks.workunit.client.0.vm03.stdout:0/449: symlink d3/d4d/d30/l90 0 2026-03-10T14:08:29.335 INFO:tasks.workunit.client.0.vm03.stdout:0/450: fsync d3/f9 0 2026-03-10T14:08:29.336 INFO:tasks.workunit.client.0.vm03.stdout:6/449: symlink d8/db/d12/d51/l88 0 2026-03-10T14:08:29.337 INFO:tasks.workunit.client.0.vm03.stdout:3/469: rename d1d/c34 to d1d/d29/d41/d45/d5b/d7e/c93 0 2026-03-10T14:08:29.338 INFO:tasks.workunit.client.0.vm03.stdout:5/568: creat d4/d16/d19/d23/d3f/fb9 x:0 0 0 2026-03-10T14:08:29.341 INFO:tasks.workunit.client.0.vm03.stdout:0/451: rmdir d3/d17 39 2026-03-10T14:08:29.341 INFO:tasks.workunit.client.0.vm03.stdout:3/470: symlink d1d/d59/l94 0 2026-03-10T14:08:29.341 INFO:tasks.workunit.client.0.vm03.stdout:0/452: stat d3 0 2026-03-10T14:08:29.343 INFO:tasks.workunit.client.0.vm03.stdout:5/569: rename d4/d13/d1f/f20 to d4/d16/d19/d23/db8/fba 0 2026-03-10T14:08:29.343 INFO:tasks.workunit.client.0.vm03.stdout:4/501: sync 2026-03-10T14:08:29.343 INFO:tasks.workunit.client.0.vm03.stdout:5/570: chown d4/d13/d1f/c42 0 1 2026-03-10T14:08:29.343 INFO:tasks.workunit.client.0.vm03.stdout:5/571: chown d4/d40 0 1 2026-03-10T14:08:29.346 INFO:tasks.workunit.client.0.vm03.stdout:9/486: write d2/d14/d2b/d34/f59 [2351296,5042] 0 2026-03-10T14:08:29.346 INFO:tasks.workunit.client.0.vm03.stdout:6/450: symlink d8/d11/d7a/l89 0 2026-03-10T14:08:29.351 INFO:tasks.workunit.client.0.vm03.stdout:2/456: creat d5/d10/d1c/d40/d59/d7b/f8c x:0 0 0 2026-03-10T14:08:29.352 INFO:tasks.workunit.client.0.vm03.stdout:5/572: chown d4/f82 149801690 1 2026-03-10T14:08:29.353 INFO:tasks.workunit.client.0.vm03.stdout:5/573: readlink d4/d6/l61 0 2026-03-10T14:08:29.354 INFO:tasks.workunit.client.0.vm03.stdout:3/471: getdents d1d/d29/d8a 0 2026-03-10T14:08:29.357 INFO:tasks.workunit.client.0.vm03.stdout:0/453: dread d3/d11/d2c/d4a/d4b/f7d [0,4194304] 0 2026-03-10T14:08:29.358 INFO:tasks.workunit.client.0.vm03.stdout:0/454: fdatasync d3/d11/d2c/d4a/d7b/f87 0 2026-03-10T14:08:29.360 INFO:tasks.workunit.client.0.vm03.stdout:4/502: symlink d5/d47/la2 0 2026-03-10T14:08:29.361 INFO:tasks.workunit.client.0.vm03.stdout:6/451: creat d8/d1b/d1c/d84/f8a x:0 0 0 2026-03-10T14:08:29.362 INFO:tasks.workunit.client.0.vm03.stdout:6/452: chown d8/d1b/d1c 38254760 1 2026-03-10T14:08:29.362 INFO:tasks.workunit.client.0.vm03.stdout:6/453: chown d8/db/d49/d6c/d32 86937246 1 2026-03-10T14:08:29.365 INFO:tasks.workunit.client.0.vm03.stdout:7/381: truncate d5/d9/d35/f52 3994145 0 2026-03-10T14:08:29.381 INFO:tasks.workunit.client.0.vm03.stdout:8/486: dwrite da/d36/f42 [0,4194304] 0 2026-03-10T14:08:29.385 INFO:tasks.workunit.client.0.vm03.stdout:8/487: dwrite f5 [0,4194304] 0 2026-03-10T14:08:29.397 INFO:tasks.workunit.client.0.vm03.stdout:1/477: dwrite d0/fa [0,4194304] 0 2026-03-10T14:08:29.397 INFO:tasks.workunit.client.0.vm03.stdout:3/472: mkdir d1d/d29/d41/d45/d95 0 2026-03-10T14:08:29.417 INFO:tasks.workunit.client.0.vm03.stdout:9/487: rename d2/d14/f3d to d2/d14/d2b/d79/d81/f9f 0 2026-03-10T14:08:29.422 INFO:tasks.workunit.client.0.vm03.stdout:4/503: creat d5/d9/db/d19/d38/d53/d55/fa3 x:0 0 0 2026-03-10T14:08:29.430 INFO:tasks.workunit.client.0.vm03.stdout:7/382: unlink d5/d9/d14/d26/l56 0 2026-03-10T14:08:29.435 INFO:tasks.workunit.client.0.vm03.stdout:8/488: creat da/d24/d49/f9b x:0 0 0 2026-03-10T14:08:29.436 INFO:tasks.workunit.client.0.vm03.stdout:1/478: mknod d0/d2/d34/c96 0 2026-03-10T14:08:29.439 INFO:tasks.workunit.client.0.vm03.stdout:0/455: rename d3/d46/c72 to d3/d11/d76/c91 0 2026-03-10T14:08:29.446 INFO:tasks.workunit.client.0.vm03.stdout:2/457: creat d5/d10/d17/f8d x:0 0 0 2026-03-10T14:08:29.449 INFO:tasks.workunit.client.0.vm03.stdout:0/456: dread d3/f10 [0,4194304] 0 2026-03-10T14:08:29.453 INFO:tasks.workunit.client.0.vm03.stdout:7/383: symlink d5/d9/d3e/l7f 0 2026-03-10T14:08:29.469 INFO:tasks.workunit.client.0.vm03.stdout:4/504: write d5/d9/d2b/f63 [291509,99299] 0 2026-03-10T14:08:29.471 INFO:tasks.workunit.client.0.vm03.stdout:5/574: write d4/d16/d19/d23/db8/fba [8855557,118953] 0 2026-03-10T14:08:29.479 INFO:tasks.workunit.client.0.vm03.stdout:1/479: truncate d0/d18/d1d/f5e 2708955 0 2026-03-10T14:08:29.480 INFO:tasks.workunit.client.0.vm03.stdout:3/473: symlink d1d/d29/d41/d45/d95/l96 0 2026-03-10T14:08:29.484 INFO:tasks.workunit.client.0.vm03.stdout:9/488: creat d2/d29/d33/d55/d72/d9d/fa0 x:0 0 0 2026-03-10T14:08:29.489 INFO:tasks.workunit.client.0.vm03.stdout:2/458: unlink d5/d10/d1c/d50/f69 0 2026-03-10T14:08:29.494 INFO:tasks.workunit.client.0.vm03.stdout:6/454: link f0 d8/d11/d18/d54/f8b 0 2026-03-10T14:08:29.502 INFO:tasks.workunit.client.0.vm03.stdout:7/384: dread d5/f47 [0,4194304] 0 2026-03-10T14:08:29.502 INFO:tasks.workunit.client.0.vm03.stdout:7/385: stat d5/d9/d14/d21/c34 0 2026-03-10T14:08:29.510 INFO:tasks.workunit.client.0.vm03.stdout:4/505: stat d5/d47/d5b/f79 0 2026-03-10T14:08:29.510 INFO:tasks.workunit.client.0.vm03.stdout:4/506: readlink d5/d9/db/l54 0 2026-03-10T14:08:29.511 INFO:tasks.workunit.client.0.vm03.stdout:5/575: creat d4/d13/d43/fbb x:0 0 0 2026-03-10T14:08:29.519 INFO:tasks.workunit.client.0.vm03.stdout:3/474: dwrite d1d/d29/f87 [0,4194304] 0 2026-03-10T14:08:29.520 INFO:tasks.workunit.client.0.vm03.stdout:3/475: fdatasync d1d/d33/d65/d5d/f88 0 2026-03-10T14:08:29.520 INFO:tasks.workunit.client.0.vm03.stdout:3/476: stat d1d/d33/f5e 0 2026-03-10T14:08:29.521 INFO:tasks.workunit.client.0.vm03.stdout:3/477: write d1d/d29/d41/d45/d55/f81 [581814,70947] 0 2026-03-10T14:08:29.533 INFO:tasks.workunit.client.0.vm03.stdout:8/489: truncate da/d36/d40/f47 3575973 0 2026-03-10T14:08:29.533 INFO:tasks.workunit.client.0.vm03.stdout:8/490: dread - da/d3a/f9a zero size 2026-03-10T14:08:29.537 INFO:tasks.workunit.client.0.vm03.stdout:2/459: rename d5/d2a/f37 to d5/d10/d1c/d54/f8e 0 2026-03-10T14:08:29.543 INFO:tasks.workunit.client.0.vm03.stdout:0/457: symlink d3/d11/d2c/d4a/d4b/d89/l92 0 2026-03-10T14:08:29.543 INFO:tasks.workunit.client.0.vm03.stdout:6/455: readlink d8/l20 0 2026-03-10T14:08:29.543 INFO:tasks.workunit.client.0.vm03.stdout:6/456: stat d8/db/d49 0 2026-03-10T14:08:29.549 INFO:tasks.workunit.client.0.vm03.stdout:5/576: read d4/d6/fa [2653565,25039] 0 2026-03-10T14:08:29.564 INFO:tasks.workunit.client.0.vm03.stdout:1/480: write d0/d42/f8e [952045,94298] 0 2026-03-10T14:08:29.572 INFO:tasks.workunit.client.0.vm03.stdout:0/458: write d3/d4d/d47/f74 [637039,112582] 0 2026-03-10T14:08:29.572 INFO:tasks.workunit.client.0.vm03.stdout:0/459: chown d3/d4d/d30/f8b 31424837 1 2026-03-10T14:08:29.575 INFO:tasks.workunit.client.0.vm03.stdout:6/457: fsync d8/d11/d18/f34 0 2026-03-10T14:08:29.578 INFO:tasks.workunit.client.0.vm03.stdout:7/386: mknod d5/d9/d14/d26/d36/c80 0 2026-03-10T14:08:29.580 INFO:tasks.workunit.client.0.vm03.stdout:4/507: mkdir d5/d9/db/da4 0 2026-03-10T14:08:29.584 INFO:tasks.workunit.client.0.vm03.stdout:3/478: mknod d1d/d29/d8a/c97 0 2026-03-10T14:08:29.586 INFO:tasks.workunit.client.0.vm03.stdout:9/489: write d2/f37 [1884059,91557] 0 2026-03-10T14:08:29.594 INFO:tasks.workunit.client.0.vm03.stdout:1/481: chown d0/d2/d34/f73 1139830332 1 2026-03-10T14:08:29.605 INFO:tasks.workunit.client.0.vm03.stdout:7/387: symlink d5/d9/d14/d26/d36/d51/d7b/l81 0 2026-03-10T14:08:29.608 INFO:tasks.workunit.client.0.vm03.stdout:0/460: write d3/d4d/d47/f5f [277514,42522] 0 2026-03-10T14:08:29.609 INFO:tasks.workunit.client.0.vm03.stdout:0/461: fsync d3/d11/f8e 0 2026-03-10T14:08:29.611 INFO:tasks.workunit.client.0.vm03.stdout:4/508: symlink d5/d9/db/d19/d34/la5 0 2026-03-10T14:08:29.614 INFO:tasks.workunit.client.0.vm03.stdout:5/577: truncate d4/d16/fac 3783496 0 2026-03-10T14:08:29.618 INFO:tasks.workunit.client.0.vm03.stdout:3/479: unlink d1d/d33/d47/d53/d68/f90 0 2026-03-10T14:08:29.624 INFO:tasks.workunit.client.0.vm03.stdout:9/490: symlink d2/d14/d2b/d79/d81/la1 0 2026-03-10T14:08:29.624 INFO:tasks.workunit.client.0.vm03.stdout:8/491: link da/fd da/d36/d40/d50/d70/d7d/f9c 0 2026-03-10T14:08:29.625 INFO:tasks.workunit.client.0.vm03.stdout:1/482: creat d0/d2/df/d16/d20/f97 x:0 0 0 2026-03-10T14:08:29.625 INFO:tasks.workunit.client.0.vm03.stdout:8/492: chown da/d36/d40/d50/d70/d99 55084 1 2026-03-10T14:08:29.626 INFO:tasks.workunit.client.0.vm03.stdout:9/491: read d2/d14/d2b/f68 [3755642,39929] 0 2026-03-10T14:08:29.636 INFO:tasks.workunit.client.0.vm03.stdout:7/388: rename d5/d9/d14/d26/c5d to d5/d9/d14/d21/d6f/c82 0 2026-03-10T14:08:29.637 INFO:tasks.workunit.client.0.vm03.stdout:0/462: creat d3/d46/d54/d79/f93 x:0 0 0 2026-03-10T14:08:29.641 INFO:tasks.workunit.client.0.vm03.stdout:4/509: creat d5/d9/db/d19/d38/d53/d55/fa6 x:0 0 0 2026-03-10T14:08:29.641 INFO:tasks.workunit.client.0.vm03.stdout:4/510: chown d5/d9/d2b/c66 88434909 1 2026-03-10T14:08:29.643 INFO:tasks.workunit.client.0.vm03.stdout:6/458: write d8/db/d12/d51/d5c/f69 [1175514,83550] 0 2026-03-10T14:08:29.644 INFO:tasks.workunit.client.0.vm03.stdout:6/459: read - d8/db/d12/d51/d5c/d60/f82 zero size 2026-03-10T14:08:29.654 INFO:tasks.workunit.client.0.vm03.stdout:2/460: getdents d5/d10/d1f 0 2026-03-10T14:08:29.658 INFO:tasks.workunit.client.0.vm03.stdout:8/493: mknod da/d36/c9d 0 2026-03-10T14:08:29.660 INFO:tasks.workunit.client.0.vm03.stdout:9/492: creat d2/d14/d2b/d43/fa2 x:0 0 0 2026-03-10T14:08:29.664 INFO:tasks.workunit.client.0.vm03.stdout:7/389: mkdir d5/d9/d14/d26/d36/d51/d7b/d83 0 2026-03-10T14:08:29.682 INFO:tasks.workunit.client.0.vm03.stdout:2/461: mkdir d5/d10/d31/d8f 0 2026-03-10T14:08:29.685 INFO:tasks.workunit.client.0.vm03.stdout:5/578: truncate d4/d16/d19/d23/db8/f77 4062433 0 2026-03-10T14:08:29.688 INFO:tasks.workunit.client.0.vm03.stdout:8/494: dread - da/d3c/d4b/d69/f83 zero size 2026-03-10T14:08:29.694 INFO:tasks.workunit.client.0.vm03.stdout:7/390: mkdir d5/d9/d3e/d84 0 2026-03-10T14:08:29.695 INFO:tasks.workunit.client.0.vm03.stdout:7/391: truncate d5/d9/f42 4217692 0 2026-03-10T14:08:29.703 INFO:tasks.workunit.client.0.vm03.stdout:4/511: mkdir d5/d9/db/da7 0 2026-03-10T14:08:29.708 INFO:tasks.workunit.client.0.vm03.stdout:3/480: creat d1d/d33/d47/f98 x:0 0 0 2026-03-10T14:08:29.712 INFO:tasks.workunit.client.0.vm03.stdout:5/579: mkdir d4/d40/dbc 0 2026-03-10T14:08:29.715 INFO:tasks.workunit.client.0.vm03.stdout:5/580: dwrite d4/d13/d1f/d8c/fa4 [0,4194304] 0 2026-03-10T14:08:29.732 INFO:tasks.workunit.client.0.vm03.stdout:8/495: fsync da/d24/f25 0 2026-03-10T14:08:29.741 INFO:tasks.workunit.client.0.vm03.stdout:8/496: dread da/f15 [4194304,4194304] 0 2026-03-10T14:08:29.743 INFO:tasks.workunit.client.0.vm03.stdout:9/493: mknod d2/d14/ca3 0 2026-03-10T14:08:29.747 INFO:tasks.workunit.client.0.vm03.stdout:7/392: creat d5/d9/d35/f85 x:0 0 0 2026-03-10T14:08:29.751 INFO:tasks.workunit.client.0.vm03.stdout:0/463: creat d3/f94 x:0 0 0 2026-03-10T14:08:29.754 INFO:tasks.workunit.client.0.vm03.stdout:4/512: rmdir d5/d9/db/d19/d99 39 2026-03-10T14:08:29.757 INFO:tasks.workunit.client.0.vm03.stdout:4/513: dwrite d5/d9/db/f48 [0,4194304] 0 2026-03-10T14:08:29.767 INFO:tasks.workunit.client.0.vm03.stdout:3/481: mkdir d1d/d39/d51/d72/d99 0 2026-03-10T14:08:29.768 INFO:tasks.workunit.client.0.vm03.stdout:3/482: readlink d1d/l3b 0 2026-03-10T14:08:29.769 INFO:tasks.workunit.client.0.vm03.stdout:2/462: mknod d5/d10/d17/c90 0 2026-03-10T14:08:29.773 INFO:tasks.workunit.client.0.vm03.stdout:1/483: getdents d0/d2/df/d16/d20 0 2026-03-10T14:08:29.784 INFO:tasks.workunit.client.0.vm03.stdout:7/393: rename d5/d9/d14/d26/d36/c80 to d5/d9/d14/d26/d39/c86 0 2026-03-10T14:08:29.789 INFO:tasks.workunit.client.0.vm03.stdout:0/464: creat d3/d11/d2c/d4a/f95 x:0 0 0 2026-03-10T14:08:29.794 INFO:tasks.workunit.client.0.vm03.stdout:6/460: getdents d8/d1b/d1c 0 2026-03-10T14:08:29.796 INFO:tasks.workunit.client.0.vm03.stdout:4/514: write d5/d9/f11 [4844749,114875] 0 2026-03-10T14:08:29.804 INFO:tasks.workunit.client.0.vm03.stdout:5/581: link d4/d6/de/f14 d4/d16/d19/d4a/fbd 0 2026-03-10T14:08:29.806 INFO:tasks.workunit.client.0.vm03.stdout:1/484: write d0/d2/f46 [4727314,57104] 0 2026-03-10T14:08:29.820 INFO:tasks.workunit.client.0.vm03.stdout:4/515: creat d5/d9/db/d19/d38/d53/d55/fa8 x:0 0 0 2026-03-10T14:08:29.825 INFO:tasks.workunit.client.0.vm03.stdout:5/582: rmdir d4/d40/d4e 39 2026-03-10T14:08:29.831 INFO:tasks.workunit.client.0.vm03.stdout:8/497: creat da/d3c/f9e x:0 0 0 2026-03-10T14:08:29.836 INFO:tasks.workunit.client.0.vm03.stdout:2/463: sync 2026-03-10T14:08:29.850 INFO:tasks.workunit.client.0.vm03.stdout:3/483: write d1d/f32 [2197633,120243] 0 2026-03-10T14:08:29.853 INFO:tasks.workunit.client.0.vm03.stdout:1/485: write d0/d42/f66 [359351,104947] 0 2026-03-10T14:08:29.855 INFO:tasks.workunit.client.0.vm03.stdout:0/465: write d3/d16/d21/d3c/f5b [411318,29119] 0 2026-03-10T14:08:29.861 INFO:tasks.workunit.client.0.vm03.stdout:4/516: read d5/d9/db/f20 [8123949,53237] 0 2026-03-10T14:08:29.863 INFO:tasks.workunit.client.0.vm03.stdout:4/517: dread d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:29.866 INFO:tasks.workunit.client.0.vm03.stdout:8/498: mknod da/d36/c9f 0 2026-03-10T14:08:29.867 INFO:tasks.workunit.client.0.vm03.stdout:8/499: dread - da/d3a/d44/d64/f91 zero size 2026-03-10T14:08:29.868 INFO:tasks.workunit.client.0.vm03.stdout:9/494: getdents d2/d29/d33/d60/d8c 0 2026-03-10T14:08:29.877 INFO:tasks.workunit.client.0.vm03.stdout:3/484: unlink d1d/d29/f3f 0 2026-03-10T14:08:29.883 INFO:tasks.workunit.client.0.vm03.stdout:2/464: dwrite d5/d10/d1f/d4f/f55 [0,4194304] 0 2026-03-10T14:08:29.888 INFO:tasks.workunit.client.0.vm03.stdout:1/486: write d0/d2/d34/f3e [556151,35637] 0 2026-03-10T14:08:29.895 INFO:tasks.workunit.client.0.vm03.stdout:4/518: symlink d5/d47/d62/la9 0 2026-03-10T14:08:29.900 INFO:tasks.workunit.client.0.vm03.stdout:9/495: mknod d2/d29/d33/d41/ca4 0 2026-03-10T14:08:29.904 INFO:tasks.workunit.client.0.vm03.stdout:9/496: dread d2/d29/d33/d60/d8c/f98 [0,4194304] 0 2026-03-10T14:08:29.907 INFO:tasks.workunit.client.0.vm03.stdout:7/394: getdents d5/d9/d14/d26/d36 0 2026-03-10T14:08:29.926 INFO:tasks.workunit.client.0.vm03.stdout:6/461: getdents d8/d3b 0 2026-03-10T14:08:29.929 INFO:tasks.workunit.client.0.vm03.stdout:5/583: creat d4/d6/fbe x:0 0 0 2026-03-10T14:08:29.929 INFO:tasks.workunit.client.0.vm03.stdout:8/500: creat da/d3c/d51/d75/fa0 x:0 0 0 2026-03-10T14:08:29.929 INFO:tasks.workunit.client.0.vm03.stdout:4/519: read d5/d9/db/f24 [2601805,66500] 0 2026-03-10T14:08:29.930 INFO:tasks.workunit.client.0.vm03.stdout:8/501: write f6 [2267042,81193] 0 2026-03-10T14:08:29.931 INFO:tasks.workunit.client.0.vm03.stdout:8/502: dread da/d24/f43 [0,4194304] 0 2026-03-10T14:08:29.931 INFO:tasks.workunit.client.0.vm03.stdout:8/503: fdatasync da/f16 0 2026-03-10T14:08:29.942 INFO:tasks.workunit.client.0.vm03.stdout:3/485: mknod d1d/d33/d47/c9a 0 2026-03-10T14:08:29.945 INFO:tasks.workunit.client.0.vm03.stdout:0/466: creat d3/d16/d21/f96 x:0 0 0 2026-03-10T14:08:29.946 INFO:tasks.workunit.client.0.vm03.stdout:5/584: dread - d4/d13/d1f/d84/f99 zero size 2026-03-10T14:08:29.948 INFO:tasks.workunit.client.0.vm03.stdout:4/520: mkdir d5/d9/db/d19/d38/d7b/daa 0 2026-03-10T14:08:29.948 INFO:tasks.workunit.client.0.vm03.stdout:8/504: creat da/d36/d40/d50/d70/d7d/fa1 x:0 0 0 2026-03-10T14:08:29.949 INFO:tasks.workunit.client.0.vm03.stdout:3/486: creat d1d/d29/f9b x:0 0 0 2026-03-10T14:08:29.952 INFO:tasks.workunit.client.0.vm03.stdout:1/487: truncate d0/d2/d34/f56 2379062 0 2026-03-10T14:08:29.954 INFO:tasks.workunit.client.0.vm03.stdout:0/467: fdatasync d3/d46/f55 0 2026-03-10T14:08:29.959 INFO:tasks.workunit.client.0.vm03.stdout:9/497: dwrite d2/d14/d2b/d79/d81/f9f [4194304,4194304] 0 2026-03-10T14:08:29.969 INFO:tasks.workunit.client.0.vm03.stdout:8/505: mknod da/d24/d49/ca2 0 2026-03-10T14:08:29.969 INFO:tasks.workunit.client.0.vm03.stdout:8/506: chown da/d24/l27 210572 1 2026-03-10T14:08:29.971 INFO:tasks.workunit.client.0.vm03.stdout:2/465: rename d5/d10/d1f/d4f/f55 to d5/d10/d1c/d40/d59/d7b/f91 0 2026-03-10T14:08:29.975 INFO:tasks.workunit.client.0.vm03.stdout:5/585: creat d4/d40/dbc/fbf x:0 0 0 2026-03-10T14:08:29.979 INFO:tasks.workunit.client.0.vm03.stdout:8/507: creat da/d3a/fa3 x:0 0 0 2026-03-10T14:08:29.981 INFO:tasks.workunit.client.0.vm03.stdout:3/487: symlink d1d/d39/d51/d72/d99/l9c 0 2026-03-10T14:08:29.983 INFO:tasks.workunit.client.0.vm03.stdout:7/395: rename d5/d9/d14/d26/d36/d51/f6c to d5/d9/d14/d26/d36/d51/d7b/f87 0 2026-03-10T14:08:29.989 INFO:tasks.workunit.client.0.vm03.stdout:0/468: creat d3/d4d/d30/f97 x:0 0 0 2026-03-10T14:08:29.991 INFO:tasks.workunit.client.0.vm03.stdout:9/498: creat d2/d29/d33/fa5 x:0 0 0 2026-03-10T14:08:29.992 INFO:tasks.workunit.client.0.vm03.stdout:3/488: link d1d/d33/f80 d1d/d29/d41/d45/d95/f9d 0 2026-03-10T14:08:29.994 INFO:tasks.workunit.client.0.vm03.stdout:7/396: truncate d5/d9/d14/d26/d39/f45 1472374 0 2026-03-10T14:08:29.994 INFO:tasks.workunit.client.0.vm03.stdout:7/397: readlink d5/d9/d35/l57 0 2026-03-10T14:08:29.997 INFO:tasks.workunit.client.0.vm03.stdout:7/398: dread d5/f66 [0,4194304] 0 2026-03-10T14:08:29.997 INFO:tasks.workunit.client.0.vm03.stdout:7/399: dread - d5/d9/d14/d26/d39/f63 zero size 2026-03-10T14:08:29.998 INFO:tasks.workunit.client.0.vm03.stdout:7/400: chown d5/d9/d14/d26/l73 191179044 1 2026-03-10T14:08:30.000 INFO:tasks.workunit.client.0.vm03.stdout:3/489: creat d1d/d29/d41/d45/d5b/f9e x:0 0 0 2026-03-10T14:08:30.002 INFO:tasks.workunit.client.0.vm03.stdout:6/462: rename d8/d1b/d1c/d84 to d8/db/d12/d51/d8c 0 2026-03-10T14:08:30.002 INFO:tasks.workunit.client.0.vm03.stdout:6/463: stat d8/d11/d7a/l89 0 2026-03-10T14:08:30.005 INFO:tasks.workunit.client.0.vm03.stdout:8/508: getdents da/d36/d40/d50/d70 0 2026-03-10T14:08:30.007 INFO:tasks.workunit.client.0.vm03.stdout:3/490: symlink d1d/d59/l9f 0 2026-03-10T14:08:30.008 INFO:tasks.workunit.client.0.vm03.stdout:5/586: sync 2026-03-10T14:08:30.009 INFO:tasks.workunit.client.0.vm03.stdout:5/587: stat d4/f17 0 2026-03-10T14:08:30.009 INFO:tasks.workunit.client.0.vm03.stdout:5/588: read - d4/d13/d43/fbb zero size 2026-03-10T14:08:30.011 INFO:tasks.workunit.client.0.vm03.stdout:5/589: dread d4/d13/d1f/d8c/da7/fab [0,4194304] 0 2026-03-10T14:08:30.016 INFO:tasks.workunit.client.0.vm03.stdout:4/521: dwrite d5/d9/db/f24 [4194304,4194304] 0 2026-03-10T14:08:30.018 INFO:tasks.workunit.client.0.vm03.stdout:2/466: getdents d5/d10/d1c/d40/d59/d7b 0 2026-03-10T14:08:30.020 INFO:tasks.workunit.client.0.vm03.stdout:1/488: dwrite d0/d2/d34/f73 [0,4194304] 0 2026-03-10T14:08:30.045 INFO:tasks.workunit.client.0.vm03.stdout:0/469: write d3/d4d/f2a [2214,69523] 0 2026-03-10T14:08:30.048 INFO:tasks.workunit.client.0.vm03.stdout:7/401: mknod d5/d9/d14/d26/d36/d51/d7b/d83/c88 0 2026-03-10T14:08:30.051 INFO:tasks.workunit.client.0.vm03.stdout:9/499: dwrite d2/f2c [0,4194304] 0 2026-03-10T14:08:30.054 INFO:tasks.workunit.client.0.vm03.stdout:3/491: creat d1d/d39/d51/fa0 x:0 0 0 2026-03-10T14:08:30.063 INFO:tasks.workunit.client.0.vm03.stdout:5/590: symlink d4/d13/d43/lc0 0 2026-03-10T14:08:30.064 INFO:tasks.workunit.client.0.vm03.stdout:5/591: write d4/d16/d19/f25 [57343,127843] 0 2026-03-10T14:08:30.076 INFO:tasks.workunit.client.0.vm03.stdout:9/500: chown d2/d29/d33/d41/d46/l8f 5818 1 2026-03-10T14:08:30.076 INFO:tasks.workunit.client.0.vm03.stdout:3/492: mkdir d1d/d39/da1 0 2026-03-10T14:08:30.078 INFO:tasks.workunit.client.0.vm03.stdout:5/592: creat d4/d13/d1f/fc1 x:0 0 0 2026-03-10T14:08:30.078 INFO:tasks.workunit.client.0.vm03.stdout:5/593: stat d4/d6/de/f14 0 2026-03-10T14:08:30.079 INFO:tasks.workunit.client.0.vm03.stdout:1/489: getdents d0/d2/df/d16/d20 0 2026-03-10T14:08:30.082 INFO:tasks.workunit.client.0.vm03.stdout:8/509: creat da/d58/d6c/fa4 x:0 0 0 2026-03-10T14:08:30.086 INFO:tasks.workunit.client.0.vm03.stdout:1/490: mknod d0/d2/df/d27/d7e/d81/c98 0 2026-03-10T14:08:30.087 INFO:tasks.workunit.client.0.vm03.stdout:1/491: readlink d0/d2/df/d27/d7e/l89 0 2026-03-10T14:08:30.090 INFO:tasks.workunit.client.0.vm03.stdout:1/492: link d0/d18/f74 d0/d2/df/d27/f99 0 2026-03-10T14:08:30.097 INFO:tasks.workunit.client.0.vm03.stdout:1/493: rename d0/d2/fe to d0/d2/d34/f9a 0 2026-03-10T14:08:30.098 INFO:tasks.workunit.client.0.vm03.stdout:5/594: read d4/d13/d8f/f91 [398865,128332] 0 2026-03-10T14:08:30.099 INFO:tasks.workunit.client.0.vm03.stdout:1/494: dwrite d0/d18/d1d/f84 [0,4194304] 0 2026-03-10T14:08:30.103 INFO:tasks.workunit.client.0.vm03.stdout:9/501: sync 2026-03-10T14:08:30.103 INFO:tasks.workunit.client.0.vm03.stdout:6/464: dwrite d8/d11/d18/d54/f8b [4194304,4194304] 0 2026-03-10T14:08:30.109 INFO:tasks.workunit.client.0.vm03.stdout:6/465: chown d8/db/d12/d51/d5c/d60 13837 1 2026-03-10T14:08:30.113 INFO:tasks.workunit.client.0.vm03.stdout:9/502: dwrite d2/f2c [0,4194304] 0 2026-03-10T14:08:30.121 INFO:tasks.workunit.client.0.vm03.stdout:7/402: dread d5/d9/f22 [0,4194304] 0 2026-03-10T14:08:30.144 INFO:tasks.workunit.client.0.vm03.stdout:5/595: dread d4/d13/f4b [4194304,4194304] 0 2026-03-10T14:08:30.144 INFO:tasks.workunit.client.0.vm03.stdout:5/596: chown d4/d16/c1e 589 1 2026-03-10T14:08:30.147 INFO:tasks.workunit.client.0.vm03.stdout:4/522: dwrite d5/fe [0,4194304] 0 2026-03-10T14:08:30.151 INFO:tasks.workunit.client.0.vm03.stdout:2/467: write d5/d10/f22 [934458,129644] 0 2026-03-10T14:08:30.155 INFO:tasks.workunit.client.0.vm03.stdout:4/523: dwrite d5/d9/db/d19/d34/f5d [0,4194304] 0 2026-03-10T14:08:30.158 INFO:tasks.workunit.client.0.vm03.stdout:6/466: creat d8/db/d12/d51/d5c/f8d x:0 0 0 2026-03-10T14:08:30.164 INFO:tasks.workunit.client.0.vm03.stdout:0/470: write d3/d46/f8c [3437399,91136] 0 2026-03-10T14:08:30.168 INFO:tasks.workunit.client.0.vm03.stdout:3/493: dwrite d1d/d29/d41/d45/d95/f9d [0,4194304] 0 2026-03-10T14:08:30.172 INFO:tasks.workunit.client.0.vm03.stdout:9/503: creat d2/d29/d33/d55/d72/d9d/fa6 x:0 0 0 2026-03-10T14:08:30.180 INFO:tasks.workunit.client.0.vm03.stdout:2/468: rename d5/d10/d31/d8f to d5/d10/d1c/d40/d92 0 2026-03-10T14:08:30.186 INFO:tasks.workunit.client.0.vm03.stdout:4/524: fsync d5/d47/f46 0 2026-03-10T14:08:30.186 INFO:tasks.workunit.client.0.vm03.stdout:4/525: dwrite d5/fe [0,4194304] 0 2026-03-10T14:08:30.191 INFO:tasks.workunit.client.0.vm03.stdout:8/510: write da/d58/d5f/f71 [221172,70093] 0 2026-03-10T14:08:30.196 INFO:tasks.workunit.client.0.vm03.stdout:0/471: mknod d3/d11/d2c/d4a/d4b/c98 0 2026-03-10T14:08:30.209 INFO:tasks.workunit.client.0.vm03.stdout:3/494: dread fb [0,4194304] 0 2026-03-10T14:08:30.209 INFO:tasks.workunit.client.0.vm03.stdout:3/495: chown d1d/d39 9148373 1 2026-03-10T14:08:30.210 INFO:tasks.workunit.client.0.vm03.stdout:3/496: fsync d1d/d39/d51/f82 0 2026-03-10T14:08:30.227 INFO:tasks.workunit.client.0.vm03.stdout:7/403: dread d5/d9/f7a [0,4194304] 0 2026-03-10T14:08:30.234 INFO:tasks.workunit.client.0.vm03.stdout:6/467: write d8/d1b/f5f [310290,113389] 0 2026-03-10T14:08:30.236 INFO:tasks.workunit.client.0.vm03.stdout:6/468: dread d8/db/d12/f26 [4194304,4194304] 0 2026-03-10T14:08:30.236 INFO:tasks.workunit.client.0.vm03.stdout:6/469: write d8/db/d12/f40 [1498004,66039] 0 2026-03-10T14:08:30.241 INFO:tasks.workunit.client.0.vm03.stdout:9/504: rmdir d2/d14/d2b 39 2026-03-10T14:08:30.254 INFO:tasks.workunit.client.0.vm03.stdout:6/470: dread d8/db/d12/f7c [0,4194304] 0 2026-03-10T14:08:30.256 INFO:tasks.workunit.client.0.vm03.stdout:6/471: read d8/db/d12/d51/d5c/f69 [42056,124602] 0 2026-03-10T14:08:30.258 INFO:tasks.workunit.client.0.vm03.stdout:5/597: creat d4/fc2 x:0 0 0 2026-03-10T14:08:30.259 INFO:tasks.workunit.client.0.vm03.stdout:2/469: mkdir d5/d10/d1c/d93 0 2026-03-10T14:08:30.260 INFO:tasks.workunit.client.0.vm03.stdout:2/470: fsync d5/d2a/f45 0 2026-03-10T14:08:30.263 INFO:tasks.workunit.client.0.vm03.stdout:2/471: dwrite d5/d10/d1c/d54/d5f/f82 [0,4194304] 0 2026-03-10T14:08:30.264 INFO:tasks.workunit.client.0.vm03.stdout:2/472: readlink d5/d10/d1f/l48 0 2026-03-10T14:08:30.270 INFO:tasks.workunit.client.0.vm03.stdout:3/497: creat d1d/d29/d41/d45/d95/fa2 x:0 0 0 2026-03-10T14:08:30.282 INFO:tasks.workunit.client.0.vm03.stdout:7/404: truncate d5/d9/f22 1588373 0 2026-03-10T14:08:30.287 INFO:tasks.workunit.client.0.vm03.stdout:6/472: creat d8/db/d49/d6c/f8e x:0 0 0 2026-03-10T14:08:30.291 INFO:tasks.workunit.client.0.vm03.stdout:5/598: creat d4/d16/d19/d23/db8/fc3 x:0 0 0 2026-03-10T14:08:30.291 INFO:tasks.workunit.client.0.vm03.stdout:5/599: dread - d4/d16/d19/d4a/faa zero size 2026-03-10T14:08:30.296 INFO:tasks.workunit.client.0.vm03.stdout:2/473: mknod d5/d35/c94 0 2026-03-10T14:08:30.296 INFO:tasks.workunit.client.0.vm03.stdout:1/495: symlink d0/d2/d34/l9b 0 2026-03-10T14:08:30.300 INFO:tasks.workunit.client.0.vm03.stdout:7/405: mkdir d5/d9/d14/d26/d5f/d89 0 2026-03-10T14:08:30.304 INFO:tasks.workunit.client.0.vm03.stdout:9/505: fsync d2/d14/d2b/d43/f56 0 2026-03-10T14:08:30.340 INFO:tasks.workunit.client.0.vm03.stdout:8/511: dwrite da/d3a/d44/d64/f7b [0,4194304] 0 2026-03-10T14:08:30.342 INFO:tasks.workunit.client.0.vm03.stdout:8/512: dread da/f1f [0,4194304] 0 2026-03-10T14:08:30.344 INFO:tasks.workunit.client.0.vm03.stdout:5/600: truncate d4/d16/d19/f79 1665586 0 2026-03-10T14:08:30.345 INFO:tasks.workunit.client.0.vm03.stdout:5/601: chown d4/d13/d1f/d8c/fa4 3104149 1 2026-03-10T14:08:30.345 INFO:tasks.workunit.client.0.vm03.stdout:8/513: dwrite da/f16 [0,4194304] 0 2026-03-10T14:08:30.357 INFO:tasks.workunit.client.0.vm03.stdout:7/406: fdatasync d5/d9/d14/d26/d5f/f68 0 2026-03-10T14:08:30.362 INFO:tasks.workunit.client.0.vm03.stdout:4/526: getdents d5/d47/d62/d87 0 2026-03-10T14:08:30.364 INFO:tasks.workunit.client.0.vm03.stdout:4/527: dread d5/d9/db/f48 [0,4194304] 0 2026-03-10T14:08:30.372 INFO:tasks.workunit.client.0.vm03.stdout:0/472: getdents d3/d11/d2c/d4a/d7b 0 2026-03-10T14:08:30.378 INFO:tasks.workunit.client.0.vm03.stdout:8/514: mkdir da/d36/d4d/da5 0 2026-03-10T14:08:30.378 INFO:tasks.workunit.client.0.vm03.stdout:8/515: fsync f2 0 2026-03-10T14:08:30.378 INFO:tasks.workunit.client.0.vm03.stdout:3/498: link l16 d1d/d33/d47/d53/d68/la3 0 2026-03-10T14:08:30.378 INFO:tasks.workunit.client.0.vm03.stdout:2/474: mknod d5/d10/d1c/d40/d59/d85/c95 0 2026-03-10T14:08:30.389 INFO:tasks.workunit.client.0.vm03.stdout:6/473: link d8/d1b/d1c/c1d d8/db/d49/c8f 0 2026-03-10T14:08:30.397 INFO:tasks.workunit.client.0.vm03.stdout:3/499: fsync d1d/f2b 0 2026-03-10T14:08:30.397 INFO:tasks.workunit.client.0.vm03.stdout:3/500: chown d1d/d29 0 1 2026-03-10T14:08:30.397 INFO:tasks.workunit.client.0.vm03.stdout:3/501: chown d1d/d39/d51/d72 1984 1 2026-03-10T14:08:30.404 INFO:tasks.workunit.client.0.vm03.stdout:9/506: rename d2/d29/d33/d41/d5c to d2/d29/da7 0 2026-03-10T14:08:30.405 INFO:tasks.workunit.client.0.vm03.stdout:9/507: chown d2/d14/d2b/d79/d81/f9f 16663 1 2026-03-10T14:08:30.422 INFO:tasks.workunit.client.0.vm03.stdout:3/502: rmdir d1d/d29/d41 39 2026-03-10T14:08:30.427 INFO:tasks.workunit.client.0.vm03.stdout:5/602: dwrite d4/f93 [0,4194304] 0 2026-03-10T14:08:30.431 INFO:tasks.workunit.client.0.vm03.stdout:5/603: dwrite d4/d13/d1f/fc1 [0,4194304] 0 2026-03-10T14:08:30.433 INFO:tasks.workunit.client.0.vm03.stdout:5/604: truncate d4/d16/d19/d6e/f95 601576 0 2026-03-10T14:08:30.434 INFO:tasks.workunit.client.0.vm03.stdout:5/605: write d4/d16/d19/d4a/faa [960507,18101] 0 2026-03-10T14:08:30.436 INFO:tasks.workunit.client.0.vm03.stdout:0/473: rename d3/d16/f39 to d3/d16/d21/d3c/f99 0 2026-03-10T14:08:30.437 INFO:tasks.workunit.client.0.vm03.stdout:1/496: write d0/d2/df/d16/d20/f5a [3216123,54662] 0 2026-03-10T14:08:30.437 INFO:tasks.workunit.client.0.vm03.stdout:1/497: chown d0/d2/df/d27 7 1 2026-03-10T14:08:30.445 INFO:tasks.workunit.client.0.vm03.stdout:9/508: unlink d2/d14/d2b/c75 0 2026-03-10T14:08:30.451 INFO:tasks.workunit.client.0.vm03.stdout:7/407: write d5/d9/f17 [1578610,92572] 0 2026-03-10T14:08:30.452 INFO:tasks.workunit.client.0.vm03.stdout:2/475: link d5/d10/d1f/f5e d5/d10/d1c/d50/d74/d83/f96 0 2026-03-10T14:08:30.452 INFO:tasks.workunit.client.0.vm03.stdout:2/476: truncate d5/d2a/f45 4332091 0 2026-03-10T14:08:30.453 INFO:tasks.workunit.client.0.vm03.stdout:2/477: write d5/d10/d17/f8d [569767,13156] 0 2026-03-10T14:08:30.460 INFO:tasks.workunit.client.0.vm03.stdout:2/478: dread d5/d35/f49 [0,4194304] 0 2026-03-10T14:08:30.461 INFO:tasks.workunit.client.0.vm03.stdout:5/606: rmdir d4/d13/d1f/d8c/da7 39 2026-03-10T14:08:30.463 INFO:tasks.workunit.client.0.vm03.stdout:8/516: write da/d24/f43 [543908,59880] 0 2026-03-10T14:08:30.471 INFO:tasks.workunit.client.0.vm03.stdout:4/528: truncate d5/fe 3829539 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:7/408: mknod d5/d9/d14/d26/d39/c8a 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:3/503: unlink d1d/d39/d51/d72/c89 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:3/504: readlink d1d/d39/d51/d72/d99/l9c 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:9/509: unlink d2/f54 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:8/517: symlink da/d36/la6 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:0/474: mkdir d3/d16/d21/d9a 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:0/475: dwrite d3/d11/d2c/d4a/f95 [0,4194304] 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:2/479: mknod d5/d2a/d8a/c97 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:8/518: symlink da/d36/d4d/la7 0 2026-03-10T14:08:30.489 INFO:tasks.workunit.client.0.vm03.stdout:3/505: rmdir d1d/d33/d65 39 2026-03-10T14:08:30.490 INFO:tasks.workunit.client.0.vm03.stdout:1/498: link d0/c59 d0/d2/df/d16/d20/c9c 0 2026-03-10T14:08:30.498 INFO:tasks.workunit.client.0.vm03.stdout:4/529: dread d5/d9/db/f20 [4194304,4194304] 0 2026-03-10T14:08:30.502 INFO:tasks.workunit.client.0.vm03.stdout:4/530: dwrite d5/d9/db/d19/d38/d53/f59 [0,4194304] 0 2026-03-10T14:08:30.512 INFO:tasks.workunit.client.0.vm03.stdout:7/409: fsync d5/d9/d35/f52 0 2026-03-10T14:08:30.514 INFO:tasks.workunit.client.0.vm03.stdout:9/510: read d2/d29/d33/d41/f50 [415893,10349] 0 2026-03-10T14:08:30.516 INFO:tasks.workunit.client.0.vm03.stdout:1/499: fdatasync d0/f24 0 2026-03-10T14:08:30.520 INFO:tasks.workunit.client.0.vm03.stdout:1/500: dwrite d0/d2/df/f43 [4194304,4194304] 0 2026-03-10T14:08:30.532 INFO:tasks.workunit.client.0.vm03.stdout:6/474: dwrite d8/d11/d18/f34 [4194304,4194304] 0 2026-03-10T14:08:30.548 INFO:tasks.workunit.client.0.vm03.stdout:7/410: mknod d5/d9/d14/d21/d6f/c8b 0 2026-03-10T14:08:30.552 INFO:tasks.workunit.client.0.vm03.stdout:3/506: rmdir d1d/d29/d41/d45/d95 39 2026-03-10T14:08:30.558 INFO:tasks.workunit.client.0.vm03.stdout:6/475: dread - d8/db/d12/f72 zero size 2026-03-10T14:08:30.559 INFO:tasks.workunit.client.0.vm03.stdout:3/507: symlink d1d/d59/la4 0 2026-03-10T14:08:30.560 INFO:tasks.workunit.client.0.vm03.stdout:1/501: mknod d0/d2/df/c9d 0 2026-03-10T14:08:30.564 INFO:tasks.workunit.client.0.vm03.stdout:5/607: sync 2026-03-10T14:08:30.564 INFO:tasks.workunit.client.0.vm03.stdout:8/519: sync 2026-03-10T14:08:30.566 INFO:tasks.workunit.client.0.vm03.stdout:8/520: dread - da/d36/d40/d50/d70/d7d/fa1 zero size 2026-03-10T14:08:30.570 INFO:tasks.workunit.client.0.vm03.stdout:3/508: creat d1d/d33/d65/d48/fa5 x:0 0 0 2026-03-10T14:08:30.571 INFO:tasks.workunit.client.0.vm03.stdout:8/521: dwrite da/d24/f43 [0,4194304] 0 2026-03-10T14:08:30.572 INFO:tasks.workunit.client.0.vm03.stdout:8/522: stat da/d58 0 2026-03-10T14:08:30.572 INFO:tasks.workunit.client.0.vm03.stdout:8/523: chown da/d24/f32 51513576 1 2026-03-10T14:08:30.573 INFO:tasks.workunit.client.0.vm03.stdout:8/524: chown da/d36/d40/d50/d70/d99 3599 1 2026-03-10T14:08:30.575 INFO:tasks.workunit.client.0.vm03.stdout:5/608: dwrite d4/d35/f8e [0,4194304] 0 2026-03-10T14:08:30.579 INFO:tasks.workunit.client.0.vm03.stdout:5/609: chown d4/c5e 4574568 1 2026-03-10T14:08:30.579 INFO:tasks.workunit.client.0.vm03.stdout:5/610: stat d4/d16/d19/d23/db8/fb0 0 2026-03-10T14:08:30.580 INFO:tasks.workunit.client.0.vm03.stdout:9/511: sync 2026-03-10T14:08:30.580 INFO:tasks.workunit.client.0.vm03.stdout:7/411: sync 2026-03-10T14:08:30.581 INFO:tasks.workunit.client.0.vm03.stdout:3/509: dwrite d1d/f4a [0,4194304] 0 2026-03-10T14:08:30.584 INFO:tasks.workunit.client.0.vm03.stdout:3/510: dread - d1d/d39/d51/fa0 zero size 2026-03-10T14:08:30.593 INFO:tasks.workunit.client.0.vm03.stdout:7/412: dwrite d5/d9/d35/f7c [0,4194304] 0 2026-03-10T14:08:30.611 INFO:tasks.workunit.client.0.vm03.stdout:8/525: rmdir da/d36/d4d 39 2026-03-10T14:08:30.615 INFO:tasks.workunit.client.0.vm03.stdout:2/480: write d5/d10/d1c/f3c [2508640,29545] 0 2026-03-10T14:08:30.618 INFO:tasks.workunit.client.0.vm03.stdout:0/476: dwrite d3/d16/f3e [0,4194304] 0 2026-03-10T14:08:30.624 INFO:tasks.workunit.client.0.vm03.stdout:3/511: rename d1d/d29/d8a to d1d/d29/d41/d45/d55/d6e/da6 0 2026-03-10T14:08:30.635 INFO:tasks.workunit.client.0.vm03.stdout:0/477: dwrite d3/d46/d54/d79/f88 [0,4194304] 0 2026-03-10T14:08:30.635 INFO:tasks.workunit.client.0.vm03.stdout:7/413: creat d5/d9/d14/d26/d5f/f8c x:0 0 0 2026-03-10T14:08:30.641 INFO:tasks.workunit.client.0.vm03.stdout:1/502: creat d0/d2/d71/d90/f9e x:0 0 0 2026-03-10T14:08:30.648 INFO:tasks.workunit.client.0.vm03.stdout:6/476: getdents d8/db/d12/d51/d5c/d60 0 2026-03-10T14:08:30.648 INFO:tasks.workunit.client.0.vm03.stdout:2/481: creat d5/d10/d1c/d50/f98 x:0 0 0 2026-03-10T14:08:30.650 INFO:tasks.workunit.client.0.vm03.stdout:5/611: symlink d4/lc4 0 2026-03-10T14:08:30.652 INFO:tasks.workunit.client.0.vm03.stdout:9/512: fdatasync d2/d14/f1b 0 2026-03-10T14:08:30.658 INFO:tasks.workunit.client.0.vm03.stdout:1/503: mkdir d0/d42/d9f 0 2026-03-10T14:08:30.659 INFO:tasks.workunit.client.0.vm03.stdout:6/477: creat d8/db/d49/d6c/f90 x:0 0 0 2026-03-10T14:08:30.660 INFO:tasks.workunit.client.0.vm03.stdout:6/478: dread - d8/db/d12/d51/d8c/f8a zero size 2026-03-10T14:08:30.660 INFO:tasks.workunit.client.0.vm03.stdout:6/479: fdatasync d8/d1b/f5f 0 2026-03-10T14:08:30.661 INFO:tasks.workunit.client.0.vm03.stdout:8/526: getdents da/d3c/d51/d8b 0 2026-03-10T14:08:30.663 INFO:tasks.workunit.client.0.vm03.stdout:8/527: dread da/d24/f52 [0,4194304] 0 2026-03-10T14:08:30.668 INFO:tasks.workunit.client.0.vm03.stdout:2/482: rmdir d5/d10/d17 39 2026-03-10T14:08:30.668 INFO:tasks.workunit.client.0.vm03.stdout:2/483: chown d5/d10/d1c/d40/d59/d85 272 1 2026-03-10T14:08:30.673 INFO:tasks.workunit.client.0.vm03.stdout:3/512: unlink d1d/d29/d41/d45/d55/d6e/c76 0 2026-03-10T14:08:30.678 INFO:tasks.workunit.client.0.vm03.stdout:1/504: unlink d0/d2/df/d27/d7e/f92 0 2026-03-10T14:08:30.683 INFO:tasks.workunit.client.0.vm03.stdout:6/480: truncate d8/db/d12/d51/d5c/f68 4998326 0 2026-03-10T14:08:30.686 INFO:tasks.workunit.client.0.vm03.stdout:6/481: dwrite d8/db/d12/d51/d8c/f8a [0,4194304] 0 2026-03-10T14:08:30.699 INFO:tasks.workunit.client.0.vm03.stdout:6/482: chown d8/d11/d18/d79/d80/c85 0 1 2026-03-10T14:08:30.699 INFO:tasks.workunit.client.0.vm03.stdout:5/612: mkdir d4/d16/d19/d6e/da3/dc5 0 2026-03-10T14:08:30.704 INFO:tasks.workunit.client.0.vm03.stdout:9/513: unlink d2/l10 0 2026-03-10T14:08:30.706 INFO:tasks.workunit.client.0.vm03.stdout:4/531: write d5/d9/db/f67 [315999,128002] 0 2026-03-10T14:08:30.710 INFO:tasks.workunit.client.0.vm03.stdout:3/513: read d1d/d33/f3a [1139875,118070] 0 2026-03-10T14:08:30.714 INFO:tasks.workunit.client.0.vm03.stdout:7/414: link d5/d9/d14/d26/d5f/f78 d5/d9/d14/d26/f8d 0 2026-03-10T14:08:30.714 INFO:tasks.workunit.client.0.vm03.stdout:7/415: chown d5/d9/d14/c13 140 1 2026-03-10T14:08:30.715 INFO:tasks.workunit.client.0.vm03.stdout:7/416: readlink d5/d9/d35/l57 0 2026-03-10T14:08:30.766 INFO:tasks.workunit.client.0.vm03.stdout:2/484: getdents d5/d10/d1c/d50/d74 0 2026-03-10T14:08:30.767 INFO:tasks.workunit.client.0.vm03.stdout:0/478: fsync d3/d16/f31 0 2026-03-10T14:08:30.771 INFO:tasks.workunit.client.0.vm03.stdout:3/514: link d1d/c1e d1d/d39/d51/d72/d99/ca7 0 2026-03-10T14:08:30.772 INFO:tasks.workunit.client.0.vm03.stdout:2/485: truncate d5/d10/d1c/f7e 1313638 0 2026-03-10T14:08:30.774 INFO:tasks.workunit.client.0.vm03.stdout:9/514: getdents d2/d29/d33/d55/d72/d9d 0 2026-03-10T14:08:30.776 INFO:tasks.workunit.client.0.vm03.stdout:8/528: write da/f15 [5605236,69172] 0 2026-03-10T14:08:30.776 INFO:tasks.workunit.client.0.vm03.stdout:8/529: read f5 [1289658,8708] 0 2026-03-10T14:08:30.777 INFO:tasks.workunit.client.0.vm03.stdout:8/530: truncate da/d3a/d44/d64/f68 4895661 0 2026-03-10T14:08:30.786 INFO:tasks.workunit.client.0.vm03.stdout:8/531: dread f2 [0,4194304] 0 2026-03-10T14:08:30.791 INFO:tasks.workunit.client.0.vm03.stdout:3/515: sync 2026-03-10T14:08:30.791 INFO:tasks.workunit.client.0.vm03.stdout:3/516: stat d1d/d39/d51/fa0 0 2026-03-10T14:08:30.796 INFO:tasks.workunit.client.0.vm03.stdout:1/505: write d0/d2/d34/f5f [662420,38969] 0 2026-03-10T14:08:30.797 INFO:tasks.workunit.client.0.vm03.stdout:6/483: write d8/db/d49/d6c/d32/f41 [3745219,85158] 0 2026-03-10T14:08:30.798 INFO:tasks.workunit.client.0.vm03.stdout:5/613: write d4/d13/d1f/f74 [2371840,90476] 0 2026-03-10T14:08:30.802 INFO:tasks.workunit.client.0.vm03.stdout:5/614: dwrite d4/d13/d1f/d8c/fa6 [0,4194304] 0 2026-03-10T14:08:30.804 INFO:tasks.workunit.client.0.vm03.stdout:5/615: write d4/d16/d19/d23/db8/fb0 [808329,32833] 0 2026-03-10T14:08:30.807 INFO:tasks.workunit.client.0.vm03.stdout:5/616: dwrite d4/d6/fbe [0,4194304] 0 2026-03-10T14:08:30.824 INFO:tasks.workunit.client.0.vm03.stdout:4/532: write f2 [782650,13099] 0 2026-03-10T14:08:30.830 INFO:tasks.workunit.client.0.vm03.stdout:7/417: write d5/d9/d14/d26/f5c [485132,23361] 0 2026-03-10T14:08:30.834 INFO:tasks.workunit.client.0.vm03.stdout:2/486: mknod d5/c99 0 2026-03-10T14:08:30.836 INFO:tasks.workunit.client.0.vm03.stdout:9/515: mknod d2/d29/d9a/ca8 0 2026-03-10T14:08:30.840 INFO:tasks.workunit.client.0.vm03.stdout:6/484: symlink d8/db/d49/d76/l91 0 2026-03-10T14:08:30.846 INFO:tasks.workunit.client.0.vm03.stdout:0/479: link d3/d4d/d30/f69 d3/d16/d21/d3c/f9b 0 2026-03-10T14:08:30.850 INFO:tasks.workunit.client.0.vm03.stdout:4/533: mkdir d5/d47/d5b/dab 0 2026-03-10T14:08:30.856 INFO:tasks.workunit.client.0.vm03.stdout:7/418: unlink d5/d9/d3e/l40 0 2026-03-10T14:08:30.860 INFO:tasks.workunit.client.0.vm03.stdout:6/485: rmdir d8/d11/d18/d54 39 2026-03-10T14:08:30.860 INFO:tasks.workunit.client.0.vm03.stdout:6/486: chown d8/d1b 398 1 2026-03-10T14:08:30.873 INFO:tasks.workunit.client.0.vm03.stdout:3/517: dwrite f17 [4194304,4194304] 0 2026-03-10T14:08:30.882 INFO:tasks.workunit.client.0.vm03.stdout:1/506: write d0/d2/d34/f9a [665226,92960] 0 2026-03-10T14:08:30.886 INFO:tasks.workunit.client.0.vm03.stdout:6/487: creat d8/db/d12/d51/d5c/d60/f92 x:0 0 0 2026-03-10T14:08:30.887 INFO:tasks.workunit.client.0.vm03.stdout:6/488: write f3 [7006221,17581] 0 2026-03-10T14:08:30.889 INFO:tasks.workunit.client.0.vm03.stdout:0/480: mknod d3/d11/c9c 0 2026-03-10T14:08:30.890 INFO:tasks.workunit.client.0.vm03.stdout:8/532: write da/d36/d6a/f6b [790345,113650] 0 2026-03-10T14:08:30.894 INFO:tasks.workunit.client.0.vm03.stdout:9/516: dwrite d2/d14/d2b/d43/f45 [0,4194304] 0 2026-03-10T14:08:30.895 INFO:tasks.workunit.client.0.vm03.stdout:0/481: dwrite d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:30.897 INFO:tasks.workunit.client.0.vm03.stdout:9/517: chown d2/d29/d38/c94 26706131 1 2026-03-10T14:08:30.910 INFO:tasks.workunit.client.0.vm03.stdout:2/487: fsync d5/d10/d17/f18 0 2026-03-10T14:08:30.915 INFO:tasks.workunit.client.0.vm03.stdout:1/507: creat d0/d2/df/d27/d7e/fa0 x:0 0 0 2026-03-10T14:08:30.916 INFO:tasks.workunit.client.0.vm03.stdout:1/508: chown d0/d2/df/f76 2 1 2026-03-10T14:08:30.919 INFO:tasks.workunit.client.0.vm03.stdout:1/509: dwrite d0/d2/d71/d90/f9e [0,4194304] 0 2026-03-10T14:08:30.933 INFO:tasks.workunit.client.0.vm03.stdout:5/617: rename d4/d16/d19/d23/db8/f77 to d4/d40/fc6 0 2026-03-10T14:08:30.936 INFO:tasks.workunit.client.0.vm03.stdout:3/518: dread d1d/f62 [0,4194304] 0 2026-03-10T14:08:30.937 INFO:tasks.workunit.client.0.vm03.stdout:4/534: dwrite d5/d9/db/f29 [0,4194304] 0 2026-03-10T14:08:30.940 INFO:tasks.workunit.client.0.vm03.stdout:6/489: creat d8/db/d49/d6c/d83/f93 x:0 0 0 2026-03-10T14:08:30.947 INFO:tasks.workunit.client.0.vm03.stdout:8/533: rmdir da/d3c/d51 39 2026-03-10T14:08:30.947 INFO:tasks.workunit.client.0.vm03.stdout:8/534: chown da/d36/c3e 546519 1 2026-03-10T14:08:30.947 INFO:tasks.workunit.client.0.vm03.stdout:8/535: stat da/d3a 0 2026-03-10T14:08:30.947 INFO:tasks.workunit.client.0.vm03.stdout:8/536: write da/d58/d6c/f76 [417504,125309] 0 2026-03-10T14:08:30.947 INFO:tasks.workunit.client.0.vm03.stdout:0/482: mknod d3/d11/d2c/d4a/d4b/d89/c9d 0 2026-03-10T14:08:30.947 INFO:tasks.workunit.client.0.vm03.stdout:0/483: chown d3/d11/d2c/d4a 1938170924 1 2026-03-10T14:08:30.956 INFO:tasks.workunit.client.0.vm03.stdout:9/518: fdatasync d2/d29/d33/d41/f50 0 2026-03-10T14:08:30.965 INFO:tasks.workunit.client.0.vm03.stdout:7/419: rename d5/d9/d14/d21/d28/f37 to d5/d9/d14/d26/d5f/f8e 0 2026-03-10T14:08:30.966 INFO:tasks.workunit.client.0.vm03.stdout:5/618: stat d4/d40/f5b 0 2026-03-10T14:08:30.967 INFO:tasks.workunit.client.0.vm03.stdout:5/619: read d4/d35/f92 [350784,53224] 0 2026-03-10T14:08:30.967 INFO:tasks.workunit.client.0.vm03.stdout:5/620: fsync d4/d16/f2d 0 2026-03-10T14:08:30.968 INFO:tasks.workunit.client.0.vm03.stdout:5/621: truncate d4/d16/d19/d6e/f57 4314546 0 2026-03-10T14:08:30.972 INFO:tasks.workunit.client.0.vm03.stdout:8/537: mkdir da/d58/da8 0 2026-03-10T14:08:30.975 INFO:tasks.workunit.client.0.vm03.stdout:8/538: write da/d58/d6c/fa4 [657777,76951] 0 2026-03-10T14:08:30.975 INFO:tasks.workunit.client.0.vm03.stdout:8/539: dread da/d24/f52 [0,4194304] 0 2026-03-10T14:08:30.976 INFO:tasks.workunit.client.0.vm03.stdout:0/484: sync 2026-03-10T14:08:30.978 INFO:tasks.workunit.client.0.vm03.stdout:3/519: dread d1d/d29/d41/d45/d55/f81 [0,4194304] 0 2026-03-10T14:08:30.981 INFO:tasks.workunit.client.0.vm03.stdout:8/540: dwrite da/d3c/f9e [0,4194304] 0 2026-03-10T14:08:30.985 INFO:tasks.workunit.client.0.vm03.stdout:9/519: dwrite d2/fc [4194304,4194304] 0 2026-03-10T14:08:30.990 INFO:tasks.workunit.client.0.vm03.stdout:9/520: chown d2/d29/da7/c74 508636727 1 2026-03-10T14:08:31.000 INFO:tasks.workunit.client.0.vm03.stdout:2/488: rename d5/d10/d1f/f3a to d5/d10/d1c/d93/f9a 0 2026-03-10T14:08:31.011 INFO:tasks.workunit.client.0.vm03.stdout:4/535: creat d5/d9/db/d19/d99/fac x:0 0 0 2026-03-10T14:08:31.015 INFO:tasks.workunit.client.0.vm03.stdout:0/485: creat d3/d16/f9e x:0 0 0 2026-03-10T14:08:31.018 INFO:tasks.workunit.client.0.vm03.stdout:6/490: rename d8/db/df/f63 to d8/db/d49/d6c/d32/f94 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:4/536: readlink d5/d47/l69 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:0/486: mknod d3/d11/d2c/d4a/d4b/c9f 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:3/520: symlink d1d/d39/da1/la8 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:3/521: fsync d1d/f66 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:8/541: creat da/d36/d4d/da5/fa9 x:0 0 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:6/491: mknod d8/db/d12/d51/d86/c95 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:0/487: mkdir d3/d4d/da0 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:0/488: readlink d3/d4d/l71 0 2026-03-10T14:08:31.029 INFO:tasks.workunit.client.0.vm03.stdout:3/522: unlink d1d/d29/d41/d45/d5b/d7e/c93 0 2026-03-10T14:08:31.030 INFO:tasks.workunit.client.0.vm03.stdout:2/489: getdents d5 0 2026-03-10T14:08:31.046 INFO:tasks.workunit.client.0.vm03.stdout:1/510: dwrite d0/d2/df/d16/d41/f68 [0,4194304] 0 2026-03-10T14:08:31.048 INFO:tasks.workunit.client.0.vm03.stdout:7/420: write d5/d9/d35/f69 [887917,60488] 0 2026-03-10T14:08:31.051 INFO:tasks.workunit.client.0.vm03.stdout:9/521: write d2/d14/f64 [280261,79403] 0 2026-03-10T14:08:31.052 INFO:tasks.workunit.client.0.vm03.stdout:5/622: dwrite d4/d6/de/f14 [0,4194304] 0 2026-03-10T14:08:31.065 INFO:tasks.workunit.client.0.vm03.stdout:6/492: symlink d8/db/d12/d51/d8c/l96 0 2026-03-10T14:08:31.071 INFO:tasks.workunit.client.0.vm03.stdout:4/537: getdents d5/d9/d2b 0 2026-03-10T14:08:31.074 INFO:tasks.workunit.client.0.vm03.stdout:1/511: rename d0/d2/d34/l9b to d0/d2/d71/la1 0 2026-03-10T14:08:31.075 INFO:tasks.workunit.client.0.vm03.stdout:7/421: fdatasync d5/d9/d14/d26/f38 0 2026-03-10T14:08:31.077 INFO:tasks.workunit.client.0.vm03.stdout:6/493: fsync d8/d11/d18/d54/f8b 0 2026-03-10T14:08:31.086 INFO:tasks.workunit.client.0.vm03.stdout:3/523: creat d1d/d29/d41/d45/d55/d6e/d83/d91/fa9 x:0 0 0 2026-03-10T14:08:31.086 INFO:tasks.workunit.client.0.vm03.stdout:8/542: symlink da/d3c/d51/d8b/laa 0 2026-03-10T14:08:31.086 INFO:tasks.workunit.client.0.vm03.stdout:8/543: fsync da/f62 0 2026-03-10T14:08:31.093 INFO:tasks.workunit.client.0.vm03.stdout:0/489: creat d3/fa1 x:0 0 0 2026-03-10T14:08:31.093 INFO:tasks.workunit.client.0.vm03.stdout:0/490: dread - d3/d16/f9e zero size 2026-03-10T14:08:31.093 INFO:tasks.workunit.client.0.vm03.stdout:0/491: fdatasync d3/d46/d54/d79/f8d 0 2026-03-10T14:08:31.093 INFO:tasks.workunit.client.0.vm03.stdout:0/492: stat d3/f19 0 2026-03-10T14:08:31.096 INFO:tasks.workunit.client.0.vm03.stdout:1/512: dread d0/d2/df/d16/d20/f5a [0,4194304] 0 2026-03-10T14:08:31.099 INFO:tasks.workunit.client.0.vm03.stdout:3/524: fsync d1d/d59/f69 0 2026-03-10T14:08:31.102 INFO:tasks.workunit.client.0.vm03.stdout:8/544: mkdir da/d24/d49/dab 0 2026-03-10T14:08:31.106 INFO:tasks.workunit.client.0.vm03.stdout:5/623: creat d4/fc7 x:0 0 0 2026-03-10T14:08:31.110 INFO:tasks.workunit.client.0.vm03.stdout:2/490: getdents d5/d2a 0 2026-03-10T14:08:31.117 INFO:tasks.workunit.client.0.vm03.stdout:8/545: rename da/d36/d40/d50/d70/l7e to da/d24/lac 0 2026-03-10T14:08:31.119 INFO:tasks.workunit.client.0.vm03.stdout:7/422: creat d5/f8f x:0 0 0 2026-03-10T14:08:31.120 INFO:tasks.workunit.client.0.vm03.stdout:1/513: sync 2026-03-10T14:08:31.122 INFO:tasks.workunit.client.0.vm03.stdout:6/494: creat d8/d11/f97 x:0 0 0 2026-03-10T14:08:31.127 INFO:tasks.workunit.client.0.vm03.stdout:9/522: truncate d2/f15 380956 0 2026-03-10T14:08:31.129 INFO:tasks.workunit.client.0.vm03.stdout:2/491: creat d5/d35/f9b x:0 0 0 2026-03-10T14:08:31.130 INFO:tasks.workunit.client.0.vm03.stdout:2/492: write d5/d10/d1c/d50/f98 [34947,53839] 0 2026-03-10T14:08:31.131 INFO:tasks.workunit.client.0.vm03.stdout:9/523: sync 2026-03-10T14:08:31.131 INFO:tasks.workunit.client.0.vm03.stdout:4/538: getdents d5/d47/d62 0 2026-03-10T14:08:31.143 INFO:tasks.workunit.client.0.vm03.stdout:0/493: write d3/d17/f56 [706335,1751] 0 2026-03-10T14:08:31.146 INFO:tasks.workunit.client.0.vm03.stdout:7/423: creat d5/d9/d14/d26/d36/d51/d7b/f90 x:0 0 0 2026-03-10T14:08:31.147 INFO:tasks.workunit.client.0.vm03.stdout:3/525: truncate d1d/d29/d41/d45/d55/f81 421954 0 2026-03-10T14:08:31.152 INFO:tasks.workunit.client.0.vm03.stdout:3/526: sync 2026-03-10T14:08:31.152 INFO:tasks.workunit.client.0.vm03.stdout:3/527: chown fe 5869 1 2026-03-10T14:08:31.153 INFO:tasks.workunit.client.0.vm03.stdout:5/624: symlink d4/lc8 0 2026-03-10T14:08:31.169 INFO:tasks.workunit.client.0.vm03.stdout:2/493: rmdir d5/d10/d1c/d50 39 2026-03-10T14:08:31.183 INFO:tasks.workunit.client.0.vm03.stdout:4/539: creat d5/d9/db/d19/d38/d53/d55/fad x:0 0 0 2026-03-10T14:08:31.187 INFO:tasks.workunit.client.0.vm03.stdout:8/546: symlink da/d3c/lad 0 2026-03-10T14:08:31.213 INFO:tasks.workunit.client.0.vm03.stdout:0/494: rename d3/d17/d5c to d3/d11/d66/da2 0 2026-03-10T14:08:31.225 INFO:tasks.workunit.client.0.vm03.stdout:7/424: mknod d5/d9/d14/d26/d36/d51/d7b/d83/c91 0 2026-03-10T14:08:31.227 INFO:tasks.workunit.client.0.vm03.stdout:1/514: creat d0/d42/d9f/fa2 x:0 0 0 2026-03-10T14:08:31.228 INFO:tasks.workunit.client.0.vm03.stdout:1/515: fsync d0/d2/df/d16/d41/f68 0 2026-03-10T14:08:31.229 INFO:tasks.workunit.client.0.vm03.stdout:5/625: creat d4/d13/d1f/d84/fc9 x:0 0 0 2026-03-10T14:08:31.234 INFO:tasks.workunit.client.0.vm03.stdout:4/540: creat d5/d47/d62/d87/fae x:0 0 0 2026-03-10T14:08:31.235 INFO:tasks.workunit.client.0.vm03.stdout:4/541: dread - d5/d6e/f93 zero size 2026-03-10T14:08:31.237 INFO:tasks.workunit.client.0.vm03.stdout:5/626: dwrite d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:31.238 INFO:tasks.workunit.client.0.vm03.stdout:8/547: creat da/d3c/d4b/d4c/fae x:0 0 0 2026-03-10T14:08:31.243 INFO:tasks.workunit.client.0.vm03.stdout:5/627: fsync d4/f37 0 2026-03-10T14:08:31.254 INFO:tasks.workunit.client.0.vm03.stdout:6/495: write d8/db/d12/d51/d5c/f68 [5854162,111726] 0 2026-03-10T14:08:31.254 INFO:tasks.workunit.client.0.vm03.stdout:6/496: chown d8/db/d49/d6c 61 1 2026-03-10T14:08:31.255 INFO:tasks.workunit.client.0.vm03.stdout:6/497: write d8/db/d49/d6c/d32/f4b [5742076,30405] 0 2026-03-10T14:08:31.256 INFO:tasks.workunit.client.0.vm03.stdout:6/498: readlink d8/d11/d18/l78 0 2026-03-10T14:08:31.261 INFO:tasks.workunit.client.0.vm03.stdout:2/494: rename d5/d10/d1c/d93 to d5/d10/d1c/d54/d9c 0 2026-03-10T14:08:31.266 INFO:tasks.workunit.client.0.vm03.stdout:7/425: chown d5/d9/d3e/c75 0 1 2026-03-10T14:08:31.268 INFO:tasks.workunit.client.0.vm03.stdout:3/528: creat d1d/d29/d41/d45/d95/faa x:0 0 0 2026-03-10T14:08:31.273 INFO:tasks.workunit.client.0.vm03.stdout:3/529: dwrite d1d/d59/f69 [0,4194304] 0 2026-03-10T14:08:31.278 INFO:tasks.workunit.client.0.vm03.stdout:1/516: creat d0/d2/df/d27/d7e/d81/fa3 x:0 0 0 2026-03-10T14:08:31.278 INFO:tasks.workunit.client.0.vm03.stdout:1/517: dread - d0/d2/df/d27/f52 zero size 2026-03-10T14:08:31.279 INFO:tasks.workunit.client.0.vm03.stdout:9/524: creat d2/d29/d33/fa9 x:0 0 0 2026-03-10T14:08:31.285 INFO:tasks.workunit.client.0.vm03.stdout:8/548: read - da/d36/f77 zero size 2026-03-10T14:08:31.286 INFO:tasks.workunit.client.0.vm03.stdout:5/628: stat d4/d16/f1c 0 2026-03-10T14:08:31.286 INFO:tasks.workunit.client.0.vm03.stdout:5/629: readlink d4/d16/d19/d23/db8/la9 0 2026-03-10T14:08:31.287 INFO:tasks.workunit.client.0.vm03.stdout:5/630: write d4/d13/d43/f94 [253088,14483] 0 2026-03-10T14:08:31.288 INFO:tasks.workunit.client.0.vm03.stdout:6/499: rmdir d8/db/d12 39 2026-03-10T14:08:31.289 INFO:tasks.workunit.client.0.vm03.stdout:5/631: dread d4/d16/d19/d6e/f95 [0,4194304] 0 2026-03-10T14:08:31.295 INFO:tasks.workunit.client.0.vm03.stdout:7/426: mkdir d5/d9/d14/d26/d39/d92 0 2026-03-10T14:08:31.296 INFO:tasks.workunit.client.0.vm03.stdout:5/632: dwrite d4/d13/d1f/fb1 [0,4194304] 0 2026-03-10T14:08:31.297 INFO:tasks.workunit.client.0.vm03.stdout:5/633: write d4/d13/d1f/fb1 [358839,25666] 0 2026-03-10T14:08:31.298 INFO:tasks.workunit.client.0.vm03.stdout:5/634: chown d4/d35 4682725 1 2026-03-10T14:08:31.304 INFO:tasks.workunit.client.0.vm03.stdout:3/530: fdatasync d1d/d29/f2e 0 2026-03-10T14:08:31.311 INFO:tasks.workunit.client.0.vm03.stdout:3/531: dwrite d1d/d29/d41/d45/d55/d6e/d83/d91/fa9 [0,4194304] 0 2026-03-10T14:08:31.315 INFO:tasks.workunit.client.0.vm03.stdout:3/532: chown d1d/d29/d41/d45/d55/d6e/d83/d91/fa9 71595 1 2026-03-10T14:08:31.325 INFO:tasks.workunit.client.0.vm03.stdout:9/525: mkdir d2/d29/d33/d41/daa 0 2026-03-10T14:08:31.330 INFO:tasks.workunit.client.0.vm03.stdout:1/518: dread d0/d2/f1a [0,4194304] 0 2026-03-10T14:08:31.331 INFO:tasks.workunit.client.0.vm03.stdout:8/549: unlink da/d58/d6c/f76 0 2026-03-10T14:08:31.363 INFO:tasks.workunit.client.0.vm03.stdout:5/635: dread d4/d6/f63 [0,4194304] 0 2026-03-10T14:08:31.363 INFO:tasks.workunit.client.0.vm03.stdout:5/636: stat d4/d16/d19/d6e 0 2026-03-10T14:08:31.366 INFO:tasks.workunit.client.0.vm03.stdout:8/550: mknod da/d3c/d51/caf 0 2026-03-10T14:08:31.366 INFO:tasks.workunit.client.0.vm03.stdout:7/427: dread d5/d9/d14/f41 [0,4194304] 0 2026-03-10T14:08:31.368 INFO:tasks.workunit.client.0.vm03.stdout:4/542: truncate d5/d9/db/f29 8129505 0 2026-03-10T14:08:31.372 INFO:tasks.workunit.client.0.vm03.stdout:2/495: dwrite d5/d10/d31/f7d [4194304,4194304] 0 2026-03-10T14:08:31.376 INFO:tasks.workunit.client.0.vm03.stdout:3/533: fdatasync d1d/d29/d41/f56 0 2026-03-10T14:08:31.382 INFO:tasks.workunit.client.0.vm03.stdout:0/495: getdents d3/d4d/d30 0 2026-03-10T14:08:31.386 INFO:tasks.workunit.client.0.vm03.stdout:5/637: readlink d4/d13/l38 0 2026-03-10T14:08:31.394 INFO:tasks.workunit.client.0.vm03.stdout:7/428: unlink c1 0 2026-03-10T14:08:31.396 INFO:tasks.workunit.client.0.vm03.stdout:7/429: chown d5/d9/d14/d26/d36/d51/d7b/d83/c88 480570067 1 2026-03-10T14:08:31.396 INFO:tasks.workunit.client.0.vm03.stdout:6/500: link d8/d11/d18/l78 d8/d3b/l98 0 2026-03-10T14:08:31.401 INFO:tasks.workunit.client.0.vm03.stdout:2/496: truncate d5/d10/d17/f18 183606 0 2026-03-10T14:08:31.402 INFO:tasks.workunit.client.0.vm03.stdout:2/497: readlink d5/l11 0 2026-03-10T14:08:31.402 INFO:tasks.workunit.client.0.vm03.stdout:7/430: dwrite d5/d9/d35/f7c [0,4194304] 0 2026-03-10T14:08:31.403 INFO:tasks.workunit.client.0.vm03.stdout:2/498: readlink d5/d10/d1c/d40/d59/l67 0 2026-03-10T14:08:31.404 INFO:tasks.workunit.client.0.vm03.stdout:7/431: dread - d5/d9/d14/d26/d39/f6a zero size 2026-03-10T14:08:31.408 INFO:tasks.workunit.client.0.vm03.stdout:3/534: creat d1d/d39/d51/d72/fab x:0 0 0 2026-03-10T14:08:31.409 INFO:tasks.workunit.client.0.vm03.stdout:3/535: write d1d/d29/d41/d45/d55/f85 [2972190,69361] 0 2026-03-10T14:08:31.415 INFO:tasks.workunit.client.0.vm03.stdout:0/496: dread d3/d46/f63 [0,4194304] 0 2026-03-10T14:08:31.420 INFO:tasks.workunit.client.0.vm03.stdout:8/551: creat da/d36/d40/d50/d70/d99/fb0 x:0 0 0 2026-03-10T14:08:31.423 INFO:tasks.workunit.client.0.vm03.stdout:8/552: dwrite da/d3a/d44/f97 [0,4194304] 0 2026-03-10T14:08:31.435 INFO:tasks.workunit.client.0.vm03.stdout:2/499: truncate d5/d10/f60 287663 0 2026-03-10T14:08:31.437 INFO:tasks.workunit.client.0.vm03.stdout:7/432: mknod d5/d9/d14/d26/d36/d51/c93 0 2026-03-10T14:08:31.438 INFO:tasks.workunit.client.0.vm03.stdout:9/526: getdents d2/d29/d38 0 2026-03-10T14:08:31.444 INFO:tasks.workunit.client.0.vm03.stdout:4/543: dwrite d5/d9/db/ff [0,4194304] 0 2026-03-10T14:08:31.456 INFO:tasks.workunit.client.0.vm03.stdout:3/536: truncate d1d/d33/f5e 332386 0 2026-03-10T14:08:31.462 INFO:tasks.workunit.client.0.vm03.stdout:0/497: creat d3/d4d/d47/fa3 x:0 0 0 2026-03-10T14:08:31.466 INFO:tasks.workunit.client.0.vm03.stdout:1/519: getdents d0/d2/d71/d90 0 2026-03-10T14:08:31.473 INFO:tasks.workunit.client.0.vm03.stdout:3/537: fsync d1d/f2b 0 2026-03-10T14:08:31.475 INFO:tasks.workunit.client.0.vm03.stdout:2/500: rename d5/d35/c94 to d5/d2a/d8a/c9d 0 2026-03-10T14:08:31.481 INFO:tasks.workunit.client.0.vm03.stdout:3/538: dread fc [0,4194304] 0 2026-03-10T14:08:31.481 INFO:tasks.workunit.client.0.vm03.stdout:3/539: chown d1d/d33/l4b 89 1 2026-03-10T14:08:31.484 INFO:tasks.workunit.client.0.vm03.stdout:9/527: dread d2/d29/da7/f93 [0,4194304] 0 2026-03-10T14:08:31.486 INFO:tasks.workunit.client.0.vm03.stdout:5/638: write d4/d13/d1f/d8c/f9c [710380,74812] 0 2026-03-10T14:08:31.498 INFO:tasks.workunit.client.0.vm03.stdout:1/520: mkdir d0/d18/d3b/d50/da4 0 2026-03-10T14:08:31.502 INFO:tasks.workunit.client.0.vm03.stdout:5/639: dread d4/d6/de/f4f [0,4194304] 0 2026-03-10T14:08:31.503 INFO:tasks.workunit.client.0.vm03.stdout:6/501: link d8/db/l28 d8/db/d49/d76/l99 0 2026-03-10T14:08:31.512 INFO:tasks.workunit.client.0.vm03.stdout:4/544: dread d5/d9/db/d19/d38/d53/d55/f75 [0,4194304] 0 2026-03-10T14:08:31.512 INFO:tasks.workunit.client.0.vm03.stdout:4/545: stat d5/d9/d2b/f63 0 2026-03-10T14:08:31.513 INFO:tasks.workunit.client.0.vm03.stdout:4/546: write d5/d9/db/d19/d34/f9b [912281,8513] 0 2026-03-10T14:08:31.522 INFO:tasks.workunit.client.0.vm03.stdout:3/540: readlink d1d/d33/l7d 0 2026-03-10T14:08:31.522 INFO:tasks.workunit.client.0.vm03.stdout:3/541: dread - d1d/d33/d65/d48/f8b zero size 2026-03-10T14:08:31.544 INFO:tasks.workunit.client.0.vm03.stdout:8/553: dwrite da/d24/f32 [0,4194304] 0 2026-03-10T14:08:31.564 INFO:tasks.workunit.client.0.vm03.stdout:7/433: creat d5/d9/f94 x:0 0 0 2026-03-10T14:08:31.568 INFO:tasks.workunit.client.0.vm03.stdout:9/528: fdatasync d2/d29/f35 0 2026-03-10T14:08:31.571 INFO:tasks.workunit.client.0.vm03.stdout:9/529: chown d2/d29/d9a/ca8 7 1 2026-03-10T14:08:31.575 INFO:tasks.workunit.client.0.vm03.stdout:4/547: write d5/d47/d5b/f84 [748614,61302] 0 2026-03-10T14:08:31.581 INFO:tasks.workunit.client.0.vm03.stdout:8/554: mkdir da/d36/d40/d50/db1 0 2026-03-10T14:08:31.584 INFO:tasks.workunit.client.0.vm03.stdout:2/501: symlink d5/d10/d1f/l9e 0 2026-03-10T14:08:31.589 INFO:tasks.workunit.client.0.vm03.stdout:3/542: mkdir d1d/d33/d47/dac 0 2026-03-10T14:08:31.594 INFO:tasks.workunit.client.0.vm03.stdout:3/543: dwrite d1d/d39/d51/fa0 [0,4194304] 0 2026-03-10T14:08:31.597 INFO:tasks.workunit.client.0.vm03.stdout:5/640: dread d4/d40/f5b [4194304,4194304] 0 2026-03-10T14:08:31.597 INFO:tasks.workunit.client.0.vm03.stdout:5/641: readlink d4/d13/d43/l9d 0 2026-03-10T14:08:31.608 INFO:tasks.workunit.client.0.vm03.stdout:0/498: getdents d3 0 2026-03-10T14:08:31.612 INFO:tasks.workunit.client.0.vm03.stdout:0/499: dwrite d3/d46/d54/d79/f8d [0,4194304] 0 2026-03-10T14:08:31.623 INFO:tasks.workunit.client.0.vm03.stdout:6/502: write d8/db/d49/d6c/d32/f94 [411775,50928] 0 2026-03-10T14:08:31.633 INFO:tasks.workunit.client.0.vm03.stdout:7/434: creat d5/d9/d3e/d84/f95 x:0 0 0 2026-03-10T14:08:31.634 INFO:tasks.workunit.client.0.vm03.stdout:3/544: symlink d1d/d33/d47/d53/lad 0 2026-03-10T14:08:31.635 INFO:tasks.workunit.client.0.vm03.stdout:9/530: creat d2/d29/d33/d41/daa/fab x:0 0 0 2026-03-10T14:08:31.638 INFO:tasks.workunit.client.0.vm03.stdout:1/521: link d0/d2/c51 d0/d2/df/d27/ca5 0 2026-03-10T14:08:31.640 INFO:tasks.workunit.client.0.vm03.stdout:4/548: mkdir d5/d9/db/d19/d38/d7b/daa/daf 0 2026-03-10T14:08:31.642 INFO:tasks.workunit.client.0.vm03.stdout:8/555: unlink da/d24/c8c 0 2026-03-10T14:08:31.647 INFO:tasks.workunit.client.0.vm03.stdout:8/556: read - da/d3a/d44/d64/f91 zero size 2026-03-10T14:08:31.647 INFO:tasks.workunit.client.0.vm03.stdout:8/557: chown da/d3c/d51/d85 3 1 2026-03-10T14:08:31.647 INFO:tasks.workunit.client.0.vm03.stdout:8/558: truncate da/d3c/f9e 4598713 0 2026-03-10T14:08:31.648 INFO:tasks.workunit.client.0.vm03.stdout:7/435: symlink d5/d9/d3e/d84/l96 0 2026-03-10T14:08:31.649 INFO:tasks.workunit.client.0.vm03.stdout:7/436: truncate d5/d9/d14/d26/d5f/f6b 76408 0 2026-03-10T14:08:31.651 INFO:tasks.workunit.client.0.vm03.stdout:3/545: mkdir d1d/d33/d65/d5d/dae 0 2026-03-10T14:08:31.653 INFO:tasks.workunit.client.0.vm03.stdout:5/642: mkdir d4/dca 0 2026-03-10T14:08:31.657 INFO:tasks.workunit.client.0.vm03.stdout:2/502: truncate d5/d10/f16 2552277 0 2026-03-10T14:08:31.658 INFO:tasks.workunit.client.0.vm03.stdout:2/503: stat d5/d2a/f6e 0 2026-03-10T14:08:31.661 INFO:tasks.workunit.client.0.vm03.stdout:4/549: fdatasync d5/d47/f46 0 2026-03-10T14:08:31.665 INFO:tasks.workunit.client.0.vm03.stdout:6/503: unlink d8/db/d12/f6a 0 2026-03-10T14:08:31.665 INFO:tasks.workunit.client.0.vm03.stdout:6/504: chown d8/d11/d18/d54 25409 1 2026-03-10T14:08:31.677 INFO:tasks.workunit.client.0.vm03.stdout:2/504: chown d5/d10/d31/c77 716306909 1 2026-03-10T14:08:31.678 INFO:tasks.workunit.client.0.vm03.stdout:2/505: truncate d5/d10/d1c/d40/d59/d7b/f8c 920188 0 2026-03-10T14:08:31.679 INFO:tasks.workunit.client.0.vm03.stdout:7/437: write d5/d9/f1f [3034832,36301] 0 2026-03-10T14:08:31.683 INFO:tasks.workunit.client.0.vm03.stdout:7/438: dwrite d5/d9/d14/d26/d39/f45 [0,4194304] 0 2026-03-10T14:08:31.687 INFO:tasks.workunit.client.0.vm03.stdout:5/643: write d4/f82 [1725871,36237] 0 2026-03-10T14:08:31.688 INFO:tasks.workunit.client.0.vm03.stdout:3/546: dwrite d1d/d29/d41/d45/d55/f63 [0,4194304] 0 2026-03-10T14:08:31.690 INFO:tasks.workunit.client.0.vm03.stdout:3/547: readlink d1d/d59/l94 0 2026-03-10T14:08:31.690 INFO:tasks.workunit.client.0.vm03.stdout:3/548: read d1d/d39/d51/fa0 [631186,27151] 0 2026-03-10T14:08:31.691 INFO:tasks.workunit.client.0.vm03.stdout:3/549: read - d1d/d33/d47/d53/d68/f86 zero size 2026-03-10T14:08:31.692 INFO:tasks.workunit.client.0.vm03.stdout:3/550: write d1d/d29/d41/d45/d55/f63 [382956,120931] 0 2026-03-10T14:08:31.703 INFO:tasks.workunit.client.0.vm03.stdout:0/500: creat d3/d16/d21/fa4 x:0 0 0 2026-03-10T14:08:31.710 INFO:tasks.workunit.client.0.vm03.stdout:6/505: creat d8/db/d12/f9a x:0 0 0 2026-03-10T14:08:31.721 INFO:tasks.workunit.client.0.vm03.stdout:7/439: dread d5/f32 [0,4194304] 0 2026-03-10T14:08:31.725 INFO:tasks.workunit.client.0.vm03.stdout:0/501: read d3/d16/d21/d3c/f99 [2081890,130216] 0 2026-03-10T14:08:31.725 INFO:tasks.workunit.client.0.vm03.stdout:0/502: chown d3/d11 63 1 2026-03-10T14:08:31.730 INFO:tasks.workunit.client.0.vm03.stdout:7/440: dread d5/d9/f1f [0,4194304] 0 2026-03-10T14:08:31.739 INFO:tasks.workunit.client.0.vm03.stdout:3/551: fsync fb 0 2026-03-10T14:08:31.741 INFO:tasks.workunit.client.0.vm03.stdout:9/531: link d2/d14/d2b/d79/c99 d2/d29/d33/d41/d95/cac 0 2026-03-10T14:08:31.745 INFO:tasks.workunit.client.0.vm03.stdout:1/522: creat d0/d18/fa6 x:0 0 0 2026-03-10T14:08:31.754 INFO:tasks.workunit.client.0.vm03.stdout:6/506: rename d8/d11/d18/f7d to d8/db/d12/d64/f9b 0 2026-03-10T14:08:31.763 INFO:tasks.workunit.client.0.vm03.stdout:0/503: fsync d3/d16/d21/d3c/f99 0 2026-03-10T14:08:31.764 INFO:tasks.workunit.client.0.vm03.stdout:0/504: write d3/fa1 [394129,40901] 0 2026-03-10T14:08:31.775 INFO:tasks.workunit.client.0.vm03.stdout:4/550: link d5/d9/db/d19/l89 d5/d6e/lb0 0 2026-03-10T14:08:31.783 INFO:tasks.workunit.client.0.vm03.stdout:8/559: getdents da/d58/d5f/d67 0 2026-03-10T14:08:31.784 INFO:tasks.workunit.client.0.vm03.stdout:6/507: rename d8/d11/f97 to d8/d3b/f9c 0 2026-03-10T14:08:31.786 INFO:tasks.workunit.client.0.vm03.stdout:5/644: creat d4/d40/fcb x:0 0 0 2026-03-10T14:08:31.788 INFO:tasks.workunit.client.0.vm03.stdout:0/505: mknod d3/d46/d5e/ca5 0 2026-03-10T14:08:31.789 INFO:tasks.workunit.client.0.vm03.stdout:0/506: readlink d3/d16/d21/l29 0 2026-03-10T14:08:31.796 INFO:tasks.workunit.client.0.vm03.stdout:0/507: dread d3/d11/d2c/d4a/d4b/f7d [0,4194304] 0 2026-03-10T14:08:31.796 INFO:tasks.workunit.client.0.vm03.stdout:9/532: fsync d2/d14/f96 0 2026-03-10T14:08:31.803 INFO:tasks.workunit.client.0.vm03.stdout:1/523: sync 2026-03-10T14:08:31.803 INFO:tasks.workunit.client.0.vm03.stdout:0/508: sync 2026-03-10T14:08:31.804 INFO:tasks.workunit.client.0.vm03.stdout:1/524: fsync d0/d2/d34/f5f 0 2026-03-10T14:08:31.805 INFO:tasks.workunit.client.0.vm03.stdout:2/506: dwrite d5/d10/d17/f20 [0,4194304] 0 2026-03-10T14:08:31.806 INFO:tasks.workunit.client.0.vm03.stdout:1/525: dread - d0/d2/d34/f88 zero size 2026-03-10T14:08:31.817 INFO:tasks.workunit.client.0.vm03.stdout:7/441: dwrite d5/f48 [0,4194304] 0 2026-03-10T14:08:31.821 INFO:tasks.workunit.client.0.vm03.stdout:3/552: creat d1d/d29/faf x:0 0 0 2026-03-10T14:08:31.828 INFO:tasks.workunit.client.0.vm03.stdout:4/551: mknod d5/d9/db/d19/d38/d7b/daa/daf/cb1 0 2026-03-10T14:08:31.830 INFO:tasks.workunit.client.0.vm03.stdout:8/560: fdatasync da/d24/f2d 0 2026-03-10T14:08:31.833 INFO:tasks.workunit.client.0.vm03.stdout:6/508: mknod d8/db/c9d 0 2026-03-10T14:08:31.841 INFO:tasks.workunit.client.0.vm03.stdout:1/526: rename d0/d2/df/d16/d20/f97 to d0/d2/df/d27/d7e/d81/fa7 0 2026-03-10T14:08:31.846 INFO:tasks.workunit.client.0.vm03.stdout:0/509: unlink d3/d11/d2c/d4a/d4b/l8a 0 2026-03-10T14:08:31.851 INFO:tasks.workunit.client.0.vm03.stdout:2/507: creat d5/f9f x:0 0 0 2026-03-10T14:08:31.854 INFO:tasks.workunit.client.0.vm03.stdout:3/553: dread d1d/d29/f6b [0,4194304] 0 2026-03-10T14:08:31.857 INFO:tasks.workunit.client.0.vm03.stdout:3/554: dwrite d1d/d29/f9b [0,4194304] 0 2026-03-10T14:08:31.862 INFO:tasks.workunit.client.0.vm03.stdout:9/533: mknod d2/d29/d33/d55/d9c/cad 0 2026-03-10T14:08:31.868 INFO:tasks.workunit.client.0.vm03.stdout:5/645: mknod d4/d40/da5/ccc 0 2026-03-10T14:08:31.871 INFO:tasks.workunit.client.0.vm03.stdout:0/510: truncate d3/d4d/d30/f32 332905 0 2026-03-10T14:08:31.874 INFO:tasks.workunit.client.0.vm03.stdout:7/442: creat d5/d9/d14/d26/d5f/d89/f97 x:0 0 0 2026-03-10T14:08:31.877 INFO:tasks.workunit.client.0.vm03.stdout:3/555: fsync d1d/d29/f6b 0 2026-03-10T14:08:31.877 INFO:tasks.workunit.client.0.vm03.stdout:3/556: stat lf 0 2026-03-10T14:08:31.878 INFO:tasks.workunit.client.0.vm03.stdout:3/557: write d1d/d39/d51/d72/f8e [891937,37767] 0 2026-03-10T14:08:31.878 INFO:tasks.workunit.client.0.vm03.stdout:3/558: readlink d1d/d39/da1/la8 0 2026-03-10T14:08:31.882 INFO:tasks.workunit.client.0.vm03.stdout:4/552: mkdir d5/d47/d62/d8a/da0/db2 0 2026-03-10T14:08:31.888 INFO:tasks.workunit.client.0.vm03.stdout:4/553: dread d5/d6e/f8c [0,4194304] 0 2026-03-10T14:08:31.888 INFO:tasks.workunit.client.0.vm03.stdout:1/527: link d0/d2/df/d16/d20/f38 d0/d2/df/d16/d41/fa8 0 2026-03-10T14:08:31.889 INFO:tasks.workunit.client.0.vm03.stdout:0/511: readlink d3/d11/l1c 0 2026-03-10T14:08:31.891 INFO:tasks.workunit.client.0.vm03.stdout:2/508: truncate d5/d35/f81 1516152 0 2026-03-10T14:08:31.895 INFO:tasks.workunit.client.0.vm03.stdout:9/534: creat d2/d29/d33/d41/d95/fae x:0 0 0 2026-03-10T14:08:31.901 INFO:tasks.workunit.client.0.vm03.stdout:9/535: stat d2/d29/d33/f70 0 2026-03-10T14:08:31.901 INFO:tasks.workunit.client.0.vm03.stdout:5/646: mknod d4/d13/ccd 0 2026-03-10T14:08:31.901 INFO:tasks.workunit.client.0.vm03.stdout:8/561: rename da/d3c/d4b/c54 to da/d3c/cb2 0 2026-03-10T14:08:31.901 INFO:tasks.workunit.client.0.vm03.stdout:8/562: write da/d3c/f48 [3172704,80113] 0 2026-03-10T14:08:31.901 INFO:tasks.workunit.client.0.vm03.stdout:1/528: truncate d0/d18/d3b/f4c 478203 0 2026-03-10T14:08:31.904 INFO:tasks.workunit.client.0.vm03.stdout:5/647: dwrite d4/d13/fad [0,4194304] 0 2026-03-10T14:08:31.905 INFO:tasks.workunit.client.0.vm03.stdout:8/563: dwrite da/d3a/d44/d64/f68 [4194304,4194304] 0 2026-03-10T14:08:31.907 INFO:tasks.workunit.client.0.vm03.stdout:8/564: write da/d24/f32 [1669138,31247] 0 2026-03-10T14:08:31.909 INFO:tasks.workunit.client.0.vm03.stdout:4/554: sync 2026-03-10T14:08:31.911 INFO:tasks.workunit.client.0.vm03.stdout:7/443: mkdir d5/d98 0 2026-03-10T14:08:31.912 INFO:tasks.workunit.client.0.vm03.stdout:3/559: mknod d1d/d33/cb0 0 2026-03-10T14:08:31.913 INFO:tasks.workunit.client.0.vm03.stdout:7/444: stat d5/d9/d14/d26/d36 0 2026-03-10T14:08:31.914 INFO:tasks.workunit.client.0.vm03.stdout:7/445: read d5/d9/d14/f41 [541264,75984] 0 2026-03-10T14:08:31.932 INFO:tasks.workunit.client.0.vm03.stdout:9/536: dread d2/d14/d2b/f68 [0,4194304] 0 2026-03-10T14:08:31.944 INFO:tasks.workunit.client.0.vm03.stdout:5/648: read d4/d40/fc6 [2085950,115467] 0 2026-03-10T14:08:31.944 INFO:tasks.workunit.client.0.vm03.stdout:5/649: chown d4/d16/d19/d23/d3f/l49 4066 1 2026-03-10T14:08:31.951 INFO:tasks.workunit.client.0.vm03.stdout:3/560: mknod d1d/d29/d41/d45/d55/cb1 0 2026-03-10T14:08:31.966 INFO:tasks.workunit.client.0.vm03.stdout:6/509: write d8/db/f3c [977172,120797] 0 2026-03-10T14:08:31.978 INFO:tasks.workunit.client.0.vm03.stdout:7/446: rmdir d5/d9/d14 39 2026-03-10T14:08:31.982 INFO:tasks.workunit.client.0.vm03.stdout:8/565: symlink da/d36/d6a/lb3 0 2026-03-10T14:08:31.983 INFO:tasks.workunit.client.0.vm03.stdout:2/509: symlink d5/d10/d1c/la0 0 2026-03-10T14:08:31.994 INFO:tasks.workunit.client.0.vm03.stdout:6/510: creat d8/db/d12/f9e x:0 0 0 2026-03-10T14:08:31.994 INFO:tasks.workunit.client.0.vm03.stdout:6/511: fdatasync d8/db/f3c 0 2026-03-10T14:08:31.994 INFO:tasks.workunit.client.0.vm03.stdout:8/566: sync 2026-03-10T14:08:31.998 INFO:tasks.workunit.client.0.vm03.stdout:8/567: dwrite da/d3c/f48 [4194304,4194304] 0 2026-03-10T14:08:32.006 INFO:tasks.workunit.client.0.vm03.stdout:4/555: truncate d5/d9/d2b/f63 34582 0 2026-03-10T14:08:32.007 INFO:tasks.workunit.client.0.vm03.stdout:9/537: write d2/f83 [1328234,99951] 0 2026-03-10T14:08:32.013 INFO:tasks.workunit.client.0.vm03.stdout:2/510: dread - d5/d10/d1c/d40/d59/f70 zero size 2026-03-10T14:08:32.016 INFO:tasks.workunit.client.0.vm03.stdout:3/561: mkdir d1d/d29/db2 0 2026-03-10T14:08:32.016 INFO:tasks.workunit.client.0.vm03.stdout:3/562: readlink d1d/d59/l9f 0 2026-03-10T14:08:32.019 INFO:tasks.workunit.client.0.vm03.stdout:0/512: rename d3/d16/d21/c41 to d3/d11/ca6 0 2026-03-10T14:08:32.019 INFO:tasks.workunit.client.0.vm03.stdout:0/513: readlink d3/l4e 0 2026-03-10T14:08:32.020 INFO:tasks.workunit.client.0.vm03.stdout:0/514: truncate d3/d11/d2c/d4a/d7b/f87 284934 0 2026-03-10T14:08:32.021 INFO:tasks.workunit.client.0.vm03.stdout:0/515: stat d3/d16/d21/c40 0 2026-03-10T14:08:32.029 INFO:tasks.workunit.client.0.vm03.stdout:6/512: chown d8/d1b/d1c/c1d 8868654 1 2026-03-10T14:08:32.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:32.032+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 8 ==== mgrmap(e 27) v1 ==== 49900+0+0 (secure 0 0 0) 0x7f8df4062510 con 0x7f8e04071980 2026-03-10T14:08:32.033 INFO:tasks.workunit.client.0.vm03.stdout:6/513: dwrite d8/db/d49/d6c/d32/f94 [0,4194304] 0 2026-03-10T14:08:32.035 INFO:tasks.workunit.client.0.vm03.stdout:6/514: truncate d8/db/d49/d6c/d32/f94 4544142 0 2026-03-10T14:08:32.052 INFO:tasks.workunit.client.0.vm03.stdout:8/568: fdatasync da/d24/f28 0 2026-03-10T14:08:32.056 INFO:tasks.workunit.client.0.vm03.stdout:4/556: symlink d5/d47/d5b/lb3 0 2026-03-10T14:08:32.066 INFO:tasks.workunit.client.0.vm03.stdout:4/557: dwrite d5/d9/db/d19/d38/d53/d55/fa8 [0,4194304] 0 2026-03-10T14:08:32.066 INFO:tasks.workunit.client.0.vm03.stdout:1/529: getdents d0/d18/d3b 0 2026-03-10T14:08:32.066 INFO:tasks.workunit.client.0.vm03.stdout:1/530: readlink d0/d2/df/d27/d7e/d81/l82 0 2026-03-10T14:08:32.066 INFO:tasks.workunit.client.0.vm03.stdout:1/531: write d0/d2/df/d16/d41/f68 [73084,111687] 0 2026-03-10T14:08:32.067 INFO:tasks.workunit.client.0.vm03.stdout:8/569: sync 2026-03-10T14:08:32.069 INFO:tasks.workunit.client.0.vm03.stdout:9/538: dread d2/d14/d2b/d43/f7c [0,4194304] 0 2026-03-10T14:08:32.073 INFO:tasks.workunit.client.0.vm03.stdout:8/570: read da/f15 [2158356,35521] 0 2026-03-10T14:08:32.075 INFO:tasks.workunit.client.0.vm03.stdout:7/447: unlink d5/d9/d14/d26/d5f/d89/f97 0 2026-03-10T14:08:32.078 INFO:tasks.workunit.client.0.vm03.stdout:4/558: dread d5/d9/db/f24 [4194304,4194304] 0 2026-03-10T14:08:32.079 INFO:tasks.workunit.client.0.vm03.stdout:4/559: chown d5/d47/d62/d87/fae 14503126 1 2026-03-10T14:08:32.082 INFO:tasks.workunit.client.0.vm03.stdout:2/511: dread d5/fa [0,4194304] 0 2026-03-10T14:08:32.085 INFO:tasks.workunit.client.0.vm03.stdout:4/560: dwrite d5/d9/db/f2a [0,4194304] 0 2026-03-10T14:08:32.095 INFO:tasks.workunit.client.0.vm03.stdout:5/650: rename d4/d16/d19/d6e/f85 to d4/d13/d1f/fce 0 2026-03-10T14:08:32.099 INFO:tasks.workunit.client.0.vm03.stdout:9/539: dread d2/d29/d33/d41/f53 [0,4194304] 0 2026-03-10T14:08:32.108 INFO:tasks.workunit.client.0.vm03.stdout:8/571: read da/d24/f43 [2193559,101116] 0 2026-03-10T14:08:32.109 INFO:tasks.workunit.client.0.vm03.stdout:8/572: chown da/d36/d40/d50/d70/d99/fb0 4249994 1 2026-03-10T14:08:32.111 INFO:tasks.workunit.client.0.vm03.stdout:7/448: unlink d5/d9/d14/d26/d39/c8a 0 2026-03-10T14:08:32.125 INFO:tasks.workunit.client.0.vm03.stdout:0/516: rename d3/d4d/d47/fa3 to d3/d4d/d8f/fa7 0 2026-03-10T14:08:32.131 INFO:tasks.workunit.client.0.vm03.stdout:5/651: dwrite d4/d13/d1f/fce [0,4194304] 0 2026-03-10T14:08:32.135 INFO:tasks.workunit.client.0.vm03.stdout:5/652: write d4/d13/fad [628813,108459] 0 2026-03-10T14:08:32.135 INFO:tasks.workunit.client.0.vm03.stdout:5/653: stat d4/dca 0 2026-03-10T14:08:32.145 INFO:tasks.workunit.client.0.vm03.stdout:0/517: dread d3/f9 [0,4194304] 0 2026-03-10T14:08:32.149 INFO:tasks.workunit.client.0.vm03.stdout:1/532: link d0/d2/df/d27/d7e/d81/fa7 d0/d2/df/d16/d20/fa9 0 2026-03-10T14:08:32.164 INFO:tasks.workunit.client.0.vm03.stdout:6/515: creat d8/d11/d18/f9f x:0 0 0 2026-03-10T14:08:32.168 INFO:tasks.workunit.client.0.vm03.stdout:6/516: chown d8/d11 113 1 2026-03-10T14:08:32.175 INFO:tasks.workunit.client.0.vm03.stdout:5/654: creat d4/d13/d43/fcf x:0 0 0 2026-03-10T14:08:32.183 INFO:tasks.workunit.client.0.vm03.stdout:8/573: mkdir da/d24/db4 0 2026-03-10T14:08:32.184 INFO:tasks.workunit.client.0.vm03.stdout:3/563: getdents d1d/d59 0 2026-03-10T14:08:32.189 INFO:tasks.workunit.client.0.vm03.stdout:2/512: creat d5/d10/d1f/d4f/d76/fa1 x:0 0 0 2026-03-10T14:08:32.190 INFO:tasks.workunit.client.0.vm03.stdout:2/513: truncate d5/d10/d1c/d40/f86 545629 0 2026-03-10T14:08:32.192 INFO:tasks.workunit.client.0.vm03.stdout:5/655: write d4/d13/d43/f58 [6767983,43235] 0 2026-03-10T14:08:32.193 INFO:tasks.workunit.client.0.vm03.stdout:5/656: stat d4/d16/d19/d6e/c86 0 2026-03-10T14:08:32.197 INFO:tasks.workunit.client.0.vm03.stdout:1/533: mkdir d0/d18/daa 0 2026-03-10T14:08:32.203 INFO:tasks.workunit.client.0.vm03.stdout:5/657: dread d4/d6/fa [0,4194304] 0 2026-03-10T14:08:32.204 INFO:tasks.workunit.client.0.vm03.stdout:5/658: dread - d4/d13/d43/fbb zero size 2026-03-10T14:08:32.206 INFO:tasks.workunit.client.0.vm03.stdout:8/574: creat da/d3c/d4b/d69/fb5 x:0 0 0 2026-03-10T14:08:32.207 INFO:tasks.workunit.client.0.vm03.stdout:8/575: dread - da/d3a/d44/f88 zero size 2026-03-10T14:08:32.211 INFO:tasks.workunit.client.0.vm03.stdout:3/564: mknod d1d/d39/da1/cb3 0 2026-03-10T14:08:32.216 INFO:tasks.workunit.client.0.vm03.stdout:3/565: dwrite d1d/d29/d41/d45/f6a [4194304,4194304] 0 2026-03-10T14:08:32.219 INFO:tasks.workunit.client.0.vm03.stdout:3/566: chown d1d/d33 138830630 1 2026-03-10T14:08:32.219 INFO:tasks.workunit.client.0.vm03.stdout:3/567: dread - d1d/d29/d41/d45/d95/faa zero size 2026-03-10T14:08:32.221 INFO:tasks.workunit.client.0.vm03.stdout:7/449: rename d5/d9/d35/l4a to d5/d9/l99 0 2026-03-10T14:08:32.224 INFO:tasks.workunit.client.0.vm03.stdout:8/576: sync 2026-03-10T14:08:32.225 INFO:tasks.workunit.client.0.vm03.stdout:8/577: write da/d3a/d44/d64/f7b [1224883,46517] 0 2026-03-10T14:08:32.228 INFO:tasks.workunit.client.0.vm03.stdout:7/450: dwrite d5/d9/f17 [0,4194304] 0 2026-03-10T14:08:32.230 INFO:tasks.workunit.client.0.vm03.stdout:7/451: stat d5/f66 0 2026-03-10T14:08:32.237 INFO:tasks.workunit.client.0.vm03.stdout:2/514: dwrite d5/d10/d1c/d50/f98 [0,4194304] 0 2026-03-10T14:08:32.241 INFO:tasks.workunit.client.0.vm03.stdout:6/517: mkdir d8/d11/da0 0 2026-03-10T14:08:32.250 INFO:tasks.workunit.client.0.vm03.stdout:9/540: write d2/f15 [972933,119495] 0 2026-03-10T14:08:32.255 INFO:tasks.workunit.client.0.vm03.stdout:9/541: dwrite d2/d29/d33/fa5 [0,4194304] 0 2026-03-10T14:08:32.256 INFO:tasks.workunit.client.0.vm03.stdout:9/542: chown d2/d14/d2b/d79/d81/f9f 13366 1 2026-03-10T14:08:32.265 INFO:tasks.workunit.client.0.vm03.stdout:8/578: symlink da/d36/d40/d50/lb6 0 2026-03-10T14:08:32.270 INFO:tasks.workunit.client.0.vm03.stdout:0/518: dwrite d3/d46/f63 [0,4194304] 0 2026-03-10T14:08:32.273 INFO:tasks.workunit.client.0.vm03.stdout:0/519: dread d3/d46/f82 [0,4194304] 0 2026-03-10T14:08:32.279 INFO:tasks.workunit.client.0.vm03.stdout:7/452: mkdir d5/d9/d14/d21/d9a 0 2026-03-10T14:08:32.284 INFO:tasks.workunit.client.0.vm03.stdout:2/515: mkdir d5/d10/d1c/d40/d59/da2 0 2026-03-10T14:08:32.299 INFO:tasks.workunit.client.0.vm03.stdout:4/561: write d5/d9/d2b/f63 [565417,118454] 0 2026-03-10T14:08:32.306 INFO:tasks.workunit.client.0.vm03.stdout:5/659: getdents d4/dca 0 2026-03-10T14:08:32.306 INFO:tasks.workunit.client.0.vm03.stdout:5/660: write d4/d13/d1f/f74 [2212965,107797] 0 2026-03-10T14:08:32.306 INFO:tasks.workunit.client.0.vm03.stdout:3/568: mkdir d1d/d29/d41/d45/db4 0 2026-03-10T14:08:32.306 INFO:tasks.workunit.client.0.vm03.stdout:5/661: dwrite d4/d16/d19/d23/db8/fb0 [0,4194304] 0 2026-03-10T14:08:32.308 INFO:tasks.workunit.client.0.vm03.stdout:9/543: creat d2/d29/d33/d60/d8c/faf x:0 0 0 2026-03-10T14:08:32.310 INFO:tasks.workunit.client.0.vm03.stdout:8/579: mkdir da/d3a/d44/d64/db7 0 2026-03-10T14:08:32.319 INFO:tasks.workunit.client.0.vm03.stdout:6/518: dread d8/db/d12/d51/d5c/f69 [0,4194304] 0 2026-03-10T14:08:32.322 INFO:tasks.workunit.client.0.vm03.stdout:3/569: dread d1d/d33/f5e [0,4194304] 0 2026-03-10T14:08:32.325 INFO:tasks.workunit.client.0.vm03.stdout:7/453: unlink d5/d9/d14/d21/d28/l65 0 2026-03-10T14:08:32.326 INFO:tasks.workunit.client.0.vm03.stdout:4/562: sync 2026-03-10T14:08:32.340 INFO:tasks.workunit.client.0.vm03.stdout:8/580: creat da/d3c/d51/d75/fb8 x:0 0 0 2026-03-10T14:08:32.340 INFO:tasks.workunit.client.0.vm03.stdout:4/563: dwrite f2 [0,4194304] 0 2026-03-10T14:08:32.343 INFO:tasks.workunit.client.0.vm03.stdout:9/544: fsync d2/f15 0 2026-03-10T14:08:32.353 INFO:tasks.workunit.client.0.vm03.stdout:0/520: mknod d3/d4d/da0/ca8 0 2026-03-10T14:08:32.367 INFO:tasks.workunit.client.0.vm03.stdout:2/516: write d5/d2a/f6d [924059,35932] 0 2026-03-10T14:08:32.368 INFO:tasks.workunit.client.0.vm03.stdout:3/570: creat d1d/d29/d41/d45/d5b/fb5 x:0 0 0 2026-03-10T14:08:32.368 INFO:tasks.workunit.client.0.vm03.stdout:7/454: unlink d5/l8 0 2026-03-10T14:08:32.368 INFO:tasks.workunit.client.0.vm03.stdout:7/455: fdatasync d5/d9/d14/d26/d39/f45 0 2026-03-10T14:08:32.368 INFO:tasks.workunit.client.0.vm03.stdout:7/456: fsync d5/d9/d35/f69 0 2026-03-10T14:08:32.368 INFO:tasks.workunit.client.0.vm03.stdout:8/581: mknod da/d3c/d51/d75/cb9 0 2026-03-10T14:08:32.368 INFO:tasks.workunit.client.0.vm03.stdout:7/457: truncate d5/d9/d3e/f4e 884457 0 2026-03-10T14:08:32.368 INFO:tasks.workunit.client.0.vm03.stdout:5/662: rename d4/d16/d19/d6e/f57 to d4/d13/fd0 0 2026-03-10T14:08:32.368 INFO:tasks.workunit.client.0.vm03.stdout:8/582: mkdir da/d3c/d4b/d4c/dba 0 2026-03-10T14:08:32.369 INFO:tasks.workunit.client.0.vm03.stdout:4/564: mkdir d5/db4 0 2026-03-10T14:08:32.369 INFO:tasks.workunit.client.0.vm03.stdout:2/517: mkdir d5/d10/da3 0 2026-03-10T14:08:32.370 INFO:tasks.workunit.client.0.vm03.stdout:9/545: symlink d2/lb0 0 2026-03-10T14:08:32.370 INFO:tasks.workunit.client.0.vm03.stdout:9/546: stat d2/d14/f1a 0 2026-03-10T14:08:32.370 INFO:tasks.workunit.client.0.vm03.stdout:6/519: link d8/fd d8/db/d12/fa1 0 2026-03-10T14:08:32.371 INFO:tasks.workunit.client.0.vm03.stdout:9/547: chown d2/d29/d33/d55/d72 6 1 2026-03-10T14:08:32.371 INFO:tasks.workunit.client.0.vm03.stdout:9/548: fsync d2/d14/f1b 0 2026-03-10T14:08:32.371 INFO:tasks.workunit.client.0.vm03.stdout:9/549: fdatasync d2/d14/f1b 0 2026-03-10T14:08:32.373 INFO:tasks.workunit.client.0.vm03.stdout:5/663: mkdir d4/d16/d19/d6e/d7f/dd1 0 2026-03-10T14:08:32.377 INFO:tasks.workunit.client.0.vm03.stdout:9/550: mknod d2/d14/d2b/d79/cb1 0 2026-03-10T14:08:32.377 INFO:tasks.workunit.client.0.vm03.stdout:9/551: chown d2/d29/d33/c40 94083 1 2026-03-10T14:08:32.379 INFO:tasks.workunit.client.0.vm03.stdout:5/664: creat d4/d13/d8f/fd2 x:0 0 0 2026-03-10T14:08:32.380 INFO:tasks.workunit.client.0.vm03.stdout:5/665: write d4/d13/d43/f72 [912088,127654] 0 2026-03-10T14:08:32.386 INFO:tasks.workunit.client.0.vm03.stdout:8/583: mkdir da/d3c/d51/d85/dbb 0 2026-03-10T14:08:32.390 INFO:tasks.workunit.client.0.vm03.stdout:1/534: write d0/d18/d1d/f5e [398526,118398] 0 2026-03-10T14:08:32.393 INFO:tasks.workunit.client.0.vm03.stdout:7/458: sync 2026-03-10T14:08:32.393 INFO:tasks.workunit.client.0.vm03.stdout:4/565: sync 2026-03-10T14:08:32.396 INFO:tasks.workunit.client.0.vm03.stdout:5/666: dwrite d4/d13/d1f/fc1 [0,4194304] 0 2026-03-10T14:08:32.397 INFO:tasks.workunit.client.0.vm03.stdout:5/667: fdatasync d4/fc7 0 2026-03-10T14:08:32.397 INFO:tasks.workunit.client.0.vm03.stdout:6/520: unlink d8/d11/c22 0 2026-03-10T14:08:32.397 INFO:tasks.workunit.client.0.vm03.stdout:9/552: symlink d2/d14/d2b/d79/d8a/lb2 0 2026-03-10T14:08:32.397 INFO:tasks.workunit.client.0.vm03.stdout:3/571: getdents d1d/d33/d47/d53/d68 0 2026-03-10T14:08:32.398 INFO:tasks.workunit.client.0.vm03.stdout:3/572: write d1d/d29/faf [747313,130561] 0 2026-03-10T14:08:32.398 INFO:tasks.workunit.client.0.vm03.stdout:4/566: dread d5/d9/f31 [0,4194304] 0 2026-03-10T14:08:32.417 INFO:tasks.workunit.client.0.vm03.stdout:0/521: dwrite d3/d4d/d47/f48 [0,4194304] 0 2026-03-10T14:08:32.422 INFO:tasks.workunit.client.0.vm03.stdout:3/573: dread d1d/d29/d41/f4e [0,4194304] 0 2026-03-10T14:08:32.427 INFO:tasks.workunit.client.0.vm03.stdout:8/584: rename da/d36/d40/d50 to da/d58/d5f/dbc 0 2026-03-10T14:08:32.433 INFO:tasks.workunit.client.0.vm03.stdout:8/585: write da/d24/f2d [4553610,108916] 0 2026-03-10T14:08:32.433 INFO:tasks.workunit.client.0.vm03.stdout:8/586: chown f2 1683 1 2026-03-10T14:08:32.433 INFO:tasks.workunit.client.0.vm03.stdout:2/518: creat d5/d10/fa4 x:0 0 0 2026-03-10T14:08:32.433 INFO:tasks.workunit.client.0.vm03.stdout:9/553: mknod d2/d29/d33/d55/cb3 0 2026-03-10T14:08:32.435 INFO:tasks.workunit.client.0.vm03.stdout:0/522: mkdir d3/d46/da9 0 2026-03-10T14:08:32.440 INFO:tasks.workunit.client.0.vm03.stdout:4/567: dread d5/f7 [0,4194304] 0 2026-03-10T14:08:32.441 INFO:tasks.workunit.client.0.vm03.stdout:3/574: mkdir d1d/d29/d41/d45/db4/db6 0 2026-03-10T14:08:32.443 INFO:tasks.workunit.client.0.vm03.stdout:7/459: creat d5/d9/d14/f9b x:0 0 0 2026-03-10T14:08:32.447 INFO:tasks.workunit.client.0.vm03.stdout:7/460: write d5/d9/d35/f7c [807905,52900] 0 2026-03-10T14:08:32.447 INFO:tasks.workunit.client.0.vm03.stdout:3/575: read d1d/f26 [292404,2204] 0 2026-03-10T14:08:32.449 INFO:tasks.workunit.client.0.vm03.stdout:0/523: rename d3/d4d/d30/f32 to d3/d11/d2c/d4a/d4b/faa 0 2026-03-10T14:08:32.451 INFO:tasks.workunit.client.0.vm03.stdout:7/461: unlink d5/d9/d14/d21/d6f/c8b 0 2026-03-10T14:08:32.451 INFO:tasks.workunit.client.0.vm03.stdout:7/462: dread - d5/d9/d14/d26/d36/d51/d7b/f87 zero size 2026-03-10T14:08:32.452 INFO:tasks.workunit.client.0.vm03.stdout:6/521: link d8/d11/c16 d8/d11/d18/d79/d80/ca2 0 2026-03-10T14:08:32.454 INFO:tasks.workunit.client.0.vm03.stdout:3/576: mkdir d1d/d29/d41/d45/d55/d6e/da6/db7 0 2026-03-10T14:08:32.455 INFO:tasks.workunit.client.0.vm03.stdout:7/463: mkdir d5/d9/d14/d26/d36/d51/d7b/d9c 0 2026-03-10T14:08:32.456 INFO:tasks.workunit.client.0.vm03.stdout:3/577: fdatasync d1d/d39/d51/d72/f8e 0 2026-03-10T14:08:32.459 INFO:tasks.workunit.client.0.vm03.stdout:6/522: dwrite d8/db/d12/d51/d5c/f8d [0,4194304] 0 2026-03-10T14:08:32.459 INFO:tasks.workunit.client.0.vm03.stdout:7/464: creat d5/d9/d14/d21/d6f/f9d x:0 0 0 2026-03-10T14:08:32.460 INFO:tasks.workunit.client.0.vm03.stdout:6/523: chown d8/d3b 3817359 1 2026-03-10T14:08:32.461 INFO:tasks.workunit.client.0.vm03.stdout:3/578: creat d1d/d29/d41/d45/d5b/fb8 x:0 0 0 2026-03-10T14:08:32.471 INFO:tasks.workunit.client.0.vm03.stdout:5/668: dwrite d4/d16/f71 [0,4194304] 0 2026-03-10T14:08:32.482 INFO:tasks.workunit.client.0.vm03.stdout:6/524: dread d8/d1b/f29 [0,4194304] 0 2026-03-10T14:08:32.488 INFO:tasks.workunit.client.0.vm03.stdout:3/579: mkdir d1d/d33/d65/d5d/dae/db9 0 2026-03-10T14:08:32.491 INFO:tasks.workunit.client.0.vm03.stdout:1/535: truncate d0/d2/d34/f9a 4005995 0 2026-03-10T14:08:32.494 INFO:tasks.workunit.client.0.vm03.stdout:2/519: dwrite d5/d10/d1c/d40/d59/d7b/f73 [0,4194304] 0 2026-03-10T14:08:32.501 INFO:tasks.workunit.client.0.vm03.stdout:7/465: rmdir d5/d9/d14/d21/d28 39 2026-03-10T14:08:32.501 INFO:tasks.workunit.client.0.vm03.stdout:8/587: truncate da/f33 148172 0 2026-03-10T14:08:32.507 INFO:tasks.workunit.client.0.vm03.stdout:4/568: dwrite d5/f7 [0,4194304] 0 2026-03-10T14:08:32.508 INFO:tasks.workunit.client.0.vm03.stdout:9/554: truncate d2/f2c 2288482 0 2026-03-10T14:08:32.509 INFO:tasks.workunit.client.0.vm03.stdout:4/569: fdatasync d5/d9/db/d19/d34/f5d 0 2026-03-10T14:08:32.515 INFO:tasks.workunit.client.0.vm03.stdout:5/669: fdatasync d4/d6/de/f65 0 2026-03-10T14:08:32.522 INFO:tasks.workunit.client.0.vm03.stdout:8/588: dread f2 [0,4194304] 0 2026-03-10T14:08:32.526 INFO:tasks.workunit.client.0.vm03.stdout:8/589: fsync da/d24/f2d 0 2026-03-10T14:08:32.526 INFO:tasks.workunit.client.0.vm03.stdout:5/670: dread d4/d13/d43/f94 [0,4194304] 0 2026-03-10T14:08:32.526 INFO:tasks.workunit.client.0.vm03.stdout:5/671: dwrite d4/d13/d1f/d8c/f9c [0,4194304] 0 2026-03-10T14:08:32.533 INFO:tasks.workunit.client.0.vm03.stdout:3/580: truncate d1d/d29/d41/f4e 2171817 0 2026-03-10T14:08:32.534 INFO:tasks.workunit.client.0.vm03.stdout:6/525: unlink d8/db/c19 0 2026-03-10T14:08:32.547 INFO:tasks.workunit.client.0.vm03.stdout:6/526: chown d8/d1b 2184806 1 2026-03-10T14:08:32.547 INFO:tasks.workunit.client.0.vm03.stdout:8/590: creat da/d3c/d51/d85/fbd x:0 0 0 2026-03-10T14:08:32.548 INFO:tasks.workunit.client.0.vm03.stdout:6/527: mkdir d8/db/d49/d6c/d83/da3 0 2026-03-10T14:08:32.555 INFO:tasks.workunit.client.0.vm03.stdout:4/570: creat d5/db4/fb5 x:0 0 0 2026-03-10T14:08:32.556 INFO:tasks.workunit.client.0.vm03.stdout:5/672: mknod d4/d16/d19/d6e/da3/dc5/cd3 0 2026-03-10T14:08:32.560 INFO:tasks.workunit.client.0.vm03.stdout:0/524: link d3/d4d/d30/f45 d3/d16/d21/d9a/fab 0 2026-03-10T14:08:32.565 INFO:tasks.workunit.client.0.vm03.stdout:0/525: dwrite d3/d46/d54/d79/f88 [0,4194304] 0 2026-03-10T14:08:32.568 INFO:tasks.workunit.client.0.vm03.stdout:0/526: chown d3/d16 447179 1 2026-03-10T14:08:32.585 INFO:tasks.workunit.client.0.vm03.stdout:8/591: link da/d3c/d51/d75/fb8 da/d24/d49/fbe 0 2026-03-10T14:08:32.585 INFO:tasks.workunit.client.0.vm03.stdout:9/555: getdents d2/d14 0 2026-03-10T14:08:32.585 INFO:tasks.workunit.client.0.vm03.stdout:8/592: truncate da/d3a/fa3 336860 0 2026-03-10T14:08:32.592 INFO:tasks.workunit.client.0.vm03.stdout:8/593: mknod da/cbf 0 2026-03-10T14:08:32.593 INFO:tasks.workunit.client.0.vm03.stdout:8/594: unlink da/d24/l27 0 2026-03-10T14:08:32.595 INFO:tasks.workunit.client.0.vm03.stdout:7/466: sync 2026-03-10T14:08:32.600 INFO:tasks.workunit.client.0.vm03.stdout:8/595: dwrite da/d24/f32 [4194304,4194304] 0 2026-03-10T14:08:32.604 INFO:tasks.workunit.client.0.vm03.stdout:8/596: dread da/d24/f32 [0,4194304] 0 2026-03-10T14:08:32.612 INFO:tasks.workunit.client.0.vm03.stdout:3/581: sync 2026-03-10T14:08:32.612 INFO:tasks.workunit.client.0.vm03.stdout:5/673: sync 2026-03-10T14:08:32.612 INFO:tasks.workunit.client.0.vm03.stdout:9/556: sync 2026-03-10T14:08:32.613 INFO:tasks.workunit.client.0.vm03.stdout:9/557: dread - d2/d29/d33/d55/d72/d9d/fa0 zero size 2026-03-10T14:08:32.617 INFO:tasks.workunit.client.0.vm03.stdout:3/582: dwrite d1d/d29/f87 [0,4194304] 0 2026-03-10T14:08:32.619 INFO:tasks.workunit.client.0.vm03.stdout:3/583: fdatasync d1d/d33/f80 0 2026-03-10T14:08:32.622 INFO:tasks.workunit.client.0.vm03.stdout:1/536: rename d0/d2/d34 to d0/d2/df/dab 0 2026-03-10T14:08:32.629 INFO:tasks.workunit.client.0.vm03.stdout:6/528: write d8/db/d49/d6c/f66 [1057064,85140] 0 2026-03-10T14:08:32.629 INFO:tasks.workunit.client.0.vm03.stdout:0/527: getdents d3/d16/d21/d9a 0 2026-03-10T14:08:32.629 INFO:tasks.workunit.client.0.vm03.stdout:4/571: write d5/d9/db/f20 [5928540,68859] 0 2026-03-10T14:08:32.630 INFO:tasks.workunit.client.0.vm03.stdout:3/584: dwrite d1d/f66 [0,4194304] 0 2026-03-10T14:08:32.631 INFO:tasks.workunit.client.0.vm03.stdout:2/520: dwrite d5/d10/d17/f28 [4194304,4194304] 0 2026-03-10T14:08:32.634 INFO:tasks.workunit.client.0.vm03.stdout:0/528: write d3/d16/d21/fa4 [317380,57980] 0 2026-03-10T14:08:32.634 INFO:tasks.workunit.client.0.vm03.stdout:0/529: write d3/d46/d54/d79/f93 [70706,16715] 0 2026-03-10T14:08:32.647 INFO:tasks.workunit.client.0.vm03.stdout:5/674: mknod d4/d6/de/cd4 0 2026-03-10T14:08:32.647 INFO:tasks.workunit.client.0.vm03.stdout:8/597: read da/f15 [11184886,47795] 0 2026-03-10T14:08:32.663 INFO:tasks.workunit.client.0.vm03.stdout:1/537: readlink d0/l63 0 2026-03-10T14:08:32.664 INFO:tasks.workunit.client.0.vm03.stdout:6/529: creat d8/d11/d7a/fa4 x:0 0 0 2026-03-10T14:08:32.664 INFO:tasks.workunit.client.0.vm03.stdout:6/530: dread - d8/d11/d18/f9f zero size 2026-03-10T14:08:32.672 INFO:tasks.workunit.client.0.vm03.stdout:4/572: rename d5/d47/d62/d8a/da0/db2 to d5/d6e/db6 0 2026-03-10T14:08:32.677 INFO:tasks.workunit.client.0.vm03.stdout:0/530: rename d3/d46/d54 to d3/d46/dac 0 2026-03-10T14:08:32.678 INFO:tasks.workunit.client.0.vm03.stdout:0/531: dwrite d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:32.689 INFO:tasks.workunit.client.0.vm03.stdout:8/598: truncate da/f1f 4547648 0 2026-03-10T14:08:32.697 INFO:tasks.workunit.client.0.vm03.stdout:3/585: mkdir d1d/d33/d47/dac/dba 0 2026-03-10T14:08:32.697 INFO:tasks.workunit.client.0.vm03.stdout:3/586: fdatasync d1d/d29/d41/d45/d5b/fb5 0 2026-03-10T14:08:32.700 INFO:tasks.workunit.client.0.vm03.stdout:3/587: dread d1d/d29/f87 [0,4194304] 0 2026-03-10T14:08:32.701 INFO:tasks.workunit.client.0.vm03.stdout:3/588: chown d1d/d29/d41/d45/d55/d6e/da6/c97 6 1 2026-03-10T14:08:32.704 INFO:tasks.workunit.client.0.vm03.stdout:0/532: rmdir d3/d16 39 2026-03-10T14:08:32.709 INFO:tasks.workunit.client.0.vm03.stdout:4/573: dread d5/d9/db/f29 [0,4194304] 0 2026-03-10T14:08:32.709 INFO:tasks.workunit.client.0.vm03.stdout:4/574: stat d5/d9/f25 0 2026-03-10T14:08:32.717 INFO:tasks.workunit.client.0.vm03.stdout:9/558: write d2/d14/f61 [3009440,99725] 0 2026-03-10T14:08:32.717 INFO:tasks.workunit.client.0.vm03.stdout:9/559: stat d2/d29/da7 0 2026-03-10T14:08:32.724 INFO:tasks.workunit.client.0.vm03.stdout:2/521: creat d5/d10/d1f/fa5 x:0 0 0 2026-03-10T14:08:32.728 INFO:tasks.workunit.client.0.vm03.stdout:2/522: dwrite d5/d10/d1c/d40/f86 [0,4194304] 0 2026-03-10T14:08:32.738 INFO:tasks.workunit.client.0.vm03.stdout:5/675: dwrite d4/d35/f9b [0,4194304] 0 2026-03-10T14:08:32.744 INFO:tasks.workunit.client.0.vm03.stdout:6/531: truncate d8/db/d49/d6c/d32/f41 3638803 0 2026-03-10T14:08:32.744 INFO:tasks.workunit.client.0.vm03.stdout:6/532: chown d8/db/d12/d51/d8c 393 1 2026-03-10T14:08:32.745 INFO:tasks.workunit.client.0.vm03.stdout:6/533: chown d8/d1b/d1c/c30 217076 1 2026-03-10T14:08:32.745 INFO:tasks.workunit.client.0.vm03.stdout:6/534: stat d8/db/d12/d51/d5c/f69 0 2026-03-10T14:08:32.750 INFO:tasks.workunit.client.0.vm03.stdout:7/467: getdents d5/d9/d14/d26/d5f/d89 0 2026-03-10T14:08:32.751 INFO:tasks.workunit.client.0.vm03.stdout:7/468: write d5/d9/d35/f69 [699222,101915] 0 2026-03-10T14:08:32.756 INFO:tasks.workunit.client.0.vm03.stdout:9/560: fsync d2/d14/d2b/d34/f7a 0 2026-03-10T14:08:32.761 INFO:tasks.workunit.client.0.vm03.stdout:2/523: mknod d5/d10/d31/ca6 0 2026-03-10T14:08:32.761 INFO:tasks.workunit.client.0.vm03.stdout:2/524: write d5/f2d [3975384,32531] 0 2026-03-10T14:08:32.767 INFO:tasks.workunit.client.0.vm03.stdout:5/676: mkdir d4/d6/de/dd5 0 2026-03-10T14:08:32.775 INFO:tasks.workunit.client.0.vm03.stdout:6/535: symlink d8/d1b/d1c/la5 0 2026-03-10T14:08:32.775 INFO:tasks.workunit.client.0.vm03.stdout:6/536: stat d8/d11/d18 0 2026-03-10T14:08:32.775 INFO:tasks.workunit.client.0.vm03.stdout:6/537: write d8/db/d12/d51/d5c/f68 [6272563,27779] 0 2026-03-10T14:08:32.778 INFO:tasks.workunit.client.0.vm03.stdout:9/561: dread d2/d14/f1a [0,4194304] 0 2026-03-10T14:08:32.785 INFO:tasks.workunit.client.0.vm03.stdout:5/677: dread d4/d6/de/f14 [4194304,4194304] 0 2026-03-10T14:08:32.785 INFO:tasks.workunit.client.0.vm03.stdout:5/678: fsync d4/d16/d19/f25 0 2026-03-10T14:08:32.799 INFO:tasks.workunit.client.0.vm03.stdout:9/562: sync 2026-03-10T14:08:32.799 INFO:tasks.workunit.client.0.vm03.stdout:7/469: read d5/d9/d35/f69 [143183,119352] 0 2026-03-10T14:08:32.799 INFO:tasks.workunit.client.0.vm03.stdout:7/470: stat d5/d9/d35/f52 0 2026-03-10T14:08:32.800 INFO:tasks.workunit.client.0.vm03.stdout:0/533: write d3/d4d/d47/f48 [1065642,120917] 0 2026-03-10T14:08:32.804 INFO:tasks.workunit.client.0.vm03.stdout:9/563: dread d2/d29/f35 [0,4194304] 0 2026-03-10T14:08:32.807 INFO:tasks.workunit.client.0.vm03.stdout:8/599: rename da/d3c/d51/d8b/laa to da/d36/d6a/lc0 0 2026-03-10T14:08:32.808 INFO:tasks.workunit.client.0.vm03.stdout:8/600: write da/d3a/d44/f46 [3579068,12487] 0 2026-03-10T14:08:32.816 INFO:tasks.workunit.client.0.vm03.stdout:2/525: unlink d5/d10/d1c/f5b 0 2026-03-10T14:08:32.842 INFO:tasks.workunit.client.0.vm03.stdout:5/679: write d4/d35/f92 [1071690,7817] 0 2026-03-10T14:08:32.844 INFO:tasks.workunit.client.0.vm03.stdout:4/575: creat d5/d47/d5b/d64/fb7 x:0 0 0 2026-03-10T14:08:32.844 INFO:tasks.workunit.client.0.vm03.stdout:5/680: read d4/d6/fbe [672450,61147] 0 2026-03-10T14:08:32.847 INFO:tasks.workunit.client.0.vm03.stdout:1/538: mknod d0/d2/df/cac 0 2026-03-10T14:08:32.858 INFO:tasks.workunit.client.0.vm03.stdout:0/534: fdatasync d3/d17/f6d 0 2026-03-10T14:08:32.866 INFO:tasks.workunit.client.0.vm03.stdout:0/535: sync 2026-03-10T14:08:32.870 INFO:tasks.workunit.client.0.vm03.stdout:7/471: creat d5/d9/d14/d26/d36/d51/d7b/d83/f9e x:0 0 0 2026-03-10T14:08:32.876 INFO:tasks.workunit.client.0.vm03.stdout:6/538: rename d8/db/d12/f9e to d8/d11/d7a/fa6 0 2026-03-10T14:08:32.880 INFO:tasks.workunit.client.0.vm03.stdout:9/564: dwrite d2/d29/d33/d41/f6e [0,4194304] 0 2026-03-10T14:08:32.882 INFO:tasks.workunit.client.0.vm03.stdout:9/565: chown d2/d14/f96 17623071 1 2026-03-10T14:08:32.882 INFO:tasks.workunit.client.0.vm03.stdout:9/566: write d2/d29/d33/d41/f57 [1734161,89724] 0 2026-03-10T14:08:32.898 INFO:tasks.workunit.client.0.vm03.stdout:3/589: link d1d/d33/c3d d1d/cbb 0 2026-03-10T14:08:32.905 INFO:tasks.workunit.client.0.vm03.stdout:5/681: creat d4/d40/da5/fd6 x:0 0 0 2026-03-10T14:08:32.910 INFO:tasks.workunit.client.0.vm03.stdout:1/539: fsync d0/f48 0 2026-03-10T14:08:32.911 INFO:tasks.workunit.client.0.vm03.stdout:1/540: dread d0/d18/d3b/f4c [0,4194304] 0 2026-03-10T14:08:32.923 INFO:tasks.workunit.client.0.vm03.stdout:0/536: mknod d3/d4d/d47/cad 0 2026-03-10T14:08:32.923 INFO:tasks.workunit.client.0.vm03.stdout:0/537: dread - d3/d11/f8e zero size 2026-03-10T14:08:32.929 INFO:tasks.workunit.client.0.vm03.stdout:7/472: write d5/f47 [786539,96539] 0 2026-03-10T14:08:32.935 INFO:tasks.workunit.client.0.vm03.stdout:2/526: rename d5/d10/d1c to d5/d10/d1f/d4f/d76/da7 0 2026-03-10T14:08:32.938 INFO:tasks.workunit.client.0.vm03.stdout:6/539: mkdir d8/d3b/da7 0 2026-03-10T14:08:32.942 INFO:tasks.workunit.client.0.vm03.stdout:8/601: truncate da/d24/f3d 738976 0 2026-03-10T14:08:32.950 INFO:tasks.workunit.client.0.vm03.stdout:8/602: sync 2026-03-10T14:08:32.952 INFO:tasks.workunit.client.0.vm03.stdout:2/527: symlink d5/d10/d1f/d4f/d76/da7/d40/d92/la8 0 2026-03-10T14:08:32.952 INFO:tasks.workunit.client.0.vm03.stdout:8/603: sync 2026-03-10T14:08:32.953 INFO:tasks.workunit.client.0.vm03.stdout:2/528: stat d5/d10/d17/c7a 0 2026-03-10T14:08:32.956 INFO:tasks.workunit.client.0.vm03.stdout:9/567: rename d2/d14/c90 to d2/d29/d33/d60/cb4 0 2026-03-10T14:08:32.959 INFO:tasks.workunit.client.0.vm03.stdout:6/540: creat d8/db/d12/d64/fa8 x:0 0 0 2026-03-10T14:08:32.961 INFO:tasks.workunit.client.0.vm03.stdout:3/590: truncate d1d/d29/d41/f4e 3151057 0 2026-03-10T14:08:32.964 INFO:tasks.workunit.client.0.vm03.stdout:4/576: rmdir d5/d9/db/da4 0 2026-03-10T14:08:32.965 INFO:tasks.workunit.client.0.vm03.stdout:5/682: link d4/fc7 d4/d16/d19/d4a/fd7 0 2026-03-10T14:08:32.966 INFO:tasks.workunit.client.0.vm03.stdout:1/541: mkdir d0/d2/df/d91/dad 0 2026-03-10T14:08:32.969 INFO:tasks.workunit.client.0.vm03.stdout:7/473: truncate d5/d9/d3e/f4e 833856 0 2026-03-10T14:08:32.972 INFO:tasks.workunit.client.0.vm03.stdout:1/542: dwrite d0/d2/df/dab/f88 [0,4194304] 0 2026-03-10T14:08:32.973 INFO:tasks.workunit.client.0.vm03.stdout:0/538: rename d3/d11/d2c/d4a/d4b/d89/l92 to d3/d11/d76/lae 0 2026-03-10T14:08:32.977 INFO:tasks.workunit.client.0.vm03.stdout:1/543: write d0/d2/df/d27/f52 [180371,55776] 0 2026-03-10T14:08:32.982 INFO:tasks.workunit.client.0.vm03.stdout:0/539: dwrite d3/d11/d66/f86 [0,4194304] 0 2026-03-10T14:08:32.994 INFO:tasks.workunit.client.0.vm03.stdout:3/591: creat d1d/d39/d51/d72/d99/fbc x:0 0 0 2026-03-10T14:08:32.999 INFO:tasks.workunit.client.0.vm03.stdout:5/683: symlink d4/d40/da5/ld8 0 2026-03-10T14:08:32.999 INFO:tasks.workunit.client.0.vm03.stdout:3/592: read d1d/d29/d41/d45/d55/d6e/d83/d91/fa9 [1620605,11814] 0 2026-03-10T14:08:33.001 INFO:tasks.workunit.client.0.vm03.stdout:5/684: dwrite d4/d16/d19/d4a/f81 [0,4194304] 0 2026-03-10T14:08:33.008 INFO:tasks.workunit.client.0.vm03.stdout:8/604: creat da/d58/da8/fc1 x:0 0 0 2026-03-10T14:08:33.026 INFO:tasks.workunit.client.0.vm03.stdout:4/577: link d5/d47/d62/d87/fae d5/d9/fb8 0 2026-03-10T14:08:33.027 INFO:tasks.workunit.client.0.vm03.stdout:5/685: stat d4/c4c 0 2026-03-10T14:08:33.034 INFO:tasks.workunit.client.0.vm03.stdout:0/540: rmdir d3/d46/dac 39 2026-03-10T14:08:33.035 INFO:tasks.workunit.client.0.vm03.stdout:3/593: mknod d1d/d33/d47/cbd 0 2026-03-10T14:08:33.042 INFO:tasks.workunit.client.0.vm03.stdout:7/474: creat d5/d9/d14/d26/f9f x:0 0 0 2026-03-10T14:08:33.046 INFO:tasks.workunit.client.0.vm03.stdout:1/544: rmdir d0/d18/d3b/d50/da4 0 2026-03-10T14:08:33.047 INFO:tasks.workunit.client.0.vm03.stdout:1/545: chown d0/d2/df/d91 58 1 2026-03-10T14:08:33.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.055+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mon.1 v2:192.168.123.104:3300/0 9 ==== mgrmap(e 28) v1 ==== 99710+0+0 (secure 0 0 0) 0x7f8df4079490 con 0x7f8e04071980 2026-03-10T14:08:33.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.055+0000 7f8e017fa700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f8dec07f780 0x7f8dec081b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:33.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.055+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 --> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8dec082140 con 0x7f8dec07f780 2026-03-10T14:08:33.055 INFO:tasks.workunit.client.0.vm03.stdout:0/541: symlink d3/d11/d66/laf 0 2026-03-10T14:08:33.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.063+0000 7f8e037fe700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f8dec07f780 0x7f8dec081b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:33.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.064+0000 7f8e037fe700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f8dec07f780 0x7f8dec081b60 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f8e04072ff0 tx=0x7f8dfc014040 comp rx=0 tx=0).ready entity=mgr.14704 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:33.067 INFO:tasks.workunit.client.0.vm03.stdout:2/529: dwrite d5/d10/d17/f18 [0,4194304] 0 2026-03-10T14:08:33.068 INFO:tasks.workunit.client.0.vm03.stdout:3/594: creat d1d/d29/d41/d45/d55/d6e/d83/d91/fbe x:0 0 0 2026-03-10T14:08:33.077 INFO:tasks.workunit.client.0.vm03.stdout:6/541: dwrite d8/db/d49/d6c/d32/f41 [0,4194304] 0 2026-03-10T14:08:33.086 INFO:tasks.workunit.client.0.vm03.stdout:5/686: symlink d4/d40/ld9 0 2026-03-10T14:08:33.098 INFO:tasks.workunit.client.0.vm03.stdout:8/605: dwrite da/d24/f25 [0,4194304] 0 2026-03-10T14:08:33.100 INFO:tasks.workunit.client.0.vm03.stdout:9/568: creat d2/d14/fb5 x:0 0 0 2026-03-10T14:08:33.104 INFO:tasks.workunit.client.0.vm03.stdout:1/546: unlink d0/d18/d1d/f84 0 2026-03-10T14:08:33.108 INFO:tasks.workunit.client.0.vm03.stdout:0/542: chown d3/d11/c14 1006778 1 2026-03-10T14:08:33.108 INFO:tasks.workunit.client.0.vm03.stdout:1/547: chown d0/d18/d1d/l40 242390 1 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: Active manager daemon vm03.rwbbep restarted 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: Activating manager daemon vm03.rwbbep 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: mgrmap e27: vm03.rwbbep(active, starting, since 0.0162317s) 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.rwbbep/crt"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.rwbbep/key"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:08:33.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: Standby manager daemon vm04.ywwcto started 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.104:0/1975973764' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/crt"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.104:0/1975973764' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.104:0/1975973764' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/key"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.? 192.168.123.104:0/1975973764' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: Manager daemon vm03.rwbbep is now available 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:08:33.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:32 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:08:33.115 INFO:tasks.workunit.client.0.vm03.stdout:3/595: fdatasync d1d/f26 0 2026-03-10T14:08:33.115 INFO:tasks.workunit.client.0.vm03.stdout:3/596: chown d1d/d29/d41/f75 821367 1 2026-03-10T14:08:33.117 INFO:tasks.workunit.client.0.vm03.stdout:7/475: symlink d5/la0 0 2026-03-10T14:08:33.119 INFO:tasks.workunit.client.0.vm03.stdout:8/606: rename da/d3a/d44/d64 to da/d3c/d51/d75/dc2 0 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "2/23 daemons upgraded", 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: "message": "", 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.126+0000 7f8e017fa700 1 -- 192.168.123.103:0/319916102 <== mgr.14704 v2:192.168.123.103:6800/2419158435 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+337 (secure 0 0 0) 0x7f8dec082140 con 0x7f8dec07f780 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.128+0000 7f8deaffd700 1 -- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f8dec07f780 msgr2=0x7f8dec081b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:33.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.128+0000 7f8deaffd700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f8dec07f780 0x7f8dec081b60 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f8e04072ff0 tx=0x7f8dfc014040 comp rx=0 tx=0).stop 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.128+0000 7f8deaffd700 1 -- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04071980 msgr2=0x7f8e04082580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.128+0000 7f8deaffd700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04071980 0x7f8e04082580 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f8df4009d30 tx=0x7f8df400e3b0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.129+0000 7f8deaffd700 1 -- 192.168.123.103:0/319916102 shutdown_connections 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.129+0000 7f8deaffd700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:6828/3678786563,v1:192.168.123.104:6829/3678786563] conn(0x7f8dec074180 0x7f8dec076630 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.129+0000 7f8deaffd700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8e04071980 0x7f8e04082580 secure :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f8df4009d30 tx=0x7f8df400e3b0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.129+0000 7f8deaffd700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f8dec07f780 0x7f8dec081b60 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.129+0000 7f8deaffd700 1 --2- 192.168.123.103:0/319916102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8e04082ac0 0x7f8e04082f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.129+0000 7f8deaffd700 1 -- 192.168.123.103:0/319916102 >> 192.168.123.103:0/319916102 conn(0x7f8e0406d1a0 msgr2=0x7f8e040764b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.129+0000 7f8deaffd700 1 -- 192.168.123.103:0/319916102 shutdown_connections 2026-03-10T14:08:33.128 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.129+0000 7f8deaffd700 1 -- 192.168.123.103:0/319916102 wait complete. 2026-03-10T14:08:33.132 INFO:tasks.workunit.client.0.vm03.stdout:9/569: unlink d2/d29/d33/d60/d8c/faf 0 2026-03-10T14:08:33.149 INFO:tasks.workunit.client.0.vm03.stdout:1/548: unlink d0/d18/l6e 0 2026-03-10T14:08:33.157 INFO:tasks.workunit.client.0.vm03.stdout:4/578: truncate d5/d47/d5b/f7c 2158406 0 2026-03-10T14:08:33.157 INFO:tasks.workunit.client.0.vm03.stdout:4/579: chown d5/d47/f46 10881 1 2026-03-10T14:08:33.158 INFO:tasks.workunit.client.0.vm03.stdout:4/580: chown d5/d47/d5b/c68 1 1 2026-03-10T14:08:33.159 INFO:tasks.workunit.client.0.vm03.stdout:2/530: write f4 [1269630,36342] 0 2026-03-10T14:08:33.169 INFO:tasks.workunit.client.0.vm03.stdout:3/597: mkdir d1d/d29/d41/d45/d55/dbf 0 2026-03-10T14:08:33.176 INFO:tasks.workunit.client.0.vm03.stdout:6/542: unlink d8/l9 0 2026-03-10T14:08:33.182 INFO:tasks.workunit.client.0.vm03.stdout:5/687: creat d4/d16/d19/d6e/d7f/dd1/fda x:0 0 0 2026-03-10T14:08:33.210 INFO:tasks.workunit.client.0.vm03.stdout:9/570: dread d2/d29/f35 [0,4194304] 0 2026-03-10T14:08:33.217 INFO:tasks.workunit.client.0.vm03.stdout:0/543: creat d3/d46/dac/fb0 x:0 0 0 2026-03-10T14:08:33.229 INFO:tasks.workunit.client.0.vm03.stdout:2/531: creat d5/d10/d31/fa9 x:0 0 0 2026-03-10T14:08:33.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.233+0000 7f2022072700 1 -- 192.168.123.103:0/97749884 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c071980 msgr2=0x7f201c071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.233+0000 7f2022072700 1 --2- 192.168.123.103:0/97749884 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c071980 0x7f201c071d90 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7f200c007780 tx=0x7f200c00c050 comp rx=0 tx=0).stop 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.234+0000 7f2022072700 1 -- 192.168.123.103:0/97749884 shutdown_connections 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.234+0000 7f2022072700 1 --2- 192.168.123.103:0/97749884 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f201c072360 0x7f201c0770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.234+0000 7f2022072700 1 --2- 192.168.123.103:0/97749884 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c071980 0x7f201c071d90 unknown :-1 s=CLOSED pgs=344 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.234+0000 7f2022072700 1 -- 192.168.123.103:0/97749884 >> 192.168.123.103:0/97749884 conn(0x7f201c06d1a0 msgr2=0x7f201c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.234+0000 7f2022072700 1 -- 192.168.123.103:0/97749884 shutdown_connections 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.234+0000 7f2022072700 1 -- 192.168.123.103:0/97749884 wait complete. 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.234+0000 7f2022072700 1 Processor -- start 2026-03-10T14:08:33.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.234+0000 7f2022072700 1 -- start start 2026-03-10T14:08:33.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.235+0000 7f2022072700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f201c072360 0x7f201c131340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:33.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.235+0000 7f2022072700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c131880 0x7f201c07f500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:33.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.235+0000 7f2022072700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f201c131d80 con 0x7f201c131880 2026-03-10T14:08:33.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.235+0000 7f2022072700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f201c131ef0 con 0x7f201c072360 2026-03-10T14:08:33.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.235+0000 7f201affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c131880 0x7f201c07f500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:33.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.235+0000 7f201affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c131880 0x7f201c07f500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59456/0 (socket says 192.168.123.103:59456) 2026-03-10T14:08:33.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.235+0000 7f201affd700 1 -- 192.168.123.103:0/3948667953 learned_addr learned my addr 192.168.123.103:0/3948667953 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:08:33.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.235+0000 7f201b7fe700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f201c072360 0x7f201c131340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:33.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.236+0000 7f201affd700 1 -- 192.168.123.103:0/3948667953 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f201c072360 msgr2=0x7f201c131340 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:33.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.236+0000 7f201affd700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f201c072360 0x7f201c131340 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.236+0000 7f201affd700 1 -- 192.168.123.103:0/3948667953 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f200c007430 con 0x7f201c131880 2026-03-10T14:08:33.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.237+0000 7f201affd700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c131880 0x7f201c07f500 secure :-1 s=READY pgs=345 cs=0 l=1 rev1=1 crypto rx=0x7f201400bf40 tx=0x7f2014009f50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:33.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.237+0000 7f2018ff9700 1 -- 192.168.123.103:0/3948667953 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f201400cac0 con 0x7f201c131880 2026-03-10T14:08:33.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.237+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f201c07faa0 con 0x7f201c131880 2026-03-10T14:08:33.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.237+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f201c07ffc0 con 0x7f201c131880 2026-03-10T14:08:33.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.238+0000 7f2018ff9700 1 -- 192.168.123.103:0/3948667953 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f201400cc20 con 0x7f201c131880 2026-03-10T14:08:33.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.238+0000 7f2018ff9700 1 -- 192.168.123.103:0/3948667953 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2014012750 con 0x7f201c131880 2026-03-10T14:08:33.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.238+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2008005320 con 0x7f201c131880 2026-03-10T14:08:33.239 INFO:tasks.workunit.client.0.vm03.stdout:3/598: mkdir d1d/d29/d41/dc0 0 2026-03-10T14:08:33.240 INFO:tasks.workunit.client.0.vm03.stdout:3/599: write d1d/d29/d41/d45/d95/faa [928625,104492] 0 2026-03-10T14:08:33.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.241+0000 7f2018ff9700 1 -- 192.168.123.103:0/3948667953 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 28) v1 ==== 99710+0+0 (secure 0 0 0) 0x7f20140128b0 con 0x7f201c131880 2026-03-10T14:08:33.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.242+0000 7f2018ff9700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f20040772b0 0x7f2004079760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:08:33.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.242+0000 7f2018ff9700 1 -- 192.168.123.103:0/3948667953 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f2014061d10 con 0x7f201c131880 2026-03-10T14:08:33.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.244+0000 7f2018ff9700 1 -- 192.168.123.103:0/3948667953 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f201405e110 con 0x7f201c131880 2026-03-10T14:08:33.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.247+0000 7f201b7fe700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f20040772b0 0x7f2004079760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:08:33.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.258+0000 7f201b7fe700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f20040772b0 0x7f2004079760 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f200c007400 tx=0x7f200c015040 comp rx=0 tx=0).ready entity=mgr.14704 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:08:33.298 INFO:tasks.workunit.client.0.vm03.stdout:1/549: symlink d0/d2/df/d16/lae 0 2026-03-10T14:08:33.300 INFO:tasks.workunit.client.0.vm03.stdout:4/581: mkdir d5/d9/db/da7/db9 0 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: Active manager daemon vm03.rwbbep restarted 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: Activating manager daemon vm03.rwbbep 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: mgrmap e27: vm03.rwbbep(active, starting, since 0.0162317s) 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.rwbbep/crt"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.rwbbep/key"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: Standby manager daemon vm04.ywwcto started 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/1975973764' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/crt"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/1975973764' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/1975973764' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/key"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/1975973764' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: Manager daemon vm03.rwbbep is now available 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:08:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:32 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:08:33.321 INFO:tasks.workunit.client.0.vm03.stdout:2/532: write d5/d10/d1f/d4f/d76/da7/d40/f4b [1680577,109558] 0 2026-03-10T14:08:33.324 INFO:tasks.workunit.client.0.vm03.stdout:2/533: stat d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 0 2026-03-10T14:08:33.325 INFO:tasks.workunit.client.0.vm03.stdout:7/476: dwrite d5/d9/d14/d26/d5f/f8e [4194304,4194304] 0 2026-03-10T14:08:33.325 INFO:tasks.workunit.client.0.vm03.stdout:2/534: write d5/d10/d31/f7d [4531129,53825] 0 2026-03-10T14:08:33.381 INFO:tasks.workunit.client.0.vm03.stdout:9/571: fdatasync d2/d14/d2b/f2d 0 2026-03-10T14:08:33.400 INFO:tasks.workunit.client.0.vm03.stdout:1/550: rename d0/d2/df/d16/d20/f72 to d0/d2/df/d27/faf 0 2026-03-10T14:08:33.406 INFO:tasks.workunit.client.0.vm03.stdout:6/543: write d8/d11/f2a [3062165,80968] 0 2026-03-10T14:08:33.413 INFO:tasks.workunit.client.0.vm03.stdout:5/688: write d4/d13/d1f/d8c/da7/fab [771972,77916] 0 2026-03-10T14:08:33.424 INFO:tasks.workunit.client.0.vm03.stdout:2/535: dread d5/d10/f22 [0,4194304] 0 2026-03-10T14:08:33.447 INFO:tasks.workunit.client.0.vm03.stdout:3/600: write d1d/d29/d41/f4e [1036911,36406] 0 2026-03-10T14:08:33.452 INFO:tasks.workunit.client.0.vm03.stdout:3/601: read d1d/d29/f2e [3327343,35083] 0 2026-03-10T14:08:33.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.453+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f2008005190 con 0x7f201c131880 2026-03-10T14:08:33.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.458+0000 7f2018ff9700 1 -- 192.168.123.103:0/3948667953 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f2014019090 con 0x7f201c131880 2026-03-10T14:08:33.457 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_OK 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f20040772b0 msgr2=0x7f2004079760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f20040772b0 0x7f2004079760 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f200c007400 tx=0x7f200c015040 comp rx=0 tx=0).stop 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c131880 msgr2=0x7f201c07f500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c131880 0x7f201c07f500 secure :-1 s=READY pgs=345 cs=0 l=1 rev1=1 crypto rx=0x7f201400bf40 tx=0x7f2014009f50 comp rx=0 tx=0).stop 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 shutdown_connections 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f201c072360 0x7f201c131340 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:6800/2419158435,v1:192.168.123.103:6801/2419158435] conn(0x7f20040772b0 0x7f2004079760 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 --2- 192.168.123.103:0/3948667953 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f201c131880 0x7f201c07f500 unknown :-1 s=CLOSED pgs=345 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 >> 192.168.123.103:0/3948667953 conn(0x7f201c06d1a0 msgr2=0x7f201c0764d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 shutdown_connections 2026-03-10T14:08:33.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:08:33.462+0000 7f2022072700 1 -- 192.168.123.103:0/3948667953 wait complete. 2026-03-10T14:08:33.524 INFO:tasks.workunit.client.0.vm03.stdout:7/477: dwrite d5/d9/d14/d26/d39/f45 [0,4194304] 0 2026-03-10T14:08:33.525 INFO:tasks.workunit.client.0.vm03.stdout:7/478: write d5/d9/d35/f85 [634980,27731] 0 2026-03-10T14:08:33.554 INFO:tasks.workunit.client.0.vm03.stdout:8/607: getdents da/d36 0 2026-03-10T14:08:33.566 INFO:tasks.workunit.client.0.vm03.stdout:0/544: write d3/d16/d21/d9a/fab [821653,15076] 0 2026-03-10T14:08:33.568 INFO:tasks.workunit.client.0.vm03.stdout:4/582: rename d5/d9/db/f67 to d5/d47/d5b/fba 0 2026-03-10T14:08:33.571 INFO:tasks.workunit.client.0.vm03.stdout:1/551: read d0/f48 [1064301,71184] 0 2026-03-10T14:08:33.571 INFO:tasks.workunit.client.0.vm03.stdout:1/552: dread - d0/d2/df/d27/d7e/d81/fa3 zero size 2026-03-10T14:08:33.575 INFO:tasks.workunit.client.0.vm03.stdout:5/689: dread d4/d13/d43/f94 [0,4194304] 0 2026-03-10T14:08:33.577 INFO:tasks.workunit.client.0.vm03.stdout:5/690: dwrite d4/d16/f2d [0,4194304] 0 2026-03-10T14:08:33.580 INFO:tasks.workunit.client.0.vm03.stdout:5/691: dread - d4/d16/d19/d4a/fb5 zero size 2026-03-10T14:08:33.580 INFO:tasks.workunit.client.0.vm03.stdout:6/544: write d8/db/d12/f5d [1197945,70427] 0 2026-03-10T14:08:33.581 INFO:tasks.workunit.client.0.vm03.stdout:6/545: chown d8/db/d49/d6c/f8e 0 1 2026-03-10T14:08:33.583 INFO:tasks.workunit.client.0.vm03.stdout:2/536: dread d5/fa [0,4194304] 0 2026-03-10T14:08:33.586 INFO:tasks.workunit.client.0.vm03.stdout:3/602: dread - d1d/d39/d51/f71 zero size 2026-03-10T14:08:33.588 INFO:tasks.workunit.client.0.vm03.stdout:3/603: chown d1d/d29/d41/d45/d55/c67 61 1 2026-03-10T14:08:33.590 INFO:tasks.workunit.client.0.vm03.stdout:5/692: dwrite d4/d16/d19/d4a/fb2 [0,4194304] 0 2026-03-10T14:08:33.591 INFO:tasks.workunit.client.0.vm03.stdout:5/693: chown d4/d13/d1f/c87 2592 1 2026-03-10T14:08:33.593 INFO:tasks.workunit.client.0.vm03.stdout:3/604: dread d1d/d29/d41/f4e [0,4194304] 0 2026-03-10T14:08:33.603 INFO:tasks.workunit.client.0.vm03.stdout:4/583: mknod d5/d9/db/da7/cbb 0 2026-03-10T14:08:33.608 INFO:tasks.workunit.client.0.vm03.stdout:0/545: dread d3/d46/f6c [0,4194304] 0 2026-03-10T14:08:33.612 INFO:tasks.workunit.client.0.vm03.stdout:6/546: symlink d8/db/d49/d58/la9 0 2026-03-10T14:08:33.613 INFO:tasks.workunit.client.0.vm03.stdout:2/537: creat d5/d10/d1f/d4f/d76/da7/d40/d59/faa x:0 0 0 2026-03-10T14:08:33.616 INFO:tasks.workunit.client.0.vm03.stdout:2/538: dwrite d5/d2a/f45 [4194304,4194304] 0 2026-03-10T14:08:33.684 INFO:tasks.workunit.client.0.vm03.stdout:7/479: dwrite d5/f66 [0,4194304] 0 2026-03-10T14:08:33.694 INFO:tasks.workunit.client.0.vm03.stdout:9/572: rename d2/d29/d33/d41/f57 to d2/d29/d33/d55/fb6 0 2026-03-10T14:08:33.694 INFO:tasks.workunit.client.0.vm03.stdout:9/573: stat d2/lb0 0 2026-03-10T14:08:33.711 INFO:tasks.workunit.client.0.vm03.stdout:3/605: chown c1a 14763595 1 2026-03-10T14:08:33.714 INFO:tasks.workunit.client.0.vm03.stdout:8/608: getdents da/d3c/d51/d75/dc2/db7 0 2026-03-10T14:08:33.715 INFO:tasks.workunit.client.0.vm03.stdout:8/609: write da/d3c/d4b/d4c/f95 [837895,107750] 0 2026-03-10T14:08:33.742 INFO:tasks.workunit.client.0.vm03.stdout:4/584: truncate d5/d9/f16 943979 0 2026-03-10T14:08:33.766 INFO:tasks.workunit.client.0.vm03.stdout:6/547: fsync d8/d1b/d1c/f73 0 2026-03-10T14:08:33.791 INFO:tasks.workunit.client.0.vm03.stdout:5/694: rmdir d4/d40/d4e 39 2026-03-10T14:08:33.795 INFO:tasks.workunit.client.0.vm03.stdout:3/606: mkdir d1d/d33/d47/dac/dc1 0 2026-03-10T14:08:33.801 INFO:tasks.workunit.client.0.vm03.stdout:8/610: stat da/c14 0 2026-03-10T14:08:33.802 INFO:tasks.workunit.client.0.vm03.stdout:8/611: dread - da/d3c/d4b/d4c/fae zero size 2026-03-10T14:08:33.802 INFO:tasks.workunit.client.0.vm03.stdout:8/612: read - da/d58/da8/fc1 zero size 2026-03-10T14:08:33.831 INFO:tasks.workunit.client.0.vm03.stdout:6/548: dread d8/db/d49/f4a [0,4194304] 0 2026-03-10T14:08:33.833 INFO:tasks.workunit.client.0.vm03.stdout:2/539: mkdir d5/d10/da3/dab 0 2026-03-10T14:08:33.833 INFO:tasks.workunit.client.0.vm03.stdout:5/695: rmdir d4/d16/d19/d23 39 2026-03-10T14:08:33.833 INFO:tasks.workunit.client.0.vm03.stdout:3/607: mknod d1d/d33/d47/d53/cc2 0 2026-03-10T14:08:33.833 INFO:tasks.workunit.client.0.vm03.stdout:8/613: mknod da/d58/d5f/dbc/d70/cc3 0 2026-03-10T14:08:33.833 INFO:tasks.workunit.client.0.vm03.stdout:1/553: link d0/d2/df/d16/d20/fa9 d0/d2/df/fb0 0 2026-03-10T14:08:33.833 INFO:tasks.workunit.client.0.vm03.stdout:4/585: getdents d5/d9/db/d19/d38/d53/d71 0 2026-03-10T14:08:33.835 INFO:tasks.workunit.client.0.vm03.stdout:2/540: stat d5/d10/d1f/d4f/d76/da7/d40/d59/d7b 0 2026-03-10T14:08:33.835 INFO:tasks.workunit.client.0.vm03.stdout:1/554: write d0/d2/df/d27/f32 [4446735,35954] 0 2026-03-10T14:08:33.836 INFO:tasks.workunit.client.0.vm03.stdout:5/696: write d4/d13/d1f/d8c/f9c [3722036,27170] 0 2026-03-10T14:08:33.850 INFO:tasks.workunit.client.0.vm03.stdout:6/549: creat d8/db/d49/d6c/d32/faa x:0 0 0 2026-03-10T14:08:33.872 INFO:tasks.workunit.client.0.vm03.stdout:9/574: link d2/d29/f35 d2/d29/d9a/fb7 0 2026-03-10T14:08:33.875 INFO:tasks.workunit.client.0.vm03.stdout:0/546: write d3/d46/f55 [225069,33534] 0 2026-03-10T14:08:33.882 INFO:tasks.workunit.client.0.vm03.stdout:3/608: creat d1d/d29/d41/d45/d5b/d7e/fc3 x:0 0 0 2026-03-10T14:08:33.936 INFO:tasks.workunit.client.0.vm03.stdout:5/697: fdatasync d4/d13/d43/f94 0 2026-03-10T14:08:33.942 INFO:tasks.workunit.client.0.vm03.stdout:9/575: dread d2/d14/f64 [0,4194304] 0 2026-03-10T14:08:33.944 INFO:tasks.workunit.client.0.vm03.stdout:7/480: creat d5/d9/d14/d26/d36/fa1 x:0 0 0 2026-03-10T14:08:33.945 INFO:tasks.workunit.client.0.vm03.stdout:7/481: chown d5/d9/d3e/c4f 6 1 2026-03-10T14:08:33.945 INFO:tasks.workunit.client.0.vm03.stdout:7/482: write d5/d9/d35/f69 [999357,130247] 0 2026-03-10T14:08:33.949 INFO:tasks.workunit.client.0.vm03.stdout:2/541: mkdir d5/dac 0 2026-03-10T14:08:33.954 INFO:tasks.workunit.client.0.vm03.stdout:2/542: chown d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/f91 16656456 1 2026-03-10T14:08:33.965 INFO:tasks.workunit.client.0.vm03.stdout:8/614: truncate da/d58/d6c/fa4 98771 0 2026-03-10T14:08:33.972 INFO:tasks.workunit.client.0.vm03.stdout:6/550: mknod d8/db/d49/d76/cab 0 2026-03-10T14:08:33.979 INFO:tasks.workunit.client.0.vm03.stdout:0/547: link d3/d16/f31 d3/d17/fb1 0 2026-03-10T14:08:33.979 INFO:tasks.workunit.client.0.vm03.stdout:8/615: creat da/d58/fc4 x:0 0 0 2026-03-10T14:08:33.979 INFO:tasks.workunit.client.0.vm03.stdout:2/543: read d5/d10/d1f/d4f/d76/da7/d50/d74/d83/f96 [307348,94684] 0 2026-03-10T14:08:33.980 INFO:tasks.workunit.client.0.vm03.stdout:4/586: link d5/d47/d5b/c5f d5/d9/db/da7/db9/cbc 0 2026-03-10T14:08:33.981 INFO:tasks.workunit.client.0.vm03.stdout:4/587: write d5/d9/db/ff [4429567,26862] 0 2026-03-10T14:08:33.983 INFO:tasks.workunit.client.0.vm03.stdout:1/555: getdents d0/d2/df/d16 0 2026-03-10T14:08:33.985 INFO:tasks.workunit.client.0.vm03.stdout:1/556: read - d0/d18/fa6 zero size 2026-03-10T14:08:33.991 INFO:tasks.workunit.client.0.vm03.stdout:3/609: getdents d1d/d39/da1 0 2026-03-10T14:08:33.994 INFO:tasks.workunit.client.0.vm03.stdout:0/548: creat d3/d16/d21/d9a/fb2 x:0 0 0 2026-03-10T14:08:33.997 INFO:tasks.workunit.client.0.vm03.stdout:2/544: creat d5/d35/fad x:0 0 0 2026-03-10T14:08:33.997 INFO:tasks.workunit.client.0.vm03.stdout:4/588: mkdir d5/d9/db/d19/d38/d7b/dbd 0 2026-03-10T14:08:33.998 INFO:tasks.workunit.client.0.vm03.stdout:6/551: dwrite d8/d1b/f7f [0,4194304] 0 2026-03-10T14:08:34.001 INFO:tasks.workunit.client.0.vm03.stdout:6/552: read - d8/d3b/f9c zero size 2026-03-10T14:08:34.001 INFO:tasks.workunit.client.0.vm03.stdout:6/553: readlink d8/d11/d7a/l89 0 2026-03-10T14:08:34.002 INFO:tasks.workunit.client.0.vm03.stdout:5/698: link d4/d6/c3e d4/d35/cdb 0 2026-03-10T14:08:34.010 INFO:tasks.workunit.client.0.vm03.stdout:8/616: symlink da/d58/d6c/lc5 0 2026-03-10T14:08:34.015 INFO:tasks.workunit.client.0.vm03.stdout:8/617: write da/d58/d5f/dbc/d70/d7d/fa1 [887694,39223] 0 2026-03-10T14:08:34.015 INFO:tasks.workunit.client.0.vm03.stdout:3/610: creat d1d/d33/d47/d53/d68/fc4 x:0 0 0 2026-03-10T14:08:34.018 INFO:tasks.workunit.client.0.vm03.stdout:2/545: chown d5/d10/f60 9990 1 2026-03-10T14:08:34.024 INFO:tasks.workunit.client.0.vm03.stdout:4/589: mkdir d5/d47/d5b/dbe 0 2026-03-10T14:08:34.028 INFO:tasks.workunit.client.0.vm03.stdout:9/576: link d2/d14/f1a d2/d14/d2b/d79/fb8 0 2026-03-10T14:08:34.028 INFO:tasks.workunit.client.0.vm03.stdout:1/557: creat d0/d18/daa/fb1 x:0 0 0 2026-03-10T14:08:34.032 INFO:tasks.workunit.client.0.vm03.stdout:6/554: mknod d8/d11/d7a/cac 0 2026-03-10T14:08:34.034 INFO:tasks.workunit.client.0.vm03.stdout:8/618: mkdir da/d3c/d51/d75/dc2/dc6 0 2026-03-10T14:08:34.037 INFO:tasks.workunit.client.0.vm03.stdout:3/611: rename l13 to d1d/d33/d47/dac/lc5 0 2026-03-10T14:08:34.041 INFO:tasks.workunit.client.0.vm03.stdout:4/590: mknod d5/d9/db/d19/d99/cbf 0 2026-03-10T14:08:34.043 INFO:tasks.workunit.client.0.vm03.stdout:9/577: creat d2/d29/d33/d41/daa/fb9 x:0 0 0 2026-03-10T14:08:34.047 INFO:tasks.workunit.client.0.vm03.stdout:1/558: symlink d0/d18/d3b/d50/lb2 0 2026-03-10T14:08:34.049 INFO:tasks.workunit.client.0.vm03.stdout:6/555: read - d8/db/d12/d51/f81 zero size 2026-03-10T14:08:34.050 INFO:tasks.workunit.client.0.vm03.stdout:1/559: dwrite d0/d2/df/d27/d7e/d81/f8d [0,4194304] 0 2026-03-10T14:08:34.053 INFO:tasks.workunit.client.0.vm03.stdout:6/556: write d8/db/d49/d6c/d83/f87 [347165,3430] 0 2026-03-10T14:08:34.065 INFO:tasks.workunit.client.0.vm03.stdout:5/699: mkdir d4/d6/ddc 0 2026-03-10T14:08:34.065 INFO:tasks.workunit.client.0.vm03.stdout:5/700: dread - d4/d40/da5/fd6 zero size 2026-03-10T14:08:34.066 INFO:tasks.workunit.client.0.vm03.stdout:5/701: write d4/d13/d1f/d8c/f9c [3690849,76906] 0 2026-03-10T14:08:34.080 INFO:tasks.workunit.client.0.vm03.stdout:7/483: write d5/d9/d14/d21/d28/f46 [380214,61964] 0 2026-03-10T14:08:34.083 INFO:tasks.workunit.client.0.vm03.stdout:2/546: mknod d5/d10/d1f/d4f/d76/da7/d54/d5f/cae 0 2026-03-10T14:08:34.085 INFO:tasks.workunit.client.0.vm03.stdout:4/591: mkdir d5/d9/db/da7/dc0 0 2026-03-10T14:08:34.085 INFO:tasks.workunit.client.0.vm03.stdout:4/592: stat d5/d47/d62/f80 0 2026-03-10T14:08:34.086 INFO:tasks.workunit.client.0.vm03.stdout:3/612: unlink d1d/d33/f92 0 2026-03-10T14:08:34.104 INFO:tasks.workunit.client.0.vm03.stdout:3/613: dread d1d/d33/f3a [0,4194304] 0 2026-03-10T14:08:34.104 INFO:tasks.workunit.client.0.vm03.stdout:3/614: readlink d1d/d29/d41/d45/d5b/d7e/l8d 0 2026-03-10T14:08:34.104 INFO:tasks.workunit.client.0.vm03.stdout:4/593: read d5/d9/db/d19/f51 [3773298,125239] 0 2026-03-10T14:08:34.105 INFO:tasks.workunit.client.0.vm03.stdout:4/594: stat d5/d9/db/d19/d34/f5d 0 2026-03-10T14:08:34.108 INFO:tasks.workunit.client.0.vm03.stdout:8/619: sync 2026-03-10T14:08:34.108 INFO:tasks.workunit.client.0.vm03.stdout:0/549: sync 2026-03-10T14:08:34.144 INFO:tasks.workunit.client.0.vm03.stdout:0/550: dread d3/d17/f56 [0,4194304] 0 2026-03-10T14:08:34.175 INFO:tasks.workunit.client.0.vm03.stdout:9/578: unlink d2/d14/c28 0 2026-03-10T14:08:34.198 INFO:tasks.workunit.client.0.vm03.stdout:3/615: symlink d1d/d29/d41/d45/d55/lc6 0 2026-03-10T14:08:34.215 INFO:tasks.workunit.client.0.vm03.stdout:4/595: creat d5/d96/fc1 x:0 0 0 2026-03-10T14:08:34.221 INFO:tasks.workunit.client.0.vm03.stdout:4/596: dwrite d5/d47/d5b/d64/d85/f9d [0,4194304] 0 2026-03-10T14:08:34.225 INFO:tasks.workunit.client.0.vm03.stdout:4/597: stat d5/d9/db/d19/d38/d53/c7d 0 2026-03-10T14:08:34.233 INFO:tasks.workunit.client.0.vm03.stdout:6/557: readlink d8/db/d49/d76/l99 0 2026-03-10T14:08:34.254 INFO:tasks.workunit.client.0.vm03.stdout:1/560: rename d0/d2/l37 to d0/d2/df/d27/lb3 0 2026-03-10T14:08:34.278 INFO:tasks.workunit.client.0.vm03.stdout:5/702: mknod d4/d13/d1f/d8c/cdd 0 2026-03-10T14:08:34.300 INFO:tasks.workunit.client.0.vm03.stdout:2/547: mknod d5/d10/d1f/d4f/d76/da7/d54/d5f/caf 0 2026-03-10T14:08:34.300 INFO:tasks.workunit.client.0.vm03.stdout:0/551: rmdir d3/d4d/d47 39 2026-03-10T14:08:34.318 INFO:tasks.workunit.client.0.vm03.stdout:9/579: symlink d2/d29/d33/d55/d9c/lba 0 2026-03-10T14:08:34.333 INFO:tasks.workunit.client.0.vm03.stdout:4/598: mkdir d5/d6e/db6/dc2 0 2026-03-10T14:08:34.335 INFO:tasks.workunit.client.0.vm03.stdout:6/558: creat d8/d11/d7a/fad x:0 0 0 2026-03-10T14:08:34.345 INFO:tasks.workunit.client.0.vm03.stdout:5/703: fdatasync d4/d6/f63 0 2026-03-10T14:08:34.351 INFO:tasks.workunit.client.0.vm03.stdout:7/484: link d5/d9/f10 d5/d9/d14/d21/d6f/fa2 0 2026-03-10T14:08:34.367 INFO:tasks.workunit.client.0.vm03.stdout:2/548: dwrite d5/d10/d31/f3d [0,4194304] 0 2026-03-10T14:08:34.384 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:34 vm03.local ceph-mon[49718]: mgrmap e28: vm03.rwbbep(active, since 1.03609s), standbys: vm04.ywwcto 2026-03-10T14:08:34.384 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:34 vm03.local ceph-mon[49718]: from='client.24465 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:08:34.384 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:34 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm04.ywwcto", "id": "vm04.ywwcto"}]: dispatch 2026-03-10T14:08:34.384 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:34 vm03.local ceph-mon[49718]: pgmap v3: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail 2026-03-10T14:08:34.384 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:34 vm03.local ceph-mon[49718]: from='client.? 192.168.123.103:0/3948667953' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:08:34.448 INFO:tasks.workunit.client.0.vm03.stdout:3/616: mkdir d1d/d33/d47/dac/dba/dc7 0 2026-03-10T14:08:34.448 INFO:tasks.workunit.client.0.vm03.stdout:3/617: readlink l14 0 2026-03-10T14:08:34.450 INFO:tasks.workunit.client.0.vm03.stdout:4/599: creat d5/d47/d62/fc3 x:0 0 0 2026-03-10T14:08:34.451 INFO:tasks.workunit.client.0.vm03.stdout:4/600: chown d5/d47/d5b/d64/f82 41572 1 2026-03-10T14:08:34.456 INFO:tasks.workunit.client.0.vm03.stdout:4/601: dwrite d5/d9/db/d19/d99/fac [0,4194304] 0 2026-03-10T14:08:34.460 INFO:tasks.workunit.client.0.vm03.stdout:6/559: mknod d8/db/d12/cae 0 2026-03-10T14:08:34.471 INFO:tasks.workunit.client.0.vm03.stdout:6/560: dread d8/db/d12/d51/d5c/f68 [0,4194304] 0 2026-03-10T14:08:34.477 INFO:tasks.workunit.client.0.vm03.stdout:8/620: symlink da/d24/d78/lc7 0 2026-03-10T14:08:34.478 INFO:tasks.workunit.client.0.vm03.stdout:8/621: write da/d36/d4d/da5/fa9 [56038,130650] 0 2026-03-10T14:08:34.490 INFO:tasks.workunit.client.0.vm03.stdout:0/552: creat d3/d4d/d8f/fb3 x:0 0 0 2026-03-10T14:08:34.507 INFO:tasks.workunit.client.0.vm03.stdout:9/580: mknod d2/d14/cbb 0 2026-03-10T14:08:34.514 INFO:tasks.workunit.client.0.vm03.stdout:4/602: readlink d5/l6a 0 2026-03-10T14:08:34.518 INFO:tasks.workunit.client.0.vm03.stdout:8/622: creat da/d58/d5f/d67/fc8 x:0 0 0 2026-03-10T14:08:34.519 INFO:tasks.workunit.client.0.vm03.stdout:8/623: write da/d3c/f48 [802249,49108] 0 2026-03-10T14:08:34.520 INFO:tasks.workunit.client.0.vm03.stdout:1/561: mknod d0/d2/df/d91/cb4 0 2026-03-10T14:08:34.530 INFO:tasks.workunit.client.0.vm03.stdout:5/704: creat d4/d16/d19/d23/d3f/fde x:0 0 0 2026-03-10T14:08:34.531 INFO:tasks.workunit.client.0.vm03.stdout:5/705: readlink d4/d6/de/l33 0 2026-03-10T14:08:34.531 INFO:tasks.workunit.client.0.vm03.stdout:5/706: stat d4/d13/d1f/c42 0 2026-03-10T14:08:34.539 INFO:tasks.workunit.client.0.vm03.stdout:0/553: dwrite d3/d4d/d47/f48 [0,4194304] 0 2026-03-10T14:08:34.539 INFO:tasks.workunit.client.0.vm03.stdout:0/554: chown d3/d11/d2c/l6e 1 1 2026-03-10T14:08:34.543 INFO:tasks.workunit.client.0.vm03.stdout:9/581: mkdir d2/d29/d33/d60/d8c/dbc 0 2026-03-10T14:08:34.547 INFO:tasks.workunit.client.0.vm03.stdout:3/618: mknod d1d/d29/cc8 0 2026-03-10T14:08:34.547 INFO:tasks.workunit.client.0.vm03.stdout:3/619: chown d1d/d39/d51/c5f 266729 1 2026-03-10T14:08:34.552 INFO:tasks.workunit.client.0.vm03.stdout:3/620: dwrite d1d/f4a [0,4194304] 0 2026-03-10T14:08:34.562 INFO:tasks.workunit.client.0.vm03.stdout:4/603: truncate d5/d9/db/d19/f3a 946868 0 2026-03-10T14:08:34.562 INFO:tasks.workunit.client.0.vm03.stdout:8/624: chown da/d58/l5c 94846767 1 2026-03-10T14:08:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:34 vm04.local ceph-mon[55966]: mgrmap e28: vm03.rwbbep(active, since 1.03609s), standbys: vm04.ywwcto 2026-03-10T14:08:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:34 vm04.local ceph-mon[55966]: from='client.24465 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:08:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:34 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm04.ywwcto", "id": "vm04.ywwcto"}]: dispatch 2026-03-10T14:08:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:34 vm04.local ceph-mon[55966]: pgmap v3: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail 2026-03-10T14:08:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:34 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/3948667953' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:08:34.568 INFO:tasks.workunit.client.0.vm03.stdout:1/562: fsync d0/d2/df/dab/f6d 0 2026-03-10T14:08:34.574 INFO:tasks.workunit.client.0.vm03.stdout:7/485: creat d5/d9/d14/d21/fa3 x:0 0 0 2026-03-10T14:08:34.581 INFO:tasks.workunit.client.0.vm03.stdout:7/486: dread d5/d9/d14/d21/d6f/fa2 [4194304,4194304] 0 2026-03-10T14:08:34.581 INFO:tasks.workunit.client.0.vm03.stdout:7/487: stat d5/la0 0 2026-03-10T14:08:34.594 INFO:tasks.workunit.client.0.vm03.stdout:0/555: creat d3/d11/d2c/d4a/d4b/d89/fb4 x:0 0 0 2026-03-10T14:08:34.605 INFO:tasks.workunit.client.0.vm03.stdout:2/549: truncate d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 3444134 0 2026-03-10T14:08:34.605 INFO:tasks.workunit.client.0.vm03.stdout:7/488: dread d5/d9/f7a [0,4194304] 0 2026-03-10T14:08:34.605 INFO:tasks.workunit.client.0.vm03.stdout:9/582: stat d2/d14/c17 0 2026-03-10T14:08:34.607 INFO:tasks.workunit.client.0.vm03.stdout:3/621: sync 2026-03-10T14:08:34.625 INFO:tasks.workunit.client.0.vm03.stdout:4/604: rename d5/d47/d62/d87 to d5/d47/d5b/dbe/dc4 0 2026-03-10T14:08:34.625 INFO:tasks.workunit.client.0.vm03.stdout:6/561: truncate d8/d11/d18/f34 5422383 0 2026-03-10T14:08:34.631 INFO:tasks.workunit.client.0.vm03.stdout:5/707: mknod d4/d6/ddc/cdf 0 2026-03-10T14:08:34.631 INFO:tasks.workunit.client.0.vm03.stdout:4/605: sync 2026-03-10T14:08:34.658 INFO:tasks.workunit.client.0.vm03.stdout:5/708: dread d4/d13/fd0 [0,4194304] 0 2026-03-10T14:08:34.658 INFO:tasks.workunit.client.0.vm03.stdout:0/556: dwrite d3/d4d/d30/f7a [0,4194304] 0 2026-03-10T14:08:34.661 INFO:tasks.workunit.client.0.vm03.stdout:5/709: chown d4/d40 108759647 1 2026-03-10T14:08:34.665 INFO:tasks.workunit.client.0.vm03.stdout:8/625: rename da/d3a/d44/f88 to da/d3c/d51/d85/fc9 0 2026-03-10T14:08:34.679 INFO:tasks.workunit.client.0.vm03.stdout:1/563: truncate d0/d2/df/d27/d7e/d81/fa7 227208 0 2026-03-10T14:08:34.680 INFO:tasks.workunit.client.0.vm03.stdout:9/583: truncate d2/f2f 1007097 0 2026-03-10T14:08:34.694 INFO:tasks.workunit.client.0.vm03.stdout:0/557: dwrite d3/f19 [4194304,4194304] 0 2026-03-10T14:08:34.711 INFO:tasks.workunit.client.0.vm03.stdout:0/558: mkdir d3/d11/d76/db5 0 2026-03-10T14:08:34.715 INFO:tasks.workunit.client.0.vm03.stdout:0/559: read d3/d46/f55 [176945,71576] 0 2026-03-10T14:08:34.723 INFO:tasks.workunit.client.0.vm03.stdout:5/710: unlink d4/d40/d4e/l64 0 2026-03-10T14:08:34.723 INFO:tasks.workunit.client.0.vm03.stdout:8/626: dwrite da/d3c/d51/d85/fc9 [0,4194304] 0 2026-03-10T14:08:34.724 INFO:tasks.workunit.client.0.vm03.stdout:8/627: readlink da/d3c/d4b/d69/l94 0 2026-03-10T14:08:34.725 INFO:tasks.workunit.client.0.vm03.stdout:0/560: dwrite d3/d16/f9e [0,4194304] 0 2026-03-10T14:08:34.740 INFO:tasks.workunit.client.0.vm03.stdout:4/606: truncate d5/d9/db/f20 7750154 0 2026-03-10T14:08:34.740 INFO:tasks.workunit.client.0.vm03.stdout:9/584: write d2/d29/d33/d60/d8c/f98 [1477349,65696] 0 2026-03-10T14:08:34.740 INFO:tasks.workunit.client.0.vm03.stdout:2/550: getdents d5/d10/d1f 0 2026-03-10T14:08:34.740 INFO:tasks.workunit.client.0.vm03.stdout:3/622: getdents d1d/d29/d41/d45/d5b 0 2026-03-10T14:08:34.745 INFO:tasks.workunit.client.0.vm03.stdout:9/585: chown d2/d29/d33/d41/daa/fb9 15372 1 2026-03-10T14:08:34.747 INFO:tasks.workunit.client.0.vm03.stdout:1/564: symlink d0/d2/d71/d90/lb5 0 2026-03-10T14:08:34.750 INFO:tasks.workunit.client.0.vm03.stdout:6/562: dread d8/db/df/f10 [0,4194304] 0 2026-03-10T14:08:34.759 INFO:tasks.workunit.client.0.vm03.stdout:7/489: rename d5/d9/f42 to d5/d9/d14/d26/d36/fa4 0 2026-03-10T14:08:34.774 INFO:tasks.workunit.client.0.vm03.stdout:8/628: creat da/d58/d5f/d67/fca x:0 0 0 2026-03-10T14:08:34.776 INFO:tasks.workunit.client.0.vm03.stdout:0/561: mkdir d3/d11/d2c/d4a/d4b/d89/db6 0 2026-03-10T14:08:34.777 INFO:tasks.workunit.client.0.vm03.stdout:0/562: stat d3/d46/d5e/f7c 0 2026-03-10T14:08:34.779 INFO:tasks.workunit.client.0.vm03.stdout:0/563: write d3/d46/f8c [6386638,99808] 0 2026-03-10T14:08:34.795 INFO:tasks.workunit.client.0.vm03.stdout:2/551: mkdir d5/d10/d1f/d4f/d76/da7/d40/db0 0 2026-03-10T14:08:34.795 INFO:tasks.workunit.client.0.vm03.stdout:3/623: read d1d/d29/f87 [1275620,39970] 0 2026-03-10T14:08:34.799 INFO:tasks.workunit.client.0.vm03.stdout:9/586: mkdir d2/d29/da7/dbd 0 2026-03-10T14:08:34.816 INFO:tasks.workunit.client.0.vm03.stdout:5/711: creat d4/d16/da0/fe0 x:0 0 0 2026-03-10T14:08:34.817 INFO:tasks.workunit.client.0.vm03.stdout:8/629: creat da/d3c/d51/d85/fcb x:0 0 0 2026-03-10T14:08:34.832 INFO:tasks.workunit.client.0.vm03.stdout:6/563: rename d8/db/d49/d58/c75 to d8/d11/d18/d54/caf 0 2026-03-10T14:08:34.832 INFO:tasks.workunit.client.0.vm03.stdout:2/552: rename d5/d10/d1f to d5/d10/d1f/d4f/d76/da7/d40/d59/da2/db1 22 2026-03-10T14:08:34.833 INFO:tasks.workunit.client.0.vm03.stdout:2/553: write d5/d10/d1f/d4f/d76/fa1 [400763,99143] 0 2026-03-10T14:08:34.840 INFO:tasks.workunit.client.0.vm03.stdout:7/490: symlink d5/d9/la5 0 2026-03-10T14:08:34.841 INFO:tasks.workunit.client.0.vm03.stdout:1/565: mkdir d0/d2/db6 0 2026-03-10T14:08:34.847 INFO:tasks.workunit.client.0.vm03.stdout:0/564: mkdir d3/d11/d2c/db7 0 2026-03-10T14:08:34.848 INFO:tasks.workunit.client.0.vm03.stdout:0/565: write d3/d16/f34 [3582388,27545] 0 2026-03-10T14:08:34.848 INFO:tasks.workunit.client.0.vm03.stdout:0/566: read d3/d4d/d30/f7a [2889091,97568] 0 2026-03-10T14:08:34.853 INFO:tasks.workunit.client.0.vm03.stdout:7/491: dwrite d5/d9/d35/f69 [0,4194304] 0 2026-03-10T14:08:34.853 INFO:tasks.workunit.client.0.vm03.stdout:0/567: dread - d3/d16/d21/d9a/fb2 zero size 2026-03-10T14:08:34.858 INFO:tasks.workunit.client.0.vm03.stdout:0/568: dread d3/d11/d66/f86 [0,4194304] 0 2026-03-10T14:08:34.868 INFO:tasks.workunit.client.0.vm03.stdout:7/492: dwrite d5/d9/d14/d26/d36/fa1 [0,4194304] 0 2026-03-10T14:08:34.871 INFO:tasks.workunit.client.0.vm03.stdout:5/712: mkdir d4/d13/d1f/de1 0 2026-03-10T14:08:34.871 INFO:tasks.workunit.client.0.vm03.stdout:5/713: truncate d4/d6/de/f89 5125223 0 2026-03-10T14:08:34.876 INFO:tasks.workunit.client.0.vm03.stdout:7/493: dread - d5/d9/d14/d21/d6f/f9d zero size 2026-03-10T14:08:34.881 INFO:tasks.workunit.client.0.vm03.stdout:0/569: dwrite d3/d4d/d47/f74 [0,4194304] 0 2026-03-10T14:08:34.904 INFO:tasks.workunit.client.0.vm03.stdout:3/624: mknod d1d/d29/d41/d45/db4/db6/cc9 0 2026-03-10T14:08:34.913 INFO:tasks.workunit.client.0.vm03.stdout:3/625: dwrite d1d/d33/f80 [0,4194304] 0 2026-03-10T14:08:34.924 INFO:tasks.workunit.client.0.vm03.stdout:3/626: dwrite d1d/d39/d51/d72/fab [0,4194304] 0 2026-03-10T14:08:34.936 INFO:tasks.workunit.client.0.vm03.stdout:6/564: unlink d8/db/d49/f4a 0 2026-03-10T14:08:34.941 INFO:tasks.workunit.client.0.vm03.stdout:2/554: creat d5/d35/fb2 x:0 0 0 2026-03-10T14:08:34.942 INFO:tasks.workunit.client.0.vm03.stdout:2/555: stat d5/d35/l6f 0 2026-03-10T14:08:34.948 INFO:tasks.workunit.client.0.vm03.stdout:1/566: fdatasync d0/d2/df/d16/d41/f68 0 2026-03-10T14:08:34.962 INFO:tasks.workunit.client.0.vm03.stdout:5/714: symlink d4/d16/d19/d23/db8/le2 0 2026-03-10T14:08:34.963 INFO:tasks.workunit.client.0.vm03.stdout:5/715: dread - d4/d13/d1f/d84/f99 zero size 2026-03-10T14:08:34.968 INFO:tasks.workunit.client.0.vm03.stdout:7/494: dwrite d5/d9/d14/d26/d36/d51/d7b/d83/f9e [0,4194304] 0 2026-03-10T14:08:34.973 INFO:tasks.workunit.client.0.vm03.stdout:4/607: link d5/d6e/lb0 d5/d47/d5b/lc5 0 2026-03-10T14:08:34.986 INFO:tasks.workunit.client.0.vm03.stdout:8/630: symlink da/d3c/d51/d75/dc2/dc6/lcc 0 2026-03-10T14:08:34.991 INFO:tasks.workunit.client.0.vm03.stdout:5/716: chown d4/d35/cdb 0 1 2026-03-10T14:08:34.993 INFO:tasks.workunit.client.0.vm03.stdout:2/556: chown d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 7152 1 2026-03-10T14:08:35.006 INFO:tasks.workunit.client.0.vm03.stdout:8/631: mknod da/d3c/d51/d8b/ccd 0 2026-03-10T14:08:35.008 INFO:tasks.workunit.client.0.vm03.stdout:3/627: link d1d/d33/d47/d53/d68/fc4 d1d/d29/d41/dc0/fca 0 2026-03-10T14:08:35.008 INFO:tasks.workunit.client.0.vm03.stdout:3/628: read - d1d/d29/d41/d45/d95/fa2 zero size 2026-03-10T14:08:35.009 INFO:tasks.workunit.client.0.vm03.stdout:3/629: write d1d/d29/d41/d45/d95/f9d [2878841,15569] 0 2026-03-10T14:08:35.012 INFO:tasks.workunit.client.0.vm03.stdout:5/717: creat d4/d16/da0/fe3 x:0 0 0 2026-03-10T14:08:35.015 INFO:tasks.workunit.client.0.vm03.stdout:7/495: mkdir d5/d9/da6 0 2026-03-10T14:08:35.018 INFO:tasks.workunit.client.0.vm03.stdout:0/570: creat d3/d16/d21/fb8 x:0 0 0 2026-03-10T14:08:35.023 INFO:tasks.workunit.client.0.vm03.stdout:6/565: sync 2026-03-10T14:08:35.036 INFO:tasks.workunit.client.0.vm03.stdout:8/632: dread da/f12 [0,4194304] 0 2026-03-10T14:08:35.036 INFO:tasks.workunit.client.0.vm03.stdout:6/566: dwrite d8/d11/d7a/fad [0,4194304] 0 2026-03-10T14:08:35.036 INFO:tasks.workunit.client.0.vm03.stdout:3/630: symlink d1d/d29/d41/d45/d55/lcb 0 2026-03-10T14:08:35.036 INFO:tasks.workunit.client.0.vm03.stdout:3/631: dwrite d1d/d39/d51/d72/d99/fbc [0,4194304] 0 2026-03-10T14:08:35.044 INFO:tasks.workunit.client.0.vm03.stdout:5/718: mknod d4/d13/d43/ce4 0 2026-03-10T14:08:35.044 INFO:tasks.workunit.client.0.vm03.stdout:5/719: read d4/d13/fd0 [3236773,98075] 0 2026-03-10T14:08:35.045 INFO:tasks.workunit.client.0.vm03.stdout:5/720: write d4/f93 [956426,45812] 0 2026-03-10T14:08:35.045 INFO:tasks.workunit.client.0.vm03.stdout:5/721: stat d4/d6/de/dd5 0 2026-03-10T14:08:35.045 INFO:tasks.workunit.client.0.vm03.stdout:5/722: readlink d4/l2a 0 2026-03-10T14:08:35.046 INFO:tasks.workunit.client.0.vm03.stdout:5/723: readlink d4/d16/l41 0 2026-03-10T14:08:35.047 INFO:tasks.workunit.client.0.vm03.stdout:2/557: truncate d5/d35/f81 2109943 0 2026-03-10T14:08:35.049 INFO:tasks.workunit.client.0.vm03.stdout:0/571: mkdir d3/d11/d2c/d4a/d4b/d89/db9 0 2026-03-10T14:08:35.052 INFO:tasks.workunit.client.0.vm03.stdout:0/572: dread d3/d4d/d47/f5f [0,4194304] 0 2026-03-10T14:08:35.055 INFO:tasks.workunit.client.0.vm03.stdout:8/633: mkdir da/d3a/dce 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:3/632: mknod d1d/d33/d65/d48/ccc 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:5/724: creat d4/d13/d1f/fe5 x:0 0 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:5/725: readlink d4/d13/d1f/d8c/da7/lb4 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:2/558: rmdir d5/d2a/d8a 39 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:7/496: mkdir d5/d9/da7 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:8/634: mknod da/d36/d4d/da5/ccf 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:8/635: write da/d24/f28 [1130204,80573] 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:7/497: unlink d5/f47 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:5/726: mknod d4/d16/da0/ce6 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:6/567: link d8/d3b/l7e d8/db/d12/d51/d8c/lb0 0 2026-03-10T14:08:35.087 INFO:tasks.workunit.client.0.vm03.stdout:6/568: stat d8/db/d12/f40 0 2026-03-10T14:08:35.088 INFO:tasks.workunit.client.0.vm03.stdout:0/573: mknod d3/cba 0 2026-03-10T14:08:35.090 INFO:tasks.workunit.client.0.vm03.stdout:7/498: mkdir d5/d9/d35/da8 0 2026-03-10T14:08:35.090 INFO:tasks.workunit.client.0.vm03.stdout:3/633: rename d1d/d29/d41/d45/d55/l84 to d1d/d29/db2/lcd 0 2026-03-10T14:08:35.091 INFO:tasks.workunit.client.0.vm03.stdout:2/559: creat d5/d10/da3/dab/fb3 x:0 0 0 2026-03-10T14:08:35.091 INFO:tasks.workunit.client.0.vm03.stdout:8/636: dread da/f33 [0,4194304] 0 2026-03-10T14:08:35.092 INFO:tasks.workunit.client.0.vm03.stdout:2/560: dread - d5/d10/d1f/fa5 zero size 2026-03-10T14:08:35.095 INFO:tasks.workunit.client.0.vm03.stdout:0/574: chown d3/f94 1088371 1 2026-03-10T14:08:35.095 INFO:tasks.workunit.client.0.vm03.stdout:5/727: link d4/d13/d43/f72 d4/d6/de/fe7 0 2026-03-10T14:08:35.095 INFO:tasks.workunit.client.0.vm03.stdout:5/728: dread - d4/fc7 zero size 2026-03-10T14:08:35.099 INFO:tasks.workunit.client.0.vm03.stdout:3/634: fsync d1d/d29/f44 0 2026-03-10T14:08:35.099 INFO:tasks.workunit.client.0.vm03.stdout:7/499: truncate d5/f6 7714813 0 2026-03-10T14:08:35.101 INFO:tasks.workunit.client.0.vm03.stdout:8/637: rename da/d58/d5f/d67/fca to da/d24/d49/fd0 0 2026-03-10T14:08:35.109 INFO:tasks.workunit.client.0.vm03.stdout:0/575: creat d3/d11/d2c/d4a/d4b/d89/fbb x:0 0 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:5/729: mkdir d4/d16/d19/d6e/d7f/dd1/de8 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:8/638: mkdir da/d58/da8/dd1 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:8/639: truncate da/d58/fc4 1032415 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:8/640: stat da/c19 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:8/641: readlink da/d58/l5c 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:0/576: fsync d3/d16/f34 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:7/500: symlink d5/d9/d14/d21/d28/la9 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:0/577: stat d3/d16/d21/fb8 0 2026-03-10T14:08:35.156 INFO:tasks.workunit.client.0.vm03.stdout:5/730: rename d4/f93 to d4/d13/d1f/de1/fe9 0 2026-03-10T14:08:35.157 INFO:tasks.workunit.client.0.vm03.stdout:8/642: mkdir da/d3c/d4b/dd2 0 2026-03-10T14:08:35.157 INFO:tasks.workunit.client.0.vm03.stdout:0/578: dwrite d3/d16/f3e [0,4194304] 0 2026-03-10T14:08:35.164 INFO:tasks.workunit.client.0.vm03.stdout:8/643: link da/d24/lac da/d3c/d51/d75/dc2/db7/ld3 0 2026-03-10T14:08:35.176 INFO:tasks.workunit.client.0.vm03.stdout:8/644: mkdir da/d3c/d4b/d4c/dba/dd4 0 2026-03-10T14:08:35.176 INFO:tasks.workunit.client.0.vm03.stdout:8/645: chown da/d3c/f48 207 1 2026-03-10T14:08:35.176 INFO:tasks.workunit.client.0.vm03.stdout:8/646: stat da/d24 0 2026-03-10T14:08:35.176 INFO:tasks.workunit.client.0.vm03.stdout:8/647: read da/f30 [3299,32669] 0 2026-03-10T14:08:35.176 INFO:tasks.workunit.client.0.vm03.stdout:8/648: chown da/d3c/d51/caf 2 1 2026-03-10T14:08:35.177 INFO:tasks.workunit.client.0.vm03.stdout:8/649: getdents da/d3c/d4b/d69 0 2026-03-10T14:08:35.178 INFO:tasks.workunit.client.0.vm03.stdout:8/650: write da/d36/f42 [3741075,14286] 0 2026-03-10T14:08:35.179 INFO:tasks.workunit.client.0.vm03.stdout:8/651: creat da/d3a/fd5 x:0 0 0 2026-03-10T14:08:35.186 INFO:tasks.workunit.client.0.vm03.stdout:8/652: mkdir da/d36/d4d/da5/dd6 0 2026-03-10T14:08:35.186 INFO:tasks.workunit.client.0.vm03.stdout:8/653: mkdir da/d24/d49/dab/dd7 0 2026-03-10T14:08:35.186 INFO:tasks.workunit.client.0.vm03.stdout:8/654: write da/d3c/f48 [3498374,15415] 0 2026-03-10T14:08:35.186 INFO:tasks.workunit.client.0.vm03.stdout:8/655: creat da/d58/d6c/fd8 x:0 0 0 2026-03-10T14:08:35.187 INFO:tasks.workunit.client.0.vm03.stdout:8/656: creat da/d3c/d51/fd9 x:0 0 0 2026-03-10T14:08:35.206 INFO:tasks.workunit.client.0.vm03.stdout:3/635: sync 2026-03-10T14:08:35.206 INFO:tasks.workunit.client.0.vm03.stdout:5/731: sync 2026-03-10T14:08:35.220 INFO:tasks.workunit.client.0.vm03.stdout:9/587: write d2/d14/f96 [137407,119810] 0 2026-03-10T14:08:35.235 INFO:tasks.workunit.client.0.vm03.stdout:9/588: link d2/d29/d33/d41/d95/cac d2/d14/d2b/d34/cbe 0 2026-03-10T14:08:35.258 INFO:tasks.workunit.client.0.vm03.stdout:1/567: write d0/d18/d3b/f3d [704775,44053] 0 2026-03-10T14:08:35.268 INFO:tasks.workunit.client.0.vm03.stdout:1/568: write d0/d2/f46 [4790501,34356] 0 2026-03-10T14:08:35.272 INFO:tasks.workunit.client.0.vm03.stdout:4/608: write d5/d47/d5b/dbe/dc4/fae [652607,67078] 0 2026-03-10T14:08:35.307 INFO:tasks.workunit.client.0.vm03.stdout:3/636: getdents d1d/d33/d65/d48 0 2026-03-10T14:08:35.327 INFO:tasks.workunit.client.0.vm03.stdout:4/609: rename d5/d9/db/d19/d38/d53/l81 to d5/d9/db/da7/lc6 0 2026-03-10T14:08:35.329 INFO:tasks.workunit.client.0.vm03.stdout:3/637: mkdir d1d/d29/d41/d45/d95/dce 0 2026-03-10T14:08:35.330 INFO:tasks.workunit.client.0.vm03.stdout:3/638: readlink d1d/d33/l7d 0 2026-03-10T14:08:35.343 INFO:tasks.workunit.client.0.vm03.stdout:6/569: dwrite d8/db/d12/f7c [4194304,4194304] 0 2026-03-10T14:08:35.345 INFO:tasks.workunit.client.0.vm03.stdout:6/570: chown d8/db/d12/d51/d5c/f8d 81547792 1 2026-03-10T14:08:35.372 INFO:tasks.workunit.client.0.vm03.stdout:2/561: truncate d5/d10/d17/f18 439709 0 2026-03-10T14:08:35.374 INFO:tasks.workunit.client.0.vm03.stdout:2/562: read - d5/f9f zero size 2026-03-10T14:08:35.406 INFO:tasks.workunit.client.0.vm03.stdout:0/579: dwrite d3/d16/f2d [0,4194304] 0 2026-03-10T14:08:35.408 INFO:tasks.workunit.client.0.vm03.stdout:7/501: truncate d5/d9/d14/d26/d36/fa4 2421126 0 2026-03-10T14:08:35.468 INFO:tasks.workunit.client.0.vm03.stdout:8/657: write da/d3c/d51/d75/fb8 [953198,99431] 0 2026-03-10T14:08:35.477 INFO:tasks.workunit.client.0.vm03.stdout:5/732: truncate d4/d13/d8f/f91 3784125 0 2026-03-10T14:08:35.477 INFO:tasks.workunit.client.0.vm03.stdout:4/610: symlink d5/d6e/lc7 0 2026-03-10T14:08:35.479 INFO:tasks.workunit.client.0.vm03.stdout:3/639: mkdir d1d/d33/d47/d53/d68/dcf 0 2026-03-10T14:08:35.479 INFO:tasks.workunit.client.0.vm03.stdout:6/571: fdatasync d8/db/d12/d51/f81 0 2026-03-10T14:08:35.482 INFO:tasks.workunit.client.0.vm03.stdout:6/572: chown d8/db/f3c 22 1 2026-03-10T14:08:35.482 INFO:tasks.workunit.client.0.vm03.stdout:6/573: chown d8/db/d49/d6c/d32 25283407 1 2026-03-10T14:08:35.483 INFO:tasks.workunit.client.0.vm03.stdout:0/580: write d3/f19 [3222382,69997] 0 2026-03-10T14:08:35.488 INFO:tasks.workunit.client.0.vm03.stdout:9/589: dwrite d2/f2f [0,4194304] 0 2026-03-10T14:08:35.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:35 vm04.local ceph-mon[55966]: pgmap v4: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail 2026-03-10T14:08:35.505 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:35 vm04.local ceph-mon[55966]: mgrmap e29: vm03.rwbbep(active, since 2s), standbys: vm04.ywwcto 2026-03-10T14:08:35.512 INFO:tasks.workunit.client.0.vm03.stdout:8/658: creat da/d58/d5f/dbc/d70/d7d/fda x:0 0 0 2026-03-10T14:08:35.516 INFO:tasks.workunit.client.0.vm03.stdout:1/569: dwrite d0/d18/d3b/f4c [0,4194304] 0 2026-03-10T14:08:35.522 INFO:tasks.workunit.client.0.vm03.stdout:1/570: dread d0/d2/df/d16/f4f [0,4194304] 0 2026-03-10T14:08:35.527 INFO:tasks.workunit.client.0.vm03.stdout:5/733: mkdir d4/d6/de/dea 0 2026-03-10T14:08:35.531 INFO:tasks.workunit.client.0.vm03.stdout:5/734: dread d4/d16/d19/d4a/fb2 [0,4194304] 0 2026-03-10T14:08:35.532 INFO:tasks.workunit.client.0.vm03.stdout:5/735: chown d4/d16/d19/d23/db8 229272693 1 2026-03-10T14:08:35.542 INFO:tasks.workunit.client.0.vm03.stdout:3/640: mknod d1d/d29/d41/cd0 0 2026-03-10T14:08:35.561 INFO:tasks.workunit.client.0.vm03.stdout:0/581: mkdir d3/d11/d76/dbc 0 2026-03-10T14:08:35.562 INFO:tasks.workunit.client.0.vm03.stdout:1/571: mkdir d0/d18/d3b/db7 0 2026-03-10T14:08:35.579 INFO:tasks.workunit.client.0.vm03.stdout:2/563: rename d5/d10/d1f/d4f/d76/da7/d50 to d5/db4 0 2026-03-10T14:08:35.580 INFO:tasks.workunit.client.0.vm03.stdout:2/564: dread - d5/d35/fad zero size 2026-03-10T14:08:35.586 INFO:tasks.workunit.client.0.vm03.stdout:6/574: getdents d8/d3b/da7 0 2026-03-10T14:08:35.592 INFO:tasks.workunit.client.0.vm03.stdout:0/582: chown d3/d4d/d30/f97 487 1 2026-03-10T14:08:35.593 INFO:tasks.workunit.client.0.vm03.stdout:1/572: sync 2026-03-10T14:08:35.600 INFO:tasks.workunit.client.0.vm03.stdout:5/736: write d4/d13/d43/f94 [190298,85725] 0 2026-03-10T14:08:35.603 INFO:tasks.workunit.client.0.vm03.stdout:4/611: dwrite d5/fe [4194304,4194304] 0 2026-03-10T14:08:35.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:35 vm03.local ceph-mon[49718]: pgmap v4: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail 2026-03-10T14:08:35.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:35 vm03.local ceph-mon[49718]: mgrmap e29: vm03.rwbbep(active, since 2s), standbys: vm04.ywwcto 2026-03-10T14:08:35.623 INFO:tasks.workunit.client.0.vm03.stdout:3/641: dwrite d1d/d39/d51/f82 [4194304,4194304] 0 2026-03-10T14:08:35.626 INFO:tasks.workunit.client.0.vm03.stdout:8/659: rmdir da/d3c/d4b/d4c/dba/dd4 0 2026-03-10T14:08:35.628 INFO:tasks.workunit.client.0.vm03.stdout:7/502: write d5/f6 [3934451,23581] 0 2026-03-10T14:08:35.629 INFO:tasks.workunit.client.0.vm03.stdout:3/642: chown d1d/d33/d47/l50 25474 1 2026-03-10T14:08:35.648 INFO:tasks.workunit.client.0.vm03.stdout:5/737: dread d4/d13/d1f/f70 [0,4194304] 0 2026-03-10T14:08:35.650 INFO:tasks.workunit.client.0.vm03.stdout:1/573: dread d0/d2/f14 [0,4194304] 0 2026-03-10T14:08:35.651 INFO:tasks.workunit.client.0.vm03.stdout:1/574: truncate d0/d18/daa/fb1 714545 0 2026-03-10T14:08:35.652 INFO:tasks.workunit.client.0.vm03.stdout:1/575: dread - d0/d2/df/dab/f6d zero size 2026-03-10T14:08:35.653 INFO:tasks.workunit.client.0.vm03.stdout:1/576: stat d0/d2/df/d91 0 2026-03-10T14:08:35.659 INFO:tasks.workunit.client.0.vm03.stdout:2/565: unlink d5/d10/f16 0 2026-03-10T14:08:35.665 INFO:tasks.workunit.client.0.vm03.stdout:0/583: dwrite d3/d46/f63 [0,4194304] 0 2026-03-10T14:08:35.668 INFO:tasks.workunit.client.0.vm03.stdout:1/577: sync 2026-03-10T14:08:35.670 INFO:tasks.workunit.client.0.vm03.stdout:7/503: unlink d5/d9/f94 0 2026-03-10T14:08:35.671 INFO:tasks.workunit.client.0.vm03.stdout:7/504: stat d5/d9/d3e/d84/l96 0 2026-03-10T14:08:35.691 INFO:tasks.workunit.client.0.vm03.stdout:9/590: getdents d2/d29 0 2026-03-10T14:08:35.699 INFO:tasks.workunit.client.0.vm03.stdout:5/738: creat d4/d40/da5/feb x:0 0 0 2026-03-10T14:08:35.702 INFO:tasks.workunit.client.0.vm03.stdout:2/566: mknod d5/d10/d1f/d4f/d76/da7/d40/d92/cb5 0 2026-03-10T14:08:35.705 INFO:tasks.workunit.client.0.vm03.stdout:0/584: rename d3/d46/d5e/c67 to d3/d11/d76/cbd 0 2026-03-10T14:08:35.705 INFO:tasks.workunit.client.0.vm03.stdout:9/591: dwrite d2/d14/d2b/d43/f45 [0,4194304] 0 2026-03-10T14:08:35.714 INFO:tasks.workunit.client.0.vm03.stdout:9/592: chown d2/d14/d2b/d43/f56 2 1 2026-03-10T14:08:35.736 INFO:tasks.workunit.client.0.vm03.stdout:4/612: dwrite d5/f3c [0,4194304] 0 2026-03-10T14:08:35.737 INFO:tasks.workunit.client.0.vm03.stdout:1/578: dread d0/d2/df/f6c [0,4194304] 0 2026-03-10T14:08:35.741 INFO:tasks.workunit.client.0.vm03.stdout:3/643: mknod d1d/d33/d65/d5d/dae/db9/cd1 0 2026-03-10T14:08:35.742 INFO:tasks.workunit.client.0.vm03.stdout:6/575: getdents d8/db/d49/d6c/d83 0 2026-03-10T14:08:35.744 INFO:tasks.workunit.client.0.vm03.stdout:3/644: stat d1d/d29/c46 0 2026-03-10T14:08:35.753 INFO:tasks.workunit.client.0.vm03.stdout:3/645: write d1d/d29/d41/d45/d95/fa2 [901891,31649] 0 2026-03-10T14:08:35.756 INFO:tasks.workunit.client.0.vm03.stdout:5/739: rename d4/d13/d1f/l5f to d4/d6/de/lec 0 2026-03-10T14:08:35.757 INFO:tasks.workunit.client.0.vm03.stdout:4/613: symlink d5/d9/db/da7/lc8 0 2026-03-10T14:08:35.762 INFO:tasks.workunit.client.0.vm03.stdout:1/579: dwrite d0/d2/df/d27/d7e/d81/f8d [0,4194304] 0 2026-03-10T14:08:35.786 INFO:tasks.workunit.client.0.vm03.stdout:8/660: getdents da/d3c 0 2026-03-10T14:08:35.791 INFO:tasks.workunit.client.0.vm03.stdout:0/585: creat d3/d4d/da0/fbe x:0 0 0 2026-03-10T14:08:35.791 INFO:tasks.workunit.client.0.vm03.stdout:1/580: creat d0/d2/df/d16/d20/fb8 x:0 0 0 2026-03-10T14:08:35.792 INFO:tasks.workunit.client.0.vm03.stdout:6/576: symlink d8/d11/da0/lb1 0 2026-03-10T14:08:35.796 INFO:tasks.workunit.client.0.vm03.stdout:4/614: mknod d5/d47/d5b/dab/cc9 0 2026-03-10T14:08:35.801 INFO:tasks.workunit.client.0.vm03.stdout:8/661: dwrite da/d58/da8/fc1 [0,4194304] 0 2026-03-10T14:08:35.805 INFO:tasks.workunit.client.0.vm03.stdout:9/593: rmdir d2/d29/d33/d60/d8c/dbc 0 2026-03-10T14:08:35.805 INFO:tasks.workunit.client.0.vm03.stdout:3/646: symlink d1d/d29/d41/d45/d95/dce/ld2 0 2026-03-10T14:08:35.808 INFO:tasks.workunit.client.0.vm03.stdout:5/740: truncate d4/f9f 1605340 0 2026-03-10T14:08:35.808 INFO:tasks.workunit.client.0.vm03.stdout:6/577: write d8/d11/d18/d54/f5b [263612,60636] 0 2026-03-10T14:08:35.810 INFO:tasks.workunit.client.0.vm03.stdout:6/578: dread - d8/db/d49/d6c/d32/faa zero size 2026-03-10T14:08:35.820 INFO:tasks.workunit.client.0.vm03.stdout:7/505: dread d5/d9/f1f [0,4194304] 0 2026-03-10T14:08:35.821 INFO:tasks.workunit.client.0.vm03.stdout:4/615: creat d5/d47/d5b/dbe/dc4/fca x:0 0 0 2026-03-10T14:08:35.821 INFO:tasks.workunit.client.0.vm03.stdout:8/662: creat da/d58/d5f/fdb x:0 0 0 2026-03-10T14:08:35.822 INFO:tasks.workunit.client.0.vm03.stdout:2/567: rename d5/d10/d31/f7d to d5/d10/d17/fb6 0 2026-03-10T14:08:35.823 INFO:tasks.workunit.client.0.vm03.stdout:3/647: dread d1d/d39/d51/f82 [4194304,4194304] 0 2026-03-10T14:08:35.826 INFO:tasks.workunit.client.0.vm03.stdout:3/648: dread - d1d/d29/d41/d45/d5b/fb8 zero size 2026-03-10T14:08:35.835 INFO:tasks.workunit.client.0.vm03.stdout:5/741: rmdir d4/d13/d43 39 2026-03-10T14:08:35.835 INFO:tasks.workunit.client.0.vm03.stdout:9/594: creat d2/d29/d38/fbf x:0 0 0 2026-03-10T14:08:35.838 INFO:tasks.workunit.client.0.vm03.stdout:8/663: rmdir da/d36/d4d 39 2026-03-10T14:08:35.838 INFO:tasks.workunit.client.0.vm03.stdout:0/586: creat d3/fbf x:0 0 0 2026-03-10T14:08:35.861 INFO:tasks.workunit.client.0.vm03.stdout:6/579: dread d8/d1b/d1c/f50 [0,4194304] 0 2026-03-10T14:08:35.861 INFO:tasks.workunit.client.0.vm03.stdout:8/664: dread da/d3c/d51/d75/dc2/f7b [0,4194304] 0 2026-03-10T14:08:35.877 INFO:tasks.workunit.client.0.vm03.stdout:4/616: write d5/d9/db/f28 [1468624,123808] 0 2026-03-10T14:08:35.882 INFO:tasks.workunit.client.0.vm03.stdout:9/595: mkdir d2/d14/d2b/d43/dc0 0 2026-03-10T14:08:35.893 INFO:tasks.workunit.client.0.vm03.stdout:6/580: mknod d8/db/d12/d64/cb2 0 2026-03-10T14:08:35.893 INFO:tasks.workunit.client.0.vm03.stdout:2/568: write d5/d10/d1f/d4f/d76/da7/d40/f52 [778034,8543] 0 2026-03-10T14:08:35.899 INFO:tasks.workunit.client.0.vm03.stdout:1/581: link d0/d2/df/d27/f99 d0/d2/fb9 0 2026-03-10T14:08:35.899 INFO:tasks.workunit.client.0.vm03.stdout:0/587: dwrite d3/f9 [0,4194304] 0 2026-03-10T14:08:35.930 INFO:tasks.workunit.client.0.vm03.stdout:7/506: rename d5/d9/d14/d21/d28/f46 to d5/d9/d14/d21/d6f/faa 0 2026-03-10T14:08:35.931 INFO:tasks.workunit.client.0.vm03.stdout:4/617: mknod d5/d9/db/d19/d38/d7b/ccb 0 2026-03-10T14:08:35.937 INFO:tasks.workunit.client.0.vm03.stdout:0/588: unlink d3/d11/d2c/c1d 0 2026-03-10T14:08:35.940 INFO:tasks.workunit.client.0.vm03.stdout:8/665: dread f5 [0,4194304] 0 2026-03-10T14:08:35.946 INFO:tasks.workunit.client.0.vm03.stdout:3/649: rename d1d/d33/d47/c9a to d1d/d29/d41/cd3 0 2026-03-10T14:08:35.947 INFO:tasks.workunit.client.0.vm03.stdout:3/650: write d1d/d29/d41/d45/d5b/fb5 [230915,82868] 0 2026-03-10T14:08:35.949 INFO:tasks.workunit.client.0.vm03.stdout:3/651: chown d1d/d33/f80 796469 1 2026-03-10T14:08:35.952 INFO:tasks.workunit.client.0.vm03.stdout:1/582: dread d0/d2/df/f31 [0,4194304] 0 2026-03-10T14:08:35.959 INFO:tasks.workunit.client.0.vm03.stdout:8/666: write da/d3c/d51/d85/fcb [954390,98863] 0 2026-03-10T14:08:35.959 INFO:tasks.workunit.client.0.vm03.stdout:2/569: dwrite d5/d10/d1f/f3e [0,4194304] 0 2026-03-10T14:08:35.959 INFO:tasks.workunit.client.0.vm03.stdout:3/652: mkdir d1d/d29/d41/d45/d55/d6e/d83/d91/dd4 0 2026-03-10T14:08:35.963 INFO:tasks.workunit.client.0.vm03.stdout:7/507: sync 2026-03-10T14:08:35.963 INFO:tasks.workunit.client.0.vm03.stdout:2/570: dread - d5/d10/fa4 zero size 2026-03-10T14:08:35.969 INFO:tasks.workunit.client.0.vm03.stdout:4/618: dread d5/f7 [0,4194304] 0 2026-03-10T14:08:35.977 INFO:tasks.workunit.client.0.vm03.stdout:5/742: dwrite d4/d40/dbc/fbf [0,4194304] 0 2026-03-10T14:08:35.983 INFO:tasks.workunit.client.0.vm03.stdout:4/619: dread - d5/db4/fb5 zero size 2026-03-10T14:08:35.992 INFO:tasks.workunit.client.0.vm03.stdout:2/571: dwrite d5/d10/da3/dab/fb3 [0,4194304] 0 2026-03-10T14:08:35.992 INFO:tasks.workunit.client.0.vm03.stdout:0/589: dwrite d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:35.993 INFO:tasks.workunit.client.0.vm03.stdout:3/653: mkdir d1d/d29/d41/d45/d55/d6e/d83/d91/dd5 0 2026-03-10T14:08:36.008 INFO:tasks.workunit.client.0.vm03.stdout:6/581: truncate d8/d11/d7a/fad 3095166 0 2026-03-10T14:08:36.014 INFO:tasks.workunit.client.0.vm03.stdout:1/583: write d0/f48 [2297859,94220] 0 2026-03-10T14:08:36.023 INFO:tasks.workunit.client.0.vm03.stdout:2/572: dwrite d5/d10/da3/dab/fb3 [0,4194304] 0 2026-03-10T14:08:36.025 INFO:tasks.workunit.client.0.vm03.stdout:9/596: rename d2/d29/d33/d55/l7d to d2/d14/d2b/lc1 0 2026-03-10T14:08:36.026 INFO:tasks.workunit.client.0.vm03.stdout:4/620: mkdir d5/d9/db/da7/dcc 0 2026-03-10T14:08:36.026 INFO:tasks.workunit.client.0.vm03.stdout:0/590: truncate d3/d16/d21/f78 866171 0 2026-03-10T14:08:36.026 INFO:tasks.workunit.client.0.vm03.stdout:8/667: creat da/d24/d49/dab/dd7/fdc x:0 0 0 2026-03-10T14:08:36.028 INFO:tasks.workunit.client.0.vm03.stdout:8/668: write da/d3a/fd5 [699043,87970] 0 2026-03-10T14:08:36.039 INFO:tasks.workunit.client.0.vm03.stdout:5/743: mknod d4/d35/ced 0 2026-03-10T14:08:36.041 INFO:tasks.workunit.client.0.vm03.stdout:9/597: dread d2/d14/f64 [0,4194304] 0 2026-03-10T14:08:36.041 INFO:tasks.workunit.client.0.vm03.stdout:1/584: rename d0/d2/df/d91/dad to d0/d2/df/d16/d41/dba 0 2026-03-10T14:08:36.041 INFO:tasks.workunit.client.0.vm03.stdout:5/744: chown d4/d6/fa 219 1 2026-03-10T14:08:36.041 INFO:tasks.workunit.client.0.vm03.stdout:4/621: creat d5/d47/d5b/dbe/fcd x:0 0 0 2026-03-10T14:08:36.047 INFO:tasks.workunit.client.0.vm03.stdout:0/591: read d3/d17/f56 [255226,106087] 0 2026-03-10T14:08:36.055 INFO:tasks.workunit.client.0.vm03.stdout:6/582: read d8/db/d49/d6c/d32/f3e [414170,98798] 0 2026-03-10T14:08:36.055 INFO:tasks.workunit.client.0.vm03.stdout:4/622: unlink d5/d9/db/d19/d34/la5 0 2026-03-10T14:08:36.055 INFO:tasks.workunit.client.0.vm03.stdout:5/745: creat d4/d16/d19/d23/fee x:0 0 0 2026-03-10T14:08:36.059 INFO:tasks.workunit.client.0.vm03.stdout:2/573: symlink d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/lb7 0 2026-03-10T14:08:36.062 INFO:tasks.workunit.client.0.vm03.stdout:1/585: mknod d0/d2/df/d27/d7e/d81/cbb 0 2026-03-10T14:08:36.063 INFO:tasks.workunit.client.0.vm03.stdout:0/592: creat d3/d16/d21/d3c/fc0 x:0 0 0 2026-03-10T14:08:36.068 INFO:tasks.workunit.client.0.vm03.stdout:3/654: dread d1d/d29/d41/d45/d55/f63 [0,4194304] 0 2026-03-10T14:08:36.068 INFO:tasks.workunit.client.0.vm03.stdout:7/508: dwrite d5/d9/d14/f41 [0,4194304] 0 2026-03-10T14:08:36.071 INFO:tasks.workunit.client.0.vm03.stdout:6/583: rename d8/db/d12/d51/f81 to d8/db/df/fb3 0 2026-03-10T14:08:36.074 INFO:tasks.workunit.client.0.vm03.stdout:1/586: chown d0/d2/df/d16/d20/c9c 153 1 2026-03-10T14:08:36.080 INFO:tasks.workunit.client.0.vm03.stdout:8/669: dread da/f2e [0,4194304] 0 2026-03-10T14:08:36.084 INFO:tasks.workunit.client.0.vm03.stdout:5/746: dread d4/d13/d43/f94 [0,4194304] 0 2026-03-10T14:08:36.094 INFO:tasks.workunit.client.0.vm03.stdout:4/623: rename d5/d47/l69 to d5/d9/db/d19/d38/d53/d55/lce 0 2026-03-10T14:08:36.094 INFO:tasks.workunit.client.0.vm03.stdout:6/584: creat d8/d11/d18/d79/fb4 x:0 0 0 2026-03-10T14:08:36.095 INFO:tasks.workunit.client.0.vm03.stdout:7/509: chown d5/d9/d14/d26/f38 6 1 2026-03-10T14:08:36.099 INFO:tasks.workunit.client.0.vm03.stdout:3/655: mknod d1d/d29/d41/d45/d55/dbf/cd6 0 2026-03-10T14:08:36.108 INFO:tasks.workunit.client.0.vm03.stdout:8/670: mkdir da/ddd 0 2026-03-10T14:08:36.108 INFO:tasks.workunit.client.0.vm03.stdout:3/656: truncate d1d/d29/d41/d45/d95/faa 1649830 0 2026-03-10T14:08:36.117 INFO:tasks.workunit.client.0.vm03.stdout:6/585: mkdir d8/db/d49/d76/db5 0 2026-03-10T14:08:36.119 INFO:tasks.workunit.client.0.vm03.stdout:6/586: dread - d8/db/d12/f72 zero size 2026-03-10T14:08:36.128 INFO:tasks.workunit.client.0.vm03.stdout:7/510: sync 2026-03-10T14:08:36.128 INFO:tasks.workunit.client.0.vm03.stdout:8/671: sync 2026-03-10T14:08:36.129 INFO:tasks.workunit.client.0.vm03.stdout:8/672: stat da/f2e 0 2026-03-10T14:08:36.130 INFO:tasks.workunit.client.0.vm03.stdout:6/587: dwrite d8/db/d12/f7c [0,4194304] 0 2026-03-10T14:08:36.134 INFO:tasks.workunit.client.0.vm03.stdout:3/657: dwrite d1d/d29/d41/d45/d5b/fb5 [0,4194304] 0 2026-03-10T14:08:36.136 INFO:tasks.workunit.client.0.vm03.stdout:8/673: write da/d58/d5f/d67/fc8 [31873,22362] 0 2026-03-10T14:08:36.159 INFO:tasks.workunit.client.0.vm03.stdout:2/574: rmdir d5/d10/d1f/d4f 39 2026-03-10T14:08:36.168 INFO:tasks.workunit.client.0.vm03.stdout:0/593: rename d3/f19 to d3/d11/d66/da2/fc1 0 2026-03-10T14:08:36.174 INFO:tasks.workunit.client.0.vm03.stdout:1/587: getdents d0/d2 0 2026-03-10T14:08:36.176 INFO:tasks.workunit.client.0.vm03.stdout:9/598: dwrite d2/d14/d2b/d43/fa2 [0,4194304] 0 2026-03-10T14:08:36.178 INFO:tasks.workunit.client.0.vm03.stdout:1/588: readlink d0/d2/df/d27/d7e/l89 0 2026-03-10T14:08:36.190 INFO:tasks.workunit.client.0.vm03.stdout:6/588: mkdir d8/db/d12/d51/d8c/db6 0 2026-03-10T14:08:36.191 INFO:tasks.workunit.client.0.vm03.stdout:4/624: dwrite d5/d9/db/f20 [4194304,4194304] 0 2026-03-10T14:08:36.192 INFO:tasks.workunit.client.0.vm03.stdout:4/625: readlink d5/l6a 0 2026-03-10T14:08:36.192 INFO:tasks.workunit.client.0.vm03.stdout:8/674: readlink da/lc 0 2026-03-10T14:08:36.193 INFO:tasks.workunit.client.0.vm03.stdout:5/747: dread d4/d16/d19/d23/db8/fba [0,4194304] 0 2026-03-10T14:08:36.203 INFO:tasks.workunit.client.0.vm03.stdout:9/599: rename d2/d14/d2b/d79/d81/la1 to d2/d29/d33/d41/lc2 0 2026-03-10T14:08:36.210 INFO:tasks.workunit.client.0.vm03.stdout:4/626: mknod d5/d9/db/da7/ccf 0 2026-03-10T14:08:36.210 INFO:tasks.workunit.client.0.vm03.stdout:4/627: chown d5/d47/l40 30205157 1 2026-03-10T14:08:36.217 INFO:tasks.workunit.client.0.vm03.stdout:8/675: truncate da/fd 3254527 0 2026-03-10T14:08:36.218 INFO:tasks.workunit.client.0.vm03.stdout:2/575: truncate d5/f9 2905174 0 2026-03-10T14:08:36.222 INFO:tasks.workunit.client.0.vm03.stdout:7/511: rmdir d5/d9/da6 0 2026-03-10T14:08:36.229 INFO:tasks.workunit.client.0.vm03.stdout:9/600: rename d2/d14/d2b/d43/f56 to d2/d29/d33/d6d/d87/fc3 0 2026-03-10T14:08:36.230 INFO:tasks.workunit.client.0.vm03.stdout:4/628: chown d5/d47/d5b/c5f 171161119 1 2026-03-10T14:08:36.232 INFO:tasks.workunit.client.0.vm03.stdout:3/658: getdents d1d/d29/d41/d45/d5b 0 2026-03-10T14:08:36.267 INFO:tasks.workunit.client.0.vm03.stdout:5/748: dwrite d4/d16/d19/d4a/fbd [4194304,4194304] 0 2026-03-10T14:08:36.276 INFO:tasks.workunit.client.0.vm03.stdout:3/659: dread d1d/d29/f9b [0,4194304] 0 2026-03-10T14:08:36.278 INFO:tasks.workunit.client.0.vm03.stdout:6/589: creat d8/db/d12/d51/d8c/db6/fb7 x:0 0 0 2026-03-10T14:08:36.291 INFO:tasks.workunit.client.0.vm03.stdout:8/676: dread da/d24/f52 [0,4194304] 0 2026-03-10T14:08:36.294 INFO:tasks.workunit.client.0.vm03.stdout:7/512: mkdir d5/d9/d35/dab 0 2026-03-10T14:08:36.295 INFO:tasks.workunit.client.0.vm03.stdout:7/513: readlink d5/d9/d14/d21/d28/la9 0 2026-03-10T14:08:36.296 INFO:tasks.workunit.client.0.vm03.stdout:4/629: mkdir d5/d47/d62/d8a/dd0 0 2026-03-10T14:08:36.296 INFO:tasks.workunit.client.0.vm03.stdout:4/630: dread - d5/db4/fb5 zero size 2026-03-10T14:08:36.297 INFO:tasks.workunit.client.0.vm03.stdout:4/631: chown d5/d9/db/d19/d38/d53/d55/fa8 20290773 1 2026-03-10T14:08:36.298 INFO:tasks.workunit.client.0.vm03.stdout:4/632: write d5/d9/db/f3d [943025,40514] 0 2026-03-10T14:08:36.319 INFO:tasks.workunit.client.0.vm03.stdout:2/576: symlink d5/db4/d74/d83/lb8 0 2026-03-10T14:08:36.319 INFO:tasks.workunit.client.0.vm03.stdout:2/577: dread - d5/d35/f9b zero size 2026-03-10T14:08:36.324 INFO:tasks.workunit.client.0.vm03.stdout:0/594: getdents d3/d4d/d47 0 2026-03-10T14:08:36.331 INFO:tasks.workunit.client.0.vm03.stdout:5/749: dwrite d4/d6/de/fe7 [0,4194304] 0 2026-03-10T14:08:36.334 INFO:tasks.workunit.client.0.vm03.stdout:1/589: getdents d0/d2/df/dab 0 2026-03-10T14:08:36.336 INFO:tasks.workunit.client.0.vm03.stdout:4/633: creat d5/d9/d2b/fd1 x:0 0 0 2026-03-10T14:08:36.339 INFO:tasks.workunit.client.0.vm03.stdout:3/660: mknod d1d/d29/d41/d45/cd7 0 2026-03-10T14:08:36.343 INFO:tasks.workunit.client.0.vm03.stdout:6/590: mknod d8/d3b/da7/cb8 0 2026-03-10T14:08:36.344 INFO:tasks.workunit.client.0.vm03.stdout:2/578: truncate d5/db4/d74/d83/f96 686860 0 2026-03-10T14:08:36.361 INFO:tasks.workunit.client.0.vm03.stdout:0/595: creat d3/d46/dac/fc2 x:0 0 0 2026-03-10T14:08:36.364 INFO:tasks.workunit.client.0.vm03.stdout:8/677: mknod da/d3c/d51/d85/dbb/cde 0 2026-03-10T14:08:36.365 INFO:tasks.workunit.client.0.vm03.stdout:5/750: stat d4/d40/f96 0 2026-03-10T14:08:36.371 INFO:tasks.workunit.client.0.vm03.stdout:1/590: write d0/d2/df/d27/d7e/fa0 [791955,25748] 0 2026-03-10T14:08:36.371 INFO:tasks.workunit.client.0.vm03.stdout:7/514: mknod d5/d9/d35/dab/cac 0 2026-03-10T14:08:36.371 INFO:tasks.workunit.client.0.vm03.stdout:4/634: mkdir d5/db4/dd2 0 2026-03-10T14:08:36.374 INFO:tasks.workunit.client.0.vm03.stdout:3/661: dwrite d1d/d59/f69 [0,4194304] 0 2026-03-10T14:08:36.375 INFO:tasks.workunit.client.0.vm03.stdout:6/591: rename d8/db/d12/d64/cb2 to d8/db/d49/d58/cb9 0 2026-03-10T14:08:36.393 INFO:tasks.workunit.client.0.vm03.stdout:5/751: write d4/d13/d1f/d84/fc9 [457182,8919] 0 2026-03-10T14:08:36.394 INFO:tasks.workunit.client.0.vm03.stdout:5/752: chown d4/f37 26843191 1 2026-03-10T14:08:36.394 INFO:tasks.workunit.client.0.vm03.stdout:9/601: getdents d2/d29/d33/d41/d95 0 2026-03-10T14:08:36.400 INFO:tasks.workunit.client.0.vm03.stdout:4/635: mknod d5/d96/cd3 0 2026-03-10T14:08:36.403 INFO:tasks.workunit.client.0.vm03.stdout:7/515: mknod d5/d9/d14/d26/d5f/cad 0 2026-03-10T14:08:36.403 INFO:tasks.workunit.client.0.vm03.stdout:7/516: read - d5/d9/d14/d26/f64 zero size 2026-03-10T14:08:36.406 INFO:tasks.workunit.client.0.vm03.stdout:3/662: creat d1d/d33/d65/d5d/dae/db9/fd8 x:0 0 0 2026-03-10T14:08:36.414 INFO:tasks.workunit.client.0.vm03.stdout:0/596: write d3/d17/f56 [496474,49247] 0 2026-03-10T14:08:36.415 INFO:tasks.workunit.client.0.vm03.stdout:8/678: creat da/d24/db4/fdf x:0 0 0 2026-03-10T14:08:36.419 INFO:tasks.workunit.client.0.vm03.stdout:5/753: mknod d4/d13/d1f/de1/cef 0 2026-03-10T14:08:36.419 INFO:tasks.workunit.client.0.vm03.stdout:5/754: readlink d4/lc8 0 2026-03-10T14:08:36.420 INFO:tasks.workunit.client.0.vm03.stdout:5/755: chown d4/d40/d4e/f5c 3490476 1 2026-03-10T14:08:36.423 INFO:tasks.workunit.client.0.vm03.stdout:7/517: stat d5/d9/f22 0 2026-03-10T14:08:36.424 INFO:tasks.workunit.client.0.vm03.stdout:6/592: unlink d8/db/d12/d64/f9b 0 2026-03-10T14:08:36.425 INFO:tasks.workunit.client.0.vm03.stdout:3/663: unlink d1d/d39/d51/d72/d99/l9c 0 2026-03-10T14:08:36.429 INFO:tasks.workunit.client.0.vm03.stdout:2/579: fdatasync d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 0 2026-03-10T14:08:36.435 INFO:tasks.workunit.client.0.vm03.stdout:6/593: dread d8/db/d49/d6c/d32/f4b [0,4194304] 0 2026-03-10T14:08:36.436 INFO:tasks.workunit.client.0.vm03.stdout:6/594: readlink d8/db/d49/d76/l99 0 2026-03-10T14:08:36.438 INFO:tasks.workunit.client.0.vm03.stdout:0/597: rename d3/d4d/d8f/fa7 to d3/d16/d21/d9a/fc3 0 2026-03-10T14:08:36.439 INFO:tasks.workunit.client.0.vm03.stdout:0/598: write d3/d16/f2d [172502,82728] 0 2026-03-10T14:08:36.446 INFO:tasks.workunit.client.0.vm03.stdout:4/636: mknod d5/d9/db/da7/db9/cd4 0 2026-03-10T14:08:36.451 INFO:tasks.workunit.client.0.vm03.stdout:7/518: mknod d5/d9/d14/d21/d6f/cae 0 2026-03-10T14:08:36.452 INFO:tasks.workunit.client.0.vm03.stdout:3/664: truncate d1d/d33/d65/d48/f58 1226341 0 2026-03-10T14:08:36.453 INFO:tasks.workunit.client.0.vm03.stdout:3/665: chown d1d/l1f 2367 1 2026-03-10T14:08:36.457 INFO:tasks.workunit.client.0.vm03.stdout:8/679: write da/d58/f72 [244950,96905] 0 2026-03-10T14:08:36.465 INFO:tasks.workunit.client.0.vm03.stdout:9/602: link d2/d29/d33/d41/d46/l8e d2/d29/d33/d6d/d87/lc4 0 2026-03-10T14:08:36.466 INFO:tasks.workunit.client.0.vm03.stdout:5/756: dwrite d4/f9e [0,4194304] 0 2026-03-10T14:08:36.475 INFO:tasks.workunit.client.0.vm03.stdout:4/637: stat d5/d9/db/da7/lc6 0 2026-03-10T14:08:36.475 INFO:tasks.workunit.client.0.vm03.stdout:1/591: getdents d0/d2/df/d16/d41 0 2026-03-10T14:08:36.475 INFO:tasks.workunit.client.0.vm03.stdout:7/519: symlink d5/d9/d35/laf 0 2026-03-10T14:08:36.477 INFO:tasks.workunit.client.0.vm03.stdout:4/638: chown d5/d47/d5b/dbe/fcd 159242755 1 2026-03-10T14:08:36.480 INFO:tasks.workunit.client.0.vm03.stdout:7/520: write d5/d9/d14/d26/d36/f3a [224873,127966] 0 2026-03-10T14:08:36.481 INFO:tasks.workunit.client.0.vm03.stdout:7/521: write d5/d9/d35/f52 [2782051,39947] 0 2026-03-10T14:08:36.485 INFO:tasks.workunit.client.0.vm03.stdout:7/522: dread d5/d9/d14/d21/d28/f6e [0,4194304] 0 2026-03-10T14:08:36.489 INFO:tasks.workunit.client.0.vm03.stdout:0/599: mknod d3/d46/cc4 0 2026-03-10T14:08:36.493 INFO:tasks.workunit.client.0.vm03.stdout:0/600: dwrite d3/d17/f35 [0,4194304] 0 2026-03-10T14:08:36.509 INFO:tasks.workunit.client.0.vm03.stdout:5/757: symlink d4/d13/d1f/lf0 0 2026-03-10T14:08:36.516 INFO:tasks.workunit.client.0.vm03.stdout:3/666: dwrite fe [4194304,4194304] 0 2026-03-10T14:08:36.520 INFO:tasks.workunit.client.0.vm03.stdout:1/592: symlink d0/d2/df/d16/d41/lbc 0 2026-03-10T14:08:36.526 INFO:tasks.workunit.client.0.vm03.stdout:2/580: creat d5/d2a/fb9 x:0 0 0 2026-03-10T14:08:36.539 INFO:tasks.workunit.client.0.vm03.stdout:6/595: truncate d8/db/d49/d6c/d83/f87 313192 0 2026-03-10T14:08:36.545 INFO:tasks.workunit.client.0.vm03.stdout:0/601: dwrite d3/d16/d21/d9a/fc3 [0,4194304] 0 2026-03-10T14:08:36.550 INFO:tasks.workunit.client.0.vm03.stdout:5/758: rmdir d4/d13/d43 39 2026-03-10T14:08:36.550 INFO:tasks.workunit.client.0.vm03.stdout:5/759: chown d4/d16/da0/ce6 8708 1 2026-03-10T14:08:36.553 INFO:tasks.workunit.client.0.vm03.stdout:3/667: truncate d1d/d29/d41/d45/d5b/f9e 623408 0 2026-03-10T14:08:36.558 INFO:tasks.workunit.client.0.vm03.stdout:9/603: dwrite d2/d29/d33/fa9 [0,4194304] 0 2026-03-10T14:08:36.563 INFO:tasks.workunit.client.0.vm03.stdout:4/639: dwrite d5/d9/f97 [0,4194304] 0 2026-03-10T14:08:36.564 INFO:tasks.workunit.client.0.vm03.stdout:4/640: chown d5/d9/c26 28354 1 2026-03-10T14:08:36.595 INFO:tasks.workunit.client.0.vm03.stdout:3/668: mknod d1d/d59/cd9 0 2026-03-10T14:08:36.600 INFO:tasks.workunit.client.0.vm03.stdout:1/593: fdatasync d0/d18/d3b/f4c 0 2026-03-10T14:08:36.610 INFO:tasks.workunit.client.0.vm03.stdout:7/523: creat d5/d9/d14/d21/fb0 x:0 0 0 2026-03-10T14:08:36.613 INFO:tasks.workunit.client.0.vm03.stdout:0/602: mknod d3/d11/d2c/db7/cc5 0 2026-03-10T14:08:36.618 INFO:tasks.workunit.client.0.vm03.stdout:3/669: unlink l14 0 2026-03-10T14:08:36.619 INFO:tasks.workunit.client.0.vm03.stdout:0/603: dwrite d3/d16/f3e [4194304,4194304] 0 2026-03-10T14:08:36.635 INFO:tasks.workunit.client.0.vm03.stdout:8/680: write da/d58/d6c/fa4 [1071374,108059] 0 2026-03-10T14:08:36.640 INFO:tasks.workunit.client.0.vm03.stdout:8/681: dread da/d24/f28 [0,4194304] 0 2026-03-10T14:08:36.641 INFO:tasks.workunit.client.0.vm03.stdout:2/581: rename d5/d10/d1f/d4f/d76/da7/d54/d9c to d5/d10/dba 0 2026-03-10T14:08:36.644 INFO:tasks.workunit.client.0.vm03.stdout:7/524: mkdir d5/d9/d14/d26/d39/db1 0 2026-03-10T14:08:36.656 INFO:tasks.workunit.client.0.vm03.stdout:7/525: readlink d5/d9/d14/d26/l43 0 2026-03-10T14:08:36.656 INFO:tasks.workunit.client.0.vm03.stdout:9/604: write d2/d29/d33/f70 [443758,106542] 0 2026-03-10T14:08:36.656 INFO:tasks.workunit.client.0.vm03.stdout:3/670: mknod d1d/d29/d41/d45/d55/d6e/d83/d91/cda 0 2026-03-10T14:08:36.656 INFO:tasks.workunit.client.0.vm03.stdout:3/671: truncate d1d/d33/d47/d53/d68/fc4 317509 0 2026-03-10T14:08:36.656 INFO:tasks.workunit.client.0.vm03.stdout:8/682: mknod da/d58/d5f/ce0 0 2026-03-10T14:08:36.656 INFO:tasks.workunit.client.0.vm03.stdout:8/683: fdatasync da/d24/f2d 0 2026-03-10T14:08:36.658 INFO:tasks.workunit.client.0.vm03.stdout:9/605: mknod d2/d29/d33/d41/d95/cc5 0 2026-03-10T14:08:36.658 INFO:tasks.workunit.client.0.vm03.stdout:3/672: mknod d1d/d33/d65/d5d/cdb 0 2026-03-10T14:08:36.666 INFO:tasks.workunit.client.0.vm03.stdout:0/604: creat d3/d46/da9/fc6 x:0 0 0 2026-03-10T14:08:36.666 INFO:tasks.workunit.client.0.vm03.stdout:2/582: mkdir d5/d10/d1f/d4f/d76/da7/dbb 0 2026-03-10T14:08:36.666 INFO:tasks.workunit.client.0.vm03.stdout:0/605: dwrite d3/d46/f63 [4194304,4194304] 0 2026-03-10T14:08:36.666 INFO:tasks.workunit.client.0.vm03.stdout:0/606: truncate d3/d16/d21/fb8 479333 0 2026-03-10T14:08:36.673 INFO:tasks.workunit.client.0.vm03.stdout:2/583: mknod d5/db4/cbc 0 2026-03-10T14:08:36.673 INFO:tasks.workunit.client.0.vm03.stdout:0/607: dwrite d3/d11/d2c/d4a/d7b/f87 [0,4194304] 0 2026-03-10T14:08:36.675 INFO:tasks.workunit.client.0.vm03.stdout:2/584: fdatasync d5/d10/d1f/d4f/d76/da7/d40/f52 0 2026-03-10T14:08:36.675 INFO:tasks.workunit.client.0.vm03.stdout:9/606: truncate d2/d14/f30 3405512 0 2026-03-10T14:08:36.683 INFO:tasks.workunit.client.0.vm03.stdout:0/608: getdents d3/d11/d66 0 2026-03-10T14:08:36.706 INFO:tasks.workunit.client.0.vm03.stdout:1/594: sync 2026-03-10T14:08:36.706 INFO:tasks.workunit.client.0.vm03.stdout:2/585: sync 2026-03-10T14:08:36.709 INFO:tasks.workunit.client.0.vm03.stdout:0/609: dread d3/d46/dac/d79/f88 [0,4194304] 0 2026-03-10T14:08:36.713 INFO:tasks.workunit.client.0.vm03.stdout:1/595: rmdir d0/d2/df/d27/d7e 39 2026-03-10T14:08:36.715 INFO:tasks.workunit.client.0.vm03.stdout:2/586: truncate d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 2315382 0 2026-03-10T14:08:36.720 INFO:tasks.workunit.client.0.vm03.stdout:2/587: dwrite d5/f2d [0,4194304] 0 2026-03-10T14:08:36.722 INFO:tasks.workunit.client.0.vm03.stdout:0/610: symlink d3/lc7 0 2026-03-10T14:08:36.723 INFO:tasks.workunit.client.0.vm03.stdout:0/611: chown d3/d46/da9 153 1 2026-03-10T14:08:36.724 INFO:tasks.workunit.client.0.vm03.stdout:2/588: mknod d5/d10/da3/cbd 0 2026-03-10T14:08:36.725 INFO:tasks.workunit.client.0.vm03.stdout:1/596: link d0/d2/fb9 d0/d18/d3b/db7/fbd 0 2026-03-10T14:08:36.736 INFO:tasks.workunit.client.0.vm03.stdout:0/612: fdatasync d3/d46/f63 0 2026-03-10T14:08:36.742 INFO:tasks.workunit.client.0.vm03.stdout:2/589: fsync d5/d10/d1f/d4f/d76/da7/d40/d59/f70 0 2026-03-10T14:08:36.748 INFO:tasks.workunit.client.0.vm03.stdout:1/597: symlink d0/d2/df/d16/d20/lbe 0 2026-03-10T14:08:36.748 INFO:tasks.workunit.client.0.vm03.stdout:6/596: write d8/db/d49/d6c/d32/d3a/f5e [3791556,83241] 0 2026-03-10T14:08:36.754 INFO:tasks.workunit.client.0.vm03.stdout:5/760: truncate d4/d35/f9b 2341379 0 2026-03-10T14:08:36.757 INFO:tasks.workunit.client.0.vm03.stdout:6/597: creat d8/db/d12/d64/fba x:0 0 0 2026-03-10T14:08:36.764 INFO:tasks.workunit.client.0.vm03.stdout:4/641: dwrite d5/d9/db/d19/f3a [0,4194304] 0 2026-03-10T14:08:36.783 INFO:tasks.workunit.client.0.vm03.stdout:7/526: rmdir d5/d9/d14/d26 39 2026-03-10T14:08:36.798 INFO:tasks.workunit.client.0.vm03.stdout:3/673: write d1d/d33/d47/d53/d68/fc4 [1125564,54334] 0 2026-03-10T14:08:36.801 INFO:tasks.workunit.client.0.vm03.stdout:8/684: write da/d24/f52 [485842,123433] 0 2026-03-10T14:08:36.801 INFO:tasks.workunit.client.0.vm03.stdout:8/685: chown da/d3c/d4b 61 1 2026-03-10T14:08:36.832 INFO:tasks.workunit.client.0.vm03.stdout:9/607: dwrite d2/d14/d2b/f2d [4194304,4194304] 0 2026-03-10T14:08:36.839 INFO:tasks.workunit.client.0.vm03.stdout:4/642: dread d5/d9/db/d19/d34/f5c [0,4194304] 0 2026-03-10T14:08:36.839 INFO:tasks.workunit.client.0.vm03.stdout:6/598: dread d8/d11/d18/d54/f8b [4194304,4194304] 0 2026-03-10T14:08:36.843 INFO:tasks.workunit.client.0.vm03.stdout:1/598: link d0/d2/df/d16/d41/f68 d0/fbf 0 2026-03-10T14:08:36.844 INFO:tasks.workunit.client.0.vm03.stdout:1/599: readlink d0/d2/df/l93 0 2026-03-10T14:08:36.855 INFO:tasks.workunit.client.0.vm03.stdout:4/643: dread d5/d47/d5b/f84 [0,4194304] 0 2026-03-10T14:08:36.857 INFO:tasks.workunit.client.0.vm03.stdout:6/599: dread d8/db/d49/d6c/d32/f3e [0,4194304] 0 2026-03-10T14:08:36.857 INFO:tasks.workunit.client.0.vm03.stdout:4/644: stat d5/d9/db/d19/d34/f58 0 2026-03-10T14:08:36.857 INFO:tasks.workunit.client.0.vm03.stdout:3/674: creat d1d/d29/d41/d45/d95/fdc x:0 0 0 2026-03-10T14:08:36.859 INFO:tasks.workunit.client.0.vm03.stdout:3/675: dread - d1d/d29/d41/d45/d5b/d7e/fc3 zero size 2026-03-10T14:08:36.862 INFO:tasks.workunit.client.0.vm03.stdout:0/613: link d3/d11/d2c/d4a/d4b/c98 d3/d16/d21/cc8 0 2026-03-10T14:08:36.863 INFO:tasks.workunit.client.0.vm03.stdout:0/614: chown d3/d46/dac/d79 10932946 1 2026-03-10T14:08:36.863 INFO:tasks.workunit.client.0.vm03.stdout:2/590: getdents d5/d10/d1f/d4f/d76/da7/d54/d5f 0 2026-03-10T14:08:36.864 INFO:tasks.workunit.client.0.vm03.stdout:3/676: dread d1d/d29/d41/dc0/fca [0,4194304] 0 2026-03-10T14:08:36.870 INFO:tasks.workunit.client.0.vm03.stdout:9/608: creat d2/d29/d33/d60/fc6 x:0 0 0 2026-03-10T14:08:36.871 INFO:tasks.workunit.client.0.vm03.stdout:9/609: chown d2/d29/d33/c36 3399337 1 2026-03-10T14:08:36.873 INFO:tasks.workunit.client.0.vm03.stdout:0/615: dread d3/d16/d21/fa4 [0,4194304] 0 2026-03-10T14:08:36.880 INFO:tasks.workunit.client.0.vm03.stdout:1/600: symlink d0/d2/df/d91/lc0 0 2026-03-10T14:08:36.880 INFO:tasks.workunit.client.0.vm03.stdout:3/677: sync 2026-03-10T14:08:36.887 INFO:tasks.workunit.client.0.vm03.stdout:7/527: mknod d5/d9/d14/d26/d39/db1/cb2 0 2026-03-10T14:08:36.889 INFO:tasks.workunit.client.0.vm03.stdout:6/600: write d8/d11/d18/f34 [3611435,83687] 0 2026-03-10T14:08:36.892 INFO:tasks.workunit.client.0.vm03.stdout:5/761: dwrite d4/d40/d4e/f80 [0,4194304] 0 2026-03-10T14:08:36.896 INFO:tasks.workunit.client.0.vm03.stdout:4/645: truncate d5/d6e/f8c 417893 0 2026-03-10T14:08:36.898 INFO:tasks.workunit.client.0.vm03.stdout:4/646: chown d5/d9/db/d19/d38/d7b/ccb 22 1 2026-03-10T14:08:36.900 INFO:tasks.workunit.client.0.vm03.stdout:2/591: dwrite d5/d10/d1f/f3e [0,4194304] 0 2026-03-10T14:08:36.903 INFO:tasks.workunit.client.0.vm03.stdout:4/647: stat d5/d96/cd3 0 2026-03-10T14:08:36.903 INFO:tasks.workunit.client.0.vm03.stdout:8/686: dwrite da/d58/d5f/dbc/d70/d7d/f9c [0,4194304] 0 2026-03-10T14:08:36.908 INFO:tasks.workunit.client.0.vm03.stdout:3/678: symlink d1d/d33/d65/d5d/ldd 0 2026-03-10T14:08:36.909 INFO:tasks.workunit.client.0.vm03.stdout:1/601: readlink d0/d2/d71/l85 0 2026-03-10T14:08:36.913 INFO:tasks.workunit.client.0.vm03.stdout:8/687: chown da/d24/d49/dab/dd7/fdc 323012 1 2026-03-10T14:08:36.914 INFO:tasks.workunit.client.0.vm03.stdout:3/679: chown d1d/d29/d41/d45/d55/d6e/d83/d91/dd5 203845 1 2026-03-10T14:08:36.914 INFO:tasks.workunit.client.0.vm03.stdout:3/680: stat d1d/c25 0 2026-03-10T14:08:36.918 INFO:tasks.workunit.client.0.vm03.stdout:7/528: symlink d5/d9/d14/d26/d39/lb3 0 2026-03-10T14:08:36.919 INFO:tasks.workunit.client.0.vm03.stdout:3/681: fsync fe 0 2026-03-10T14:08:36.926 INFO:tasks.workunit.client.0.vm03.stdout:6/601: symlink d8/db/d49/d6c/lbb 0 2026-03-10T14:08:36.934 INFO:tasks.workunit.client.0.vm03.stdout:9/610: rename d2/d29/d33/d60/cb4 to d2/d29/da7/cc7 0 2026-03-10T14:08:36.934 INFO:tasks.workunit.client.0.vm03.stdout:9/611: chown d2/d14/d2b/d79/d8a 90 1 2026-03-10T14:08:36.935 INFO:tasks.workunit.client.0.vm03.stdout:2/592: mkdir d5/d10/d1f/d4f/dbe 0 2026-03-10T14:08:36.935 INFO:tasks.workunit.client.0.vm03.stdout:4/648: dread - d5/d9/db/d19/d38/d53/d55/fa6 zero size 2026-03-10T14:08:36.942 INFO:tasks.workunit.client.0.vm03.stdout:9/612: rename d2/d29/da7/c74 to d2/d14/d2b/d34/cc8 0 2026-03-10T14:08:36.943 INFO:tasks.workunit.client.0.vm03.stdout:7/529: mkdir d5/d9/d14/d26/d36/db4 0 2026-03-10T14:08:36.943 INFO:tasks.workunit.client.0.vm03.stdout:9/613: write d2/d29/d33/f70 [733969,106222] 0 2026-03-10T14:08:36.946 INFO:tasks.workunit.client.0.vm03.stdout:7/530: stat d5/d9/d3e/c4c 0 2026-03-10T14:08:36.948 INFO:tasks.workunit.client.0.vm03.stdout:2/593: creat d5/d10/d1f/d4f/d76/da7/d40/d59/da2/fbf x:0 0 0 2026-03-10T14:08:36.948 INFO:tasks.workunit.client.0.vm03.stdout:4/649: truncate d5/d96/fc1 788589 0 2026-03-10T14:08:36.953 INFO:tasks.workunit.client.0.vm03.stdout:7/531: truncate d5/f53 1571143 0 2026-03-10T14:08:36.954 INFO:tasks.workunit.client.0.vm03.stdout:4/650: mkdir d5/d47/d5b/dbe/dc4/dd5 0 2026-03-10T14:08:36.955 INFO:tasks.workunit.client.0.vm03.stdout:2/594: mknod d5/d10/d1f/d4f/d76/da7/d54/d5f/cc0 0 2026-03-10T14:08:36.963 INFO:tasks.workunit.client.0.vm03.stdout:9/614: rename d2/ce to d2/d29/d33/d41/daa/cc9 0 2026-03-10T14:08:36.968 INFO:tasks.workunit.client.0.vm03.stdout:7/532: unlink d5/d9/d35/f69 0 2026-03-10T14:08:36.969 INFO:tasks.workunit.client.0.vm03.stdout:4/651: creat d5/d9/fd6 x:0 0 0 2026-03-10T14:08:36.969 INFO:tasks.workunit.client.0.vm03.stdout:9/615: readlink d2/d29/d33/d41/d46/l8f 0 2026-03-10T14:08:36.969 INFO:tasks.workunit.client.0.vm03.stdout:7/533: rmdir d5/d9/d14/d26/d36/d51 39 2026-03-10T14:08:36.971 INFO:tasks.workunit.client.0.vm03.stdout:4/652: creat d5/d9/db/d19/d99/fd7 x:0 0 0 2026-03-10T14:08:36.974 INFO:tasks.workunit.client.0.vm03.stdout:1/602: dread d0/d2/df/f1b [0,4194304] 0 2026-03-10T14:08:36.975 INFO:tasks.workunit.client.0.vm03.stdout:9/616: rmdir d2/d14/d2b/d43/dc0 0 2026-03-10T14:08:36.976 INFO:tasks.workunit.client.0.vm03.stdout:2/595: sync 2026-03-10T14:08:36.979 INFO:tasks.workunit.client.0.vm03.stdout:4/653: mknod d5/d47/cd8 0 2026-03-10T14:08:36.980 INFO:tasks.workunit.client.0.vm03.stdout:9/617: dread d2/d29/d33/fa9 [0,4194304] 0 2026-03-10T14:08:36.984 INFO:tasks.workunit.client.0.vm03.stdout:1/603: fdatasync d0/d2/df/d27/d7e/f95 0 2026-03-10T14:08:37.013 INFO:tasks.workunit.client.0.vm03.stdout:4/654: dread d5/d96/fc1 [0,4194304] 0 2026-03-10T14:08:37.016 INFO:tasks.workunit.client.0.vm03.stdout:3/682: truncate d1d/d29/d41/d45/f6a 840621 0 2026-03-10T14:08:37.018 INFO:tasks.workunit.client.0.vm03.stdout:5/762: dwrite d4/d16/d19/d23/d3f/fb9 [0,4194304] 0 2026-03-10T14:08:37.018 INFO:tasks.workunit.client.0.vm03.stdout:7/534: dread d5/d9/f10 [0,4194304] 0 2026-03-10T14:08:37.019 INFO:tasks.workunit.client.0.vm03.stdout:6/602: dwrite d8/db/d12/f40 [0,4194304] 0 2026-03-10T14:08:37.020 INFO:tasks.workunit.client.0.vm03.stdout:6/603: write d8/db/f3c [1332955,113260] 0 2026-03-10T14:08:37.020 INFO:tasks.workunit.client.0.vm03.stdout:8/688: dwrite da/d58/d5f/f71 [0,4194304] 0 2026-03-10T14:08:37.024 INFO:tasks.workunit.client.0.vm03.stdout:6/604: write d8/d11/d18/d79/fb4 [501801,3465] 0 2026-03-10T14:08:37.029 INFO:tasks.workunit.client.0.vm03.stdout:9/618: symlink d2/d14/d2b/d43/lca 0 2026-03-10T14:08:37.036 INFO:tasks.workunit.client.0.vm03.stdout:9/619: write d2/d29/d33/d55/d72/d9d/fa0 [577451,47151] 0 2026-03-10T14:08:37.037 INFO:tasks.workunit.client.0.vm03.stdout:4/655: symlink d5/d6e/ld9 0 2026-03-10T14:08:37.041 INFO:tasks.workunit.client.0.vm03.stdout:5/763: read d4/d16/fac [108160,16029] 0 2026-03-10T14:08:37.042 INFO:tasks.workunit.client.0.vm03.stdout:5/764: readlink d4/d6/de/l7c 0 2026-03-10T14:08:37.044 INFO:tasks.workunit.client.0.vm03.stdout:5/765: readlink d4/lc8 0 2026-03-10T14:08:37.051 INFO:tasks.workunit.client.0.vm03.stdout:4/656: dread d5/d47/f46 [0,4194304] 0 2026-03-10T14:08:37.055 INFO:tasks.workunit.client.0.vm03.stdout:6/605: dread d8/db/d12/f57 [0,4194304] 0 2026-03-10T14:08:37.057 INFO:tasks.workunit.client.0.vm03.stdout:3/683: mkdir d1d/d29/d41/d45/db4/dde 0 2026-03-10T14:08:37.063 INFO:tasks.workunit.client.0.vm03.stdout:8/689: creat da/d3c/d51/d75/dc2/fe1 x:0 0 0 2026-03-10T14:08:37.070 INFO:tasks.workunit.client.0.vm03.stdout:9/620: readlink d2/l12 0 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:35] ENGINE Bus STARTING 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:35] ENGINE Serving on https://192.168.123.103:7150 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:35] ENGINE Client ('192.168.123.103', 42552) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:35] ENGINE Serving on http://192.168.123.103:8765 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: [10/Mar/2026:14:08:35] ENGINE Bus STARTED 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: pgmap v5: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:37.077 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:36 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:37.080 INFO:tasks.workunit.client.0.vm03.stdout:9/621: dwrite d2/f2f [0,4194304] 0 2026-03-10T14:08:37.084 INFO:tasks.workunit.client.0.vm03.stdout:5/766: creat d4/d16/d19/d67/ff1 x:0 0 0 2026-03-10T14:08:37.103 INFO:tasks.workunit.client.0.vm03.stdout:1/604: dwrite d0/d2/df/f1b [0,4194304] 0 2026-03-10T14:08:37.109 INFO:tasks.workunit.client.0.vm03.stdout:6/606: dread f5 [4194304,4194304] 0 2026-03-10T14:08:37.109 INFO:tasks.workunit.client.0.vm03.stdout:6/607: chown d8/db/d49/d6c 183 1 2026-03-10T14:08:37.111 INFO:tasks.workunit.client.0.vm03.stdout:3/684: creat d1d/d29/d41/d45/d5b/fdf x:0 0 0 2026-03-10T14:08:37.120 INFO:tasks.workunit.client.0.vm03.stdout:8/690: mknod da/d24/d78/ce2 0 2026-03-10T14:08:37.122 INFO:tasks.workunit.client.0.vm03.stdout:7/535: creat d5/d9/d14/d26/d39/d92/fb5 x:0 0 0 2026-03-10T14:08:37.124 INFO:tasks.workunit.client.0.vm03.stdout:2/596: getdents d5/d10 0 2026-03-10T14:08:37.130 INFO:tasks.workunit.client.0.vm03.stdout:9/622: dread d2/d29/d33/fa9 [0,4194304] 0 2026-03-10T14:08:37.131 INFO:tasks.workunit.client.0.vm03.stdout:5/767: symlink d4/d13/d1f/d84/lf2 0 2026-03-10T14:08:37.134 INFO:tasks.workunit.client.0.vm03.stdout:1/605: fdatasync d0/d2/df/dab/f56 0 2026-03-10T14:08:37.139 INFO:tasks.workunit.client.0.vm03.stdout:3/685: mkdir d1d/d39/da1/de0 0 2026-03-10T14:08:37.139 INFO:tasks.workunit.client.0.vm03.stdout:8/691: fdatasync da/fe 0 2026-03-10T14:08:37.141 INFO:tasks.workunit.client.0.vm03.stdout:2/597: mkdir d5/db4/d74/d83/dc1 0 2026-03-10T14:08:37.149 INFO:tasks.workunit.client.0.vm03.stdout:5/768: stat d4/d13/d8f/f91 0 2026-03-10T14:08:37.149 INFO:tasks.workunit.client.0.vm03.stdout:6/608: sync 2026-03-10T14:08:37.149 INFO:tasks.workunit.client.0.vm03.stdout:7/536: sync 2026-03-10T14:08:37.149 INFO:tasks.workunit.client.0.vm03.stdout:3/686: sync 2026-03-10T14:08:37.156 INFO:tasks.workunit.client.0.vm03.stdout:4/657: rename d5/f7 to d5/d47/d5b/d64/fda 0 2026-03-10T14:08:37.157 INFO:tasks.workunit.client.0.vm03.stdout:9/623: dread d2/d14/f61 [0,4194304] 0 2026-03-10T14:08:37.161 INFO:tasks.workunit.client.0.vm03.stdout:9/624: sync 2026-03-10T14:08:37.161 INFO:tasks.workunit.client.0.vm03.stdout:9/625: write d2/f15 [919056,5030] 0 2026-03-10T14:08:37.162 INFO:tasks.workunit.client.0.vm03.stdout:9/626: read d2/d14/f61 [2179252,79398] 0 2026-03-10T14:08:37.164 INFO:tasks.workunit.client.0.vm03.stdout:9/627: sync 2026-03-10T14:08:37.164 INFO:tasks.workunit.client.0.vm03.stdout:9/628: sync 2026-03-10T14:08:37.166 INFO:tasks.workunit.client.0.vm03.stdout:9/629: chown d2/d29/d33/d41/d95 16019 1 2026-03-10T14:08:37.168 INFO:tasks.workunit.client.0.vm03.stdout:9/630: stat d2/d29/d33/fa9 0 2026-03-10T14:08:37.174 INFO:tasks.workunit.client.0.vm03.stdout:0/616: write d3/d4d/d30/f7a [1385212,37102] 0 2026-03-10T14:08:37.179 INFO:tasks.workunit.client.0.vm03.stdout:2/598: mkdir d5/d10/d1f/d4f/d76/da7/d40/dc2 0 2026-03-10T14:08:37.183 INFO:tasks.workunit.client.0.vm03.stdout:2/599: stat d5/d10/d1f/d4f/d76/l79 0 2026-03-10T14:08:37.187 INFO:tasks.workunit.client.0.vm03.stdout:6/609: truncate d8/db/df/f37 4773985 0 2026-03-10T14:08:37.188 INFO:tasks.workunit.client.0.vm03.stdout:3/687: truncate d1d/d29/d41/f56 2649408 0 2026-03-10T14:08:37.188 INFO:tasks.workunit.client.0.vm03.stdout:3/688: chown l5 968 1 2026-03-10T14:08:37.197 INFO:tasks.workunit.client.0.vm03.stdout:1/606: link d0/d2/df/f1b d0/d2/d71/d90/fc1 0 2026-03-10T14:08:37.206 INFO:tasks.workunit.client.0.vm03.stdout:1/607: dread - d0/d2/f2a zero size 2026-03-10T14:08:37.218 INFO:tasks.workunit.client.0.vm03.stdout:1/608: dread d0/d2/df/d16/f4f [0,4194304] 0 2026-03-10T14:08:37.221 INFO:tasks.workunit.client.0.vm03.stdout:0/617: read - d3/d46/d5e/f7c zero size 2026-03-10T14:08:37.221 INFO:tasks.workunit.client.0.vm03.stdout:5/769: dwrite d4/d16/d19/d4a/faa [0,4194304] 0 2026-03-10T14:08:37.227 INFO:tasks.workunit.client.0.vm03.stdout:5/770: truncate d4/d40/da5/fd6 918687 0 2026-03-10T14:08:37.229 INFO:tasks.workunit.client.0.vm03.stdout:7/537: mknod d5/d98/cb6 0 2026-03-10T14:08:37.232 INFO:tasks.workunit.client.0.vm03.stdout:6/610: dread - d8/db/d12/d51/d5c/d60/f92 zero size 2026-03-10T14:08:37.234 INFO:tasks.workunit.client.0.vm03.stdout:6/611: write d8/db/d12/d64/fa8 [496882,102159] 0 2026-03-10T14:08:37.238 INFO:tasks.workunit.client.0.vm03.stdout:3/689: mkdir d1d/d39/d51/de1 0 2026-03-10T14:08:37.239 INFO:tasks.workunit.client.0.vm03.stdout:3/690: readlink d1d/d33/d47/d53/d68/l77 0 2026-03-10T14:08:37.248 INFO:tasks.workunit.client.0.vm03.stdout:1/609: creat d0/d2/df/d16/d20/fc2 x:0 0 0 2026-03-10T14:08:37.253 INFO:tasks.workunit.client.0.vm03.stdout:7/538: fsync d5/d9/d14/d26/d36/d51/d7b/d83/f9e 0 2026-03-10T14:08:37.255 INFO:tasks.workunit.client.0.vm03.stdout:5/771: dwrite d4/d13/d43/fbb [0,4194304] 0 2026-03-10T14:08:37.264 INFO:tasks.workunit.client.0.vm03.stdout:2/600: chown d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 839 1 2026-03-10T14:08:37.275 INFO:tasks.workunit.client.0.vm03.stdout:6/612: rmdir d8/d11/d18/d79/d80 39 2026-03-10T14:08:37.281 INFO:tasks.workunit.client.0.vm03.stdout:4/658: rename d5/d9/db/d19/f3a to d5/d9/db/d19/fdb 0 2026-03-10T14:08:37.281 INFO:tasks.workunit.client.0.vm03.stdout:4/659: dread - d5/d47/f65 zero size 2026-03-10T14:08:37.281 INFO:tasks.workunit.client.0.vm03.stdout:6/613: readlink d8/db/l28 0 2026-03-10T14:08:37.281 INFO:tasks.workunit.client.0.vm03.stdout:8/692: getdents da/d58 0 2026-03-10T14:08:37.281 INFO:tasks.workunit.client.0.vm03.stdout:3/691: stat d1d/d29/d41/d45/d55/f81 0 2026-03-10T14:08:37.281 INFO:tasks.workunit.client.0.vm03.stdout:8/693: dread - da/d58/d6c/fd8 zero size 2026-03-10T14:08:37.281 INFO:tasks.workunit.client.0.vm03.stdout:0/618: rmdir d3/d4d/d30 39 2026-03-10T14:08:37.282 INFO:tasks.workunit.client.0.vm03.stdout:5/772: mkdir d4/d16/d19/d23/d3f/df3 0 2026-03-10T14:08:37.282 INFO:tasks.workunit.client.0.vm03.stdout:2/601: chown d5/d10/d17/c27 1 1 2026-03-10T14:08:37.284 INFO:tasks.workunit.client.0.vm03.stdout:4/660: mknod d5/db4/cdc 0 2026-03-10T14:08:37.284 INFO:tasks.workunit.client.0.vm03.stdout:8/694: truncate da/d24/d49/f89 858489 0 2026-03-10T14:08:37.285 INFO:tasks.workunit.client.0.vm03.stdout:0/619: write d3/d11/d2c/d4a/d7b/f87 [3518381,17056] 0 2026-03-10T14:08:37.288 INFO:tasks.workunit.client.0.vm03.stdout:6/614: rename d8/db/d49/d6c/d83/f93 to d8/db/df/fbc 0 2026-03-10T14:08:37.288 INFO:tasks.workunit.client.0.vm03.stdout:9/631: getdents d2/d29/d33/d41/daa 0 2026-03-10T14:08:37.288 INFO:tasks.workunit.client.0.vm03.stdout:7/539: creat d5/d9/d14/d26/d36/db4/fb7 x:0 0 0 2026-03-10T14:08:37.289 INFO:tasks.workunit.client.0.vm03.stdout:3/692: creat d1d/d33/d47/dac/dc1/fe2 x:0 0 0 2026-03-10T14:08:37.289 INFO:tasks.workunit.client.0.vm03.stdout:5/773: write d4/f37 [3447835,31229] 0 2026-03-10T14:08:37.292 INFO:tasks.workunit.client.0.vm03.stdout:3/693: read d1d/d29/d41/d45/d95/faa [1429006,22570] 0 2026-03-10T14:08:37.293 INFO:tasks.workunit.client.0.vm03.stdout:2/602: creat d5/d10/d1f/d4f/d76/da7/d54/fc3 x:0 0 0 2026-03-10T14:08:37.293 INFO:tasks.workunit.client.0.vm03.stdout:6/615: symlink d8/db/d12/d51/d5c/lbd 0 2026-03-10T14:08:37.294 INFO:tasks.workunit.client.0.vm03.stdout:9/632: rmdir d2/d29/d9a 39 2026-03-10T14:08:37.301 INFO:tasks.workunit.client.0.vm03.stdout:5/774: mkdir d4/d16/df4 0 2026-03-10T14:08:37.306 INFO:tasks.workunit.client.0.vm03.stdout:1/610: dwrite d0/d2/df/dab/f56 [0,4194304] 0 2026-03-10T14:08:37.308 INFO:tasks.workunit.client.0.vm03.stdout:1/611: readlink d0/d2/df/d27/l7a 0 2026-03-10T14:08:37.308 INFO:tasks.workunit.client.0.vm03.stdout:3/694: creat d1d/d29/d41/d45/d5b/fe3 x:0 0 0 2026-03-10T14:08:37.313 INFO:tasks.workunit.client.0.vm03.stdout:6/616: dread d8/d11/d18/d54/f8b [0,4194304] 0 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:35] ENGINE Bus STARTING 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:35] ENGINE Serving on https://192.168.123.103:7150 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:35] ENGINE Client ('192.168.123.103', 42552) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:35] ENGINE Serving on http://192.168.123.103:8765 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: [10/Mar/2026:14:08:35] ENGINE Bus STARTED 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: pgmap v5: 65 pgs: 65 active+clean; 3.0 GiB data, 10 GiB used, 110 GiB / 120 GiB avail 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:36 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:37.314 INFO:tasks.workunit.client.0.vm03.stdout:6/617: dread - d8/db/d12/f9a zero size 2026-03-10T14:08:37.319 INFO:tasks.workunit.client.0.vm03.stdout:2/603: creat d5/db4/d74/d83/fc4 x:0 0 0 2026-03-10T14:08:37.323 INFO:tasks.workunit.client.0.vm03.stdout:8/695: unlink da/d24/d49/f89 0 2026-03-10T14:08:37.333 INFO:tasks.workunit.client.0.vm03.stdout:4/661: write d5/d9/db/f29 [5233927,78002] 0 2026-03-10T14:08:37.335 INFO:tasks.workunit.client.0.vm03.stdout:7/540: rename d5/d9/d14/d26/c2b to d5/d9/d14/d26/d5f/d89/cb8 0 2026-03-10T14:08:37.336 INFO:tasks.workunit.client.0.vm03.stdout:6/618: dread d8/db/d12/d64/fa8 [0,4194304] 0 2026-03-10T14:08:37.336 INFO:tasks.workunit.client.0.vm03.stdout:3/695: dread - d1d/d33/d47/f98 zero size 2026-03-10T14:08:37.338 INFO:tasks.workunit.client.0.vm03.stdout:2/604: mknod d5/d10/d1f/d4f/d76/da7/d40/cc5 0 2026-03-10T14:08:37.339 INFO:tasks.workunit.client.0.vm03.stdout:6/619: dread - d8/db/d49/d6c/d32/faa zero size 2026-03-10T14:08:37.364 INFO:tasks.workunit.client.0.vm03.stdout:8/696: truncate f2 8117420 0 2026-03-10T14:08:37.367 INFO:tasks.workunit.client.0.vm03.stdout:8/697: dread - da/d3c/d4b/d69/f83 zero size 2026-03-10T14:08:37.368 INFO:tasks.workunit.client.0.vm03.stdout:5/775: symlink d4/d40/lf5 0 2026-03-10T14:08:37.371 INFO:tasks.workunit.client.0.vm03.stdout:4/662: rmdir d5/d9/db/d19/d38/d53 39 2026-03-10T14:08:37.372 INFO:tasks.workunit.client.0.vm03.stdout:8/698: dread da/d3a/fa3 [0,4194304] 0 2026-03-10T14:08:37.373 INFO:tasks.workunit.client.0.vm03.stdout:8/699: rename da/d24/d49 to da/d24/d49/de3 22 2026-03-10T14:08:37.376 INFO:tasks.workunit.client.0.vm03.stdout:3/696: mkdir d1d/d29/d41/d45/db4/de4 0 2026-03-10T14:08:37.389 INFO:tasks.workunit.client.0.vm03.stdout:2/605: mknod d5/d10/d1f/d4f/cc6 0 2026-03-10T14:08:37.390 INFO:tasks.workunit.client.0.vm03.stdout:0/620: write d3/d4d/f49 [4133067,80370] 0 2026-03-10T14:08:37.397 INFO:tasks.workunit.client.0.vm03.stdout:6/620: mknod d8/d3b/cbe 0 2026-03-10T14:08:37.403 INFO:tasks.workunit.client.0.vm03.stdout:9/633: symlink d2/d29/d33/d60/d7f/lcb 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:8/700: creat da/d3c/d4b/d69/fe4 x:0 0 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:1/612: getdents d0/d18/d3b 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:5/776: mknod d4/d13/d1f/d8c/cf6 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:6/621: truncate d8/db/d49/d6c/d32/f94 5196434 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:4/663: symlink d5/d9/db/d19/d38/d53/ldd 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:9/634: dwrite d2/d29/d33/d55/d72/d9d/fa0 [0,4194304] 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:9/635: write d2/d14/d2b/f2d [9418013,113836] 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:7/541: link d5/d9/d14/d26/l43 d5/d9/d3e/d84/lb9 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:6/622: chown d8/l2b 383689 1 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:7/542: write d5/d9/d14/d21/fa3 [185784,61609] 0 2026-03-10T14:08:37.425 INFO:tasks.workunit.client.0.vm03.stdout:7/543: dread - d5/d9/d14/f9b zero size 2026-03-10T14:08:37.429 INFO:tasks.workunit.client.0.vm03.stdout:7/544: write d5/d9/d14/d21/d6f/f9d [363451,107197] 0 2026-03-10T14:08:37.437 INFO:tasks.workunit.client.0.vm03.stdout:4/664: creat d5/d6e/fde x:0 0 0 2026-03-10T14:08:37.437 INFO:tasks.workunit.client.0.vm03.stdout:5/777: fdatasync d4/d13/d1f/d8c/fa6 0 2026-03-10T14:08:37.437 INFO:tasks.workunit.client.0.vm03.stdout:1/613: mkdir d0/d2/df/dab/dc3 0 2026-03-10T14:08:37.437 INFO:tasks.workunit.client.0.vm03.stdout:5/778: write d4/d6/de/fe7 [4756877,25097] 0 2026-03-10T14:08:37.437 INFO:tasks.workunit.client.0.vm03.stdout:5/779: chown d4/d16/da0/ce6 2504833 1 2026-03-10T14:08:37.441 INFO:tasks.workunit.client.0.vm03.stdout:9/636: fsync d2/fc 0 2026-03-10T14:08:37.442 INFO:tasks.workunit.client.0.vm03.stdout:8/701: readlink da/d36/d4d/la7 0 2026-03-10T14:08:37.442 INFO:tasks.workunit.client.0.vm03.stdout:9/637: chown d2/d14/cbb 0 1 2026-03-10T14:08:37.447 INFO:tasks.workunit.client.0.vm03.stdout:6/623: unlink d8/d11/d18/f9f 0 2026-03-10T14:08:37.447 INFO:tasks.workunit.client.0.vm03.stdout:7/545: creat d5/d9/d14/d26/d39/d92/fba x:0 0 0 2026-03-10T14:08:37.456 INFO:tasks.workunit.client.0.vm03.stdout:9/638: creat d2/d14/d2b/d43/fcc x:0 0 0 2026-03-10T14:08:37.458 INFO:tasks.workunit.client.0.vm03.stdout:7/546: symlink d5/d9/d14/d26/d36/db4/lbb 0 2026-03-10T14:08:37.460 INFO:tasks.workunit.client.0.vm03.stdout:5/780: creat d4/d16/d19/d23/d3f/df3/ff7 x:0 0 0 2026-03-10T14:08:37.467 INFO:tasks.workunit.client.0.vm03.stdout:9/639: truncate d2/d29/d33/d41/daa/fab 978666 0 2026-03-10T14:08:37.468 INFO:tasks.workunit.client.0.vm03.stdout:7/547: mknod d5/d9/d14/d26/d36/d51/cbc 0 2026-03-10T14:08:37.474 INFO:tasks.workunit.client.0.vm03.stdout:4/665: getdents d5/d9/db/d19 0 2026-03-10T14:08:37.476 INFO:tasks.workunit.client.0.vm03.stdout:7/548: creat d5/d9/d14/d26/d5f/d89/fbd x:0 0 0 2026-03-10T14:08:37.478 INFO:tasks.workunit.client.0.vm03.stdout:4/666: truncate d5/d6e/f93 199849 0 2026-03-10T14:08:37.482 INFO:tasks.workunit.client.0.vm03.stdout:7/549: symlink d5/d9/d14/d26/d36/d51/d7b/d83/lbe 0 2026-03-10T14:08:37.484 INFO:tasks.workunit.client.0.vm03.stdout:7/550: stat d5/d9/d3e/c58 0 2026-03-10T14:08:37.485 INFO:tasks.workunit.client.0.vm03.stdout:2/606: dwrite d5/d10/f22 [0,4194304] 0 2026-03-10T14:08:37.494 INFO:tasks.workunit.client.0.vm03.stdout:3/697: write d1d/d29/d41/d45/d5b/f9e [1255469,99758] 0 2026-03-10T14:08:37.494 INFO:tasks.workunit.client.0.vm03.stdout:7/551: symlink d5/d9/d14/d26/d39/lbf 0 2026-03-10T14:08:37.498 INFO:tasks.workunit.client.0.vm03.stdout:0/621: dwrite d3/d16/d21/f6a [0,4194304] 0 2026-03-10T14:08:37.498 INFO:tasks.workunit.client.0.vm03.stdout:2/607: creat d5/d10/d1f/d4f/fc7 x:0 0 0 2026-03-10T14:08:37.501 INFO:tasks.workunit.client.0.vm03.stdout:3/698: truncate fc 1832040 0 2026-03-10T14:08:37.508 INFO:tasks.workunit.client.0.vm03.stdout:7/552: truncate d5/d9/d14/d26/d39/f63 722055 0 2026-03-10T14:08:37.514 INFO:tasks.workunit.client.0.vm03.stdout:8/702: dwrite da/d24/d49/f66 [4194304,4194304] 0 2026-03-10T14:08:37.515 INFO:tasks.workunit.client.0.vm03.stdout:1/614: dwrite d0/d42/f7f [0,4194304] 0 2026-03-10T14:08:37.517 INFO:tasks.workunit.client.0.vm03.stdout:8/703: write da/d36/d4d/da5/fa9 [586813,126349] 0 2026-03-10T14:08:37.528 INFO:tasks.workunit.client.0.vm03.stdout:6/624: dwrite d8/db/df/f37 [0,4194304] 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:0/622: stat d3/d11/l23 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:5/781: dwrite d4/d16/f7b [0,4194304] 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:2/608: mkdir d5/d10/da3/dab/dc8 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:6/625: stat d8/d11/d18/d79/c70 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:6/626: chown d8/db/d12/d51/d8c/f8a 40445750 1 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:3/699: mkdir d1d/d39/da1/de0/de5 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:1/615: getdents d0/d2/df/dab/dc3 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:1/616: chown d0/d2/df/d27/f94 122147 1 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:2/609: symlink d5/dac/lc9 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:7/553: creat d5/d9/fc0 x:0 0 0 2026-03-10T14:08:37.540 INFO:tasks.workunit.client.0.vm03.stdout:6/627: dwrite d8/db/d12/d51/d5c/f8d [0,4194304] 0 2026-03-10T14:08:37.545 INFO:tasks.workunit.client.0.vm03.stdout:6/628: chown d8/db/df/f37 6537 1 2026-03-10T14:08:37.555 INFO:tasks.workunit.client.0.vm03.stdout:6/629: dwrite d8/db/d12/fa1 [4194304,4194304] 0 2026-03-10T14:08:37.560 INFO:tasks.workunit.client.0.vm03.stdout:6/630: readlink d8/db/l28 0 2026-03-10T14:08:37.564 INFO:tasks.workunit.client.0.vm03.stdout:1/617: unlink d0/d2/d71/d90/lb5 0 2026-03-10T14:08:37.585 INFO:tasks.workunit.client.0.vm03.stdout:9/640: rename d2/d29/d33/d55 to d2/d29/dcd 0 2026-03-10T14:08:37.585 INFO:tasks.workunit.client.0.vm03.stdout:0/623: getdents d3/d11 0 2026-03-10T14:08:37.585 INFO:tasks.workunit.client.0.vm03.stdout:5/782: read d4/d6/de/f4f [412445,99511] 0 2026-03-10T14:08:37.585 INFO:tasks.workunit.client.0.vm03.stdout:0/624: dread d3/d17/f56 [0,4194304] 0 2026-03-10T14:08:37.585 INFO:tasks.workunit.client.0.vm03.stdout:9/641: write d2/d29/d33/d41/f6e [2035287,124245] 0 2026-03-10T14:08:37.585 INFO:tasks.workunit.client.0.vm03.stdout:9/642: readlink d2/d29/dcd/d9c/lba 0 2026-03-10T14:08:37.585 INFO:tasks.workunit.client.0.vm03.stdout:1/618: mkdir d0/d18/d1d/dc4 0 2026-03-10T14:08:37.598 INFO:tasks.workunit.client.0.vm03.stdout:9/643: mkdir d2/d29/d33/d41/daa/dce 0 2026-03-10T14:08:37.599 INFO:tasks.workunit.client.0.vm03.stdout:4/667: sync 2026-03-10T14:08:37.599 INFO:tasks.workunit.client.0.vm03.stdout:3/700: sync 2026-03-10T14:08:37.600 INFO:tasks.workunit.client.0.vm03.stdout:4/668: chown d5/db4/dd2 2016445464 1 2026-03-10T14:08:37.607 INFO:tasks.workunit.client.0.vm03.stdout:0/625: symlink d3/d11/d2c/lc9 0 2026-03-10T14:08:37.607 INFO:tasks.workunit.client.0.vm03.stdout:9/644: creat d2/d29/d33/d41/daa/fcf x:0 0 0 2026-03-10T14:08:37.609 INFO:tasks.workunit.client.0.vm03.stdout:3/701: fdatasync d1d/d33/d47/d53/d68/f86 0 2026-03-10T14:08:37.610 INFO:tasks.workunit.client.0.vm03.stdout:3/702: dread - d1d/d39/d51/f71 zero size 2026-03-10T14:08:37.610 INFO:tasks.workunit.client.0.vm03.stdout:4/669: truncate d5/d9/db/d19/d38/d53/d55/f61 4089109 0 2026-03-10T14:08:37.615 INFO:tasks.workunit.client.0.vm03.stdout:1/619: creat d0/d2/df/d27/fc5 x:0 0 0 2026-03-10T14:08:37.617 INFO:tasks.workunit.client.0.vm03.stdout:0/626: fdatasync d3/d4d/f2a 0 2026-03-10T14:08:37.617 INFO:tasks.workunit.client.0.vm03.stdout:0/627: write d3/d4d/f49 [3653422,101048] 0 2026-03-10T14:08:37.620 INFO:tasks.workunit.client.0.vm03.stdout:0/628: dread - d3/d16/d21/d9a/fb2 zero size 2026-03-10T14:08:37.620 INFO:tasks.workunit.client.0.vm03.stdout:1/620: creat d0/d2/df/d16/d20/fc6 x:0 0 0 2026-03-10T14:08:37.624 INFO:tasks.workunit.client.0.vm03.stdout:4/670: dwrite d5/d47/d5b/d64/fda [0,4194304] 0 2026-03-10T14:08:37.624 INFO:tasks.workunit.client.0.vm03.stdout:0/629: symlink d3/d4d/lca 0 2026-03-10T14:08:37.630 INFO:tasks.workunit.client.0.vm03.stdout:0/630: truncate d3/f94 735679 0 2026-03-10T14:08:37.640 INFO:tasks.workunit.client.0.vm03.stdout:0/631: stat d3/d16/d21/fa4 0 2026-03-10T14:08:37.641 INFO:tasks.workunit.client.0.vm03.stdout:4/671: getdents d5/d9/db 0 2026-03-10T14:08:37.641 INFO:tasks.workunit.client.0.vm03.stdout:0/632: symlink d3/d11/d76/lcb 0 2026-03-10T14:08:37.641 INFO:tasks.workunit.client.0.vm03.stdout:0/633: dread - d3/d16/d21/f96 zero size 2026-03-10T14:08:37.671 INFO:tasks.workunit.client.0.vm03.stdout:3/703: sync 2026-03-10T14:08:37.672 INFO:tasks.workunit.client.0.vm03.stdout:0/634: creat d3/d4d/fcc x:0 0 0 2026-03-10T14:08:37.672 INFO:tasks.workunit.client.0.vm03.stdout:4/672: sync 2026-03-10T14:08:37.674 INFO:tasks.workunit.client.0.vm03.stdout:1/621: dread d0/d2/df/d16/d20/fa9 [0,4194304] 0 2026-03-10T14:08:37.678 INFO:tasks.workunit.client.0.vm03.stdout:1/622: read d0/fbf [1320963,83381] 0 2026-03-10T14:08:37.700 INFO:tasks.workunit.client.0.vm03.stdout:4/673: symlink d5/d9/db/ldf 0 2026-03-10T14:08:37.700 INFO:tasks.workunit.client.0.vm03.stdout:0/635: fdatasync d3/d16/f34 0 2026-03-10T14:08:37.700 INFO:tasks.workunit.client.0.vm03.stdout:0/636: dwrite d3/d17/fb1 [0,4194304] 0 2026-03-10T14:08:37.701 INFO:tasks.workunit.client.0.vm03.stdout:0/637: stat d3/d11/d2c/d4a/l6f 0 2026-03-10T14:08:37.701 INFO:tasks.workunit.client.0.vm03.stdout:1/623: rename d0/d2/df/d16/d20/fc6 to d0/d2/df/dab/fc7 0 2026-03-10T14:08:37.727 INFO:tasks.workunit.client.0.vm03.stdout:1/624: dread d0/d2/df/d27/f58 [0,4194304] 0 2026-03-10T14:08:37.730 INFO:tasks.workunit.client.0.vm03.stdout:3/704: dread d1d/d39/d51/f82 [4194304,4194304] 0 2026-03-10T14:08:37.743 INFO:tasks.workunit.client.0.vm03.stdout:3/705: rename d1d/d33/d65/d5d/dae/db9/cd1 to d1d/d29/d41/d45/d55/d6e/d83/d91/dd4/ce6 0 2026-03-10T14:08:37.744 INFO:tasks.workunit.client.0.vm03.stdout:3/706: readlink d1d/l30 0 2026-03-10T14:08:37.744 INFO:tasks.workunit.client.0.vm03.stdout:3/707: fdatasync d1d/f66 0 2026-03-10T14:08:37.746 INFO:tasks.workunit.client.0.vm03.stdout:7/554: getdents d5/d9/d14/d26/d5f/d89 0 2026-03-10T14:08:37.751 INFO:tasks.workunit.client.0.vm03.stdout:2/610: getdents d5/d10/da3/dab 0 2026-03-10T14:08:37.754 INFO:tasks.workunit.client.0.vm03.stdout:8/704: write da/f12 [1023343,60470] 0 2026-03-10T14:08:37.760 INFO:tasks.workunit.client.0.vm03.stdout:8/705: dwrite da/d3c/f48 [4194304,4194304] 0 2026-03-10T14:08:37.766 INFO:tasks.workunit.client.0.vm03.stdout:2/611: dread d5/fa [0,4194304] 0 2026-03-10T14:08:37.771 INFO:tasks.workunit.client.0.vm03.stdout:6/631: dwrite d8/db/df/fbc [0,4194304] 0 2026-03-10T14:08:37.778 INFO:tasks.workunit.client.0.vm03.stdout:5/783: write d4/fc2 [32879,130374] 0 2026-03-10T14:08:37.784 INFO:tasks.workunit.client.0.vm03.stdout:9/645: write d2/d14/d2b/d34/f59 [4869127,39126] 0 2026-03-10T14:08:37.804 INFO:tasks.workunit.client.0.vm03.stdout:7/555: creat d5/d9/d3e/d84/fc1 x:0 0 0 2026-03-10T14:08:37.806 INFO:tasks.workunit.client.0.vm03.stdout:8/706: rmdir da/d3a/d44 39 2026-03-10T14:08:37.809 INFO:tasks.workunit.client.0.vm03.stdout:0/638: write d3/d16/d21/f78 [692561,28935] 0 2026-03-10T14:08:37.812 INFO:tasks.workunit.client.0.vm03.stdout:1/625: write d0/d2/df/dab/f9a [2439406,82542] 0 2026-03-10T14:08:37.812 INFO:tasks.workunit.client.0.vm03.stdout:3/708: dread fb [0,4194304] 0 2026-03-10T14:08:37.812 INFO:tasks.workunit.client.0.vm03.stdout:1/626: stat d0/d2/df/d16/lae 0 2026-03-10T14:08:37.812 INFO:tasks.workunit.client.0.vm03.stdout:1/627: fdatasync d0/d2/df/dab/f56 0 2026-03-10T14:08:37.820 INFO:tasks.workunit.client.0.vm03.stdout:6/632: rename d8/db/d12/d51 to d8/d11/da0/dbf 0 2026-03-10T14:08:37.821 INFO:tasks.workunit.client.0.vm03.stdout:5/784: symlink d4/d16/d19/d23/lf8 0 2026-03-10T14:08:37.822 INFO:tasks.workunit.client.0.vm03.stdout:6/633: write d8/db/d49/d6c/d32/d3a/f5e [4632443,42086] 0 2026-03-10T14:08:37.824 INFO:tasks.workunit.client.0.vm03.stdout:9/646: mkdir d2/d29/d33/d41/d46/dd0 0 2026-03-10T14:08:37.825 INFO:tasks.workunit.client.0.vm03.stdout:9/647: read - d2/d29/d33/d60/fc6 zero size 2026-03-10T14:08:37.835 INFO:tasks.workunit.client.0.vm03.stdout:5/785: creat d4/d16/d19/d23/db8/ff9 x:0 0 0 2026-03-10T14:08:37.839 INFO:tasks.workunit.client.0.vm03.stdout:7/556: symlink d5/d9/da7/lc2 0 2026-03-10T14:08:37.840 INFO:tasks.workunit.client.0.vm03.stdout:6/634: truncate f0 2498599 0 2026-03-10T14:08:37.840 INFO:tasks.workunit.client.0.vm03.stdout:9/648: sync 2026-03-10T14:08:37.841 INFO:tasks.workunit.client.0.vm03.stdout:7/557: dread - d5/d9/d14/d26/f38 zero size 2026-03-10T14:08:37.842 INFO:tasks.workunit.client.0.vm03.stdout:9/649: write d2/d14/d2b/f68 [2717704,6017] 0 2026-03-10T14:08:37.842 INFO:tasks.workunit.client.0.vm03.stdout:4/674: rename d5/d9/db/d19/d38/l52 to d5/d47/d5b/dbe/dc4/le0 0 2026-03-10T14:08:37.847 INFO:tasks.workunit.client.0.vm03.stdout:0/639: rename d3/d4d/d47/f5f to d3/d4d/d47/fcd 0 2026-03-10T14:08:37.851 INFO:tasks.workunit.client.0.vm03.stdout:5/786: dread d4/f9e [0,4194304] 0 2026-03-10T14:08:37.861 INFO:tasks.workunit.client.0.vm03.stdout:8/707: dread da/d3c/d51/d85/fc9 [0,4194304] 0 2026-03-10T14:08:37.863 INFO:tasks.workunit.client.0.vm03.stdout:3/709: write d1d/d39/d51/f82 [3579876,128326] 0 2026-03-10T14:08:37.865 INFO:tasks.workunit.client.0.vm03.stdout:5/787: dwrite d4/d16/d19/d23/fee [0,4194304] 0 2026-03-10T14:08:37.870 INFO:tasks.workunit.client.0.vm03.stdout:2/612: getdents d5/d10/d1f/d4f/d76/da7/d40 0 2026-03-10T14:08:37.877 INFO:tasks.workunit.client.0.vm03.stdout:9/650: creat d2/d14/d2b/d79/fd1 x:0 0 0 2026-03-10T14:08:37.889 INFO:tasks.workunit.client.0.vm03.stdout:5/788: creat d4/d13/d8f/ffa x:0 0 0 2026-03-10T14:08:37.891 INFO:tasks.workunit.client.0.vm03.stdout:1/628: getdents d0/d42/d9f 0 2026-03-10T14:08:37.893 INFO:tasks.workunit.client.0.vm03.stdout:2/613: sync 2026-03-10T14:08:37.896 INFO:tasks.workunit.client.0.vm03.stdout:3/710: getdents d1d/d39/d51/de1 0 2026-03-10T14:08:37.901 INFO:tasks.workunit.client.0.vm03.stdout:5/789: read d4/d35/f9b [2204648,101833] 0 2026-03-10T14:08:37.904 INFO:tasks.workunit.client.0.vm03.stdout:4/675: truncate d5/d47/d5b/d64/d85/f9d 2301884 0 2026-03-10T14:08:37.910 INFO:tasks.workunit.client.0.vm03.stdout:7/558: dwrite d5/d9/f7a [0,4194304] 0 2026-03-10T14:08:37.910 INFO:tasks.workunit.client.0.vm03.stdout:0/640: dread d3/d16/d21/d3c/f99 [0,4194304] 0 2026-03-10T14:08:37.911 INFO:tasks.workunit.client.0.vm03.stdout:8/708: dwrite f5 [0,4194304] 0 2026-03-10T14:08:37.918 INFO:tasks.workunit.client.0.vm03.stdout:8/709: dread da/f16 [0,4194304] 0 2026-03-10T14:08:37.918 INFO:tasks.workunit.client.0.vm03.stdout:3/711: chown d1d/d29/db2/lcd 4 1 2026-03-10T14:08:37.918 INFO:tasks.workunit.client.0.vm03.stdout:8/710: write da/d24/db4/fdf [857157,13782] 0 2026-03-10T14:08:37.923 INFO:tasks.workunit.client.0.vm03.stdout:5/790: truncate d4/d16/f1c 2636971 0 2026-03-10T14:08:37.925 INFO:tasks.workunit.client.0.vm03.stdout:4/676: creat d5/d9/db/d19/d38/d7b/daa/daf/fe1 x:0 0 0 2026-03-10T14:08:37.925 INFO:tasks.workunit.client.0.vm03.stdout:9/651: creat d2/fd2 x:0 0 0 2026-03-10T14:08:37.928 INFO:tasks.workunit.client.0.vm03.stdout:0/641: creat d3/d46/dac/fce x:0 0 0 2026-03-10T14:08:37.933 INFO:tasks.workunit.client.0.vm03.stdout:1/629: dread d0/d2/df/d16/d41/fa8 [0,4194304] 0 2026-03-10T14:08:37.943 INFO:tasks.workunit.client.0.vm03.stdout:9/652: dread d2/d14/d2b/d34/f59 [0,4194304] 0 2026-03-10T14:08:37.953 INFO:tasks.workunit.client.0.vm03.stdout:2/614: dread d5/d10/d31/f4e [0,4194304] 0 2026-03-10T14:08:37.963 INFO:tasks.workunit.client.0.vm03.stdout:6/635: rename d8/d1b/f5f to d8/d11/da0/dbf/fc0 0 2026-03-10T14:08:37.965 INFO:tasks.workunit.client.0.vm03.stdout:2/615: sync 2026-03-10T14:08:37.965 INFO:tasks.workunit.client.0.vm03.stdout:2/616: chown d5/d10/d1f 1462325817 1 2026-03-10T14:08:37.966 INFO:tasks.workunit.client.0.vm03.stdout:7/559: dwrite d5/d9/d14/d26/d5f/f8c [0,4194304] 0 2026-03-10T14:08:37.967 INFO:tasks.workunit.client.0.vm03.stdout:2/617: fdatasync d5/d10/d1f/d4f/d76/da7/d40/d59/da2/fbf 0 2026-03-10T14:08:37.978 INFO:tasks.workunit.client.0.vm03.stdout:5/791: truncate d4/d40/fc6 3974588 0 2026-03-10T14:08:37.981 INFO:tasks.workunit.client.0.vm03.stdout:8/711: rmdir da/d36 39 2026-03-10T14:08:37.989 INFO:tasks.workunit.client.0.vm03.stdout:0/642: creat d3/d11/d2c/d4a/fcf x:0 0 0 2026-03-10T14:08:37.991 INFO:tasks.workunit.client.0.vm03.stdout:1/630: creat d0/d2/df/dab/fc8 x:0 0 0 2026-03-10T14:08:37.993 INFO:tasks.workunit.client.0.vm03.stdout:9/653: fsync d2/d29/d33/fa9 0 2026-03-10T14:08:38.005 INFO:tasks.workunit.client.0.vm03.stdout:3/712: rename d1d/d29 to d1d/d33/d47/d53/d68/dcf/de7 0 2026-03-10T14:08:38.007 INFO:tasks.workunit.client.0.vm03.stdout:2/618: fdatasync d5/d2a/f34 0 2026-03-10T14:08:38.008 INFO:tasks.workunit.client.0.vm03.stdout:2/619: chown d5/d10/da3 21255122 1 2026-03-10T14:08:38.011 INFO:tasks.workunit.client.0.vm03.stdout:7/560: rmdir d5/d9/d3e 39 2026-03-10T14:08:38.015 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:38 vm03.local ceph-mon[49718]: mgrmap e30: vm03.rwbbep(active, since 4s), standbys: vm04.ywwcto 2026-03-10T14:08:38.015 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:38 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:38.015 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:38 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:38.030 INFO:tasks.workunit.client.0.vm03.stdout:8/712: creat da/d3c/d4b/d4c/fe5 x:0 0 0 2026-03-10T14:08:38.031 INFO:tasks.workunit.client.0.vm03.stdout:5/792: dwrite d4/d13/d43/f58 [0,4194304] 0 2026-03-10T14:08:38.043 INFO:tasks.workunit.client.0.vm03.stdout:8/713: dread f5 [0,4194304] 0 2026-03-10T14:08:38.053 INFO:tasks.workunit.client.0.vm03.stdout:3/713: readlink d1d/d33/d47/d53/d68/la3 0 2026-03-10T14:08:38.056 INFO:tasks.workunit.client.0.vm03.stdout:2/620: creat d5/d10/d1f/d4f/d76/fca x:0 0 0 2026-03-10T14:08:38.058 INFO:tasks.workunit.client.0.vm03.stdout:7/561: unlink d5/d9/d14/d26/d39/db1/cb2 0 2026-03-10T14:08:38.060 INFO:tasks.workunit.client.0.vm03.stdout:4/677: creat d5/d9/db/d19/d38/fe2 x:0 0 0 2026-03-10T14:08:38.065 INFO:tasks.workunit.client.0.vm03.stdout:0/643: write d3/d16/d21/fa4 [297640,40673] 0 2026-03-10T14:08:38.065 INFO:tasks.workunit.client.0.vm03.stdout:1/631: write d0/d2/df/d16/d20/f5a [1126133,97251] 0 2026-03-10T14:08:38.076 INFO:tasks.workunit.client.0.vm03.stdout:5/793: dwrite d4/f17 [0,4194304] 0 2026-03-10T14:08:38.081 INFO:tasks.workunit.client.0.vm03.stdout:5/794: truncate d4/d40/dbc/fbf 4212483 0 2026-03-10T14:08:38.086 INFO:tasks.workunit.client.0.vm03.stdout:5/795: chown d4/d16/d19/d23/d3f/df3 552 1 2026-03-10T14:08:38.089 INFO:tasks.workunit.client.0.vm03.stdout:6/636: rename d8/d3b/l7e to d8/d11/da0/dbf/d8c/lc1 0 2026-03-10T14:08:38.091 INFO:tasks.workunit.client.0.vm03.stdout:9/654: write d2/d29/d33/d41/daa/fab [1547651,75545] 0 2026-03-10T14:08:38.096 INFO:tasks.workunit.client.0.vm03.stdout:3/714: symlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/le8 0 2026-03-10T14:08:38.099 INFO:tasks.workunit.client.0.vm03.stdout:4/678: creat d5/d47/d5b/dbe/dc4/fe3 x:0 0 0 2026-03-10T14:08:38.107 INFO:tasks.workunit.client.0.vm03.stdout:9/655: dread d2/d14/d2b/d43/fa2 [0,4194304] 0 2026-03-10T14:08:38.108 INFO:tasks.workunit.client.0.vm03.stdout:1/632: write d0/fa [4749704,59929] 0 2026-03-10T14:08:38.117 INFO:tasks.workunit.client.0.vm03.stdout:5/796: creat d4/d16/ffb x:0 0 0 2026-03-10T14:08:38.118 INFO:tasks.workunit.client.0.vm03.stdout:5/797: write d4/d35/f92 [414120,26499] 0 2026-03-10T14:08:38.120 INFO:tasks.workunit.client.0.vm03.stdout:5/798: dread - d4/d16/ffb zero size 2026-03-10T14:08:38.120 INFO:tasks.workunit.client.0.vm03.stdout:5/799: readlink d4/d16/d19/d23/lf8 0 2026-03-10T14:08:38.124 INFO:tasks.workunit.client.0.vm03.stdout:8/714: rename da/d3a/fd5 to da/d3a/d44/fe6 0 2026-03-10T14:08:38.125 INFO:tasks.workunit.client.0.vm03.stdout:8/715: write da/d3c/d4b/d4c/fe5 [279853,70030] 0 2026-03-10T14:08:38.125 INFO:tasks.workunit.client.0.vm03.stdout:5/800: dread d4/d40/d4e/f80 [0,4194304] 0 2026-03-10T14:08:38.126 INFO:tasks.workunit.client.0.vm03.stdout:1/633: dread d0/d2/df/d16/d20/f5a [0,4194304] 0 2026-03-10T14:08:38.132 INFO:tasks.workunit.client.0.vm03.stdout:6/637: write d8/d11/da0/dbf/d5c/d60/f82 [423328,18036] 0 2026-03-10T14:08:38.134 INFO:tasks.workunit.client.0.vm03.stdout:7/562: creat d5/d9/d14/d26/d36/d51/d7b/d9c/fc3 x:0 0 0 2026-03-10T14:08:38.135 INFO:tasks.workunit.client.0.vm03.stdout:6/638: write d8/d11/d18/d54/f5b [4395122,117886] 0 2026-03-10T14:08:38.139 INFO:tasks.workunit.client.0.vm03.stdout:7/563: chown d5/d9/d14/d21/d28/c2a 577 1 2026-03-10T14:08:38.143 INFO:tasks.workunit.client.0.vm03.stdout:2/621: symlink d5/d10/d1f/d4f/d76/da7/lcb 0 2026-03-10T14:08:38.146 INFO:tasks.workunit.client.0.vm03.stdout:3/715: creat d1d/d33/d65/d5d/fe9 x:0 0 0 2026-03-10T14:08:38.146 INFO:tasks.workunit.client.0.vm03.stdout:4/679: symlink d5/d47/d5b/dbe/le4 0 2026-03-10T14:08:38.150 INFO:tasks.workunit.client.0.vm03.stdout:9/656: mkdir d2/d29/d33/d60/d8c/dd3 0 2026-03-10T14:08:38.155 INFO:tasks.workunit.client.0.vm03.stdout:6/639: sync 2026-03-10T14:08:38.162 INFO:tasks.workunit.client.0.vm03.stdout:8/716: truncate da/d3c/d51/d75/dc2/f68 3138014 0 2026-03-10T14:08:38.166 INFO:tasks.workunit.client.0.vm03.stdout:5/801: write d4/d6/f63 [3714025,6073] 0 2026-03-10T14:08:38.169 INFO:tasks.workunit.client.0.vm03.stdout:0/644: dwrite d3/f94 [0,4194304] 0 2026-03-10T14:08:38.171 INFO:tasks.workunit.client.0.vm03.stdout:7/564: read - d5/d9/d3e/d84/fc1 zero size 2026-03-10T14:08:38.171 INFO:tasks.workunit.client.0.vm03.stdout:1/634: write d0/d2/df/d16/f61 [232375,25988] 0 2026-03-10T14:08:38.182 INFO:tasks.workunit.client.0.vm03.stdout:4/680: chown d5/d9/db/f2a 460751989 1 2026-03-10T14:08:38.191 INFO:tasks.workunit.client.0.vm03.stdout:3/716: rmdir d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/dbf 39 2026-03-10T14:08:38.194 INFO:tasks.workunit.client.0.vm03.stdout:8/717: mkdir da/d24/d49/dab/dd7/de7 0 2026-03-10T14:08:38.198 INFO:tasks.workunit.client.0.vm03.stdout:5/802: stat d4/d13/c3c 0 2026-03-10T14:08:38.202 INFO:tasks.workunit.client.0.vm03.stdout:3/717: dread d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f81 [0,4194304] 0 2026-03-10T14:08:38.212 INFO:tasks.workunit.client.0.vm03.stdout:9/657: truncate d2/d29/d38/f4b 5178537 0 2026-03-10T14:08:38.212 INFO:tasks.workunit.client.0.vm03.stdout:1/635: fdatasync d0/d2/df/dab/f3e 0 2026-03-10T14:08:38.215 INFO:tasks.workunit.client.0.vm03.stdout:6/640: fsync d8/d11/d7a/fad 0 2026-03-10T14:08:38.216 INFO:tasks.workunit.client.0.vm03.stdout:3/718: sync 2026-03-10T14:08:38.219 INFO:tasks.workunit.client.0.vm03.stdout:3/719: fdatasync d1d/f66 0 2026-03-10T14:08:38.221 INFO:tasks.workunit.client.0.vm03.stdout:8/718: creat da/d3a/d44/fe8 x:0 0 0 2026-03-10T14:08:38.225 INFO:tasks.workunit.client.0.vm03.stdout:3/720: dwrite d1d/d39/d51/f82 [4194304,4194304] 0 2026-03-10T14:08:38.230 INFO:tasks.workunit.client.0.vm03.stdout:0/645: dread d3/d16/f3e [0,4194304] 0 2026-03-10T14:08:38.235 INFO:tasks.workunit.client.0.vm03.stdout:5/803: truncate d4/d13/d1f/f74 2890309 0 2026-03-10T14:08:38.238 INFO:tasks.workunit.client.0.vm03.stdout:4/681: mkdir d5/db4/dd2/de5 0 2026-03-10T14:08:38.239 INFO:tasks.workunit.client.0.vm03.stdout:4/682: chown d5 16791804 1 2026-03-10T14:08:38.239 INFO:tasks.workunit.client.0.vm03.stdout:1/636: mkdir d0/d42/d9f/dc9 0 2026-03-10T14:08:38.239 INFO:tasks.workunit.client.0.vm03.stdout:2/622: creat d5/d2a/fcc x:0 0 0 2026-03-10T14:08:38.240 INFO:tasks.workunit.client.0.vm03.stdout:4/683: chown d5/db4/dd2/de5 0 1 2026-03-10T14:08:38.242 INFO:tasks.workunit.client.0.vm03.stdout:6/641: write d8/db/d12/f57 [1691302,127371] 0 2026-03-10T14:08:38.252 INFO:tasks.workunit.client.0.vm03.stdout:6/642: dread d8/db/df/f37 [0,4194304] 0 2026-03-10T14:08:38.253 INFO:tasks.workunit.client.0.vm03.stdout:8/719: symlink da/d3c/d51/d75/dc2/le9 0 2026-03-10T14:08:38.260 INFO:tasks.workunit.client.0.vm03.stdout:0/646: rmdir d3/d11/d2c/d4a/d4b/d89 39 2026-03-10T14:08:38.269 INFO:tasks.workunit.client.0.vm03.stdout:7/565: creat d5/fc4 x:0 0 0 2026-03-10T14:08:38.276 INFO:tasks.workunit.client.0.vm03.stdout:7/566: write d5/fc4 [282851,34481] 0 2026-03-10T14:08:38.282 INFO:tasks.workunit.client.0.vm03.stdout:6/643: read f0 [1805427,110504] 0 2026-03-10T14:08:38.285 INFO:tasks.workunit.client.0.vm03.stdout:8/720: symlink da/d24/d49/lea 0 2026-03-10T14:08:38.287 INFO:tasks.workunit.client.0.vm03.stdout:0/647: symlink d3/d11/d2c/d4a/ld0 0 2026-03-10T14:08:38.287 INFO:tasks.workunit.client.0.vm03.stdout:8/721: dread - da/d58/d6c/fd8 zero size 2026-03-10T14:08:38.288 INFO:tasks.workunit.client.0.vm03.stdout:9/658: creat d2/d29/d33/fd4 x:0 0 0 2026-03-10T14:08:38.293 INFO:tasks.workunit.client.0.vm03.stdout:1/637: mknod d0/cca 0 2026-03-10T14:08:38.294 INFO:tasks.workunit.client.0.vm03.stdout:5/804: dread d4/d16/f1c [0,4194304] 0 2026-03-10T14:08:38.296 INFO:tasks.workunit.client.0.vm03.stdout:2/623: mknod d5/d10/d1f/ccd 0 2026-03-10T14:08:38.297 INFO:tasks.workunit.client.0.vm03.stdout:2/624: write d5/d10/d1f/d4f/d76/da7/d54/fc3 [962227,129371] 0 2026-03-10T14:08:38.302 INFO:tasks.workunit.client.0.vm03.stdout:7/567: fsync d5/d9/f22 0 2026-03-10T14:08:38.304 INFO:tasks.workunit.client.0.vm03.stdout:6/644: dread - d8/db/df/fb3 zero size 2026-03-10T14:08:38.305 INFO:tasks.workunit.client.0.vm03.stdout:6/645: read f0 [1844551,1920] 0 2026-03-10T14:08:38.311 INFO:tasks.workunit.client.0.vm03.stdout:4/684: write d5/d6e/f9c [1177490,37563] 0 2026-03-10T14:08:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:38 vm04.local ceph-mon[55966]: mgrmap e30: vm03.rwbbep(active, since 4s), standbys: vm04.ywwcto 2026-03-10T14:08:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:38 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:38 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:38.316 INFO:tasks.workunit.client.0.vm03.stdout:1/638: creat d0/d2/df/d16/d41/dba/fcb x:0 0 0 2026-03-10T14:08:38.317 INFO:tasks.workunit.client.0.vm03.stdout:1/639: chown d0/d18/d3b/f53 6965985 1 2026-03-10T14:08:38.323 INFO:tasks.workunit.client.0.vm03.stdout:6/646: symlink d8/db/d49/lc2 0 2026-03-10T14:08:38.326 INFO:tasks.workunit.client.0.vm03.stdout:3/721: getdents d1d/d33/d65/d5d/dae/db9 0 2026-03-10T14:08:38.327 INFO:tasks.workunit.client.0.vm03.stdout:3/722: chown d1d/d33/l3e 2 1 2026-03-10T14:08:38.332 INFO:tasks.workunit.client.0.vm03.stdout:0/648: creat d3/d4d/d8f/fd1 x:0 0 0 2026-03-10T14:08:38.336 INFO:tasks.workunit.client.0.vm03.stdout:1/640: creat d0/d42/fcc x:0 0 0 2026-03-10T14:08:38.338 INFO:tasks.workunit.client.0.vm03.stdout:1/641: dread - d0/d2/df/d16/d20/fc2 zero size 2026-03-10T14:08:38.338 INFO:tasks.workunit.client.0.vm03.stdout:1/642: dread - d0/d2/df/dab/f6d zero size 2026-03-10T14:08:38.338 INFO:tasks.workunit.client.0.vm03.stdout:1/643: dread - d0/d2/df/d16/d20/fc2 zero size 2026-03-10T14:08:38.338 INFO:tasks.workunit.client.0.vm03.stdout:8/722: write f2 [2200542,65230] 0 2026-03-10T14:08:38.339 INFO:tasks.workunit.client.0.vm03.stdout:6/647: chown d8/d11/da0/dbf/fc0 0 1 2026-03-10T14:08:38.345 INFO:tasks.workunit.client.0.vm03.stdout:3/723: sync 2026-03-10T14:08:38.345 INFO:tasks.workunit.client.0.vm03.stdout:1/644: sync 2026-03-10T14:08:38.348 INFO:tasks.workunit.client.0.vm03.stdout:1/645: truncate d0/d18/d3b/f3d 1030539 0 2026-03-10T14:08:38.353 INFO:tasks.workunit.client.0.vm03.stdout:5/805: truncate d4/d13/d43/fbb 2098649 0 2026-03-10T14:08:38.367 INFO:tasks.workunit.client.0.vm03.stdout:2/625: rename d5/d10/d17/c27 to d5/d10/d1f/d4f/d76/da7/cce 0 2026-03-10T14:08:38.373 INFO:tasks.workunit.client.0.vm03.stdout:7/568: creat d5/d9/d14/d21/fc5 x:0 0 0 2026-03-10T14:08:38.376 INFO:tasks.workunit.client.0.vm03.stdout:6/648: dwrite d8/db/d49/d6c/f66 [0,4194304] 0 2026-03-10T14:08:38.380 INFO:tasks.workunit.client.0.vm03.stdout:1/646: symlink d0/d2/df/d27/d7e/d81/lcd 0 2026-03-10T14:08:38.386 INFO:tasks.workunit.client.0.vm03.stdout:5/806: creat d4/d6/ddc/ffc x:0 0 0 2026-03-10T14:08:38.387 INFO:tasks.workunit.client.0.vm03.stdout:9/659: getdents d2/d29/d33/d6d 0 2026-03-10T14:08:38.387 INFO:tasks.workunit.client.0.vm03.stdout:4/685: rename d5/d96/f98 to d5/d47/d5b/dbe/dc4/fe6 0 2026-03-10T14:08:38.388 INFO:tasks.workunit.client.0.vm03.stdout:2/626: mkdir d5/db4/d74/dcf 0 2026-03-10T14:08:38.390 INFO:tasks.workunit.client.0.vm03.stdout:6/649: dwrite d8/d11/da0/dbf/d5c/d60/f82 [0,4194304] 0 2026-03-10T14:08:38.393 INFO:tasks.workunit.client.0.vm03.stdout:5/807: symlink d4/d6/ddc/lfd 0 2026-03-10T14:08:38.393 INFO:tasks.workunit.client.0.vm03.stdout:8/723: creat da/d58/d6c/feb x:0 0 0 2026-03-10T14:08:38.396 INFO:tasks.workunit.client.0.vm03.stdout:5/808: chown d4/d16/d19/d4a/fbd 42 1 2026-03-10T14:08:38.396 INFO:tasks.workunit.client.0.vm03.stdout:7/569: dwrite d5/d9/d14/d26/d36/f3a [0,4194304] 0 2026-03-10T14:08:38.402 INFO:tasks.workunit.client.0.vm03.stdout:5/809: dread - d4/d16/ffb zero size 2026-03-10T14:08:38.403 INFO:tasks.workunit.client.0.vm03.stdout:2/627: mknod d5/d10/d1f/d4f/cd0 0 2026-03-10T14:08:38.403 INFO:tasks.workunit.client.0.vm03.stdout:4/686: stat d5/d9/db/da7/db9/cd4 0 2026-03-10T14:08:38.403 INFO:tasks.workunit.client.0.vm03.stdout:8/724: rmdir da/d3c/d51 39 2026-03-10T14:08:38.406 INFO:tasks.workunit.client.0.vm03.stdout:4/687: stat d5/c9a 0 2026-03-10T14:08:38.406 INFO:tasks.workunit.client.0.vm03.stdout:4/688: chown d5/l6a 13 1 2026-03-10T14:08:38.428 INFO:tasks.workunit.client.0.vm03.stdout:1/647: creat d0/d2/df/fce x:0 0 0 2026-03-10T14:08:38.429 INFO:tasks.workunit.client.0.vm03.stdout:5/810: dread d4/d6/fa [0,4194304] 0 2026-03-10T14:08:38.434 INFO:tasks.workunit.client.0.vm03.stdout:8/725: mkdir da/d58/d5f/dbc/dec 0 2026-03-10T14:08:38.434 INFO:tasks.workunit.client.0.vm03.stdout:9/660: dread d2/d14/d2b/d79/d8a/f8d [0,4194304] 0 2026-03-10T14:08:38.435 INFO:tasks.workunit.client.0.vm03.stdout:1/648: symlink d0/d2/df/d16/lcf 0 2026-03-10T14:08:38.435 INFO:tasks.workunit.client.0.vm03.stdout:1/649: fdatasync d0/d18/daa/fb1 0 2026-03-10T14:08:38.438 INFO:tasks.workunit.client.0.vm03.stdout:9/661: mknod d2/d29/dcd/cd5 0 2026-03-10T14:08:38.441 INFO:tasks.workunit.client.0.vm03.stdout:8/726: read da/d24/f3d [638878,99923] 0 2026-03-10T14:08:38.446 INFO:tasks.workunit.client.0.vm03.stdout:3/724: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f81 [662412,128989] 0 2026-03-10T14:08:38.449 INFO:tasks.workunit.client.0.vm03.stdout:3/725: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/fb8 [800041,34776] 0 2026-03-10T14:08:38.459 INFO:tasks.workunit.client.0.vm03.stdout:0/649: write d3/d4d/d30/f8b [874066,71761] 0 2026-03-10T14:08:38.461 INFO:tasks.workunit.client.0.vm03.stdout:3/726: rename d1d/d33/d47/d53/cc2 to d1d/d33/d47/d53/cea 0 2026-03-10T14:08:38.461 INFO:tasks.workunit.client.0.vm03.stdout:0/650: read - d3/d16/d21/f96 zero size 2026-03-10T14:08:38.464 INFO:tasks.workunit.client.0.vm03.stdout:5/811: getdents d4 0 2026-03-10T14:08:38.470 INFO:tasks.workunit.client.0.vm03.stdout:6/650: dwrite d8/db/d49/d6c/f8e [0,4194304] 0 2026-03-10T14:08:38.476 INFO:tasks.workunit.client.0.vm03.stdout:3/727: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/feb x:0 0 0 2026-03-10T14:08:38.477 INFO:tasks.workunit.client.0.vm03.stdout:7/570: write d5/d9/f17 [1367755,94853] 0 2026-03-10T14:08:38.478 INFO:tasks.workunit.client.0.vm03.stdout:5/812: mkdir d4/d16/dfe 0 2026-03-10T14:08:38.479 INFO:tasks.workunit.client.0.vm03.stdout:4/689: write d5/d9/db/d19/d38/d53/d55/f76 [199617,36426] 0 2026-03-10T14:08:38.485 INFO:tasks.workunit.client.0.vm03.stdout:4/690: truncate d5/d9/db/d19/d99/fac 4390598 0 2026-03-10T14:08:38.485 INFO:tasks.workunit.client.0.vm03.stdout:1/650: dread d0/d2/df/d27/f99 [0,4194304] 0 2026-03-10T14:08:38.490 INFO:tasks.workunit.client.0.vm03.stdout:9/662: write d2/d29/d33/fa9 [1277777,110111] 0 2026-03-10T14:08:38.491 INFO:tasks.workunit.client.0.vm03.stdout:3/728: mkdir d1d/d33/d65/d5d/dae/db9/dec 0 2026-03-10T14:08:38.491 INFO:tasks.workunit.client.0.vm03.stdout:6/651: creat d8/db/d49/d6c/d32/fc3 x:0 0 0 2026-03-10T14:08:38.493 INFO:tasks.workunit.client.0.vm03.stdout:9/663: stat d2/d29/d33/d6d/d87 0 2026-03-10T14:08:38.493 INFO:tasks.workunit.client.0.vm03.stdout:4/691: symlink d5/d9/db/d19/d34/le7 0 2026-03-10T14:08:38.495 INFO:tasks.workunit.client.0.vm03.stdout:2/628: dwrite d5/d10/d31/fa9 [0,4194304] 0 2026-03-10T14:08:38.498 INFO:tasks.workunit.client.0.vm03.stdout:5/813: truncate d4/d13/d1f/f70 2346147 0 2026-03-10T14:08:38.500 INFO:tasks.workunit.client.0.vm03.stdout:0/651: sync 2026-03-10T14:08:38.508 INFO:tasks.workunit.client.0.vm03.stdout:3/729: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/db6/fed x:0 0 0 2026-03-10T14:08:38.510 INFO:tasks.workunit.client.0.vm03.stdout:8/727: dread da/d3c/d51/d75/dc2/f68 [0,4194304] 0 2026-03-10T14:08:38.515 INFO:tasks.workunit.client.0.vm03.stdout:4/692: rename d5/db4/dd2/de5 to d5/d47/d5b/d64/d85/de8 0 2026-03-10T14:08:38.517 INFO:tasks.workunit.client.0.vm03.stdout:2/629: symlink d5/d35/ld1 0 2026-03-10T14:08:38.519 INFO:tasks.workunit.client.0.vm03.stdout:0/652: dwrite d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:38.534 INFO:tasks.workunit.client.0.vm03.stdout:6/652: creat d8/db/d49/d76/fc4 x:0 0 0 2026-03-10T14:08:38.536 INFO:tasks.workunit.client.0.vm03.stdout:1/651: write d0/d2/df/d16/d41/f68 [5026465,73636] 0 2026-03-10T14:08:38.537 INFO:tasks.workunit.client.0.vm03.stdout:6/653: chown d8/db/df/f37 4670 1 2026-03-10T14:08:38.538 INFO:tasks.workunit.client.0.vm03.stdout:1/652: chown d0/d42/d9f/fa2 6818761 1 2026-03-10T14:08:38.544 INFO:tasks.workunit.client.0.vm03.stdout:7/571: getdents d5/d9/d14/d21/d28 0 2026-03-10T14:08:38.566 INFO:tasks.workunit.client.0.vm03.stdout:3/730: rename d1d/d33/d47/d53/d68/dcf/de7/d41/f56 to d1d/d33/d65/d5d/dae/fee 0 2026-03-10T14:08:38.568 INFO:tasks.workunit.client.0.vm03.stdout:8/728: write da/f16 [258518,29281] 0 2026-03-10T14:08:38.571 INFO:tasks.workunit.client.0.vm03.stdout:4/693: dwrite d5/d47/d5b/dbe/dc4/fe6 [4194304,4194304] 0 2026-03-10T14:08:38.580 INFO:tasks.workunit.client.0.vm03.stdout:2/630: truncate d5/d10/d31/f38 1887240 0 2026-03-10T14:08:38.584 INFO:tasks.workunit.client.0.vm03.stdout:9/664: link d2/d14/d2b/d79/c99 d2/d29/d33/d41/d46/dd0/cd6 0 2026-03-10T14:08:38.598 INFO:tasks.workunit.client.0.vm03.stdout:7/572: fdatasync d5/d9/d14/f4d 0 2026-03-10T14:08:38.598 INFO:tasks.workunit.client.0.vm03.stdout:5/814: rename d4/d16/d19/d23/db8/ff9 to d4/d13/d1f/fff 0 2026-03-10T14:08:38.598 INFO:tasks.workunit.client.0.vm03.stdout:4/694: mkdir d5/d47/d62/de9 0 2026-03-10T14:08:38.598 INFO:tasks.workunit.client.0.vm03.stdout:7/573: truncate d5/d9/d14/d26/f38 382559 0 2026-03-10T14:08:38.604 INFO:tasks.workunit.client.0.vm03.stdout:7/574: stat d5/d9/d14/d26/d39/f63 0 2026-03-10T14:08:38.604 INFO:tasks.workunit.client.0.vm03.stdout:8/729: link da/d24/d49/lea da/d24/d49/dab/led 0 2026-03-10T14:08:38.620 INFO:tasks.workunit.client.0.vm03.stdout:4/695: sync 2026-03-10T14:08:38.621 INFO:tasks.workunit.client.0.vm03.stdout:0/653: rename d3/d11/d2c/d4a/d4b/c9f to d3/cd2 0 2026-03-10T14:08:38.624 INFO:tasks.workunit.client.0.vm03.stdout:0/654: write d3/d16/d21/f6a [1025815,127457] 0 2026-03-10T14:08:38.626 INFO:tasks.workunit.client.0.vm03.stdout:0/655: dread - d3/d16/d21/d3c/fc0 zero size 2026-03-10T14:08:38.635 INFO:tasks.workunit.client.0.vm03.stdout:5/815: dread d4/d6/fbe [0,4194304] 0 2026-03-10T14:08:38.636 INFO:tasks.workunit.client.0.vm03.stdout:5/816: chown d4/d13/d1f/c42 6292 1 2026-03-10T14:08:38.637 INFO:tasks.workunit.client.0.vm03.stdout:5/817: write d4/d16/d19/d67/ff1 [70899,27086] 0 2026-03-10T14:08:38.678 INFO:tasks.workunit.client.0.vm03.stdout:8/730: rename da/ddd to da/d24/d49/dee 0 2026-03-10T14:08:38.678 INFO:tasks.workunit.client.0.vm03.stdout:2/631: write d5/d10/d1f/f5e [1056055,95524] 0 2026-03-10T14:08:38.679 INFO:tasks.workunit.client.0.vm03.stdout:6/654: write d8/db/f1f [1478579,34795] 0 2026-03-10T14:08:38.679 INFO:tasks.workunit.client.0.vm03.stdout:1/653: write d0/d18/d3b/f53 [1035256,91073] 0 2026-03-10T14:08:38.683 INFO:tasks.workunit.client.0.vm03.stdout:0/656: mkdir d3/d11/d76/dd3 0 2026-03-10T14:08:38.684 INFO:tasks.workunit.client.0.vm03.stdout:9/665: write d2/d14/f30 [3364119,115434] 0 2026-03-10T14:08:38.691 INFO:tasks.workunit.client.0.vm03.stdout:5/818: dread d4/d40/d4e/f5c [0,4194304] 0 2026-03-10T14:08:38.691 INFO:tasks.workunit.client.0.vm03.stdout:6/655: sync 2026-03-10T14:08:38.691 INFO:tasks.workunit.client.0.vm03.stdout:2/632: sync 2026-03-10T14:08:38.692 INFO:tasks.workunit.client.0.vm03.stdout:2/633: chown d5/d10/dba 14 1 2026-03-10T14:08:38.711 INFO:tasks.workunit.client.0.vm03.stdout:2/634: dread d5/f2d [0,4194304] 0 2026-03-10T14:08:38.716 INFO:tasks.workunit.client.0.vm03.stdout:4/696: creat d5/d9/db/da7/dc0/fea x:0 0 0 2026-03-10T14:08:38.716 INFO:tasks.workunit.client.0.vm03.stdout:3/731: truncate d1d/d33/d65/d5d/dae/fee 1536217 0 2026-03-10T14:08:38.717 INFO:tasks.workunit.client.0.vm03.stdout:7/575: getdents d5/d9/d3e/d84 0 2026-03-10T14:08:38.719 INFO:tasks.workunit.client.0.vm03.stdout:2/635: dwrite d5/d35/fb2 [0,4194304] 0 2026-03-10T14:08:38.727 INFO:tasks.workunit.client.0.vm03.stdout:0/657: dwrite d3/d4d/da0/fbe [0,4194304] 0 2026-03-10T14:08:38.735 INFO:tasks.workunit.client.0.vm03.stdout:1/654: symlink d0/d2/df/dab/ld0 0 2026-03-10T14:08:38.735 INFO:tasks.workunit.client.0.vm03.stdout:3/732: dwrite d1d/d59/f69 [0,4194304] 0 2026-03-10T14:08:38.739 INFO:tasks.workunit.client.0.vm03.stdout:9/666: truncate d2/d14/d2b/d79/d8a/f8d 3394849 0 2026-03-10T14:08:38.754 INFO:tasks.workunit.client.0.vm03.stdout:6/656: mkdir d8/d11/da0/dbf/d5c/d60/dc5 0 2026-03-10T14:08:38.780 INFO:tasks.workunit.client.0.vm03.stdout:6/657: read d8/d11/da0/dbf/d5c/d60/f82 [146784,53650] 0 2026-03-10T14:08:38.780 INFO:tasks.workunit.client.0.vm03.stdout:6/658: write d8/db/f1f [5231211,127252] 0 2026-03-10T14:08:38.780 INFO:tasks.workunit.client.0.vm03.stdout:7/576: fdatasync d5/fb 0 2026-03-10T14:08:38.785 INFO:tasks.workunit.client.0.vm03.stdout:1/655: mknod d0/d18/daa/cd1 0 2026-03-10T14:08:38.786 INFO:tasks.workunit.client.0.vm03.stdout:1/656: dread d0/d2/df/f6c [0,4194304] 0 2026-03-10T14:08:38.788 INFO:tasks.workunit.client.0.vm03.stdout:0/658: truncate d3/d11/d2c/d4a/f95 73626 0 2026-03-10T14:08:38.795 INFO:tasks.workunit.client.0.vm03.stdout:5/819: fdatasync d4/d13/d1f/f74 0 2026-03-10T14:08:38.797 INFO:tasks.workunit.client.0.vm03.stdout:9/667: readlink d2/d14/d2b/d34/l89 0 2026-03-10T14:08:38.810 INFO:tasks.workunit.client.0.vm03.stdout:7/577: mknod d5/d9/d3e/cc6 0 2026-03-10T14:08:38.825 INFO:tasks.workunit.client.0.vm03.stdout:1/657: creat d0/d2/df/d27/d7e/d81/fd2 x:0 0 0 2026-03-10T14:08:38.841 INFO:tasks.workunit.client.0.vm03.stdout:1/658: truncate d0/d2/df/d16/f4f 466155 0 2026-03-10T14:08:38.850 INFO:tasks.workunit.client.0.vm03.stdout:7/578: creat d5/d9/d14/d26/d5f/fc7 x:0 0 0 2026-03-10T14:08:38.855 INFO:tasks.workunit.client.0.vm03.stdout:2/636: rmdir d5/d10/d1f/d4f/d76/da7/dbb 0 2026-03-10T14:08:38.871 INFO:tasks.workunit.client.0.vm03.stdout:9/668: getdents d2/d29/d33/d6d 0 2026-03-10T14:08:38.871 INFO:tasks.workunit.client.0.vm03.stdout:1/659: dread d0/d18/f25 [0,4194304] 0 2026-03-10T14:08:38.871 INFO:tasks.workunit.client.0.vm03.stdout:1/660: write d0/d42/f7f [1938798,105387] 0 2026-03-10T14:08:38.871 INFO:tasks.workunit.client.0.vm03.stdout:1/661: readlink d0/d2/df/d27/d7e/l89 0 2026-03-10T14:08:38.871 INFO:tasks.workunit.client.0.vm03.stdout:6/659: link d8/l2b d8/d11/da0/dbf/lc6 0 2026-03-10T14:08:38.871 INFO:tasks.workunit.client.0.vm03.stdout:6/660: dwrite d8/db/d12/f7c [4194304,4194304] 0 2026-03-10T14:08:38.871 INFO:tasks.workunit.client.0.vm03.stdout:1/662: creat d0/d2/df/d91/fd3 x:0 0 0 2026-03-10T14:08:38.871 INFO:tasks.workunit.client.0.vm03.stdout:0/659: sync 2026-03-10T14:08:38.879 INFO:tasks.workunit.client.0.vm03.stdout:2/637: sync 2026-03-10T14:08:38.881 INFO:tasks.workunit.client.0.vm03.stdout:9/669: fsync d2/f83 0 2026-03-10T14:08:38.890 INFO:tasks.workunit.client.0.vm03.stdout:4/697: creat d5/d47/d5b/d64/feb x:0 0 0 2026-03-10T14:08:38.891 INFO:tasks.workunit.client.0.vm03.stdout:1/663: fsync d0/d2/df/f1b 0 2026-03-10T14:08:38.895 INFO:tasks.workunit.client.0.vm03.stdout:8/731: dread f6 [0,4194304] 0 2026-03-10T14:08:38.903 INFO:tasks.workunit.client.0.vm03.stdout:0/660: mkdir d3/d11/d2c/db7/dd4 0 2026-03-10T14:08:38.903 INFO:tasks.workunit.client.0.vm03.stdout:0/661: dwrite d3/d16/d21/f6a [4194304,4194304] 0 2026-03-10T14:08:38.923 INFO:tasks.workunit.client.0.vm03.stdout:9/670: dread d2/d29/d38/f51 [0,4194304] 0 2026-03-10T14:08:38.926 INFO:tasks.workunit.client.0.vm03.stdout:1/664: rename d0/d2/df/d16/c79 to d0/d2/df/dab/cd4 0 2026-03-10T14:08:38.927 INFO:tasks.workunit.client.0.vm03.stdout:1/665: write d0/d42/fcc [816186,71970] 0 2026-03-10T14:08:38.930 INFO:tasks.workunit.client.0.vm03.stdout:7/579: mkdir d5/d9/d14/d26/d36/d51/dc8 0 2026-03-10T14:08:38.935 INFO:tasks.workunit.client.0.vm03.stdout:2/638: mknod d5/d10/d17/cd2 0 2026-03-10T14:08:38.937 INFO:tasks.workunit.client.0.vm03.stdout:6/661: link d8/db/d12/cae d8/db/d49/d6c/d32/cc7 0 2026-03-10T14:08:38.941 INFO:tasks.workunit.client.0.vm03.stdout:6/662: dwrite d8/db/d49/d76/fc4 [0,4194304] 0 2026-03-10T14:08:38.957 INFO:tasks.workunit.client.0.vm03.stdout:1/666: creat d0/d42/d9f/fd5 x:0 0 0 2026-03-10T14:08:38.960 INFO:tasks.workunit.client.0.vm03.stdout:8/732: symlink da/d58/d6c/lef 0 2026-03-10T14:08:38.966 INFO:tasks.workunit.client.0.vm03.stdout:7/580: creat d5/d9/d14/d26/d36/d51/d7b/fc9 x:0 0 0 2026-03-10T14:08:38.966 INFO:tasks.workunit.client.0.vm03.stdout:2/639: symlink d5/db4/d74/ld3 0 2026-03-10T14:08:38.966 INFO:tasks.workunit.client.0.vm03.stdout:6/663: write d8/db/d49/d6c/d32/f4b [1749509,37219] 0 2026-03-10T14:08:38.968 INFO:tasks.workunit.client.0.vm03.stdout:0/662: getdents d3/d11/d66 0 2026-03-10T14:08:38.973 INFO:tasks.workunit.client.0.vm03.stdout:1/667: creat d0/d18/d3b/d50/fd6 x:0 0 0 2026-03-10T14:08:38.977 INFO:tasks.workunit.client.0.vm03.stdout:6/664: mkdir d8/db/df/dc8 0 2026-03-10T14:08:38.978 INFO:tasks.workunit.client.0.vm03.stdout:3/733: write d1d/d33/d65/d48/fa5 [132605,101356] 0 2026-03-10T14:08:38.978 INFO:tasks.workunit.client.0.vm03.stdout:6/665: write d8/d11/d18/f34 [4625753,33643] 0 2026-03-10T14:08:38.983 INFO:tasks.workunit.client.0.vm03.stdout:2/640: mknod d5/cd4 0 2026-03-10T14:08:38.990 INFO:tasks.workunit.client.0.vm03.stdout:0/663: dread d3/d4d/f22 [0,4194304] 0 2026-03-10T14:08:38.997 INFO:tasks.workunit.client.0.vm03.stdout:3/734: symlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e/lef 0 2026-03-10T14:08:38.997 INFO:tasks.workunit.client.0.vm03.stdout:1/668: mkdir d0/d2/df/d27/dd7 0 2026-03-10T14:08:38.999 INFO:tasks.workunit.client.0.vm03.stdout:6/666: fdatasync d8/d1b/f7f 0 2026-03-10T14:08:39.001 INFO:tasks.workunit.client.0.vm03.stdout:6/667: write d8/db/d12/f57 [4484042,102797] 0 2026-03-10T14:08:39.002 INFO:tasks.workunit.client.0.vm03.stdout:5/820: dwrite d4/d13/d1f/d84/f99 [0,4194304] 0 2026-03-10T14:08:39.006 INFO:tasks.workunit.client.0.vm03.stdout:0/664: sync 2026-03-10T14:08:39.016 INFO:tasks.workunit.client.0.vm03.stdout:3/735: dread d1d/d59/f69 [0,4194304] 0 2026-03-10T14:08:39.020 INFO:tasks.workunit.client.0.vm03.stdout:2/641: creat d5/d10/d1f/d4f/d76/da7/d40/db0/fd5 x:0 0 0 2026-03-10T14:08:39.020 INFO:tasks.workunit.client.0.vm03.stdout:5/821: creat d4/d6/ddc/f100 x:0 0 0 2026-03-10T14:08:39.023 INFO:tasks.workunit.client.0.vm03.stdout:0/665: mknod d3/d11/d66/cd5 0 2026-03-10T14:08:39.023 INFO:tasks.workunit.client.0.vm03.stdout:1/669: mknod d0/d42/d9f/dc9/cd8 0 2026-03-10T14:08:39.025 INFO:tasks.workunit.client.0.vm03.stdout:4/698: dwrite d5/d9/db/d19/d34/f6b [0,4194304] 0 2026-03-10T14:08:39.031 INFO:tasks.workunit.client.0.vm03.stdout:7/581: write d5/d9/f22 [1775290,32934] 0 2026-03-10T14:08:39.031 INFO:tasks.workunit.client.0.vm03.stdout:7/582: chown d5/d9/d14/d26/d36/fa1 876400 1 2026-03-10T14:08:39.032 INFO:tasks.workunit.client.0.vm03.stdout:9/671: dwrite d2/fc [0,4194304] 0 2026-03-10T14:08:39.039 INFO:tasks.workunit.client.0.vm03.stdout:8/733: dwrite da/d24/f3d [0,4194304] 0 2026-03-10T14:08:39.041 INFO:tasks.workunit.client.0.vm03.stdout:8/734: write da/d3c/d51/d85/fcb [1255615,104705] 0 2026-03-10T14:08:39.061 INFO:tasks.workunit.client.0.vm03.stdout:1/670: creat d0/d42/fd9 x:0 0 0 2026-03-10T14:08:39.071 INFO:tasks.workunit.client.0.vm03.stdout:7/583: fdatasync d5/d9/d14/d26/d5f/f8e 0 2026-03-10T14:08:39.076 INFO:tasks.workunit.client.0.vm03.stdout:3/736: rmdir d1d/d33 39 2026-03-10T14:08:39.082 INFO:tasks.workunit.client.0.vm03.stdout:1/671: sync 2026-03-10T14:08:39.086 INFO:tasks.workunit.client.0.vm03.stdout:6/668: write d8/d1b/d1c/f73 [674112,109500] 0 2026-03-10T14:08:39.087 INFO:tasks.workunit.client.0.vm03.stdout:2/642: write d5/d10/d1f/d4f/d76/da7/f3c [2677266,81082] 0 2026-03-10T14:08:39.091 INFO:tasks.workunit.client.0.vm03.stdout:4/699: symlink d5/d47/d5b/d64/d85/lec 0 2026-03-10T14:08:39.091 INFO:tasks.workunit.client.0.vm03.stdout:4/700: readlink d5/d47/d62/l6c 0 2026-03-10T14:08:39.101 INFO:tasks.workunit.client.0.vm03.stdout:7/584: rmdir d5/d9/d14/d26/d5f 39 2026-03-10T14:08:39.108 INFO:tasks.workunit.client.0.vm03.stdout:3/737: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/fdf [917821,110256] 0 2026-03-10T14:08:39.108 INFO:tasks.workunit.client.0.vm03.stdout:1/672: mkdir d0/d2/df/d91/dda 0 2026-03-10T14:08:39.114 INFO:tasks.workunit.client.0.vm03.stdout:6/669: unlink d8/db/d12/f26 0 2026-03-10T14:08:39.116 INFO:tasks.workunit.client.0.vm03.stdout:9/672: link d2/d29/d38/fbf d2/d29/d33/d41/d46/dd0/fd7 0 2026-03-10T14:08:39.121 INFO:tasks.workunit.client.0.vm03.stdout:7/585: symlink d5/d9/d35/dab/lca 0 2026-03-10T14:08:39.127 INFO:tasks.workunit.client.0.vm03.stdout:5/822: dread d4/d16/f71 [0,4194304] 0 2026-03-10T14:08:39.127 INFO:tasks.workunit.client.0.vm03.stdout:4/701: mkdir d5/d9/db/d19/d38/ded 0 2026-03-10T14:08:39.128 INFO:tasks.workunit.client.0.vm03.stdout:5/823: readlink d4/d16/d19/d23/d3f/l49 0 2026-03-10T14:08:39.130 INFO:tasks.workunit.client.0.vm03.stdout:5/824: truncate d4/fc2 1175712 0 2026-03-10T14:08:39.133 INFO:tasks.workunit.client.0.vm03.stdout:2/643: link d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/lb7 d5/db4/d74/d83/dc1/ld6 0 2026-03-10T14:08:39.133 INFO:tasks.workunit.client.0.vm03.stdout:8/735: rename da/d36/c9d to da/d3c/d51/d75/dc2/db7/cf0 0 2026-03-10T14:08:39.138 INFO:tasks.workunit.client.0.vm03.stdout:9/673: rename d2/d29/d33/d41/daa/dce to d2/d29/dcd/dd8 0 2026-03-10T14:08:39.139 INFO:tasks.workunit.client.0.vm03.stdout:5/825: truncate d4/d16/d19/d23/db8/fc3 476060 0 2026-03-10T14:08:39.139 INFO:tasks.workunit.client.0.vm03.stdout:2/644: fdatasync d5/d2a/f6e 0 2026-03-10T14:08:39.142 INFO:tasks.workunit.client.0.vm03.stdout:5/826: mkdir d4/d16/d19/d23/db8/d101 0 2026-03-10T14:08:39.142 INFO:tasks.workunit.client.0.vm03.stdout:2/645: rmdir d5/d10/da3/dab 39 2026-03-10T14:08:39.144 INFO:tasks.workunit.client.0.vm03.stdout:8/736: sync 2026-03-10T14:08:39.149 INFO:tasks.workunit.client.0.vm03.stdout:9/674: rename d2/d29/dcd/d72/c7e to d2/d29/d9a/cd9 0 2026-03-10T14:08:39.149 INFO:tasks.workunit.client.0.vm03.stdout:4/702: dread d5/d9/db/f24 [0,4194304] 0 2026-03-10T14:08:39.150 INFO:tasks.workunit.client.0.vm03.stdout:9/675: readlink d2/d14/d2b/d34/l4a 0 2026-03-10T14:08:39.154 INFO:tasks.workunit.client.0.vm03.stdout:5/827: mknod d4/d16/d19/d23/db8/d101/c102 0 2026-03-10T14:08:39.155 INFO:tasks.workunit.client.0.vm03.stdout:4/703: stat d5/d9/db/d19/d38/d7b/daa/daf/cb1 0 2026-03-10T14:08:39.157 INFO:tasks.workunit.client.0.vm03.stdout:5/828: dread - d4/d16/d19/d6e/d7f/dd1/fda zero size 2026-03-10T14:08:39.159 INFO:tasks.workunit.client.0.vm03.stdout:4/704: mknod d5/d47/d5b/d64/cee 0 2026-03-10T14:08:39.161 INFO:tasks.workunit.client.0.vm03.stdout:4/705: mkdir d5/d9/db/da7/db9/def 0 2026-03-10T14:08:39.163 INFO:tasks.workunit.client.0.vm03.stdout:4/706: creat d5/d9/db/ff0 x:0 0 0 2026-03-10T14:08:39.163 INFO:tasks.workunit.client.0.vm03.stdout:9/676: dread d2/f4c [0,4194304] 0 2026-03-10T14:08:39.164 INFO:tasks.workunit.client.0.vm03.stdout:4/707: chown d5/d9/c74 467627111 1 2026-03-10T14:08:39.168 INFO:tasks.workunit.client.0.vm03.stdout:9/677: symlink d2/d29/dcd/dd8/lda 0 2026-03-10T14:08:39.169 INFO:tasks.workunit.client.0.vm03.stdout:4/708: creat d5/d9/db/d19/d99/ff1 x:0 0 0 2026-03-10T14:08:39.230 INFO:tasks.workunit.client.0.vm03.stdout:3/738: write d1d/d33/d47/d53/d68/dcf/de7/f2e [3038255,12168] 0 2026-03-10T14:08:39.233 INFO:tasks.workunit.client.0.vm03.stdout:3/739: link d1d/d39/d51/d72/d99/fbc d1d/d33/d47/d53/d68/ff0 0 2026-03-10T14:08:39.236 INFO:tasks.workunit.client.0.vm03.stdout:3/740: creat d1d/d33/ff1 x:0 0 0 2026-03-10T14:08:39.246 INFO:tasks.workunit.client.0.vm03.stdout:3/741: sync 2026-03-10T14:08:39.248 INFO:tasks.workunit.client.0.vm03.stdout:1/673: dwrite d0/d2/df/dab/f73 [0,4194304] 0 2026-03-10T14:08:39.251 INFO:tasks.workunit.client.0.vm03.stdout:3/742: creat d1d/d39/ff2 x:0 0 0 2026-03-10T14:08:39.267 INFO:tasks.workunit.client.0.vm03.stdout:3/743: dread - d1d/d33/d47/d53/d68/dcf/de7/d41/f70 zero size 2026-03-10T14:08:39.271 INFO:tasks.workunit.client.0.vm03.stdout:1/674: dwrite d0/d2/df/d16/d20/f38 [4194304,4194304] 0 2026-03-10T14:08:39.282 INFO:tasks.workunit.client.0.vm03.stdout:1/675: dread d0/f70 [0,4194304] 0 2026-03-10T14:08:39.287 INFO:tasks.workunit.client.0.vm03.stdout:6/670: write d8/db/df/fb3 [489483,114388] 0 2026-03-10T14:08:39.289 INFO:tasks.workunit.client.0.vm03.stdout:6/671: chown d8/d11/da0/dbf 2718940 1 2026-03-10T14:08:39.289 INFO:tasks.workunit.client.0.vm03.stdout:6/672: fdatasync d8/d1b/f7f 0 2026-03-10T14:08:39.297 INFO:tasks.workunit.client.0.vm03.stdout:7/586: dwrite d5/d9/d14/d26/d36/fa4 [0,4194304] 0 2026-03-10T14:08:39.297 INFO:tasks.workunit.client.0.vm03.stdout:1/676: creat d0/d2/df/d27/d7e/d81/fdb x:0 0 0 2026-03-10T14:08:39.298 INFO:tasks.workunit.client.0.vm03.stdout:7/587: write d5/d9/d14/f9b [1021226,129955] 0 2026-03-10T14:08:39.301 INFO:tasks.workunit.client.0.vm03.stdout:6/673: creat d8/d3b/da7/fc9 x:0 0 0 2026-03-10T14:08:39.308 INFO:tasks.workunit.client.0.vm03.stdout:6/674: chown d8/l2f 1340 1 2026-03-10T14:08:39.308 INFO:tasks.workunit.client.0.vm03.stdout:2/646: dwrite d5/d10/dba/f9a [0,4194304] 0 2026-03-10T14:08:39.313 INFO:tasks.workunit.client.0.vm03.stdout:6/675: mkdir d8/d11/da0/dca 0 2026-03-10T14:08:39.316 INFO:tasks.workunit.client.0.vm03.stdout:5/829: dwrite d4/d40/f96 [0,4194304] 0 2026-03-10T14:08:39.318 INFO:tasks.workunit.client.0.vm03.stdout:8/737: read da/d36/d40/f47 [3528653,120297] 0 2026-03-10T14:08:39.319 INFO:tasks.workunit.client.0.vm03.stdout:6/676: mkdir d8/db/d49/d6c/d32/dcb 0 2026-03-10T14:08:39.336 INFO:tasks.workunit.client.0.vm03.stdout:3/744: fsync d1d/d39/d51/d72/d99/fbc 0 2026-03-10T14:08:39.337 INFO:tasks.workunit.client.0.vm03.stdout:9/678: write d2/d29/d38/f4b [5014678,66991] 0 2026-03-10T14:08:39.342 INFO:tasks.workunit.client.0.vm03.stdout:5/830: mknod d4/d6/ddc/c103 0 2026-03-10T14:08:39.343 INFO:tasks.workunit.client.0.vm03.stdout:2/647: rename d5/d10/l56 to d5/d10/d1f/d4f/d76/da7/d40/d59/ld7 0 2026-03-10T14:08:39.343 INFO:tasks.workunit.client.0.vm03.stdout:0/666: dwrite d3/d11/d2c/d4a/f95 [0,4194304] 0 2026-03-10T14:08:39.348 INFO:tasks.workunit.client.0.vm03.stdout:4/709: dwrite d5/d9/db/d19/d38/f86 [0,4194304] 0 2026-03-10T14:08:39.360 INFO:tasks.workunit.client.0.vm03.stdout:0/667: write d3/d11/d2c/d4a/fcf [690387,38600] 0 2026-03-10T14:08:39.360 INFO:tasks.workunit.client.0.vm03.stdout:4/710: dread - d5/d47/f65 zero size 2026-03-10T14:08:39.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:39 vm03.local ceph-mon[49718]: pgmap v6: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 27 MiB/s rd, 60 MiB/s wr, 206 op/s 2026-03-10T14:08:39.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:39 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:39.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:39 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:39.360 INFO:tasks.workunit.client.0.vm03.stdout:0/668: stat d3/d46/f55 0 2026-03-10T14:08:39.361 INFO:tasks.workunit.client.0.vm03.stdout:4/711: fdatasync d5/d9/db/d19/d99/fd7 0 2026-03-10T14:08:39.406 INFO:tasks.workunit.client.0.vm03.stdout:3/745: fsync d1d/d33/d47/d53/d68/dcf/de7/d41/f4e 0 2026-03-10T14:08:39.422 INFO:tasks.workunit.client.0.vm03.stdout:1/677: write d0/d2/df/dab/f88 [3439808,7755] 0 2026-03-10T14:08:39.422 INFO:tasks.workunit.client.0.vm03.stdout:7/588: truncate d5/d9/d14/d26/d5f/f8c 3542506 0 2026-03-10T14:08:39.422 INFO:tasks.workunit.client.0.vm03.stdout:1/678: stat d0/f10 0 2026-03-10T14:08:39.427 INFO:tasks.workunit.client.0.vm03.stdout:9/679: fsync d2/d14/f64 0 2026-03-10T14:08:39.449 INFO:tasks.workunit.client.0.vm03.stdout:8/738: dwrite da/d3c/f9e [0,4194304] 0 2026-03-10T14:08:39.449 INFO:tasks.workunit.client.0.vm03.stdout:6/677: truncate d8/db/d49/d6c/f66 644177 0 2026-03-10T14:08:39.467 INFO:tasks.workunit.client.0.vm03.stdout:8/739: chown da/d3c/d4b/d4c/fe5 2 1 2026-03-10T14:08:39.474 INFO:tasks.workunit.client.0.vm03.stdout:6/678: mknod d8/d3b/da7/ccc 0 2026-03-10T14:08:39.475 INFO:tasks.workunit.client.0.vm03.stdout:6/679: dread - d8/db/d12/d64/fba zero size 2026-03-10T14:08:39.475 INFO:tasks.workunit.client.0.vm03.stdout:6/680: fsync d8/db/d12/f40 0 2026-03-10T14:08:39.476 INFO:tasks.workunit.client.0.vm03.stdout:6/681: chown d8/d11/d7a 63435 1 2026-03-10T14:08:39.481 INFO:tasks.workunit.client.0.vm03.stdout:0/669: mkdir d3/d16/d21/dd6 0 2026-03-10T14:08:39.481 INFO:tasks.workunit.client.0.vm03.stdout:7/589: dwrite d5/d9/d14/f41 [0,4194304] 0 2026-03-10T14:08:39.488 INFO:tasks.workunit.client.0.vm03.stdout:4/712: mknod d5/d47/d62/d8a/dd0/cf2 0 2026-03-10T14:08:39.490 INFO:tasks.workunit.client.0.vm03.stdout:3/746: symlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/dd5/lf3 0 2026-03-10T14:08:39.492 INFO:tasks.workunit.client.0.vm03.stdout:1/679: dwrite d0/d2/df/d27/d7e/d81/fa3 [0,4194304] 0 2026-03-10T14:08:39.492 INFO:tasks.workunit.client.0.vm03.stdout:3/747: dread - d1d/d39/d51/f71 zero size 2026-03-10T14:08:39.494 INFO:tasks.workunit.client.0.vm03.stdout:3/748: stat d1d/d33/d47/d53/d68/dcf/de7/faf 0 2026-03-10T14:08:39.494 INFO:tasks.workunit.client.0.vm03.stdout:3/749: chown d1d/d33/d65/d5d 14 1 2026-03-10T14:08:39.499 INFO:tasks.workunit.client.0.vm03.stdout:2/648: link d5/d10/d1f/d4f/d76/da7/d40/d59/faa d5/db4/d74/dcf/fd8 0 2026-03-10T14:08:39.500 INFO:tasks.workunit.client.0.vm03.stdout:6/682: chown d8/d3b/l98 2360384 1 2026-03-10T14:08:39.501 INFO:tasks.workunit.client.0.vm03.stdout:7/590: dwrite d5/d9/d14/f9b [0,4194304] 0 2026-03-10T14:08:39.506 INFO:tasks.workunit.client.0.vm03.stdout:0/670: write d3/d17/fb1 [2349523,15278] 0 2026-03-10T14:08:39.506 INFO:tasks.workunit.client.0.vm03.stdout:1/680: write d0/d2/df/d27/faf [1755434,81907] 0 2026-03-10T14:08:39.506 INFO:tasks.workunit.client.0.vm03.stdout:1/681: readlink d0/d2/df/l5d 0 2026-03-10T14:08:39.510 INFO:tasks.workunit.client.0.vm03.stdout:7/591: write d5/d9/d14/d26/d5f/d89/fbd [548998,110348] 0 2026-03-10T14:08:39.512 INFO:tasks.workunit.client.0.vm03.stdout:4/713: dread d5/d9/db/f12 [0,4194304] 0 2026-03-10T14:08:39.528 INFO:tasks.workunit.client.0.vm03.stdout:3/750: creat d1d/d33/d47/d53/d68/dcf/de7/db2/ff4 x:0 0 0 2026-03-10T14:08:39.529 INFO:tasks.workunit.client.0.vm03.stdout:5/831: getdents d4/d13/d1f/d8c 0 2026-03-10T14:08:39.535 INFO:tasks.workunit.client.0.vm03.stdout:8/740: mkdir da/d36/d4d/da5/df1 0 2026-03-10T14:08:39.538 INFO:tasks.workunit.client.0.vm03.stdout:3/751: dread d1d/d33/f80 [0,4194304] 0 2026-03-10T14:08:39.545 INFO:tasks.workunit.client.0.vm03.stdout:3/752: dwrite d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/fb5 [0,4194304] 0 2026-03-10T14:08:39.548 INFO:tasks.workunit.client.0.vm03.stdout:1/682: mknod d0/d2/df/d27/d7e/d81/cdc 0 2026-03-10T14:08:39.557 INFO:tasks.workunit.client.0.vm03.stdout:1/683: dread d0/d2/df/d16/d41/fa8 [4194304,4194304] 0 2026-03-10T14:08:39.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:39 vm04.local ceph-mon[55966]: pgmap v6: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 27 MiB/s rd, 60 MiB/s wr, 206 op/s 2026-03-10T14:08:39.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:39 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:39.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:39 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:39.571 INFO:tasks.workunit.client.0.vm03.stdout:3/753: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/dce/ff5 x:0 0 0 2026-03-10T14:08:39.579 INFO:tasks.workunit.client.0.vm03.stdout:1/684: symlink d0/d2/d71/d90/ldd 0 2026-03-10T14:08:39.579 INFO:tasks.workunit.client.0.vm03.stdout:0/671: creat d3/fd7 x:0 0 0 2026-03-10T14:08:39.579 INFO:tasks.workunit.client.0.vm03.stdout:3/754: mkdir d1d/d39/df6 0 2026-03-10T14:08:39.579 INFO:tasks.workunit.client.0.vm03.stdout:3/755: write d1d/d39/d51/d72/fab [2663698,65422] 0 2026-03-10T14:08:39.580 INFO:tasks.workunit.client.0.vm03.stdout:3/756: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/cb1 36540403 1 2026-03-10T14:08:39.582 INFO:tasks.workunit.client.0.vm03.stdout:1/685: unlink d0/d2/df/d27/d7e/f95 0 2026-03-10T14:08:39.584 INFO:tasks.workunit.client.0.vm03.stdout:4/714: creat d5/d9/db/ff3 x:0 0 0 2026-03-10T14:08:39.585 INFO:tasks.workunit.client.0.vm03.stdout:4/715: dread d5/d9/db/f12 [0,4194304] 0 2026-03-10T14:08:39.590 INFO:tasks.workunit.client.0.vm03.stdout:2/649: getdents d5/d10/d1f/d4f/d76/da7 0 2026-03-10T14:08:39.591 INFO:tasks.workunit.client.0.vm03.stdout:8/741: sync 2026-03-10T14:08:39.595 INFO:tasks.workunit.client.0.vm03.stdout:3/757: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/f9d [2191887,10963] 0 2026-03-10T14:08:39.598 INFO:tasks.workunit.client.0.vm03.stdout:3/758: readlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/dd5/lf3 0 2026-03-10T14:08:39.598 INFO:tasks.workunit.client.0.vm03.stdout:1/686: mknod d0/d18/d3b/d50/cde 0 2026-03-10T14:08:39.602 INFO:tasks.workunit.client.0.vm03.stdout:8/742: sync 2026-03-10T14:08:39.606 INFO:tasks.workunit.client.0.vm03.stdout:7/592: fdatasync d5/d9/d14/f9b 0 2026-03-10T14:08:39.611 INFO:tasks.workunit.client.0.vm03.stdout:9/680: write d2/f37 [1695068,50365] 0 2026-03-10T14:08:39.614 INFO:tasks.workunit.client.0.vm03.stdout:1/687: dwrite d0/d2/df/d16/f61 [4194304,4194304] 0 2026-03-10T14:08:39.616 INFO:tasks.workunit.client.0.vm03.stdout:4/716: read - d5/d9/db/d19/d38/d53/f95 zero size 2026-03-10T14:08:39.617 INFO:tasks.workunit.client.0.vm03.stdout:4/717: readlink d5/d47/d62/l9e 0 2026-03-10T14:08:39.621 INFO:tasks.workunit.client.0.vm03.stdout:5/832: write d4/d40/fcb [23032,21287] 0 2026-03-10T14:08:39.624 INFO:tasks.workunit.client.0.vm03.stdout:4/718: sync 2026-03-10T14:08:39.627 INFO:tasks.workunit.client.0.vm03.stdout:6/683: dwrite d8/db/d49/d6c/d32/f3e [0,4194304] 0 2026-03-10T14:08:39.635 INFO:tasks.workunit.client.0.vm03.stdout:1/688: dwrite d0/d18/d3b/d50/fd6 [0,4194304] 0 2026-03-10T14:08:39.640 INFO:tasks.workunit.client.0.vm03.stdout:1/689: fsync d0/d2/df/fce 0 2026-03-10T14:08:39.647 INFO:tasks.workunit.client.0.vm03.stdout:1/690: dread d0/d2/df/dab/f73 [0,4194304] 0 2026-03-10T14:08:39.649 INFO:tasks.workunit.client.0.vm03.stdout:2/650: creat d5/db4/fd9 x:0 0 0 2026-03-10T14:08:39.660 INFO:tasks.workunit.client.0.vm03.stdout:5/833: dread d4/d16/d19/d23/db8/fba [0,4194304] 0 2026-03-10T14:08:39.666 INFO:tasks.workunit.client.0.vm03.stdout:5/834: dread d4/d40/fcb [0,4194304] 0 2026-03-10T14:08:39.671 INFO:tasks.workunit.client.0.vm03.stdout:3/759: mknod d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/cf7 0 2026-03-10T14:08:39.672 INFO:tasks.workunit.client.0.vm03.stdout:4/719: symlink d5/db4/lf4 0 2026-03-10T14:08:39.673 INFO:tasks.workunit.client.0.vm03.stdout:8/743: write da/d3a/d44/fe6 [828856,116905] 0 2026-03-10T14:08:39.673 INFO:tasks.workunit.client.0.vm03.stdout:6/684: chown d8/db/d49/d6c/d32/f94 95562 1 2026-03-10T14:08:39.673 INFO:tasks.workunit.client.0.vm03.stdout:3/760: fdatasync d1d/d33/d47/dac/dc1/fe2 0 2026-03-10T14:08:39.676 INFO:tasks.workunit.client.0.vm03.stdout:3/761: chown d1d/d33/d47/l54 11617 1 2026-03-10T14:08:39.676 INFO:tasks.workunit.client.0.vm03.stdout:4/720: dread - d5/d9/db/d19/d38/d53/d55/fa6 zero size 2026-03-10T14:08:39.682 INFO:tasks.workunit.client.0.vm03.stdout:3/762: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/fb8 [1657540,8887] 0 2026-03-10T14:08:39.682 INFO:tasks.workunit.client.0.vm03.stdout:1/691: symlink d0/d2/d71/d90/ldf 0 2026-03-10T14:08:39.684 INFO:tasks.workunit.client.0.vm03.stdout:1/692: chown d0/f48 2095969 1 2026-03-10T14:08:39.691 INFO:tasks.workunit.client.0.vm03.stdout:7/593: mknod d5/d9/d14/d26/d36/ccb 0 2026-03-10T14:08:39.695 INFO:tasks.workunit.client.0.vm03.stdout:7/594: chown d5/d9/d14/d26/d36/l3d 6627539 1 2026-03-10T14:08:39.703 INFO:tasks.workunit.client.0.vm03.stdout:1/693: dwrite d0/f48 [0,4194304] 0 2026-03-10T14:08:39.703 INFO:tasks.workunit.client.0.vm03.stdout:6/685: mkdir d8/db/df/dcd 0 2026-03-10T14:08:39.706 INFO:tasks.workunit.client.0.vm03.stdout:8/744: rename da/c7a to da/d58/d5f/dbc/d70/d99/cf2 0 2026-03-10T14:08:39.709 INFO:tasks.workunit.client.0.vm03.stdout:4/721: write d5/d9/db/f20 [1165692,97500] 0 2026-03-10T14:08:39.728 INFO:tasks.workunit.client.0.vm03.stdout:5/835: mknod d4/d16/df4/c104 0 2026-03-10T14:08:39.730 INFO:tasks.workunit.client.0.vm03.stdout:9/681: creat d2/fdb x:0 0 0 2026-03-10T14:08:39.731 INFO:tasks.workunit.client.0.vm03.stdout:3/763: symlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/lf8 0 2026-03-10T14:08:39.735 INFO:tasks.workunit.client.0.vm03.stdout:2/651: dwrite d5/d10/d17/f20 [0,4194304] 0 2026-03-10T14:08:39.737 INFO:tasks.workunit.client.0.vm03.stdout:6/686: symlink d8/db/d49/d58/lce 0 2026-03-10T14:08:39.740 INFO:tasks.workunit.client.0.vm03.stdout:6/687: write d8/db/d12/fa1 [7527408,69520] 0 2026-03-10T14:08:39.766 INFO:tasks.workunit.client.0.vm03.stdout:8/745: truncate da/d3a/f9a 1040110 0 2026-03-10T14:08:39.769 INFO:tasks.workunit.client.0.vm03.stdout:8/746: truncate da/d3c/d51/d75/dc2/fe1 924126 0 2026-03-10T14:08:39.776 INFO:tasks.workunit.client.0.vm03.stdout:3/764: mkdir d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/dd4/df9 0 2026-03-10T14:08:39.776 INFO:tasks.workunit.client.0.vm03.stdout:3/765: readlink d1d/d33/d65/d5d/ldd 0 2026-03-10T14:08:39.777 INFO:tasks.workunit.client.0.vm03.stdout:2/652: symlink d5/db4/lda 0 2026-03-10T14:08:39.778 INFO:tasks.workunit.client.0.vm03.stdout:2/653: chown d5/d10/dba 25839396 1 2026-03-10T14:08:39.778 INFO:tasks.workunit.client.0.vm03.stdout:3/766: dread - d1d/d33/d47/d53/d68/f86 zero size 2026-03-10T14:08:39.779 INFO:tasks.workunit.client.0.vm03.stdout:3/767: stat d1d/d39/df6 0 2026-03-10T14:08:39.782 INFO:tasks.workunit.client.0.vm03.stdout:2/654: write d5/d10/d31/fa9 [3131077,17962] 0 2026-03-10T14:08:39.783 INFO:tasks.workunit.client.0.vm03.stdout:2/655: stat d5/d10/d1f 0 2026-03-10T14:08:39.784 INFO:tasks.workunit.client.0.vm03.stdout:7/595: dwrite d5/d9/d14/d26/d39/f6a [0,4194304] 0 2026-03-10T14:08:39.788 INFO:tasks.workunit.client.0.vm03.stdout:3/768: truncate d1d/d33/d47/d53/d68/dcf/de7/d41/f75 8488075 0 2026-03-10T14:08:39.790 INFO:tasks.workunit.client.0.vm03.stdout:4/722: write d5/d9/db/f12 [894863,62748] 0 2026-03-10T14:08:39.790 INFO:tasks.workunit.client.0.vm03.stdout:0/672: truncate d3/d4d/d30/f7a 5103558 0 2026-03-10T14:08:39.794 INFO:tasks.workunit.client.0.vm03.stdout:6/688: dread d8/d11/f35 [0,4194304] 0 2026-03-10T14:08:39.794 INFO:tasks.workunit.client.0.vm03.stdout:3/769: write d1d/d33/d47/d53/d68/ff0 [2855001,75493] 0 2026-03-10T14:08:39.799 INFO:tasks.workunit.client.0.vm03.stdout:8/747: creat da/d24/d49/dab/dd7/ff3 x:0 0 0 2026-03-10T14:08:39.802 INFO:tasks.workunit.client.0.vm03.stdout:6/689: chown d8/d11/d18/d54 1 1 2026-03-10T14:08:39.803 INFO:tasks.workunit.client.0.vm03.stdout:6/690: dread - d8/db/d12/f72 zero size 2026-03-10T14:08:39.806 INFO:tasks.workunit.client.0.vm03.stdout:1/694: creat d0/fe0 x:0 0 0 2026-03-10T14:08:39.809 INFO:tasks.workunit.client.0.vm03.stdout:8/748: mknod da/d58/d5f/d67/cf4 0 2026-03-10T14:08:39.809 INFO:tasks.workunit.client.0.vm03.stdout:4/723: dwrite d5/d47/d5b/d64/feb [0,4194304] 0 2026-03-10T14:08:39.811 INFO:tasks.workunit.client.0.vm03.stdout:5/836: dwrite d4/d40/f5b [4194304,4194304] 0 2026-03-10T14:08:39.813 INFO:tasks.workunit.client.0.vm03.stdout:9/682: dwrite d2/d14/d2b/d34/f59 [0,4194304] 0 2026-03-10T14:08:39.813 INFO:tasks.workunit.client.0.vm03.stdout:3/770: mknod d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/dd5/cfa 0 2026-03-10T14:08:39.837 INFO:tasks.workunit.client.0.vm03.stdout:4/724: readlink d5/d47/l40 0 2026-03-10T14:08:39.839 INFO:tasks.workunit.client.0.vm03.stdout:1/695: fdatasync d0/d2/df/f6c 0 2026-03-10T14:08:39.840 INFO:tasks.workunit.client.0.vm03.stdout:0/673: dwrite d3/d16/d21/fa4 [0,4194304] 0 2026-03-10T14:08:39.840 INFO:tasks.workunit.client.0.vm03.stdout:8/749: rename da/d3c/d4b/d69/l94 to da/d58/d5f/d67/lf5 0 2026-03-10T14:08:39.849 INFO:tasks.workunit.client.0.vm03.stdout:2/656: fsync d5/d10/d1f/d4f/d76/da7/d40/d59/faa 0 2026-03-10T14:08:39.850 INFO:tasks.workunit.client.0.vm03.stdout:0/674: write d3/d11/d2c/d4a/f95 [831910,86244] 0 2026-03-10T14:08:39.865 INFO:tasks.workunit.client.0.vm03.stdout:3/771: dwrite d1d/d39/ff2 [0,4194304] 0 2026-03-10T14:08:39.873 INFO:tasks.workunit.client.0.vm03.stdout:0/675: symlink d3/d46/dac/ld8 0 2026-03-10T14:08:39.873 INFO:tasks.workunit.client.0.vm03.stdout:2/657: dread - d5/d10/d1f/d4f/d76/da7/d40/d59/f63 zero size 2026-03-10T14:08:39.877 INFO:tasks.workunit.client.0.vm03.stdout:7/596: dwrite d5/d9/d14/d26/f64 [0,4194304] 0 2026-03-10T14:08:39.881 INFO:tasks.workunit.client.0.vm03.stdout:3/772: truncate d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/fa2 935822 0 2026-03-10T14:08:39.899 INFO:tasks.workunit.client.0.vm03.stdout:0/676: dwrite d3/d4d/da0/fbe [0,4194304] 0 2026-03-10T14:08:39.905 INFO:tasks.workunit.client.0.vm03.stdout:7/597: mkdir d5/d9/d3e/dcc 0 2026-03-10T14:08:39.913 INFO:tasks.workunit.client.0.vm03.stdout:8/750: dread da/d3c/d4b/f63 [0,4194304] 0 2026-03-10T14:08:39.914 INFO:tasks.workunit.client.0.vm03.stdout:2/658: write d5/d10/d31/f3d [4111340,80711] 0 2026-03-10T14:08:39.916 INFO:tasks.workunit.client.0.vm03.stdout:2/659: readlink d5/d10/d1f/d4f/l87 0 2026-03-10T14:08:39.920 INFO:tasks.workunit.client.0.vm03.stdout:3/773: unlink d1d/d33/d47/d53/d68/dcf/de7/db2/ff4 0 2026-03-10T14:08:39.922 INFO:tasks.workunit.client.0.vm03.stdout:1/696: creat d0/d2/df/fe1 x:0 0 0 2026-03-10T14:08:39.922 INFO:tasks.workunit.client.0.vm03.stdout:6/691: write d8/db/d12/f72 [678079,56334] 0 2026-03-10T14:08:39.923 INFO:tasks.workunit.client.0.vm03.stdout:5/837: getdents d4/d16/d19/d23/d3f 0 2026-03-10T14:08:39.929 INFO:tasks.workunit.client.0.vm03.stdout:0/677: creat d3/d4d/d47/fd9 x:0 0 0 2026-03-10T14:08:39.934 INFO:tasks.workunit.client.0.vm03.stdout:7/598: unlink d5/f66 0 2026-03-10T14:08:39.934 INFO:tasks.workunit.client.0.vm03.stdout:4/725: truncate d5/d9/db/d19/d38/f86 837557 0 2026-03-10T14:08:39.935 INFO:tasks.workunit.client.0.vm03.stdout:2/660: truncate d5/d10/d17/f28 2792021 0 2026-03-10T14:08:39.935 INFO:tasks.workunit.client.0.vm03.stdout:2/661: stat d5/d10/l29 0 2026-03-10T14:08:39.942 INFO:tasks.workunit.client.0.vm03.stdout:6/692: mknod d8/d11/da0/dbf/d86/ccf 0 2026-03-10T14:08:39.945 INFO:tasks.workunit.client.0.vm03.stdout:7/599: creat d5/d9/d14/d26/d5f/d89/fcd x:0 0 0 2026-03-10T14:08:39.946 INFO:tasks.workunit.client.0.vm03.stdout:9/683: dwrite d2/d29/d33/d41/f50 [0,4194304] 0 2026-03-10T14:08:39.952 INFO:tasks.workunit.client.0.vm03.stdout:2/662: creat d5/d10/da3/dab/fdb x:0 0 0 2026-03-10T14:08:39.959 INFO:tasks.workunit.client.0.vm03.stdout:0/678: truncate d3/f28 387051 0 2026-03-10T14:08:39.971 INFO:tasks.workunit.client.0.vm03.stdout:0/679: dread d3/d4d/da0/fbe [0,4194304] 0 2026-03-10T14:08:39.971 INFO:tasks.workunit.client.0.vm03.stdout:9/684: dwrite d2/f37 [0,4194304] 0 2026-03-10T14:08:39.971 INFO:tasks.workunit.client.0.vm03.stdout:0/680: write d3/d16/d21/d9a/fc3 [523951,46665] 0 2026-03-10T14:08:39.971 INFO:tasks.workunit.client.0.vm03.stdout:6/693: mkdir d8/db/dd0 0 2026-03-10T14:08:39.971 INFO:tasks.workunit.client.0.vm03.stdout:2/663: unlink d5/d10/d17/f33 0 2026-03-10T14:08:39.972 INFO:tasks.workunit.client.0.vm03.stdout:7/600: truncate d5/d9/d14/d26/d36/d51/d7b/f90 540342 0 2026-03-10T14:08:39.977 INFO:tasks.workunit.client.0.vm03.stdout:0/681: mknod d3/d46/da9/cda 0 2026-03-10T14:08:39.982 INFO:tasks.workunit.client.0.vm03.stdout:6/694: dread d8/db/d49/d6c/f8e [0,4194304] 0 2026-03-10T14:08:39.987 INFO:tasks.workunit.client.0.vm03.stdout:2/664: rmdir d5/d10/dba 39 2026-03-10T14:08:39.988 INFO:tasks.workunit.client.0.vm03.stdout:4/726: getdents d5/d47/d62/d8a/da0 0 2026-03-10T14:08:39.992 INFO:tasks.workunit.client.0.vm03.stdout:7/601: mkdir d5/d9/d14/d26/d5f/dce 0 2026-03-10T14:08:40.002 INFO:tasks.workunit.client.0.vm03.stdout:4/727: chown d5/d9/db/d19/d38/c91 2 1 2026-03-10T14:08:40.002 INFO:tasks.workunit.client.0.vm03.stdout:7/602: readlink d5/l62 0 2026-03-10T14:08:40.002 INFO:tasks.workunit.client.0.vm03.stdout:6/695: creat d8/db/d12/fd1 x:0 0 0 2026-03-10T14:08:40.002 INFO:tasks.workunit.client.0.vm03.stdout:4/728: chown d5/d9/d2b/f63 136631 1 2026-03-10T14:08:40.002 INFO:tasks.workunit.client.0.vm03.stdout:2/665: mknod d5/d10/d1f/d4f/d76/da7/d40/db0/cdc 0 2026-03-10T14:08:40.002 INFO:tasks.workunit.client.0.vm03.stdout:6/696: chown d8/l36 76088132 1 2026-03-10T14:08:40.009 INFO:tasks.workunit.client.0.vm03.stdout:7/603: mknod d5/d9/d3e/dcc/ccf 0 2026-03-10T14:08:40.012 INFO:tasks.workunit.client.0.vm03.stdout:4/729: creat d5/d9/db/d19/d38/d7b/daa/daf/ff5 x:0 0 0 2026-03-10T14:08:40.015 INFO:tasks.workunit.client.0.vm03.stdout:6/697: creat d8/db/d49/d6c/fd2 x:0 0 0 2026-03-10T14:08:40.016 INFO:tasks.workunit.client.0.vm03.stdout:2/666: dread d5/d35/f49 [0,4194304] 0 2026-03-10T14:08:40.018 INFO:tasks.workunit.client.0.vm03.stdout:0/682: getdents d3/d16/d21 0 2026-03-10T14:08:40.030 INFO:tasks.workunit.client.0.vm03.stdout:7/604: mknod d5/d9/d14/d26/d39/d92/cd0 0 2026-03-10T14:08:40.030 INFO:tasks.workunit.client.0.vm03.stdout:4/730: creat d5/d9/db/d19/d34/ff6 x:0 0 0 2026-03-10T14:08:40.030 INFO:tasks.workunit.client.0.vm03.stdout:6/698: creat d8/db/d49/fd3 x:0 0 0 2026-03-10T14:08:40.031 INFO:tasks.workunit.client.0.vm03.stdout:3/774: sync 2026-03-10T14:08:40.032 INFO:tasks.workunit.client.0.vm03.stdout:7/605: rename d5/fb to d5/d9/d14/d26/d5f/dce/fd1 0 2026-03-10T14:08:40.037 INFO:tasks.workunit.client.0.vm03.stdout:4/731: dread d5/d47/d5b/f79 [0,4194304] 0 2026-03-10T14:08:40.040 INFO:tasks.workunit.client.0.vm03.stdout:5/838: write d4/d13/d8f/fd2 [45747,90351] 0 2026-03-10T14:08:40.042 INFO:tasks.workunit.client.0.vm03.stdout:5/839: read d4/d6/f63 [104759,17790] 0 2026-03-10T14:08:40.044 INFO:tasks.workunit.client.0.vm03.stdout:8/751: dwrite da/d58/d5f/dbc/d70/d99/fb0 [0,4194304] 0 2026-03-10T14:08:40.046 INFO:tasks.workunit.client.0.vm03.stdout:1/697: dwrite d0/d2/df/f31 [0,4194304] 0 2026-03-10T14:08:40.046 INFO:tasks.workunit.client.0.vm03.stdout:0/683: dread d3/d4d/d47/f74 [0,4194304] 0 2026-03-10T14:08:40.056 INFO:tasks.workunit.client.0.vm03.stdout:3/775: truncate d1d/f26 1452016 0 2026-03-10T14:08:40.056 INFO:tasks.workunit.client.0.vm03.stdout:3/776: chown d1d/d33 508 1 2026-03-10T14:08:40.059 INFO:tasks.workunit.client.0.vm03.stdout:3/777: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/f9d [2436890,125237] 0 2026-03-10T14:08:40.059 INFO:tasks.workunit.client.0.vm03.stdout:4/732: mknod d5/d47/d5b/cf7 0 2026-03-10T14:08:40.060 INFO:tasks.workunit.client.0.vm03.stdout:3/778: dread - d1d/d33/d47/f98 zero size 2026-03-10T14:08:40.062 INFO:tasks.workunit.client.0.vm03.stdout:3/779: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/c57 74 1 2026-03-10T14:08:40.065 INFO:tasks.workunit.client.0.vm03.stdout:5/840: mkdir d4/d40/dbc/d105 0 2026-03-10T14:08:40.070 INFO:tasks.workunit.client.0.vm03.stdout:1/698: creat d0/d2/df/d27/d7e/d81/fe2 x:0 0 0 2026-03-10T14:08:40.073 INFO:tasks.workunit.client.0.vm03.stdout:5/841: creat d4/d6/ddc/f106 x:0 0 0 2026-03-10T14:08:40.074 INFO:tasks.workunit.client.0.vm03.stdout:6/699: link d8/d11/l74 d8/db/d49/d58/ld4 0 2026-03-10T14:08:40.076 INFO:tasks.workunit.client.0.vm03.stdout:8/752: fsync da/d36/f77 0 2026-03-10T14:08:40.076 INFO:tasks.workunit.client.0.vm03.stdout:4/733: unlink d5/d9/f25 0 2026-03-10T14:08:40.079 INFO:tasks.workunit.client.0.vm03.stdout:5/842: creat d4/d16/d19/d6e/d7f/dd1/de8/f107 x:0 0 0 2026-03-10T14:08:40.080 INFO:tasks.workunit.client.0.vm03.stdout:8/753: unlink da/d3c/d51/d85/fbd 0 2026-03-10T14:08:40.081 INFO:tasks.workunit.client.0.vm03.stdout:8/754: fsync da/d24/d49/dab/dd7/fdc 0 2026-03-10T14:08:40.082 INFO:tasks.workunit.client.0.vm03.stdout:8/755: readlink da/d3c/d51/d75/dc2/le9 0 2026-03-10T14:08:40.084 INFO:tasks.workunit.client.0.vm03.stdout:5/843: creat d4/d16/d19/d4a/f108 x:0 0 0 2026-03-10T14:08:40.087 INFO:tasks.workunit.client.0.vm03.stdout:8/756: mkdir da/d3a/df6 0 2026-03-10T14:08:40.089 INFO:tasks.workunit.client.0.vm03.stdout:8/757: chown da/d36/c3e 11470 1 2026-03-10T14:08:40.090 INFO:tasks.workunit.client.0.vm03.stdout:4/734: dread d5/d9/f44 [0,4194304] 0 2026-03-10T14:08:40.090 INFO:tasks.workunit.client.0.vm03.stdout:5/844: link d4/d13/d1f/d8c/c8d d4/d6/ddc/c109 0 2026-03-10T14:08:40.093 INFO:tasks.workunit.client.0.vm03.stdout:8/758: mkdir da/df7 0 2026-03-10T14:08:40.096 INFO:tasks.workunit.client.0.vm03.stdout:8/759: getdents da/d3c/d51/d75/dc2/db7 0 2026-03-10T14:08:40.098 INFO:tasks.workunit.client.0.vm03.stdout:8/760: creat da/d24/ff8 x:0 0 0 2026-03-10T14:08:40.100 INFO:tasks.workunit.client.0.vm03.stdout:8/761: rmdir da/d58/d5f/dbc/d70/d99 39 2026-03-10T14:08:40.101 INFO:tasks.workunit.client.0.vm03.stdout:8/762: mkdir da/d3c/d4b/df9 0 2026-03-10T14:08:40.102 INFO:tasks.workunit.client.0.vm03.stdout:8/763: symlink da/df7/lfa 0 2026-03-10T14:08:40.110 INFO:tasks.workunit.client.0.vm03.stdout:8/764: dread da/d24/f43 [0,4194304] 0 2026-03-10T14:08:40.112 INFO:tasks.workunit.client.0.vm03.stdout:8/765: rename da/d24/d49/dab to da/d3a/d44/dfb 0 2026-03-10T14:08:40.116 INFO:tasks.workunit.client.0.vm03.stdout:8/766: creat da/d3c/d51/d75/dc2/dc6/ffc x:0 0 0 2026-03-10T14:08:40.117 INFO:tasks.workunit.client.0.vm03.stdout:8/767: write da/d24/f3d [1626671,74345] 0 2026-03-10T14:08:40.119 INFO:tasks.workunit.client.0.vm03.stdout:8/768: chown da/d36/f77 0 1 2026-03-10T14:08:40.122 INFO:tasks.workunit.client.0.vm03.stdout:8/769: unlink da/d24/f32 0 2026-03-10T14:08:40.122 INFO:tasks.workunit.client.0.vm03.stdout:8/770: fdatasync da/f12 0 2026-03-10T14:08:40.129 INFO:tasks.workunit.client.0.vm03.stdout:8/771: rmdir da/d36/d6a 39 2026-03-10T14:08:40.142 INFO:tasks.workunit.client.0.vm03.stdout:8/772: truncate da/d58/f72 1582417 0 2026-03-10T14:08:40.143 INFO:tasks.workunit.client.0.vm03.stdout:8/773: stat da/d24/d49/lea 0 2026-03-10T14:08:40.143 INFO:tasks.workunit.client.0.vm03.stdout:8/774: mkdir da/d3a/dce/dfd 0 2026-03-10T14:08:40.143 INFO:tasks.workunit.client.0.vm03.stdout:8/775: readlink da/d36/d4d/la7 0 2026-03-10T14:08:40.143 INFO:tasks.workunit.client.0.vm03.stdout:8/776: fdatasync da/f16 0 2026-03-10T14:08:40.143 INFO:tasks.workunit.client.0.vm03.stdout:8/777: write da/d3c/d4b/d4c/fe5 [646096,118299] 0 2026-03-10T14:08:40.143 INFO:tasks.workunit.client.0.vm03.stdout:8/778: fdatasync da/d36/d4d/f8f 0 2026-03-10T14:08:40.146 INFO:tasks.workunit.client.0.vm03.stdout:8/779: mknod da/d58/d5f/dbc/db1/cfe 0 2026-03-10T14:08:40.148 INFO:tasks.workunit.client.0.vm03.stdout:8/780: creat da/d36/d4d/da5/dd6/fff x:0 0 0 2026-03-10T14:08:40.151 INFO:tasks.workunit.client.0.vm03.stdout:9/685: sync 2026-03-10T14:08:40.152 INFO:tasks.workunit.client.0.vm03.stdout:9/686: truncate d2/d29/d33/fa9 4755693 0 2026-03-10T14:08:40.153 INFO:tasks.workunit.client.0.vm03.stdout:9/687: dread - d2/d29/d33/d41/f7b zero size 2026-03-10T14:08:40.153 INFO:tasks.workunit.client.0.vm03.stdout:9/688: chown d2/d29/dcd/d9c 488 1 2026-03-10T14:08:40.154 INFO:tasks.workunit.client.0.vm03.stdout:9/689: chown d2/d29/d33/d41/daa/fb9 35088 1 2026-03-10T14:08:40.241 INFO:tasks.workunit.client.0.vm03.stdout:2/667: sync 2026-03-10T14:08:40.241 INFO:tasks.workunit.client.0.vm03.stdout:2/668: readlink d5/d35/l6f 0 2026-03-10T14:08:40.242 INFO:tasks.workunit.client.0.vm03.stdout:2/669: chown d5/l7 16297542 1 2026-03-10T14:08:40.243 INFO:tasks.workunit.client.0.vm03.stdout:2/670: dread - d5/d10/d1f/d4f/d76/fca zero size 2026-03-10T14:08:40.244 INFO:tasks.workunit.client.0.vm03.stdout:2/671: mknod d5/d35/cdd 0 2026-03-10T14:08:40.344 INFO:tasks.workunit.client.0.vm03.stdout:6/700: truncate d8/db/d49/d6c/f8e 1224676 0 2026-03-10T14:08:40.354 INFO:tasks.workunit.client.0.vm03.stdout:7/606: write d5/f1b [1705588,53181] 0 2026-03-10T14:08:40.383 INFO:tasks.workunit.client.0.vm03.stdout:6/701: rename d8/d11/da0/dbf/d8c/db6/fb7 to d8/d11/d18/d79/d80/fd5 0 2026-03-10T14:08:40.392 INFO:tasks.workunit.client.0.vm03.stdout:3/780: write d1d/d39/d51/f61 [600920,85890] 0 2026-03-10T14:08:40.392 INFO:tasks.workunit.client.0.vm03.stdout:1/699: write d0/d2/df/dab/f73 [1330373,12871] 0 2026-03-10T14:08:40.396 INFO:tasks.workunit.client.0.vm03.stdout:0/684: dwrite d3/d17/f56 [0,4194304] 0 2026-03-10T14:08:40.421 INFO:tasks.workunit.client.0.vm03.stdout:8/781: write da/d3c/f4f [446840,103921] 0 2026-03-10T14:08:40.434 INFO:tasks.workunit.client.0.vm03.stdout:5/845: dwrite d4/d13/d1f/d8c/fa6 [0,4194304] 0 2026-03-10T14:08:40.435 INFO:tasks.workunit.client.0.vm03.stdout:4/735: dwrite d5/d9/db/d19/d34/f5d [0,4194304] 0 2026-03-10T14:08:40.451 INFO:tasks.workunit.client.0.vm03.stdout:9/690: dwrite d2/d29/d33/d41/d95/fae [0,4194304] 0 2026-03-10T14:08:40.458 INFO:tasks.workunit.client.0.vm03.stdout:2/672: write d5/d35/f81 [1078559,96415] 0 2026-03-10T14:08:40.468 INFO:tasks.workunit.client.0.vm03.stdout:2/673: dread - d5/db4/d74/dcf/fd8 zero size 2026-03-10T14:08:40.468 INFO:tasks.workunit.client.0.vm03.stdout:2/674: stat d5/d10/d1f/d4f/d76/da7/d40/d92/la8 0 2026-03-10T14:08:40.473 INFO:tasks.workunit.client.0.vm03.stdout:0/685: read d3/d17/f6d [3455911,27817] 0 2026-03-10T14:08:40.483 INFO:tasks.workunit.client.0.vm03.stdout:0/686: dread d3/d16/f34 [0,4194304] 0 2026-03-10T14:08:40.492 INFO:tasks.workunit.client.0.vm03.stdout:8/782: mknod da/d3c/d51/d75/c100 0 2026-03-10T14:08:40.513 INFO:tasks.workunit.client.0.vm03.stdout:5/846: mknod d4/d16/df4/c10a 0 2026-03-10T14:08:40.537 INFO:tasks.workunit.client.0.vm03.stdout:4/736: dwrite d5/d47/d5b/dbe/dc4/fae [0,4194304] 0 2026-03-10T14:08:40.550 INFO:tasks.workunit.client.0.vm03.stdout:1/700: mknod d0/d18/d3b/db7/ce3 0 2026-03-10T14:08:40.578 INFO:tasks.workunit.client.0.vm03.stdout:5/847: dread d4/f82 [0,4194304] 0 2026-03-10T14:08:40.587 INFO:tasks.workunit.client.0.vm03.stdout:3/781: mkdir d1d/d33/d47/d53/d68/dcf/de7/d41/dc0/dfb 0 2026-03-10T14:08:40.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:40 vm03.local ceph-mon[49718]: pgmap v7: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 24 MiB/s rd, 55 MiB/s wr, 165 op/s 2026-03-10T14:08:40.611 INFO:tasks.workunit.client.0.vm03.stdout:0/687: dwrite d3/d11/d66/da2/fc1 [0,4194304] 0 2026-03-10T14:08:40.612 INFO:tasks.workunit.client.0.vm03.stdout:0/688: chown d3/d16/d21/d3c/f99 148672 1 2026-03-10T14:08:40.633 INFO:tasks.workunit.client.0.vm03.stdout:8/783: dread f5 [0,4194304] 0 2026-03-10T14:08:40.651 INFO:tasks.workunit.client.0.vm03.stdout:1/701: creat d0/d2/df/d16/d41/dba/fe4 x:0 0 0 2026-03-10T14:08:40.653 INFO:tasks.workunit.client.0.vm03.stdout:9/691: mkdir d2/d29/ddc 0 2026-03-10T14:08:40.659 INFO:tasks.workunit.client.0.vm03.stdout:3/782: read d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f85 [3322513,51397] 0 2026-03-10T14:08:40.659 INFO:tasks.workunit.client.0.vm03.stdout:3/783: chown d1d/d33/d65/d48 28200 1 2026-03-10T14:08:40.670 INFO:tasks.workunit.client.0.vm03.stdout:0/689: dread d3/d4d/d47/fcd [0,4194304] 0 2026-03-10T14:08:40.671 INFO:tasks.workunit.client.0.vm03.stdout:0/690: dread - d3/d46/dac/fc2 zero size 2026-03-10T14:08:40.677 INFO:tasks.workunit.client.0.vm03.stdout:1/702: rename d0/d2/df/d16/d20/lbe to d0/d2/df/dab/dc3/le5 0 2026-03-10T14:08:40.680 INFO:tasks.workunit.client.0.vm03.stdout:5/848: creat d4/d6/de/dd5/f10b x:0 0 0 2026-03-10T14:08:40.682 INFO:tasks.workunit.client.0.vm03.stdout:3/784: write d1d/d39/d51/fa0 [3574785,80656] 0 2026-03-10T14:08:40.685 INFO:tasks.workunit.client.0.vm03.stdout:1/703: mknod d0/d18/daa/ce6 0 2026-03-10T14:08:40.691 INFO:tasks.workunit.client.0.vm03.stdout:5/849: truncate d4/d16/f71 1441400 0 2026-03-10T14:08:40.696 INFO:tasks.workunit.client.0.vm03.stdout:7/607: dwrite d5/d9/d14/d26/d36/d51/d7b/f90 [0,4194304] 0 2026-03-10T14:08:40.709 INFO:tasks.workunit.client.0.vm03.stdout:7/608: write d5/d9/d14/d21/d6f/f9d [1036488,109776] 0 2026-03-10T14:08:40.709 INFO:tasks.workunit.client.0.vm03.stdout:6/702: dwrite d8/d11/d7a/fa4 [0,4194304] 0 2026-03-10T14:08:40.709 INFO:tasks.workunit.client.0.vm03.stdout:6/703: dwrite d8/db/d49/d6c/d32/f3e [4194304,4194304] 0 2026-03-10T14:08:40.711 INFO:tasks.workunit.client.0.vm03.stdout:2/675: write d5/d10/d1f/d4f/d76/da7/d40/d59/f63 [233623,121279] 0 2026-03-10T14:08:40.711 INFO:tasks.workunit.client.0.vm03.stdout:0/691: sync 2026-03-10T14:08:40.723 INFO:tasks.workunit.client.0.vm03.stdout:4/737: write d5/d47/d5b/d64/f82 [680566,30565] 0 2026-03-10T14:08:40.729 INFO:tasks.workunit.client.0.vm03.stdout:2/676: read d5/db4/d74/d83/f96 [156479,47829] 0 2026-03-10T14:08:40.736 INFO:tasks.workunit.client.0.vm03.stdout:5/850: link d4/d16/d19/d4a/fbd d4/d16/df4/f10c 0 2026-03-10T14:08:40.736 INFO:tasks.workunit.client.0.vm03.stdout:1/704: getdents d0/d18/d1d 0 2026-03-10T14:08:40.736 INFO:tasks.workunit.client.0.vm03.stdout:6/704: creat d8/d11/da0/dca/fd6 x:0 0 0 2026-03-10T14:08:40.743 INFO:tasks.workunit.client.0.vm03.stdout:2/677: mknod d5/d10/d1f/d4f/d76/da7/d40/d59/d85/cde 0 2026-03-10T14:08:40.747 INFO:tasks.workunit.client.0.vm03.stdout:2/678: readlink d5/d35/ld1 0 2026-03-10T14:08:40.753 INFO:tasks.workunit.client.0.vm03.stdout:1/705: rmdir d0/d2/d71 39 2026-03-10T14:08:40.762 INFO:tasks.workunit.client.0.vm03.stdout:5/851: rmdir d4/d16/dfe 0 2026-03-10T14:08:40.765 INFO:tasks.workunit.client.0.vm03.stdout:5/852: fsync d4/d16/d19/d6e/d7f/dd1/de8/f107 0 2026-03-10T14:08:40.767 INFO:tasks.workunit.client.0.vm03.stdout:2/679: sync 2026-03-10T14:08:40.769 INFO:tasks.workunit.client.0.vm03.stdout:0/692: dread d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:40.770 INFO:tasks.workunit.client.0.vm03.stdout:4/738: getdents d5/d96 0 2026-03-10T14:08:40.774 INFO:tasks.workunit.client.0.vm03.stdout:1/706: write d0/d2/df/dab/f56 [4555838,56882] 0 2026-03-10T14:08:40.780 INFO:tasks.workunit.client.0.vm03.stdout:2/680: creat d5/d10/d1f/d4f/d76/da7/d40/d59/da2/fdf x:0 0 0 2026-03-10T14:08:40.782 INFO:tasks.workunit.client.0.vm03.stdout:4/739: stat d5/d9/db/da7/db9/cbc 0 2026-03-10T14:08:40.785 INFO:tasks.workunit.client.0.vm03.stdout:0/693: rename d3/d11/d2c/db7 to d3/d11/d76/db5/ddb 0 2026-03-10T14:08:40.788 INFO:tasks.workunit.client.0.vm03.stdout:6/705: getdents d8/db 0 2026-03-10T14:08:40.791 INFO:tasks.workunit.client.0.vm03.stdout:2/681: creat d5/d10/d1f/d4f/d76/da7/d40/d59/fe0 x:0 0 0 2026-03-10T14:08:40.792 INFO:tasks.workunit.client.0.vm03.stdout:2/682: chown d5/f2d 10867 1 2026-03-10T14:08:40.792 INFO:tasks.workunit.client.0.vm03.stdout:2/683: dread - d5/db4/d74/dcf/fd8 zero size 2026-03-10T14:08:40.794 INFO:tasks.workunit.client.0.vm03.stdout:4/740: rmdir d5/d9/d2b 39 2026-03-10T14:08:40.794 INFO:tasks.workunit.client.0.vm03.stdout:4/741: chown d5/d9/db/d19/d38/d7b/daa/daf/cb1 3045009 1 2026-03-10T14:08:40.798 INFO:tasks.workunit.client.0.vm03.stdout:2/684: sync 2026-03-10T14:08:40.802 INFO:tasks.workunit.client.0.vm03.stdout:8/784: write da/d3c/d51/d75/fa0 [782420,24727] 0 2026-03-10T14:08:40.805 INFO:tasks.workunit.client.0.vm03.stdout:9/692: truncate d2/d29/d33/d41/f50 612568 0 2026-03-10T14:08:40.808 INFO:tasks.workunit.client.0.vm03.stdout:3/785: truncate d1d/d39/d51/fa0 2310748 0 2026-03-10T14:08:40.808 INFO:tasks.workunit.client.0.vm03.stdout:9/693: fdatasync d2/fd2 0 2026-03-10T14:08:40.809 INFO:tasks.workunit.client.0.vm03.stdout:9/694: write d2/fdb [71436,103970] 0 2026-03-10T14:08:40.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:40 vm04.local ceph-mon[55966]: pgmap v7: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 24 MiB/s rd, 55 MiB/s wr, 165 op/s 2026-03-10T14:08:40.822 INFO:tasks.workunit.client.0.vm03.stdout:0/694: unlink d3/d46/dac/c52 0 2026-03-10T14:08:40.824 INFO:tasks.workunit.client.0.vm03.stdout:7/609: write d5/d9/d35/f52 [2945939,10023] 0 2026-03-10T14:08:40.829 INFO:tasks.workunit.client.0.vm03.stdout:8/785: dread da/d58/fc4 [0,4194304] 0 2026-03-10T14:08:40.833 INFO:tasks.workunit.client.0.vm03.stdout:7/610: dread d5/d9/f22 [0,4194304] 0 2026-03-10T14:08:40.834 INFO:tasks.workunit.client.0.vm03.stdout:8/786: sync 2026-03-10T14:08:40.839 INFO:tasks.workunit.client.0.vm03.stdout:2/685: dread d5/f9 [0,4194304] 0 2026-03-10T14:08:40.839 INFO:tasks.workunit.client.0.vm03.stdout:1/707: link d0/d2/df/d27/fc5 d0/d2/fe7 0 2026-03-10T14:08:40.840 INFO:tasks.workunit.client.0.vm03.stdout:2/686: dread - d5/d35/f58 zero size 2026-03-10T14:08:40.842 INFO:tasks.workunit.client.0.vm03.stdout:3/786: mknod d1d/d39/d51/d72/cfc 0 2026-03-10T14:08:40.845 INFO:tasks.workunit.client.0.vm03.stdout:6/706: dwrite d8/d11/d18/d54/f8b [0,4194304] 0 2026-03-10T14:08:40.848 INFO:tasks.workunit.client.0.vm03.stdout:5/853: link d4/d16/d19/d23/db8/fc3 d4/d40/dbc/d105/f10d 0 2026-03-10T14:08:40.851 INFO:tasks.workunit.client.0.vm03.stdout:9/695: mkdir d2/d14/d2b/d43/ddd 0 2026-03-10T14:08:40.868 INFO:tasks.workunit.client.0.vm03.stdout:6/707: mknod d8/db/d49/d6c/d32/cd7 0 2026-03-10T14:08:40.874 INFO:tasks.workunit.client.0.vm03.stdout:4/742: rmdir d5/d6e/db6/dc2 0 2026-03-10T14:08:40.875 INFO:tasks.workunit.client.0.vm03.stdout:3/787: write d1d/d39/d51/fa0 [1491895,18705] 0 2026-03-10T14:08:40.876 INFO:tasks.workunit.client.0.vm03.stdout:3/788: chown d1d/d33/d47/d53/d68/dcf/de7/c46 620 1 2026-03-10T14:08:40.878 INFO:tasks.workunit.client.0.vm03.stdout:3/789: write d1d/d33/d47/dac/dc1/fe2 [341628,104981] 0 2026-03-10T14:08:40.882 INFO:tasks.workunit.client.0.vm03.stdout:2/687: write d5/d10/d1f/f5e [906322,60888] 0 2026-03-10T14:08:40.887 INFO:tasks.workunit.client.0.vm03.stdout:0/695: mknod d3/d11/d76/db5/ddb/dd4/cdc 0 2026-03-10T14:08:40.890 INFO:tasks.workunit.client.0.vm03.stdout:9/696: symlink d2/d29/dcd/d72/lde 0 2026-03-10T14:08:40.895 INFO:tasks.workunit.client.0.vm03.stdout:7/611: creat d5/d9/d14/d26/d36/d51/dc8/fd2 x:0 0 0 2026-03-10T14:08:40.895 INFO:tasks.workunit.client.0.vm03.stdout:7/612: readlink d5/l5e 0 2026-03-10T14:08:40.896 INFO:tasks.workunit.client.0.vm03.stdout:8/787: getdents da/d36/d4d/da5/df1 0 2026-03-10T14:08:40.897 INFO:tasks.workunit.client.0.vm03.stdout:8/788: fsync da/d24/d49/f66 0 2026-03-10T14:08:40.899 INFO:tasks.workunit.client.0.vm03.stdout:4/743: unlink d5/d9/db/da7/cbb 0 2026-03-10T14:08:40.905 INFO:tasks.workunit.client.0.vm03.stdout:9/697: symlink d2/d14/d2b/d34/ldf 0 2026-03-10T14:08:40.910 INFO:tasks.workunit.client.0.vm03.stdout:1/708: creat d0/d18/fe8 x:0 0 0 2026-03-10T14:08:40.910 INFO:tasks.workunit.client.0.vm03.stdout:3/790: write d1d/d33/f3a [2606130,60777] 0 2026-03-10T14:08:40.912 INFO:tasks.workunit.client.0.vm03.stdout:5/854: rename d4/d35 to d4/d6/de/d10e 0 2026-03-10T14:08:40.913 INFO:tasks.workunit.client.0.vm03.stdout:6/708: mknod d8/d11/cd8 0 2026-03-10T14:08:40.913 INFO:tasks.workunit.client.0.vm03.stdout:6/709: chown d8/db/d49/d6c/d32/d3a 30349 1 2026-03-10T14:08:40.915 INFO:tasks.workunit.client.0.vm03.stdout:2/688: mkdir d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/de1 0 2026-03-10T14:08:40.916 INFO:tasks.workunit.client.0.vm03.stdout:2/689: read d5/d10/d1f/d4f/d76/da7/d40/f68 [461273,38645] 0 2026-03-10T14:08:40.919 INFO:tasks.workunit.client.0.vm03.stdout:3/791: creat d1d/d33/d47/d53/d68/dcf/de7/d41/ffd x:0 0 0 2026-03-10T14:08:40.920 INFO:tasks.workunit.client.0.vm03.stdout:3/792: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95 0 1 2026-03-10T14:08:40.922 INFO:tasks.workunit.client.0.vm03.stdout:6/710: creat d8/d1b/d1c/fd9 x:0 0 0 2026-03-10T14:08:40.929 INFO:tasks.workunit.client.0.vm03.stdout:4/744: symlink d5/d9/db/d19/d38/ded/lf8 0 2026-03-10T14:08:40.935 INFO:tasks.workunit.client.0.vm03.stdout:8/789: dwrite da/d36/d40/f47 [0,4194304] 0 2026-03-10T14:08:40.935 INFO:tasks.workunit.client.0.vm03.stdout:8/790: dread - da/d58/d5f/fdb zero size 2026-03-10T14:08:40.948 INFO:tasks.workunit.client.0.vm03.stdout:9/698: fdatasync d2/d14/d2b/d79/fb8 0 2026-03-10T14:08:40.955 INFO:tasks.workunit.client.0.vm03.stdout:0/696: rename d3/d4d/c62 to d3/d46/dac/d79/cdd 0 2026-03-10T14:08:40.955 INFO:tasks.workunit.client.0.vm03.stdout:7/613: rename d5/d9/d35 to d5/d9/d35/dab/dd3 22 2026-03-10T14:08:40.956 INFO:tasks.workunit.client.0.vm03.stdout:6/711: creat d8/db/d49/d58/fda x:0 0 0 2026-03-10T14:08:40.961 INFO:tasks.workunit.client.0.vm03.stdout:8/791: dwrite da/d3c/f48 [8388608,4194304] 0 2026-03-10T14:08:40.967 INFO:tasks.workunit.client.0.vm03.stdout:1/709: symlink d0/d2/db6/le9 0 2026-03-10T14:08:40.968 INFO:tasks.workunit.client.0.vm03.stdout:3/793: fdatasync d1d/d33/d65/d5d/dae/fee 0 2026-03-10T14:08:40.969 INFO:tasks.workunit.client.0.vm03.stdout:3/794: stat d1d/d33/d47/d53/d68/dcf/de7/d41/dc0/fca 0 2026-03-10T14:08:40.974 INFO:tasks.workunit.client.0.vm03.stdout:0/697: creat d3/d16/d21/d3c/fde x:0 0 0 2026-03-10T14:08:40.978 INFO:tasks.workunit.client.0.vm03.stdout:7/614: creat d5/d9/d14/d26/d39/fd4 x:0 0 0 2026-03-10T14:08:40.978 INFO:tasks.workunit.client.0.vm03.stdout:6/712: mknod d8/d11/d7a/cdb 0 2026-03-10T14:08:40.979 INFO:tasks.workunit.client.0.vm03.stdout:3/795: creat d1d/d33/d65/d48/ffe x:0 0 0 2026-03-10T14:08:40.980 INFO:tasks.workunit.client.0.vm03.stdout:3/796: read d1d/d33/d47/dac/dc1/fe2 [207007,15334] 0 2026-03-10T14:08:40.983 INFO:tasks.workunit.client.0.vm03.stdout:9/699: unlink d2/d29/d33/d41/f50 0 2026-03-10T14:08:40.989 INFO:tasks.workunit.client.0.vm03.stdout:6/713: mkdir d8/db/d49/d58/ddc 0 2026-03-10T14:08:40.990 INFO:tasks.workunit.client.0.vm03.stdout:8/792: mknod da/d3a/dce/dfd/c101 0 2026-03-10T14:08:40.995 INFO:tasks.workunit.client.0.vm03.stdout:2/690: dwrite d5/f9 [0,4194304] 0 2026-03-10T14:08:40.995 INFO:tasks.workunit.client.0.vm03.stdout:1/710: fsync d0/d2/df/d27/f58 0 2026-03-10T14:08:40.996 INFO:tasks.workunit.client.0.vm03.stdout:1/711: chown d0/d2/df/d91/dda 2 1 2026-03-10T14:08:41.008 INFO:tasks.workunit.client.0.vm03.stdout:9/700: sync 2026-03-10T14:08:41.010 INFO:tasks.workunit.client.0.vm03.stdout:4/745: dwrite d5/d9/fb8 [4194304,4194304] 0 2026-03-10T14:08:41.028 INFO:tasks.workunit.client.0.vm03.stdout:0/698: symlink d3/d11/d2c/d4a/d4b/d89/db9/ldf 0 2026-03-10T14:08:41.029 INFO:tasks.workunit.client.0.vm03.stdout:7/615: write d5/d9/d14/d26/f8d [140160,114036] 0 2026-03-10T14:08:41.037 INFO:tasks.workunit.client.0.vm03.stdout:2/691: dread d5/f4d [0,4194304] 0 2026-03-10T14:08:41.040 INFO:tasks.workunit.client.0.vm03.stdout:9/701: rename d2/d29/l6f to d2/d14/d2b/d76/le0 0 2026-03-10T14:08:41.040 INFO:tasks.workunit.client.0.vm03.stdout:4/746: fsync d5/d47/d5b/d64/d85/f9d 0 2026-03-10T14:08:41.043 INFO:tasks.workunit.client.0.vm03.stdout:7/616: rmdir d5/d9/d14/d26/d39/d92 39 2026-03-10T14:08:41.043 INFO:tasks.workunit.client.0.vm03.stdout:6/714: getdents d8/d11/da0/dca 0 2026-03-10T14:08:41.048 INFO:tasks.workunit.client.0.vm03.stdout:9/702: dwrite d2/d29/d33/d41/daa/fcf [0,4194304] 0 2026-03-10T14:08:41.052 INFO:tasks.workunit.client.0.vm03.stdout:0/699: mkdir d3/d46/de0 0 2026-03-10T14:08:41.059 INFO:tasks.workunit.client.0.vm03.stdout:4/747: mkdir d5/d9/db/d19/d38/d53/d71/df9 0 2026-03-10T14:08:41.072 INFO:tasks.workunit.client.0.vm03.stdout:2/692: dread d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/f73 [0,4194304] 0 2026-03-10T14:08:41.073 INFO:tasks.workunit.client.0.vm03.stdout:1/712: getdents d0/d42 0 2026-03-10T14:08:41.074 INFO:tasks.workunit.client.0.vm03.stdout:1/713: write d0/d2/df/dab/f73 [1212362,69537] 0 2026-03-10T14:08:41.079 INFO:tasks.workunit.client.0.vm03.stdout:5/855: write d4/d6/de/f14 [6835025,101765] 0 2026-03-10T14:08:41.080 INFO:tasks.workunit.client.0.vm03.stdout:1/714: dwrite d0/d2/df/d16/d41/f68 [0,4194304] 0 2026-03-10T14:08:41.093 INFO:tasks.workunit.client.0.vm03.stdout:6/715: creat d8/d11/da0/dbf/d8c/db6/fdd x:0 0 0 2026-03-10T14:08:41.102 INFO:tasks.workunit.client.0.vm03.stdout:3/797: dwrite d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/fbe [0,4194304] 0 2026-03-10T14:08:41.107 INFO:tasks.workunit.client.0.vm03.stdout:7/617: mkdir d5/d9/d14/d26/d36/d51/d7b/d9c/dd5 0 2026-03-10T14:08:41.113 INFO:tasks.workunit.client.0.vm03.stdout:8/793: dwrite da/f30 [0,4194304] 0 2026-03-10T14:08:41.141 INFO:tasks.workunit.client.0.vm03.stdout:6/716: unlink d8/db/d49/d6c/l45 0 2026-03-10T14:08:41.149 INFO:tasks.workunit.client.0.vm03.stdout:7/618: fsync d5/d9/f1f 0 2026-03-10T14:08:41.153 INFO:tasks.workunit.client.0.vm03.stdout:8/794: dread da/d24/f3d [0,4194304] 0 2026-03-10T14:08:41.158 INFO:tasks.workunit.client.0.vm03.stdout:9/703: rename d2/d29/da7 to d2/d29/d33/d60/d8c/de1 0 2026-03-10T14:08:41.163 INFO:tasks.workunit.client.0.vm03.stdout:0/700: creat d3/d11/d2c/d4a/d4b/d89/db6/fe1 x:0 0 0 2026-03-10T14:08:41.163 INFO:tasks.workunit.client.0.vm03.stdout:5/856: truncate d4/d6/f97 2527536 0 2026-03-10T14:08:41.164 INFO:tasks.workunit.client.0.vm03.stdout:8/795: dread da/d24/db4/fdf [0,4194304] 0 2026-03-10T14:08:41.195 INFO:tasks.workunit.client.0.vm03.stdout:4/748: rename d5/l2d to d5/d9/db/d19/d38/d53/d71/lfa 0 2026-03-10T14:08:41.200 INFO:tasks.workunit.client.0.vm03.stdout:9/704: creat d2/d29/d33/d60/d7f/fe2 x:0 0 0 2026-03-10T14:08:41.202 INFO:tasks.workunit.client.0.vm03.stdout:0/701: creat d3/d11/d66/da2/fe2 x:0 0 0 2026-03-10T14:08:41.209 INFO:tasks.workunit.client.0.vm03.stdout:8/796: creat da/d58/d5f/f102 x:0 0 0 2026-03-10T14:08:41.210 INFO:tasks.workunit.client.0.vm03.stdout:8/797: fdatasync da/d3c/f48 0 2026-03-10T14:08:41.216 INFO:tasks.workunit.client.0.vm03.stdout:1/715: creat d0/d2/df/d27/fea x:0 0 0 2026-03-10T14:08:41.235 INFO:tasks.workunit.client.0.vm03.stdout:3/798: rmdir d1d/d33/d47/dac/dba/dc7 0 2026-03-10T14:08:41.245 INFO:tasks.workunit.client.0.vm03.stdout:3/799: dread d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/f9e [0,4194304] 0 2026-03-10T14:08:41.257 INFO:tasks.workunit.client.0.vm03.stdout:3/800: dwrite d1d/d39/d51/d72/fab [0,4194304] 0 2026-03-10T14:08:41.257 INFO:tasks.workunit.client.0.vm03.stdout:2/693: rename d5/d10/fa4 to d5/d10/d1f/d4f/d76/da7/d54/fe2 0 2026-03-10T14:08:41.260 INFO:tasks.workunit.client.0.vm03.stdout:9/705: truncate d2/f4c 1101759 0 2026-03-10T14:08:41.266 INFO:tasks.workunit.client.0.vm03.stdout:0/702: fsync d3/d11/d2c/d4a/d4b/d89/fbb 0 2026-03-10T14:08:41.267 INFO:tasks.workunit.client.0.vm03.stdout:5/857: rmdir d4/d13/d43 39 2026-03-10T14:08:41.270 INFO:tasks.workunit.client.0.vm03.stdout:1/716: creat d0/d2/df/d16/d41/feb x:0 0 0 2026-03-10T14:08:41.271 INFO:tasks.workunit.client.0.vm03.stdout:7/619: creat d5/d9/d14/d26/d39/fd6 x:0 0 0 2026-03-10T14:08:41.271 INFO:tasks.workunit.client.0.vm03.stdout:6/717: creat d8/d11/da0/dbf/fde x:0 0 0 2026-03-10T14:08:41.273 INFO:tasks.workunit.client.0.vm03.stdout:6/718: chown d8/d11/da0/dbf/l88 9 1 2026-03-10T14:08:41.273 INFO:tasks.workunit.client.0.vm03.stdout:0/703: write d3/d46/dac/fc2 [150973,79717] 0 2026-03-10T14:08:41.287 INFO:tasks.workunit.client.0.vm03.stdout:2/694: unlink d5/d10/da3/dab/fb3 0 2026-03-10T14:08:41.301 INFO:tasks.workunit.client.0.vm03.stdout:3/801: chown d1d/d33/d47/d53/d68/dcf/de7/l6d 299693 1 2026-03-10T14:08:41.314 INFO:tasks.workunit.client.0.vm03.stdout:8/798: rmdir da 39 2026-03-10T14:08:41.324 INFO:tasks.workunit.client.0.vm03.stdout:7/620: read - d5/d9/d14/d26/f9f zero size 2026-03-10T14:08:41.325 INFO:tasks.workunit.client.0.vm03.stdout:5/858: read d4/d13/d1f/f83 [7644506,16840] 0 2026-03-10T14:08:41.327 INFO:tasks.workunit.client.0.vm03.stdout:9/706: dwrite d2/d29/d33/d60/f65 [0,4194304] 0 2026-03-10T14:08:41.327 INFO:tasks.workunit.client.0.vm03.stdout:1/717: fsync d0/d42/f66 0 2026-03-10T14:08:41.348 INFO:tasks.workunit.client.0.vm03.stdout:4/749: link d5/d9/db/d19/d38/f77 d5/d47/d62/d8a/da0/ffb 0 2026-03-10T14:08:41.351 INFO:tasks.workunit.client.0.vm03.stdout:7/621: fdatasync d5/d9/d14/d26/f9f 0 2026-03-10T14:08:41.357 INFO:tasks.workunit.client.0.vm03.stdout:8/799: chown da/df7/lfa 305176 1 2026-03-10T14:08:41.357 INFO:tasks.workunit.client.0.vm03.stdout:9/707: symlink d2/d29/d38/le3 0 2026-03-10T14:08:41.357 INFO:tasks.workunit.client.0.vm03.stdout:1/718: mkdir d0/d18/d3b/dec 0 2026-03-10T14:08:41.358 INFO:tasks.workunit.client.0.vm03.stdout:2/695: read d5/d10/d1f/f3f [3148009,70617] 0 2026-03-10T14:08:41.360 INFO:tasks.workunit.client.0.vm03.stdout:2/696: dread - d5/d35/f58 zero size 2026-03-10T14:08:41.370 INFO:tasks.workunit.client.0.vm03.stdout:5/859: sync 2026-03-10T14:08:41.373 INFO:tasks.workunit.client.0.vm03.stdout:4/750: creat d5/d47/d5b/dbe/ffc x:0 0 0 2026-03-10T14:08:41.377 INFO:tasks.workunit.client.0.vm03.stdout:0/704: rename d3/d4d/d30/f81 to d3/d11/d76/dd3/fe3 0 2026-03-10T14:08:41.379 INFO:tasks.workunit.client.0.vm03.stdout:5/860: sync 2026-03-10T14:08:41.382 INFO:tasks.workunit.client.0.vm03.stdout:1/719: fsync d0/d2/df/d27/f52 0 2026-03-10T14:08:41.383 INFO:tasks.workunit.client.0.vm03.stdout:8/800: dwrite da/d58/d5f/dbc/d70/d7d/fda [0,4194304] 0 2026-03-10T14:08:41.384 INFO:tasks.workunit.client.0.vm03.stdout:2/697: rmdir d5/d10/d1f/d4f/d76/da7/d54/d5f 39 2026-03-10T14:08:41.386 INFO:tasks.workunit.client.0.vm03.stdout:8/801: fdatasync da/d58/d6c/fd8 0 2026-03-10T14:08:41.386 INFO:tasks.workunit.client.0.vm03.stdout:7/622: mknod d5/d9/d14/d26/d36/cd7 0 2026-03-10T14:08:41.387 INFO:tasks.workunit.client.0.vm03.stdout:6/719: getdents d8/d11/d7a 0 2026-03-10T14:08:41.397 INFO:tasks.workunit.client.0.vm03.stdout:8/802: dread da/d3c/d4b/d4c/fe5 [0,4194304] 0 2026-03-10T14:08:41.417 INFO:tasks.workunit.client.0.vm03.stdout:0/705: creat d3/d11/d76/db5/fe4 x:0 0 0 2026-03-10T14:08:41.418 INFO:tasks.workunit.client.0.vm03.stdout:3/802: dwrite d1d/f32 [0,4194304] 0 2026-03-10T14:08:41.418 INFO:tasks.workunit.client.0.vm03.stdout:5/861: symlink d4/d13/d1f/d84/l10f 0 2026-03-10T14:08:41.422 INFO:tasks.workunit.client.0.vm03.stdout:3/803: chown d1d/d33/f3a 34217776 1 2026-03-10T14:08:41.425 INFO:tasks.workunit.client.0.vm03.stdout:7/623: dread d5/d9/d14/d26/d5f/f8e [4194304,4194304] 0 2026-03-10T14:08:41.439 INFO:tasks.workunit.client.0.vm03.stdout:1/720: write d0/d2/df/dab/fc7 [501121,99672] 0 2026-03-10T14:08:41.446 INFO:tasks.workunit.client.0.vm03.stdout:8/803: unlink da/d58/d5f/c74 0 2026-03-10T14:08:41.447 INFO:tasks.workunit.client.0.vm03.stdout:8/804: write da/d3c/d51/d75/dc2/fe1 [1267744,101306] 0 2026-03-10T14:08:41.448 INFO:tasks.workunit.client.0.vm03.stdout:0/706: fdatasync d3/d11/d2c/d4a/d4b/f7d 0 2026-03-10T14:08:41.460 INFO:tasks.workunit.client.0.vm03.stdout:1/721: mkdir d0/d2/df/d16/ded 0 2026-03-10T14:08:41.460 INFO:tasks.workunit.client.0.vm03.stdout:1/722: dread - d0/d2/df/d27/d7e/d81/fdb zero size 2026-03-10T14:08:41.461 INFO:tasks.workunit.client.0.vm03.stdout:3/804: dread d1d/d33/d47/d53/d68/dcf/de7/d41/dc0/fca [0,4194304] 0 2026-03-10T14:08:41.463 INFO:tasks.workunit.client.0.vm03.stdout:1/723: chown d0/d2/df/d27/l5c 543 1 2026-03-10T14:08:41.463 INFO:tasks.workunit.client.0.vm03.stdout:9/708: write d2/d29/d33/d41/daa/fb9 [302367,83201] 0 2026-03-10T14:08:41.465 INFO:tasks.workunit.client.0.vm03.stdout:3/805: fdatasync d1d/d33/d65/d48/fa5 0 2026-03-10T14:08:41.480 INFO:tasks.workunit.client.0.vm03.stdout:7/624: mkdir d5/d9/d35/da8/dd8 0 2026-03-10T14:08:41.482 INFO:tasks.workunit.client.0.vm03.stdout:8/805: write da/d58/d5f/d67/fc8 [831700,50973] 0 2026-03-10T14:08:41.486 INFO:tasks.workunit.client.0.vm03.stdout:2/698: creat d5/d10/d1f/d4f/d76/da7/fe3 x:0 0 0 2026-03-10T14:08:41.488 INFO:tasks.workunit.client.0.vm03.stdout:1/724: mknod d0/d18/daa/cee 0 2026-03-10T14:08:41.489 INFO:tasks.workunit.client.0.vm03.stdout:1/725: write d0/d2/df/d27/d7e/fa0 [791,101132] 0 2026-03-10T14:08:41.492 INFO:tasks.workunit.client.0.vm03.stdout:4/751: getdents d5/d9/db/d19/d34 0 2026-03-10T14:08:41.492 INFO:tasks.workunit.client.0.vm03.stdout:9/709: mknod d2/d29/d33/d60/d8c/ce4 0 2026-03-10T14:08:41.493 INFO:tasks.workunit.client.0.vm03.stdout:3/806: creat d1d/d39/da1/de0/fff x:0 0 0 2026-03-10T14:08:41.497 INFO:tasks.workunit.client.0.vm03.stdout:7/625: unlink d5/d9/d3e/c58 0 2026-03-10T14:08:41.497 INFO:tasks.workunit.client.0.vm03.stdout:8/806: symlink da/d58/d5f/l103 0 2026-03-10T14:08:41.498 INFO:tasks.workunit.client.0.vm03.stdout:6/720: getdents d8/d11/da0/dbf 0 2026-03-10T14:08:41.499 INFO:tasks.workunit.client.0.vm03.stdout:1/726: rename d0/d2/df/d27/fea to d0/d2/df/d16/d41/dba/fef 0 2026-03-10T14:08:41.501 INFO:tasks.workunit.client.0.vm03.stdout:2/699: sync 2026-03-10T14:08:41.501 INFO:tasks.workunit.client.0.vm03.stdout:8/807: sync 2026-03-10T14:08:41.501 INFO:tasks.workunit.client.0.vm03.stdout:1/727: truncate d0/d2/fe7 1007060 0 2026-03-10T14:08:41.501 INFO:tasks.workunit.client.0.vm03.stdout:8/808: readlink da/d3a/l98 0 2026-03-10T14:08:41.502 INFO:tasks.workunit.client.0.vm03.stdout:8/809: chown da/d3c/d51/d75/dc2/dc6 13671944 1 2026-03-10T14:08:41.506 INFO:tasks.workunit.client.0.vm03.stdout:4/752: creat d5/d47/d62/d8a/dd0/ffd x:0 0 0 2026-03-10T14:08:41.506 INFO:tasks.workunit.client.0.vm03.stdout:3/807: dread d1d/d39/d51/fa0 [0,4194304] 0 2026-03-10T14:08:41.515 INFO:tasks.workunit.client.0.vm03.stdout:8/810: dwrite da/f33 [0,4194304] 0 2026-03-10T14:08:41.535 INFO:tasks.workunit.client.0.vm03.stdout:9/710: creat d2/d29/dcd/d9c/fe5 x:0 0 0 2026-03-10T14:08:41.537 INFO:tasks.workunit.client.0.vm03.stdout:5/862: write d4/d16/f71 [1509189,64477] 0 2026-03-10T14:08:41.544 INFO:tasks.workunit.client.0.vm03.stdout:2/700: creat d5/d10/d1f/d4f/d76/da7/d40/d59/fe4 x:0 0 0 2026-03-10T14:08:41.545 INFO:tasks.workunit.client.0.vm03.stdout:2/701: chown d5/d10/l1b 223 1 2026-03-10T14:08:41.550 INFO:tasks.workunit.client.0.vm03.stdout:0/707: getdents d3/d11/d76 0 2026-03-10T14:08:41.550 INFO:tasks.workunit.client.0.vm03.stdout:6/721: getdents d8/db/d49/d76/db5 0 2026-03-10T14:08:41.552 INFO:tasks.workunit.client.0.vm03.stdout:3/808: mkdir d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/dde/d100 0 2026-03-10T14:08:41.557 INFO:tasks.workunit.client.0.vm03.stdout:8/811: dread da/d58/d5f/dbc/d70/d7d/fa1 [0,4194304] 0 2026-03-10T14:08:41.562 INFO:tasks.workunit.client.0.vm03.stdout:8/812: chown da/d24 14253 1 2026-03-10T14:08:41.564 INFO:tasks.workunit.client.0.vm03.stdout:4/753: dread d5/d9/f16 [0,4194304] 0 2026-03-10T14:08:41.565 INFO:tasks.workunit.client.0.vm03.stdout:6/722: dread d8/d11/d7a/fad [0,4194304] 0 2026-03-10T14:08:41.567 INFO:tasks.workunit.client.0.vm03.stdout:3/809: dread d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f81 [0,4194304] 0 2026-03-10T14:08:41.572 INFO:tasks.workunit.client.0.vm03.stdout:0/708: mknod d3/d11/d76/db5/ce5 0 2026-03-10T14:08:41.573 INFO:tasks.workunit.client.0.vm03.stdout:7/626: unlink d5/d9/d14/d26/d36/l3d 0 2026-03-10T14:08:41.576 INFO:tasks.workunit.client.0.vm03.stdout:1/728: creat d0/d18/d3b/dec/ff0 x:0 0 0 2026-03-10T14:08:41.581 INFO:tasks.workunit.client.0.vm03.stdout:8/813: rename da/d58/da8 to da/d3c/d51/d85/d104 0 2026-03-10T14:08:41.581 INFO:tasks.workunit.client.0.vm03.stdout:9/711: link d2/d29/d33/fa5 d2/d29/d33/d41/d46/dd0/fe6 0 2026-03-10T14:08:41.584 INFO:tasks.workunit.client.0.vm03.stdout:8/814: chown da/d3c/d4b/df9 213013 1 2026-03-10T14:08:41.586 INFO:tasks.workunit.client.0.vm03.stdout:6/723: symlink d8/d11/da0/ldf 0 2026-03-10T14:08:41.589 INFO:tasks.workunit.client.0.vm03.stdout:3/810: mknod d1d/d33/d65/d5d/dae/c101 0 2026-03-10T14:08:41.595 INFO:tasks.workunit.client.0.vm03.stdout:7/627: mkdir d5/d98/dd9 0 2026-03-10T14:08:41.598 INFO:tasks.workunit.client.0.vm03.stdout:0/709: mkdir d3/d11/d2c/d4a/d4b/d89/db9/de6 0 2026-03-10T14:08:41.599 INFO:tasks.workunit.client.0.vm03.stdout:0/710: chown d3/d11/d2c/d4a/l4f 9604164 1 2026-03-10T14:08:41.606 INFO:tasks.workunit.client.0.vm03.stdout:9/712: fsync d2/d14/f61 0 2026-03-10T14:08:41.606 INFO:tasks.workunit.client.0.vm03.stdout:4/754: link d5/d47/d62/d8a/dd0/ffd d5/d47/d5b/dbe/dc4/d8e/ffe 0 2026-03-10T14:08:41.607 INFO:tasks.workunit.client.0.vm03.stdout:8/815: symlink da/d24/db4/l105 0 2026-03-10T14:08:41.608 INFO:tasks.workunit.client.0.vm03.stdout:6/724: rename d8/d11/da0/dbf/d86 to d8/db/d12/d64/de0 0 2026-03-10T14:08:41.613 INFO:tasks.workunit.client.0.vm03.stdout:2/702: creat d5/d10/d31/fe5 x:0 0 0 2026-03-10T14:08:41.613 INFO:tasks.workunit.client.0.vm03.stdout:3/811: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/f102 x:0 0 0 2026-03-10T14:08:41.613 INFO:tasks.workunit.client.0.vm03.stdout:5/863: getdents d4/d16/d19/d6e/d7f/dd1 0 2026-03-10T14:08:41.614 INFO:tasks.workunit.client.0.vm03.stdout:7/628: creat d5/d9/d14/d26/d36/db4/fda x:0 0 0 2026-03-10T14:08:41.614 INFO:tasks.workunit.client.0.vm03.stdout:5/864: chown d4/d13/d1f/fb1 60254 1 2026-03-10T14:08:41.626 INFO:tasks.workunit.client.0.vm03.stdout:9/713: dwrite d2/f37 [0,4194304] 0 2026-03-10T14:08:41.633 INFO:tasks.workunit.client.0.vm03.stdout:5/865: dread d4/d13/d1f/de1/fe9 [0,4194304] 0 2026-03-10T14:08:41.634 INFO:tasks.workunit.client.0.vm03.stdout:0/711: mknod d3/d46/dac/ce7 0 2026-03-10T14:08:41.640 INFO:tasks.workunit.client.0.vm03.stdout:1/729: dwrite d0/d2/fe7 [0,4194304] 0 2026-03-10T14:08:41.641 INFO:tasks.workunit.client.0.vm03.stdout:6/725: creat d8/db/d49/fe1 x:0 0 0 2026-03-10T14:08:41.644 INFO:tasks.workunit.client.0.vm03.stdout:1/730: chown d0/d2/df/f43 674777896 1 2026-03-10T14:08:41.648 INFO:tasks.workunit.client.0.vm03.stdout:8/816: mkdir da/d3c/d51/d85/dbb/d106 0 2026-03-10T14:08:41.650 INFO:tasks.workunit.client.0.vm03.stdout:4/755: unlink d5/d9/db/da7/db9/cd4 0 2026-03-10T14:08:41.651 INFO:tasks.workunit.client.0.vm03.stdout:8/817: chown da/d3c/f9e 12252 1 2026-03-10T14:08:41.652 INFO:tasks.workunit.client.0.vm03.stdout:3/812: truncate d1d/d33/d47/d53/d68/dcf/de7/d41/f70 247880 0 2026-03-10T14:08:41.653 INFO:tasks.workunit.client.0.vm03.stdout:3/813: stat fb 0 2026-03-10T14:08:41.653 INFO:tasks.workunit.client.0.vm03.stdout:8/818: write da/d58/d5f/dbc/d70/d7d/f9c [4869252,78470] 0 2026-03-10T14:08:41.654 INFO:tasks.workunit.client.0.vm03.stdout:8/819: chown da/d3c/d51/d85/dbb/d106 1 1 2026-03-10T14:08:41.667 INFO:tasks.workunit.client.0.vm03.stdout:5/866: symlink d4/d16/d19/d23/db8/l110 0 2026-03-10T14:08:41.668 INFO:tasks.workunit.client.0.vm03.stdout:7/629: fsync d5/d9/d14/d26/d5f/f8c 0 2026-03-10T14:08:41.669 INFO:tasks.workunit.client.0.vm03.stdout:0/712: read d3/d4d/f49 [3318666,98895] 0 2026-03-10T14:08:41.671 INFO:tasks.workunit.client.0.vm03.stdout:0/713: write d3/d11/d2c/d4a/fcf [138748,50927] 0 2026-03-10T14:08:41.678 INFO:tasks.workunit.client.0.vm03.stdout:4/756: mkdir d5/db4/dd2/dff 0 2026-03-10T14:08:41.685 INFO:tasks.workunit.client.0.vm03.stdout:1/731: dread d0/d2/df/d16/d41/fa8 [4194304,4194304] 0 2026-03-10T14:08:41.688 INFO:tasks.workunit.client.0.vm03.stdout:1/732: chown d0/d2/db6/le9 11218138 1 2026-03-10T14:08:41.692 INFO:tasks.workunit.client.0.vm03.stdout:8/820: rename da/d3c/d4b/d4c/fae to da/d3c/d51/d85/d104/f107 0 2026-03-10T14:08:41.693 INFO:tasks.workunit.client.0.vm03.stdout:0/714: dwrite d3/d16/d21/d3c/fc0 [0,4194304] 0 2026-03-10T14:08:41.695 INFO:tasks.workunit.client.0.vm03.stdout:3/814: dread d1d/d33/d47/d53/d68/dcf/de7/f6b [0,4194304] 0 2026-03-10T14:08:41.701 INFO:tasks.workunit.client.0.vm03.stdout:9/714: dwrite d2/d29/d9a/fb7 [0,4194304] 0 2026-03-10T14:08:41.714 INFO:tasks.workunit.client.0.vm03.stdout:6/726: mkdir d8/db/dd0/de2 0 2026-03-10T14:08:41.714 INFO:tasks.workunit.client.0.vm03.stdout:6/727: fdatasync d8/d1b/d1c/f73 0 2026-03-10T14:08:41.727 INFO:tasks.workunit.client.0.vm03.stdout:7/630: unlink d5/l25 0 2026-03-10T14:08:41.733 INFO:tasks.workunit.client.0.vm03.stdout:4/757: rename d5/d47/d5b/dbe/dc4 to d5/d9/db/da7/dcc/d100 0 2026-03-10T14:08:41.736 INFO:tasks.workunit.client.0.vm03.stdout:3/815: rmdir d1d/d39 39 2026-03-10T14:08:41.751 INFO:tasks.workunit.client.0.vm03.stdout:8/821: truncate da/d3a/f9a 593940 0 2026-03-10T14:08:41.751 INFO:tasks.workunit.client.0.vm03.stdout:2/703: getdents d5/d10 0 2026-03-10T14:08:41.751 INFO:tasks.workunit.client.0.vm03.stdout:7/631: creat d5/d9/d14/d26/d5f/fdb x:0 0 0 2026-03-10T14:08:41.756 INFO:tasks.workunit.client.0.vm03.stdout:3/816: unlink d1d/d33/d65/d5d/fe9 0 2026-03-10T14:08:41.761 INFO:tasks.workunit.client.0.vm03.stdout:9/715: mknod d2/d29/ddc/ce7 0 2026-03-10T14:08:41.763 INFO:tasks.workunit.client.0.vm03.stdout:9/716: chown d2/d14/d2b/d43/ddd 1 1 2026-03-10T14:08:41.763 INFO:tasks.workunit.client.0.vm03.stdout:7/632: creat d5/d9/d35/dab/fdc x:0 0 0 2026-03-10T14:08:41.764 INFO:tasks.workunit.client.0.vm03.stdout:8/822: mknod da/d3c/d51/d85/c108 0 2026-03-10T14:08:41.768 INFO:tasks.workunit.client.0.vm03.stdout:5/867: dwrite d4/d16/fac [0,4194304] 0 2026-03-10T14:08:41.772 INFO:tasks.workunit.client.0.vm03.stdout:4/758: truncate d5/d9/db/f6d 27179 0 2026-03-10T14:08:41.775 INFO:tasks.workunit.client.0.vm03.stdout:9/717: mknod d2/d14/d2b/d76/ce8 0 2026-03-10T14:08:41.777 INFO:tasks.workunit.client.0.vm03.stdout:6/728: read d8/d11/da0/dbf/d5c/d60/f82 [2385847,96597] 0 2026-03-10T14:08:41.780 INFO:tasks.workunit.client.0.vm03.stdout:9/718: stat d2/d14/f96 0 2026-03-10T14:08:41.781 INFO:tasks.workunit.client.0.vm03.stdout:7/633: unlink d5/d9/d14/d26/d36/cd7 0 2026-03-10T14:08:41.786 INFO:tasks.workunit.client.0.vm03.stdout:1/733: dwrite d0/d18/f75 [0,4194304] 0 2026-03-10T14:08:41.790 INFO:tasks.workunit.client.0.vm03.stdout:2/704: truncate d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/f8c 1186559 0 2026-03-10T14:08:41.813 INFO:tasks.workunit.client.0.vm03.stdout:0/715: write d3/d16/f9e [3338504,68708] 0 2026-03-10T14:08:41.813 INFO:tasks.workunit.client.0.vm03.stdout:7/634: creat d5/d9/d14/d26/d36/d51/dc8/fdd x:0 0 0 2026-03-10T14:08:41.822 INFO:tasks.workunit.client.0.vm03.stdout:2/705: mknod d5/d10/da3/dab/ce6 0 2026-03-10T14:08:41.822 INFO:tasks.workunit.client.0.vm03.stdout:5/868: rmdir d4/d6/de 39 2026-03-10T14:08:41.822 INFO:tasks.workunit.client.0.vm03.stdout:4/759: dread d5/d9/db/d19/d34/f4c [0,4194304] 0 2026-03-10T14:08:41.828 INFO:tasks.workunit.client.0.vm03.stdout:6/729: rename d8/d11/c16 to d8/db/df/ce3 0 2026-03-10T14:08:41.830 INFO:tasks.workunit.client.0.vm03.stdout:5/869: dwrite d4/d13/d8f/fd2 [0,4194304] 0 2026-03-10T14:08:41.836 INFO:tasks.workunit.client.0.vm03.stdout:1/734: mkdir d0/d2/df/d16/ded/df1 0 2026-03-10T14:08:41.841 INFO:tasks.workunit.client.0.vm03.stdout:0/716: dwrite d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:41.846 INFO:tasks.workunit.client.0.vm03.stdout:0/717: write d3/d4d/d8f/fd1 [191410,89616] 0 2026-03-10T14:08:41.856 INFO:tasks.workunit.client.0.vm03.stdout:6/730: creat d8/d11/da0/dbf/d5c/fe4 x:0 0 0 2026-03-10T14:08:41.860 INFO:tasks.workunit.client.0.vm03.stdout:2/706: creat d5/d10/d1f/d4f/dbe/fe7 x:0 0 0 2026-03-10T14:08:41.864 INFO:tasks.workunit.client.0.vm03.stdout:0/718: mkdir d3/d11/d2c/d4a/d4b/d89/db9/de6/de8 0 2026-03-10T14:08:41.864 INFO:tasks.workunit.client.0.vm03.stdout:0/719: dread - d3/d4d/fcc zero size 2026-03-10T14:08:41.865 INFO:tasks.workunit.client.0.vm03.stdout:5/870: link d4/d16/d19/d23/db8/l110 d4/d13/d1f/d8c/da7/l111 0 2026-03-10T14:08:41.865 INFO:tasks.workunit.client.0.vm03.stdout:0/720: fsync d3/d46/da9/fc6 0 2026-03-10T14:08:41.865 INFO:tasks.workunit.client.0.vm03.stdout:5/871: readlink d4/d6/ddc/lfd 0 2026-03-10T14:08:41.866 INFO:tasks.workunit.client.0.vm03.stdout:5/872: chown d4/d16/df4/c104 1105 1 2026-03-10T14:08:41.867 INFO:tasks.workunit.client.0.vm03.stdout:5/873: chown d4/d16/d19/d23/d3f/df3/ff7 3848260 1 2026-03-10T14:08:41.870 INFO:tasks.workunit.client.0.vm03.stdout:0/721: mknod d3/d11/d2c/d4a/d7b/ce9 0 2026-03-10T14:08:41.872 INFO:tasks.workunit.client.0.vm03.stdout:5/874: truncate d4/d13/d8f/f91 4399479 0 2026-03-10T14:08:41.892 INFO:tasks.workunit.client.0.vm03.stdout:3/817: dwrite d1d/d33/d47/d53/d68/dcf/de7/f87 [4194304,4194304] 0 2026-03-10T14:08:41.894 INFO:tasks.workunit.client.0.vm03.stdout:3/818: stat d1d/d33/d65/d5d/dae 0 2026-03-10T14:08:41.909 INFO:tasks.workunit.client.0.vm03.stdout:2/707: dread d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 [0,4194304] 0 2026-03-10T14:08:41.915 INFO:tasks.workunit.client.0.vm03.stdout:3/819: dread d1d/d33/d47/d53/d68/dcf/de7/d41/f75 [0,4194304] 0 2026-03-10T14:08:41.926 INFO:tasks.workunit.client.0.vm03.stdout:2/708: unlink d5/d10/d17/f26 0 2026-03-10T14:08:41.932 INFO:tasks.workunit.client.0.vm03.stdout:5/875: dread d4/d13/d43/f58 [0,4194304] 0 2026-03-10T14:08:41.939 INFO:tasks.workunit.client.0.vm03.stdout:8/823: dwrite da/d3c/d4b/d69/fb5 [0,4194304] 0 2026-03-10T14:08:41.940 INFO:tasks.workunit.client.0.vm03.stdout:6/731: dread d8/db/d12/d64/fa8 [0,4194304] 0 2026-03-10T14:08:41.949 INFO:tasks.workunit.client.0.vm03.stdout:3/820: rename l9 to d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/da6/db7/l103 0 2026-03-10T14:08:41.958 INFO:tasks.workunit.client.0.vm03.stdout:3/821: fdatasync d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e/fc3 0 2026-03-10T14:08:41.958 INFO:tasks.workunit.client.0.vm03.stdout:9/719: write d2/f2c [286768,65401] 0 2026-03-10T14:08:41.969 INFO:tasks.workunit.client.0.vm03.stdout:8/824: chown da/d58/d5f/dbc/c55 6086815 1 2026-03-10T14:08:41.975 INFO:tasks.workunit.client.0.vm03.stdout:6/732: symlink d8/db/d49/d6c/d32/d3a/le5 0 2026-03-10T14:08:41.976 INFO:tasks.workunit.client.0.vm03.stdout:9/720: mknod d2/d29/d33/d41/d46/ce9 0 2026-03-10T14:08:41.976 INFO:tasks.workunit.client.0.vm03.stdout:2/709: getdents d5/dac 0 2026-03-10T14:08:41.977 INFO:tasks.workunit.client.0.vm03.stdout:2/710: readlink d5/d35/l47 0 2026-03-10T14:08:41.979 INFO:tasks.workunit.client.0.vm03.stdout:9/721: fdatasync d2/d14/d2b/d43/f7c 0 2026-03-10T14:08:41.982 INFO:tasks.workunit.client.0.vm03.stdout:2/711: truncate d5/d10/d1f/d4f/d76/da7/f7e 4296562 0 2026-03-10T14:08:41.986 INFO:tasks.workunit.client.0.vm03.stdout:6/733: rmdir d8/db/d49/d76/db5 0 2026-03-10T14:08:41.986 INFO:tasks.workunit.client.0.vm03.stdout:9/722: rename d2/d14/d2b/d34/ldf to d2/d29/d33/d60/d8c/de1/lea 0 2026-03-10T14:08:41.993 INFO:tasks.workunit.client.0.vm03.stdout:9/723: unlink d2/d29/d33/d41/f53 0 2026-03-10T14:08:41.996 INFO:tasks.workunit.client.0.vm03.stdout:6/734: mknod d8/db/ce6 0 2026-03-10T14:08:41.997 INFO:tasks.workunit.client.0.vm03.stdout:3/822: sync 2026-03-10T14:08:41.997 INFO:tasks.workunit.client.0.vm03.stdout:2/712: sync 2026-03-10T14:08:42.000 INFO:tasks.workunit.client.0.vm03.stdout:9/724: getdents d2/d14/d2b/d43 0 2026-03-10T14:08:42.002 INFO:tasks.workunit.client.0.vm03.stdout:7/635: dwrite d5/d9/d14/d26/d5f/f6b [0,4194304] 0 2026-03-10T14:08:42.017 INFO:tasks.workunit.client.0.vm03.stdout:6/735: fsync d8/db/d12/fa1 0 2026-03-10T14:08:42.019 INFO:tasks.workunit.client.0.vm03.stdout:1/735: write d0/d42/f80 [633338,41380] 0 2026-03-10T14:08:42.026 INFO:tasks.workunit.client.0.vm03.stdout:4/760: dwrite d5/d9/db/d19/d38/d53/d55/fa8 [0,4194304] 0 2026-03-10T14:08:42.029 INFO:tasks.workunit.client.0.vm03.stdout:4/761: chown d5/d9/db/d19/d34/f58 1 1 2026-03-10T14:08:42.043 INFO:tasks.workunit.client.0.vm03.stdout:9/725: fdatasync d2/d29/d38/fbf 0 2026-03-10T14:08:42.043 INFO:tasks.workunit.client.0.vm03.stdout:7/636: truncate d5/f8f 377233 0 2026-03-10T14:08:42.044 INFO:tasks.workunit.client.0.vm03.stdout:0/722: dwrite d3/d46/dac/d79/f8d [0,4194304] 0 2026-03-10T14:08:42.051 INFO:tasks.workunit.client.0.vm03.stdout:2/713: mknod d5/db4/d74/d83/dc1/ce8 0 2026-03-10T14:08:42.056 INFO:tasks.workunit.client.0.vm03.stdout:9/726: mkdir d2/d29/dcd/d72/d9d/deb 0 2026-03-10T14:08:42.059 INFO:tasks.workunit.client.0.vm03.stdout:3/823: dwrite d1d/d33/d47/d53/d68/dcf/de7/f9b [0,4194304] 0 2026-03-10T14:08:42.065 INFO:tasks.workunit.client.0.vm03.stdout:6/736: symlink d8/db/df/le7 0 2026-03-10T14:08:42.083 INFO:tasks.workunit.client.0.vm03.stdout:4/762: mknod d5/d47/d5b/d64/c101 0 2026-03-10T14:08:42.096 INFO:tasks.workunit.client.0.vm03.stdout:2/714: mkdir d5/d10/d1f/d4f/d76/da7/de9 0 2026-03-10T14:08:42.113 INFO:tasks.workunit.client.0.vm03.stdout:3/824: fsync d1d/d33/d47/d53/d68/dcf/de7/f9b 0 2026-03-10T14:08:42.113 INFO:tasks.workunit.client.0.vm03.stdout:3/825: fsync d1d/d33/d65/d48/fa5 0 2026-03-10T14:08:42.114 INFO:tasks.workunit.client.0.vm03.stdout:3/826: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/lc6 470 1 2026-03-10T14:08:42.119 INFO:tasks.workunit.client.0.vm03.stdout:4/763: rename d5/d9/d2b to d5/d47/d62/d8a/dd0/d102 0 2026-03-10T14:08:42.147 INFO:tasks.workunit.client.0.vm03.stdout:5/876: truncate d4/f9e 293562 0 2026-03-10T14:08:42.148 INFO:tasks.workunit.client.0.vm03.stdout:2/715: getdents d5/dac 0 2026-03-10T14:08:42.159 INFO:tasks.workunit.client.0.vm03.stdout:4/764: truncate d5/d47/d5b/f84 641576 0 2026-03-10T14:08:42.160 INFO:tasks.workunit.client.0.vm03.stdout:5/877: symlink d4/d6/ddc/l112 0 2026-03-10T14:08:42.163 INFO:tasks.workunit.client.0.vm03.stdout:2/716: dread d5/d10/d17/f18 [0,4194304] 0 2026-03-10T14:08:42.167 INFO:tasks.workunit.client.0.vm03.stdout:3/827: creat d1d/d33/d65/d5d/dae/db9/dec/f104 x:0 0 0 2026-03-10T14:08:42.172 INFO:tasks.workunit.client.0.vm03.stdout:3/828: dwrite d1d/d33/d65/d48/fa5 [0,4194304] 0 2026-03-10T14:08:42.175 INFO:tasks.workunit.client.0.vm03.stdout:5/878: chown d4/d40/d4e 183777950 1 2026-03-10T14:08:42.178 INFO:tasks.workunit.client.0.vm03.stdout:2/717: creat d5/d10/da3/dab/fea x:0 0 0 2026-03-10T14:08:42.182 INFO:tasks.workunit.client.0.vm03.stdout:4/765: creat d5/d47/d62/de9/f103 x:0 0 0 2026-03-10T14:08:42.182 INFO:tasks.workunit.client.0.vm03.stdout:5/879: unlink d4/d13/d1f/d8c/fa4 0 2026-03-10T14:08:42.187 INFO:tasks.workunit.client.0.vm03.stdout:3/829: dwrite d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/fb8 [0,4194304] 0 2026-03-10T14:08:42.191 INFO:tasks.workunit.client.0.vm03.stdout:5/880: stat d4/d16/d19/d23/db8/l110 0 2026-03-10T14:08:42.193 INFO:tasks.workunit.client.0.vm03.stdout:8/825: write da/d3c/d51/d85/d104/f107 [220334,54713] 0 2026-03-10T14:08:42.193 INFO:tasks.workunit.client.0.vm03.stdout:4/766: rename d5/d9/c74 to d5/d47/d5b/c104 0 2026-03-10T14:08:42.197 INFO:tasks.workunit.client.0.vm03.stdout:8/826: read da/f30 [1132585,81823] 0 2026-03-10T14:08:42.211 INFO:tasks.workunit.client.0.vm03.stdout:2/718: sync 2026-03-10T14:08:42.222 INFO:tasks.workunit.client.0.vm03.stdout:3/830: creat d1d/d33/d47/dac/dba/f105 x:0 0 0 2026-03-10T14:08:42.234 INFO:tasks.workunit.client.0.vm03.stdout:5/881: truncate d4/f82 1310258 0 2026-03-10T14:08:42.247 INFO:tasks.workunit.client.0.vm03.stdout:8/827: symlink da/d3a/d44/l109 0 2026-03-10T14:08:42.248 INFO:tasks.workunit.client.0.vm03.stdout:8/828: dread - da/d58/d5f/f102 zero size 2026-03-10T14:08:42.262 INFO:tasks.workunit.client.0.vm03.stdout:5/882: truncate d4/d13/d43/f94 761515 0 2026-03-10T14:08:42.266 INFO:tasks.workunit.client.0.vm03.stdout:8/829: creat da/d3a/dce/f10a x:0 0 0 2026-03-10T14:08:42.311 INFO:tasks.workunit.client.0.vm03.stdout:4/767: dread d5/d9/db/ff [0,4194304] 0 2026-03-10T14:08:42.311 INFO:tasks.workunit.client.0.vm03.stdout:8/830: readlink da/d58/d6c/lc5 0 2026-03-10T14:08:42.311 INFO:tasks.workunit.client.0.vm03.stdout:4/768: dread d5/d9/db/ff [0,4194304] 0 2026-03-10T14:08:42.311 INFO:tasks.workunit.client.0.vm03.stdout:9/727: rmdir d2 39 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:3/831: getdents d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/de4 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:5/883: write d4/d6/de/fe7 [2060298,39308] 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:5/884: chown d4/d13/d1f/fb1 58358983 1 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:7/637: dwrite d5/d9/d14/d26/d39/f63 [0,4194304] 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:9/728: truncate d2/d29/dcd/d9c/fe5 611584 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:2/719: rename d5/d10/d1f/d4f/d76/da7/lcb to d5/d10/d1f/d4f/d76/da7/d54/d5f/leb 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:5/885: dread d4/d16/d19/d23/db8/fc3 [0,4194304] 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:5/886: chown d4/d40/dbc/fbf 30549 1 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:7/638: rename d5/d98/dd9 to d5/d9/d14/d21/d28/dde 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:2/720: creat d5/d10/d1f/d4f/d76/da7/d40/d59/da2/fec x:0 0 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:7/639: symlink d5/d9/d35/dab/ldf 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:3/832: dwrite d1d/d33/d47/d53/d68/dcf/de7/d41/f75 [8388608,4194304] 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:7/640: creat d5/d9/d14/d26/d39/db1/fe0 x:0 0 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:2/721: unlink d5/d10/d1f/d4f/d76/da7/d54/f8e 0 2026-03-10T14:08:42.312 INFO:tasks.workunit.client.0.vm03.stdout:7/641: unlink d5/d9/d35/f85 0 2026-03-10T14:08:42.317 INFO:tasks.workunit.client.0.vm03.stdout:3/833: dwrite d1d/d59/f69 [0,4194304] 0 2026-03-10T14:08:42.322 INFO:tasks.workunit.client.0.vm03.stdout:2/722: creat d5/db4/d74/dcf/fed x:0 0 0 2026-03-10T14:08:42.323 INFO:tasks.workunit.client.0.vm03.stdout:2/723: dread - d5/d10/d1f/d4f/d76/da7/d40/d59/da2/fdf zero size 2026-03-10T14:08:42.324 INFO:tasks.workunit.client.0.vm03.stdout:2/724: chown d5/d35/c6a 27840 1 2026-03-10T14:08:42.324 INFO:tasks.workunit.client.0.vm03.stdout:2/725: chown d5/dac 1998 1 2026-03-10T14:08:42.336 INFO:tasks.workunit.client.0.vm03.stdout:3/834: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/dde/d100/f106 x:0 0 0 2026-03-10T14:08:42.338 INFO:tasks.workunit.client.0.vm03.stdout:3/835: read d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/fa2 [834231,52094] 0 2026-03-10T14:08:42.341 INFO:tasks.workunit.client.0.vm03.stdout:2/726: getdents d5/d10/d1f/d4f/d76/da7/d40/d59/d85 0 2026-03-10T14:08:42.342 INFO:tasks.workunit.client.0.vm03.stdout:3/836: fdatasync d1d/d33/d65/d48/ffe 0 2026-03-10T14:08:42.361 INFO:tasks.workunit.client.0.vm03.stdout:3/837: rmdir d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e 39 2026-03-10T14:08:42.366 INFO:tasks.workunit.client.0.vm03.stdout:2/727: truncate d5/d10/d1f/d4f/d76/da7/f44 453738 0 2026-03-10T14:08:42.374 INFO:tasks.workunit.client.0.vm03.stdout:3/838: fsync fb 0 2026-03-10T14:08:42.374 INFO:tasks.workunit.client.0.vm03.stdout:1/736: dwrite d0/d2/f9 [0,4194304] 0 2026-03-10T14:08:42.386 INFO:tasks.workunit.client.0.vm03.stdout:1/737: creat d0/d2/df/d16/d20/ff2 x:0 0 0 2026-03-10T14:08:42.388 INFO:tasks.workunit.client.0.vm03.stdout:1/738: fsync d0/d18/d3b/db7/fbd 0 2026-03-10T14:08:42.393 INFO:tasks.workunit.client.0.vm03.stdout:8/831: dread f6 [0,4194304] 0 2026-03-10T14:08:42.396 INFO:tasks.workunit.client.0.vm03.stdout:1/739: symlink d0/d18/d1d/dc4/lf3 0 2026-03-10T14:08:42.405 INFO:tasks.workunit.client.0.vm03.stdout:1/740: dread d0/d42/fcc [0,4194304] 0 2026-03-10T14:08:42.408 INFO:tasks.workunit.client.0.vm03.stdout:3/839: sync 2026-03-10T14:08:42.410 INFO:tasks.workunit.client.0.vm03.stdout:9/729: sync 2026-03-10T14:08:42.410 INFO:tasks.workunit.client.0.vm03.stdout:5/887: sync 2026-03-10T14:08:42.410 INFO:tasks.workunit.client.0.vm03.stdout:2/728: sync 2026-03-10T14:08:42.411 INFO:tasks.workunit.client.0.vm03.stdout:5/888: chown d4/d13/ccd 0 1 2026-03-10T14:08:42.420 INFO:tasks.workunit.client.0.vm03.stdout:0/723: write d3/d4d/d47/fcd [704798,23427] 0 2026-03-10T14:08:42.426 INFO:tasks.workunit.client.0.vm03.stdout:6/737: dwrite d8/d1b/d1c/f50 [0,4194304] 0 2026-03-10T14:08:42.426 INFO:tasks.workunit.client.0.vm03.stdout:5/889: dwrite d4/d16/d19/d23/db8/fba [8388608,4194304] 0 2026-03-10T14:08:42.467 INFO:tasks.workunit.client.0.vm03.stdout:9/730: mkdir d2/d29/d38/dec 0 2026-03-10T14:08:42.477 INFO:tasks.workunit.client.0.vm03.stdout:8/832: dread da/d3c/f48 [0,4194304] 0 2026-03-10T14:08:42.479 INFO:tasks.workunit.client.0.vm03.stdout:4/769: dwrite d5/d9/db/d19/d38/f86 [0,4194304] 0 2026-03-10T14:08:42.480 INFO:tasks.workunit.client.0.vm03.stdout:8/833: chown da/d58/d5f/f102 7 1 2026-03-10T14:08:42.481 INFO:tasks.workunit.client.0.vm03.stdout:4/770: readlink d5/d6e/ld9 0 2026-03-10T14:08:42.485 INFO:tasks.workunit.client.0.vm03.stdout:5/890: write d4/d16/d19/d23/d3f/fb9 [1034403,53813] 0 2026-03-10T14:08:42.486 INFO:tasks.workunit.client.0.vm03.stdout:3/840: fdatasync d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/fa9 0 2026-03-10T14:08:42.487 INFO:tasks.workunit.client.0.vm03.stdout:0/724: dread d3/f28 [0,4194304] 0 2026-03-10T14:08:42.504 INFO:tasks.workunit.client.0.vm03.stdout:0/725: dwrite d3/d11/d2c/d4a/d4b/d89/db6/fe1 [0,4194304] 0 2026-03-10T14:08:42.507 INFO:tasks.workunit.client.0.vm03.stdout:7/642: truncate d5/d9/d14/d26/d5f/f68 3827049 0 2026-03-10T14:08:42.507 INFO:tasks.workunit.client.0.vm03.stdout:9/731: creat d2/d29/dcd/dd8/fed x:0 0 0 2026-03-10T14:08:42.507 INFO:tasks.workunit.client.0.vm03.stdout:9/732: fsync d2/d29/d9a/fb7 0 2026-03-10T14:08:42.511 INFO:tasks.workunit.client.0.vm03.stdout:5/891: dread d4/d13/d1f/d84/f99 [0,4194304] 0 2026-03-10T14:08:42.511 INFO:tasks.workunit.client.0.vm03.stdout:5/892: fdatasync d4/d40/f96 0 2026-03-10T14:08:42.511 INFO:tasks.workunit.client.0.vm03.stdout:4/771: fdatasync d5/d47/d5b/f79 0 2026-03-10T14:08:42.516 INFO:tasks.workunit.client.0.vm03.stdout:8/834: rename da/d3c/d51/d8b to da/d3c/d4b/dd2/d10b 0 2026-03-10T14:08:42.516 INFO:tasks.workunit.client.0.vm03.stdout:6/738: mknod d8/d1b/ce8 0 2026-03-10T14:08:42.518 INFO:tasks.workunit.client.0.vm03.stdout:8/835: dread - da/d3a/d44/dfb/dd7/ff3 zero size 2026-03-10T14:08:42.523 INFO:tasks.workunit.client.0.vm03.stdout:9/733: dread d2/d29/dcd/fb6 [0,4194304] 0 2026-03-10T14:08:42.528 INFO:tasks.workunit.client.0.vm03.stdout:3/841: chown d1d/d33/d47/d53/d68/dcf/de7/d41/f70 5 1 2026-03-10T14:08:42.542 INFO:tasks.workunit.client.0.vm03.stdout:2/729: creat d5/d2a/fee x:0 0 0 2026-03-10T14:08:42.550 INFO:tasks.workunit.client.0.vm03.stdout:5/893: stat d4/d13/d43/f94 0 2026-03-10T14:08:42.561 INFO:tasks.workunit.client.0.vm03.stdout:4/772: fsync d5/d47/d5b/fba 0 2026-03-10T14:08:42.565 INFO:tasks.workunit.client.0.vm03.stdout:6/739: creat d8/d11/da0/dca/fe9 x:0 0 0 2026-03-10T14:08:42.570 INFO:tasks.workunit.client.0.vm03.stdout:8/836: creat da/d3c/d51/d85/f10c x:0 0 0 2026-03-10T14:08:42.570 INFO:tasks.workunit.client.0.vm03.stdout:8/837: readlink da/d24/db4/l105 0 2026-03-10T14:08:42.577 INFO:tasks.workunit.client.0.vm03.stdout:1/741: getdents d0/d2/df/d16/d20 0 2026-03-10T14:08:42.580 INFO:tasks.workunit.client.0.vm03.stdout:9/734: unlink d2/d29/ddc/ce7 0 2026-03-10T14:08:42.583 INFO:tasks.workunit.client.0.vm03.stdout:6/740: sync 2026-03-10T14:08:42.588 INFO:tasks.workunit.client.0.vm03.stdout:5/894: mknod d4/d16/d19/d6e/da3/c113 0 2026-03-10T14:08:42.588 INFO:tasks.workunit.client.0.vm03.stdout:5/895: stat d4/d6/cb7 0 2026-03-10T14:08:42.591 INFO:tasks.workunit.client.0.vm03.stdout:7/643: write d5/d9/d14/d26/f9f [622726,113518] 0 2026-03-10T14:08:42.596 INFO:tasks.workunit.client.0.vm03.stdout:1/742: dread d0/f24 [0,4194304] 0 2026-03-10T14:08:42.599 INFO:tasks.workunit.client.0.vm03.stdout:4/773: mknod d5/d47/d62/d8a/c105 0 2026-03-10T14:08:42.607 INFO:tasks.workunit.client.0.vm03.stdout:0/726: dwrite d3/d46/dac/fb0 [0,4194304] 0 2026-03-10T14:08:42.613 INFO:tasks.workunit.client.0.vm03.stdout:8/838: fsync da/d3c/d4b/d4c/f95 0 2026-03-10T14:08:42.615 INFO:tasks.workunit.client.0.vm03.stdout:2/730: dread d5/d10/d1f/d4f/d76/da7/d40/d59/f71 [4194304,4194304] 0 2026-03-10T14:08:42.620 INFO:tasks.workunit.client.0.vm03.stdout:9/735: creat d2/d29/d38/fee x:0 0 0 2026-03-10T14:08:42.623 INFO:tasks.workunit.client.0.vm03.stdout:3/842: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/de4/f107 x:0 0 0 2026-03-10T14:08:42.626 INFO:tasks.workunit.client.0.vm03.stdout:5/896: mknod d4/d16/df4/c114 0 2026-03-10T14:08:42.628 INFO:tasks.workunit.client.0.vm03.stdout:1/743: dread d0/d2/df/d16/f4f [0,4194304] 0 2026-03-10T14:08:42.629 INFO:tasks.workunit.client.0.vm03.stdout:4/774: symlink d5/d9/db/d19/d38/d7b/daa/daf/l106 0 2026-03-10T14:08:42.629 INFO:tasks.workunit.client.0.vm03.stdout:4/775: readlink d5/d9/db/d19/l30 0 2026-03-10T14:08:42.632 INFO:tasks.workunit.client.0.vm03.stdout:0/727: symlink d3/d11/d76/db5/ddb/lea 0 2026-03-10T14:08:42.633 INFO:tasks.workunit.client.0.vm03.stdout:0/728: chown d3/d16/c42 1243541185 1 2026-03-10T14:08:42.634 INFO:tasks.workunit.client.0.vm03.stdout:8/839: truncate da/d58/fc4 1303760 0 2026-03-10T14:08:42.635 INFO:tasks.workunit.client.0.vm03.stdout:8/840: chown da/d3c/d51/c6e 1372576 1 2026-03-10T14:08:42.635 INFO:tasks.workunit.client.0.vm03.stdout:9/736: truncate d2/d14/fb5 718174 0 2026-03-10T14:08:42.636 INFO:tasks.workunit.client.0.vm03.stdout:3/843: creat d1d/d33/d47/dac/dba/f108 x:0 0 0 2026-03-10T14:08:42.639 INFO:tasks.workunit.client.0.vm03.stdout:5/897: sync 2026-03-10T14:08:42.644 INFO:tasks.workunit.client.0.vm03.stdout:2/731: mknod d5/d10/d1f/d4f/d76/da7/cef 0 2026-03-10T14:08:42.645 INFO:tasks.workunit.client.0.vm03.stdout:0/729: unlink d3/d11/d76/c91 0 2026-03-10T14:08:42.648 INFO:tasks.workunit.client.0.vm03.stdout:8/841: chown da/d24/l2c 0 1 2026-03-10T14:08:42.656 INFO:tasks.workunit.client.0.vm03.stdout:9/737: mknod d2/d29/d33/d41/daa/cef 0 2026-03-10T14:08:42.660 INFO:tasks.workunit.client.0.vm03.stdout:2/732: dread d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/f73 [0,4194304] 0 2026-03-10T14:08:42.661 INFO:tasks.workunit.client.0.vm03.stdout:1/744: write d0/d42/f66 [3320733,49902] 0 2026-03-10T14:08:42.662 INFO:tasks.workunit.client.0.vm03.stdout:6/741: symlink d8/db/d49/d6c/d83/da3/lea 0 2026-03-10T14:08:42.667 INFO:tasks.workunit.client.0.vm03.stdout:7/644: rename d5/d9/d3e to d5/d9/d14/d26/d36/de1 0 2026-03-10T14:08:42.668 INFO:tasks.workunit.client.0.vm03.stdout:3/844: write d1d/d33/d47/d53/d68/f86 [33410,42828] 0 2026-03-10T14:08:42.672 INFO:tasks.workunit.client.0.vm03.stdout:5/898: mkdir d4/d40/d4e/d115 0 2026-03-10T14:08:42.673 INFO:tasks.workunit.client.0.vm03.stdout:5/899: dread d4/d40/dbc/d105/f10d [0,4194304] 0 2026-03-10T14:08:42.676 INFO:tasks.workunit.client.0.vm03.stdout:9/738: mknod d2/d29/dcd/d72/cf0 0 2026-03-10T14:08:42.677 INFO:tasks.workunit.client.0.vm03.stdout:0/730: dwrite d3/d4d/da0/fbe [0,4194304] 0 2026-03-10T14:08:42.679 INFO:tasks.workunit.client.0.vm03.stdout:1/745: rmdir d0/d2/df/d16 39 2026-03-10T14:08:42.680 INFO:tasks.workunit.client.0.vm03.stdout:6/742: stat d8/db/c9d 0 2026-03-10T14:08:42.686 INFO:tasks.workunit.client.0.vm03.stdout:4/776: rename d5/d9/l7f to d5/d47/d5b/dab/l107 0 2026-03-10T14:08:42.686 INFO:tasks.workunit.client.0.vm03.stdout:4/777: write d5/d9/db/f12 [777877,100656] 0 2026-03-10T14:08:42.689 INFO:tasks.workunit.client.0.vm03.stdout:7/645: mkdir d5/d9/d14/d26/d5f/de2 0 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: pgmap v8: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 19 MiB/s rd, 43 MiB/s wr, 128 op/s 2026-03-10T14:08:42.691 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:42 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:42.695 INFO:tasks.workunit.client.0.vm03.stdout:3/845: write d1d/f7a [3596829,109995] 0 2026-03-10T14:08:42.696 INFO:tasks.workunit.client.0.vm03.stdout:3/846: dread - d1d/d33/d65/d5d/dae/db9/dec/f104 zero size 2026-03-10T14:08:42.699 INFO:tasks.workunit.client.0.vm03.stdout:8/842: mkdir da/d3c/d4b/df9/d10d 0 2026-03-10T14:08:42.699 INFO:tasks.workunit.client.0.vm03.stdout:1/746: rmdir d0/d42/d9f 39 2026-03-10T14:08:42.700 INFO:tasks.workunit.client.0.vm03.stdout:1/747: chown d0/d18/daa 0 1 2026-03-10T14:08:42.700 INFO:tasks.workunit.client.0.vm03.stdout:8/843: write da/d58/f72 [41767,63487] 0 2026-03-10T14:08:42.702 INFO:tasks.workunit.client.0.vm03.stdout:0/731: mknod d3/d11/d76/db5/ddb/dd4/ceb 0 2026-03-10T14:08:42.705 INFO:tasks.workunit.client.0.vm03.stdout:6/743: fsync d8/d11/f2a 0 2026-03-10T14:08:42.706 INFO:tasks.workunit.client.0.vm03.stdout:6/744: write d8/db/d49/d6c/fd2 [666279,86211] 0 2026-03-10T14:08:42.718 INFO:tasks.workunit.client.0.vm03.stdout:6/745: dread d8/db/df/f10 [0,4194304] 0 2026-03-10T14:08:42.719 INFO:tasks.workunit.client.0.vm03.stdout:7/646: mkdir d5/d9/d14/d26/d36/d51/dc8/de3 0 2026-03-10T14:08:42.719 INFO:tasks.workunit.client.0.vm03.stdout:4/778: mkdir d5/d9/db/d19/d38/d53/d55/d108 0 2026-03-10T14:08:42.719 INFO:tasks.workunit.client.0.vm03.stdout:5/900: chown d4/d13/d1f/d8c/da7/l111 254 1 2026-03-10T14:08:42.719 INFO:tasks.workunit.client.0.vm03.stdout:3/847: chown d1d/d39 4085262 1 2026-03-10T14:08:42.735 INFO:tasks.workunit.client.0.vm03.stdout:1/748: truncate d0/f24 5006529 0 2026-03-10T14:08:42.737 INFO:tasks.workunit.client.0.vm03.stdout:8/844: creat da/d36/d40/f10e x:0 0 0 2026-03-10T14:08:42.743 INFO:tasks.workunit.client.0.vm03.stdout:2/733: rename d5/d10/d31/c77 to d5/db4/d74/cf0 0 2026-03-10T14:08:42.745 INFO:tasks.workunit.client.0.vm03.stdout:9/739: creat d2/d14/ff1 x:0 0 0 2026-03-10T14:08:42.758 INFO:tasks.workunit.client.0.vm03.stdout:3/848: symlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/dce/l109 0 2026-03-10T14:08:42.758 INFO:tasks.workunit.client.0.vm03.stdout:1/749: rmdir d0/d42 39 2026-03-10T14:08:42.759 INFO:tasks.workunit.client.0.vm03.stdout:2/734: creat d5/dac/ff1 x:0 0 0 2026-03-10T14:08:42.760 INFO:tasks.workunit.client.0.vm03.stdout:7/647: read d5/d9/f1f [884719,125937] 0 2026-03-10T14:08:42.761 INFO:tasks.workunit.client.0.vm03.stdout:7/648: write d5/d9/d14/d26/d5f/d89/fbd [964991,7589] 0 2026-03-10T14:08:42.765 INFO:tasks.workunit.client.0.vm03.stdout:9/740: creat d2/d29/dcd/d72/ff2 x:0 0 0 2026-03-10T14:08:42.766 INFO:tasks.workunit.client.0.vm03.stdout:5/901: truncate d4/d13/d1f/fce 806864 0 2026-03-10T14:08:42.768 INFO:tasks.workunit.client.0.vm03.stdout:4/779: creat d5/d9/db/da7/dcc/d100/d8e/f109 x:0 0 0 2026-03-10T14:08:42.771 INFO:tasks.workunit.client.0.vm03.stdout:8/845: rename da/d24/lac to da/df7/l10f 0 2026-03-10T14:08:42.772 INFO:tasks.workunit.client.0.vm03.stdout:3/849: mknod d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/da6/c10a 0 2026-03-10T14:08:42.775 INFO:tasks.workunit.client.0.vm03.stdout:6/746: creat d8/db/feb x:0 0 0 2026-03-10T14:08:42.775 INFO:tasks.workunit.client.0.vm03.stdout:9/741: truncate d2/d14/d2b/d43/f5b 2391998 0 2026-03-10T14:08:42.780 INFO:tasks.workunit.client.0.vm03.stdout:5/902: dwrite d4/d13/d1f/d8c/fa6 [0,4194304] 0 2026-03-10T14:08:42.788 INFO:tasks.workunit.client.0.vm03.stdout:5/903: dwrite d4/d16/d19/d23/db8/fba [8388608,4194304] 0 2026-03-10T14:08:42.788 INFO:tasks.workunit.client.0.vm03.stdout:4/780: truncate d5/d9/db/d19/d38/d53/f83 4901172 0 2026-03-10T14:08:42.790 INFO:tasks.workunit.client.0.vm03.stdout:0/732: getdents d3/d46/dac/d79 0 2026-03-10T14:08:42.790 INFO:tasks.workunit.client.0.vm03.stdout:8/846: mknod da/df7/c110 0 2026-03-10T14:08:42.802 INFO:tasks.workunit.client.0.vm03.stdout:6/747: dwrite d8/db/d12/f57 [0,4194304] 0 2026-03-10T14:08:42.803 INFO:tasks.workunit.client.0.vm03.stdout:6/748: chown d8/db/c9d 53 1 2026-03-10T14:08:42.807 INFO:tasks.workunit.client.0.vm03.stdout:3/850: unlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/de4/f107 0 2026-03-10T14:08:42.808 INFO:tasks.workunit.client.0.vm03.stdout:3/851: readlink d1d/l74 0 2026-03-10T14:08:42.812 INFO:tasks.workunit.client.0.vm03.stdout:6/749: truncate d8/d11/f35 1466535 0 2026-03-10T14:08:42.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: pgmap v8: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 19 MiB/s rd, 43 MiB/s wr, 128 op/s 2026-03-10T14:08:42.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:42 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:42.817 INFO:tasks.workunit.client.0.vm03.stdout:8/847: dread da/d24/d49/fbe [0,4194304] 0 2026-03-10T14:08:42.818 INFO:tasks.workunit.client.0.vm03.stdout:2/735: getdents d5/d10/d1f/d4f/d76/da7/d54/d5f 0 2026-03-10T14:08:42.818 INFO:tasks.workunit.client.0.vm03.stdout:8/848: fdatasync da/d3a/d44/fe8 0 2026-03-10T14:08:42.819 INFO:tasks.workunit.client.0.vm03.stdout:8/849: write da/f33 [1408238,57047] 0 2026-03-10T14:08:42.820 INFO:tasks.workunit.client.0.vm03.stdout:8/850: dread - da/d58/d6c/feb zero size 2026-03-10T14:08:42.821 INFO:tasks.workunit.client.0.vm03.stdout:6/750: unlink d8/db/df/fbc 0 2026-03-10T14:08:42.824 INFO:tasks.workunit.client.0.vm03.stdout:6/751: dwrite d8/d1b/f7f [0,4194304] 0 2026-03-10T14:08:42.826 INFO:tasks.workunit.client.0.vm03.stdout:4/781: link d5/d47/c4d d5/db4/dd2/c10a 0 2026-03-10T14:08:42.826 INFO:tasks.workunit.client.0.vm03.stdout:6/752: readlink d8/db/d12/l6b 0 2026-03-10T14:08:42.837 INFO:tasks.workunit.client.0.vm03.stdout:0/733: creat d3/d4d/fec x:0 0 0 2026-03-10T14:08:42.849 INFO:tasks.workunit.client.0.vm03.stdout:5/904: getdents d4/d16/d19/d6e/da3 0 2026-03-10T14:08:42.855 INFO:tasks.workunit.client.0.vm03.stdout:1/750: write d0/d2/f46 [5608181,105727] 0 2026-03-10T14:08:42.857 INFO:tasks.workunit.client.0.vm03.stdout:7/649: truncate d5/d9/f30 1498206 0 2026-03-10T14:08:42.865 INFO:tasks.workunit.client.0.vm03.stdout:3/852: truncate d1d/f32 1505975 0 2026-03-10T14:08:42.875 INFO:tasks.workunit.client.0.vm03.stdout:1/751: read d0/d2/df/dab/f3e [3353530,90703] 0 2026-03-10T14:08:42.876 INFO:tasks.workunit.client.0.vm03.stdout:1/752: write d0/d2/df/d27/d7e/d81/fe2 [918074,109254] 0 2026-03-10T14:08:42.877 INFO:tasks.workunit.client.0.vm03.stdout:7/650: mknod d5/d9/d14/d26/d36/de1/dcc/ce4 0 2026-03-10T14:08:42.880 INFO:tasks.workunit.client.0.vm03.stdout:9/742: dwrite d2/d14/d2b/d43/f5b [0,4194304] 0 2026-03-10T14:08:42.884 INFO:tasks.workunit.client.0.vm03.stdout:2/736: creat d5/d10/d1f/d4f/d76/da7/ff2 x:0 0 0 2026-03-10T14:08:42.888 INFO:tasks.workunit.client.0.vm03.stdout:0/734: dwrite d3/d4d/d47/f48 [0,4194304] 0 2026-03-10T14:08:42.892 INFO:tasks.workunit.client.0.vm03.stdout:6/753: rename d8/db/d49/d6c/d32/d3a/le5 to d8/db/dd0/de2/lec 0 2026-03-10T14:08:42.892 INFO:tasks.workunit.client.0.vm03.stdout:6/754: dread - d8/db/feb zero size 2026-03-10T14:08:42.897 INFO:tasks.workunit.client.0.vm03.stdout:6/755: dwrite d8/d1b/f7f [0,4194304] 0 2026-03-10T14:08:42.899 INFO:tasks.workunit.client.0.vm03.stdout:3/853: fsync d1d/d39/d51/fa0 0 2026-03-10T14:08:42.899 INFO:tasks.workunit.client.0.vm03.stdout:5/905: creat d4/d40/dbc/d105/f116 x:0 0 0 2026-03-10T14:08:42.902 INFO:tasks.workunit.client.0.vm03.stdout:3/854: truncate d1d/d33/d65/d48/ffe 831765 0 2026-03-10T14:08:42.911 INFO:tasks.workunit.client.0.vm03.stdout:7/651: dread d5/d9/d14/d26/d5f/f68 [0,4194304] 0 2026-03-10T14:08:42.912 INFO:tasks.workunit.client.0.vm03.stdout:8/851: getdents da/d24/d78 0 2026-03-10T14:08:42.915 INFO:tasks.workunit.client.0.vm03.stdout:5/906: creat d4/d16/d19/d6e/f117 x:0 0 0 2026-03-10T14:08:42.918 INFO:tasks.workunit.client.0.vm03.stdout:9/743: getdents d2/d14/d2b/d43/ddd 0 2026-03-10T14:08:42.918 INFO:tasks.workunit.client.0.vm03.stdout:9/744: stat d2/d29/dcd/d72/d9d/fa6 0 2026-03-10T14:08:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/745: chown d2/d29/d33/d60/d8c/dd3 1458037 1 2026-03-10T14:08:42.921 INFO:tasks.workunit.client.0.vm03.stdout:7/652: dread d5/d9/f19 [0,4194304] 0 2026-03-10T14:08:42.926 INFO:tasks.workunit.client.0.vm03.stdout:2/737: fsync d5/d10/d31/f4e 0 2026-03-10T14:08:42.929 INFO:tasks.workunit.client.0.vm03.stdout:2/738: dread d5/d10/d1f/d4f/d76/da7/d40/f68 [0,4194304] 0 2026-03-10T14:08:42.930 INFO:tasks.workunit.client.0.vm03.stdout:2/739: dread - d5/d10/d1f/d4f/d76/fca zero size 2026-03-10T14:08:42.931 INFO:tasks.workunit.client.0.vm03.stdout:4/782: link d5/d9/db/d19/d38/d53/f83 d5/d9/db/d19/f10b 0 2026-03-10T14:08:42.933 INFO:tasks.workunit.client.0.vm03.stdout:0/735: symlink d3/d11/d76/dbc/led 0 2026-03-10T14:08:42.933 INFO:tasks.workunit.client.0.vm03.stdout:0/736: readlink d3/lc7 0 2026-03-10T14:08:42.935 INFO:tasks.workunit.client.0.vm03.stdout:5/907: creat d4/d16/f118 x:0 0 0 2026-03-10T14:08:42.939 INFO:tasks.workunit.client.0.vm03.stdout:5/908: dwrite d4/d16/d19/d4a/f108 [0,4194304] 0 2026-03-10T14:08:42.942 INFO:tasks.workunit.client.0.vm03.stdout:1/753: creat d0/d2/df/d27/ff4 x:0 0 0 2026-03-10T14:08:42.943 INFO:tasks.workunit.client.0.vm03.stdout:8/852: unlink da/c14 0 2026-03-10T14:08:42.944 INFO:tasks.workunit.client.0.vm03.stdout:9/746: truncate d2/d14/d2b/d43/f45 4732897 0 2026-03-10T14:08:42.945 INFO:tasks.workunit.client.0.vm03.stdout:7/653: symlink d5/d98/le5 0 2026-03-10T14:08:42.946 INFO:tasks.workunit.client.0.vm03.stdout:7/654: write d5/d9/d14/d21/fc5 [53305,89405] 0 2026-03-10T14:08:42.952 INFO:tasks.workunit.client.0.vm03.stdout:4/783: unlink d5/d9/db/d19/d99/ff1 0 2026-03-10T14:08:42.954 INFO:tasks.workunit.client.0.vm03.stdout:4/784: write d5/d9/db/d19/d38/d7b/daa/daf/fe1 [1016265,74312] 0 2026-03-10T14:08:42.958 INFO:tasks.workunit.client.0.vm03.stdout:6/756: creat d8/d11/da0/dbf/fed x:0 0 0 2026-03-10T14:08:42.962 INFO:tasks.workunit.client.0.vm03.stdout:1/754: read d0/d18/d3b/f53 [911571,111410] 0 2026-03-10T14:08:42.964 INFO:tasks.workunit.client.0.vm03.stdout:9/747: mkdir d2/d14/d2b/d79/d81/df3 0 2026-03-10T14:08:42.974 INFO:tasks.workunit.client.0.vm03.stdout:3/855: rename d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/f9d to d1d/d33/d47/d53/d68/dcf/de7/f10b 0 2026-03-10T14:08:42.975 INFO:tasks.workunit.client.0.vm03.stdout:6/757: unlink d8/d11/d7a/l89 0 2026-03-10T14:08:42.976 INFO:tasks.workunit.client.0.vm03.stdout:9/748: dwrite d2/d14/d2b/f2d [4194304,4194304] 0 2026-03-10T14:08:42.985 INFO:tasks.workunit.client.0.vm03.stdout:1/755: truncate d0/d2/df/d27/f58 3485156 0 2026-03-10T14:08:42.989 INFO:tasks.workunit.client.0.vm03.stdout:9/749: truncate d2/d14/d2b/d34/f7a 766507 0 2026-03-10T14:08:42.993 INFO:tasks.workunit.client.0.vm03.stdout:3/856: mkdir d1d/d33/d47/d10c 0 2026-03-10T14:08:42.995 INFO:tasks.workunit.client.0.vm03.stdout:3/857: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/db6/f10d x:0 0 0 2026-03-10T14:08:42.997 INFO:tasks.workunit.client.0.vm03.stdout:6/758: link d8/db/d49/d6c/f66 d8/d1b/d1c/fee 0 2026-03-10T14:08:42.998 INFO:tasks.workunit.client.0.vm03.stdout:3/858: read - d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/fe3 zero size 2026-03-10T14:08:42.999 INFO:tasks.workunit.client.0.vm03.stdout:3/859: write d1d/d33/d47/d53/d68/dcf/de7/f2e [4706070,9768] 0 2026-03-10T14:08:43.001 INFO:tasks.workunit.client.0.vm03.stdout:3/860: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/dd5 180401175 1 2026-03-10T14:08:43.007 INFO:tasks.workunit.client.0.vm03.stdout:6/759: readlink d8/d11/da0/dbf/lc6 0 2026-03-10T14:08:43.011 INFO:tasks.workunit.client.0.vm03.stdout:1/756: getdents d0/d18/d3b/d50 0 2026-03-10T14:08:43.016 INFO:tasks.workunit.client.0.vm03.stdout:9/750: getdents d2/d29 0 2026-03-10T14:08:43.017 INFO:tasks.workunit.client.0.vm03.stdout:7/655: sync 2026-03-10T14:08:43.017 INFO:tasks.workunit.client.0.vm03.stdout:4/785: sync 2026-03-10T14:08:43.020 INFO:tasks.workunit.client.0.vm03.stdout:9/751: dwrite d2/d29/dcd/d72/d9d/fa0 [0,4194304] 0 2026-03-10T14:08:43.021 INFO:tasks.workunit.client.0.vm03.stdout:3/861: creat d1d/d33/d65/d5d/dae/db9/dec/f10e x:0 0 0 2026-03-10T14:08:43.022 INFO:tasks.workunit.client.0.vm03.stdout:6/760: fdatasync d8/d11/da0/dbf/fde 0 2026-03-10T14:08:43.026 INFO:tasks.workunit.client.0.vm03.stdout:6/761: read d8/db/d12/fa1 [6334309,106946] 0 2026-03-10T14:08:43.035 INFO:tasks.workunit.client.0.vm03.stdout:1/757: read - d0/d2/df/f76 zero size 2026-03-10T14:08:43.041 INFO:tasks.workunit.client.0.vm03.stdout:1/758: dread d0/d2/df/d27/faf [0,4194304] 0 2026-03-10T14:08:43.048 INFO:tasks.workunit.client.0.vm03.stdout:7/656: mknod d5/d9/d14/d26/d36/d51/d7b/d9c/ce6 0 2026-03-10T14:08:43.048 INFO:tasks.workunit.client.0.vm03.stdout:7/657: stat d5/l62 0 2026-03-10T14:08:43.051 INFO:tasks.workunit.client.0.vm03.stdout:9/752: fsync d2/d29/d38/fbf 0 2026-03-10T14:08:43.056 INFO:tasks.workunit.client.0.vm03.stdout:3/862: creat d1d/d39/da1/de0/f10f x:0 0 0 2026-03-10T14:08:43.056 INFO:tasks.workunit.client.0.vm03.stdout:1/759: sync 2026-03-10T14:08:43.070 INFO:tasks.workunit.client.0.vm03.stdout:7/658: rename d5/d9/d35/dab to d5/d9/d14/d26/d5f/de7 0 2026-03-10T14:08:43.091 INFO:tasks.workunit.client.0.vm03.stdout:3/863: symlink d1d/d33/d65/l110 0 2026-03-10T14:08:43.092 INFO:tasks.workunit.client.0.vm03.stdout:3/864: write d1d/d33/d47/d53/d68/dcf/de7/f2e [2131739,63905] 0 2026-03-10T14:08:43.109 INFO:tasks.workunit.client.0.vm03.stdout:2/740: dwrite d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 [0,4194304] 0 2026-03-10T14:08:43.114 INFO:tasks.workunit.client.0.vm03.stdout:1/760: dread d0/d2/df/dab/f5f [0,4194304] 0 2026-03-10T14:08:43.123 INFO:tasks.workunit.client.0.vm03.stdout:0/737: dwrite d3/f9 [4194304,4194304] 0 2026-03-10T14:08:43.126 INFO:tasks.workunit.client.0.vm03.stdout:7/659: creat d5/d9/d14/d26/d5f/d89/fe8 x:0 0 0 2026-03-10T14:08:43.127 INFO:tasks.workunit.client.0.vm03.stdout:6/762: dread d8/d11/da0/dbf/d8c/f8a [0,4194304] 0 2026-03-10T14:08:43.132 INFO:tasks.workunit.client.0.vm03.stdout:5/909: dwrite d4/d13/d1f/fb1 [0,4194304] 0 2026-03-10T14:08:43.134 INFO:tasks.workunit.client.0.vm03.stdout:5/910: write d4/d40/f96 [3672255,96856] 0 2026-03-10T14:08:43.142 INFO:tasks.workunit.client.0.vm03.stdout:8/853: dwrite da/f31 [0,4194304] 0 2026-03-10T14:08:43.147 INFO:tasks.workunit.client.0.vm03.stdout:3/865: rmdir d1d/d33/d47/d53/d68/dcf 39 2026-03-10T14:08:43.152 INFO:tasks.workunit.client.0.vm03.stdout:2/741: mknod d5/d10/d1f/d4f/d76/cf3 0 2026-03-10T14:08:43.156 INFO:tasks.workunit.client.0.vm03.stdout:1/761: mknod d0/d2/db6/cf5 0 2026-03-10T14:08:43.156 INFO:tasks.workunit.client.0.vm03.stdout:8/854: sync 2026-03-10T14:08:43.168 INFO:tasks.workunit.client.0.vm03.stdout:9/753: rename d2/d14/d2b/c2e to d2/d29/d33/d60/d8c/dd3/cf4 0 2026-03-10T14:08:43.171 INFO:tasks.workunit.client.0.vm03.stdout:9/754: dwrite d2/fdb [0,4194304] 0 2026-03-10T14:08:43.177 INFO:tasks.workunit.client.0.vm03.stdout:0/738: mknod d3/d16/cee 0 2026-03-10T14:08:43.182 INFO:tasks.workunit.client.0.vm03.stdout:8/855: dread da/d58/d5f/dbc/d70/d7d/fda [0,4194304] 0 2026-03-10T14:08:43.184 INFO:tasks.workunit.client.0.vm03.stdout:6/763: creat d8/db/dd0/fef x:0 0 0 2026-03-10T14:08:43.188 INFO:tasks.workunit.client.0.vm03.stdout:6/764: dwrite d8/db/d49/d6c/d32/fc3 [0,4194304] 0 2026-03-10T14:08:43.195 INFO:tasks.workunit.client.0.vm03.stdout:3/866: creat d1d/d39/da1/de0/f111 x:0 0 0 2026-03-10T14:08:43.214 INFO:tasks.workunit.client.0.vm03.stdout:7/660: creat d5/d9/d14/d26/d5f/de2/fe9 x:0 0 0 2026-03-10T14:08:43.215 INFO:tasks.workunit.client.0.vm03.stdout:7/661: write d5/d9/d14/d26/d5f/d89/fbd [1943967,12421] 0 2026-03-10T14:08:43.215 INFO:tasks.workunit.client.0.vm03.stdout:7/662: chown d5/d9/d14/d26/d36/d51/d7b/d83/lbe 470816 1 2026-03-10T14:08:43.216 INFO:tasks.workunit.client.0.vm03.stdout:7/663: read - d5/d9/d14/d26/d36/d51/d7b/d9c/fc3 zero size 2026-03-10T14:08:43.228 INFO:tasks.workunit.client.0.vm03.stdout:4/786: dwrite d5/d9/db/f2a [0,4194304] 0 2026-03-10T14:08:43.231 INFO:tasks.workunit.client.0.vm03.stdout:6/765: symlink d8/db/d49/lf0 0 2026-03-10T14:08:43.250 INFO:tasks.workunit.client.0.vm03.stdout:5/911: dread d4/d40/d4e/f80 [0,4194304] 0 2026-03-10T14:08:43.267 INFO:tasks.workunit.client.0.vm03.stdout:4/787: readlink d5/d6e/lb0 0 2026-03-10T14:08:43.271 INFO:tasks.workunit.client.0.vm03.stdout:6/766: unlink d8/db/d12/d64/fba 0 2026-03-10T14:08:43.272 INFO:tasks.workunit.client.0.vm03.stdout:6/767: write d8/db/d12/fa1 [2124814,129556] 0 2026-03-10T14:08:43.288 INFO:tasks.workunit.client.0.vm03.stdout:3/867: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e/fc3 30226 1 2026-03-10T14:08:43.314 INFO:tasks.workunit.client.0.vm03.stdout:8/856: dwrite da/d3c/d51/d85/d104/fc1 [0,4194304] 0 2026-03-10T14:08:43.318 INFO:tasks.workunit.client.0.vm03.stdout:4/788: rmdir d5/d9/db/da7/dcc/d100 39 2026-03-10T14:08:43.318 INFO:tasks.workunit.client.0.vm03.stdout:8/857: chown da/d3a/d44/dfb/dd7/fdc 27236508 1 2026-03-10T14:08:43.319 INFO:tasks.workunit.client.0.vm03.stdout:8/858: write da/d24/d49/f66 [1919051,44776] 0 2026-03-10T14:08:43.322 INFO:tasks.workunit.client.0.vm03.stdout:6/768: mkdir d8/d11/d18/d79/d80/df1 0 2026-03-10T14:08:43.338 INFO:tasks.workunit.client.0.vm03.stdout:9/755: truncate d2/d29/d33/d41/daa/fab 425214 0 2026-03-10T14:08:43.354 INFO:tasks.workunit.client.0.vm03.stdout:0/739: truncate d3/d17/fb1 3149921 0 2026-03-10T14:08:43.354 INFO:tasks.workunit.client.0.vm03.stdout:2/742: link d5/f23 d5/d10/d1f/d4f/dbe/ff4 0 2026-03-10T14:08:43.355 INFO:tasks.workunit.client.0.vm03.stdout:1/762: creat d0/d42/ff6 x:0 0 0 2026-03-10T14:08:43.379 INFO:tasks.workunit.client.0.vm03.stdout:8/859: symlink da/d3a/d44/dfb/dd7/l111 0 2026-03-10T14:08:43.379 INFO:tasks.workunit.client.0.vm03.stdout:8/860: readlink da/d24/l2c 0 2026-03-10T14:08:43.387 INFO:tasks.workunit.client.0.vm03.stdout:6/769: unlink d8/db/d49/fe1 0 2026-03-10T14:08:43.388 INFO:tasks.workunit.client.0.vm03.stdout:6/770: write d8/d11/d7a/fa4 [5168783,92830] 0 2026-03-10T14:08:43.393 INFO:tasks.workunit.client.0.vm03.stdout:9/756: symlink d2/d29/d33/d60/d7f/lf5 0 2026-03-10T14:08:43.440 INFO:tasks.workunit.client.0.vm03.stdout:3/868: dwrite d1d/d33/d65/d5d/dae/db9/fd8 [0,4194304] 0 2026-03-10T14:08:43.465 INFO:tasks.workunit.client.0.vm03.stdout:5/912: dwrite d4/d13/d1f/d8c/f9c [0,4194304] 0 2026-03-10T14:08:43.515 INFO:tasks.workunit.client.0.vm03.stdout:0/740: mknod d3/d4d/d8f/cef 0 2026-03-10T14:08:43.515 INFO:tasks.workunit.client.0.vm03.stdout:1/763: chown d0/d2/f14 27270 1 2026-03-10T14:08:43.521 INFO:tasks.workunit.client.0.vm03.stdout:1/764: dwrite d0/d42/f7f [4194304,4194304] 0 2026-03-10T14:08:43.528 INFO:tasks.workunit.client.0.vm03.stdout:7/664: rename d5/d9/f30 to d5/d9/d14/d21/fea 0 2026-03-10T14:08:43.529 INFO:tasks.workunit.client.0.vm03.stdout:8/861: fsync da/d58/d5f/dbc/d70/d7d/fda 0 2026-03-10T14:08:43.531 INFO:tasks.workunit.client.0.vm03.stdout:7/665: chown d5/d9/d14/d26/d39/f45 461359409 1 2026-03-10T14:08:43.531 INFO:tasks.workunit.client.0.vm03.stdout:6/771: unlink d8/db/d12/f5d 0 2026-03-10T14:08:43.536 INFO:tasks.workunit.client.0.vm03.stdout:3/869: unlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/fdc 0 2026-03-10T14:08:43.537 INFO:tasks.workunit.client.0.vm03.stdout:0/741: mkdir d3/d16/d21/df0 0 2026-03-10T14:08:43.537 INFO:tasks.workunit.client.0.vm03.stdout:3/870: readlink d1d/d33/d65/d48/l6f 0 2026-03-10T14:08:43.541 INFO:tasks.workunit.client.0.vm03.stdout:9/757: dread d2/d14/f96 [0,4194304] 0 2026-03-10T14:08:43.541 INFO:tasks.workunit.client.0.vm03.stdout:7/666: rmdir d5/d9/d14/d26/d5f/de2 39 2026-03-10T14:08:43.544 INFO:tasks.workunit.client.0.vm03.stdout:8/862: dread da/d3c/d4b/f63 [0,4194304] 0 2026-03-10T14:08:43.544 INFO:tasks.workunit.client.0.vm03.stdout:8/863: chown da/d3a/d44/dfb/dd7 25454 1 2026-03-10T14:08:43.547 INFO:tasks.workunit.client.0.vm03.stdout:9/758: unlink d2/d14/d2b/d34/l4f 0 2026-03-10T14:08:43.547 INFO:tasks.workunit.client.0.vm03.stdout:8/864: write da/f33 [1485158,95258] 0 2026-03-10T14:08:43.549 INFO:tasks.workunit.client.0.vm03.stdout:4/789: link d5/d9/db/ff d5/f10c 0 2026-03-10T14:08:43.556 INFO:tasks.workunit.client.0.vm03.stdout:2/743: getdents d5/d10/d31 0 2026-03-10T14:08:43.559 INFO:tasks.workunit.client.0.vm03.stdout:7/667: dread d5/d9/d14/d26/d36/de1/f4e [0,4194304] 0 2026-03-10T14:08:43.564 INFO:tasks.workunit.client.0.vm03.stdout:2/744: dread d5/d10/d1f/f5e [0,4194304] 0 2026-03-10T14:08:43.568 INFO:tasks.workunit.client.0.vm03.stdout:3/871: rename d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/fbe to d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e/f112 0 2026-03-10T14:08:43.569 INFO:tasks.workunit.client.0.vm03.stdout:2/745: stat d5/d10/d1f/d4f/d76/da7/d40/d59/f71 0 2026-03-10T14:08:43.572 INFO:tasks.workunit.client.0.vm03.stdout:5/913: link d4/d13/d8f/lae d4/d16/df4/l119 0 2026-03-10T14:08:43.572 INFO:tasks.workunit.client.0.vm03.stdout:6/772: creat d8/db/d49/d76/ff2 x:0 0 0 2026-03-10T14:08:43.576 INFO:tasks.workunit.client.0.vm03.stdout:6/773: dread - d8/d11/da0/dbf/d5c/d60/f92 zero size 2026-03-10T14:08:43.587 INFO:tasks.workunit.client.0.vm03.stdout:7/668: dread d5/d9/f1f [0,4194304] 0 2026-03-10T14:08:43.588 INFO:tasks.workunit.client.0.vm03.stdout:6/774: dwrite d8/fd [0,4194304] 0 2026-03-10T14:08:43.600 INFO:tasks.workunit.client.0.vm03.stdout:3/872: sync 2026-03-10T14:08:43.615 INFO:tasks.workunit.client.0.vm03.stdout:5/914: mknod d4/d40/d4e/c11a 0 2026-03-10T14:08:43.646 INFO:tasks.workunit.client.0.vm03.stdout:8/865: link da/f62 da/d58/d5f/dbc/f112 0 2026-03-10T14:08:43.677 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:43 vm03.local ceph-mon[49718]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:08:43.677 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:43 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:43.677 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:43 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:43.677 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:43 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:43.677 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:43 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:43.677 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:43 vm03.local ceph-mon[49718]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-10T14:08:43.678 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:43 vm03.local ceph-mon[49718]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-10T14:08:43.694 INFO:tasks.workunit.client.0.vm03.stdout:7/669: creat d5/d9/d14/d26/d36/db4/feb x:0 0 0 2026-03-10T14:08:43.723 INFO:tasks.workunit.client.0.vm03.stdout:5/915: chown d4/d6/c45 238 1 2026-03-10T14:08:43.742 INFO:tasks.workunit.client.0.vm03.stdout:1/765: write d0/d2/df/d16/d20/f5a [4881303,79708] 0 2026-03-10T14:08:43.742 INFO:tasks.workunit.client.0.vm03.stdout:1/766: stat d0/d18/d1d/dc4 0 2026-03-10T14:08:43.760 INFO:tasks.workunit.client.0.vm03.stdout:7/670: creat d5/d9/fec x:0 0 0 2026-03-10T14:08:43.760 INFO:tasks.workunit.client.0.vm03.stdout:4/790: dwrite d5/d47/d62/d8a/dd0/d102/f94 [0,4194304] 0 2026-03-10T14:08:43.763 INFO:tasks.workunit.client.0.vm03.stdout:9/759: truncate d2/d14/f30 870884 0 2026-03-10T14:08:43.765 INFO:tasks.workunit.client.0.vm03.stdout:2/746: write d5/d10/d1f/d4f/d76/da7/d40/d59/faa [111149,16649] 0 2026-03-10T14:08:43.771 INFO:tasks.workunit.client.0.vm03.stdout:7/671: rename d5/d9/d14/d26/c61 to d5/d9/d14/d26/d36/d51/d7b/ced 0 2026-03-10T14:08:43.772 INFO:tasks.workunit.client.0.vm03.stdout:7/672: write d5/d9/d14/d26/d5f/f6b [452129,83704] 0 2026-03-10T14:08:43.776 INFO:tasks.workunit.client.0.vm03.stdout:9/760: mkdir d2/d29/d33/d60/d8c/dd3/df6 0 2026-03-10T14:08:43.777 INFO:tasks.workunit.client.0.vm03.stdout:4/791: creat d5/d47/d62/d8a/dd0/d102/f10d x:0 0 0 2026-03-10T14:08:43.781 INFO:tasks.workunit.client.0.vm03.stdout:2/747: mkdir d5/dac/df5 0 2026-03-10T14:08:43.795 INFO:tasks.workunit.client.0.vm03.stdout:2/748: dread - d5/db4/d74/d83/fc4 zero size 2026-03-10T14:08:43.805 INFO:tasks.workunit.client.0.vm03.stdout:2/749: unlink d5/d10/d1f/d4f/d76/da7/d54/l64 0 2026-03-10T14:08:43.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:43 vm04.local ceph-mon[55966]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:08:43.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:43 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:43.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:43 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:43.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:43 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:43.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:43 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:43.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:43 vm04.local ceph-mon[55966]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-10T14:08:43.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:43 vm04.local ceph-mon[55966]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-10T14:08:43.821 INFO:tasks.workunit.client.0.vm03.stdout:9/761: getdents d2/d29/d33/d41/d95 0 2026-03-10T14:08:43.822 INFO:tasks.workunit.client.0.vm03.stdout:9/762: chown d2/d29/d33/d6d/d87 0 1 2026-03-10T14:08:43.827 INFO:tasks.workunit.client.0.vm03.stdout:9/763: dread - d2/d14/d2b/d79/fd1 zero size 2026-03-10T14:08:43.849 INFO:tasks.workunit.client.0.vm03.stdout:9/764: dread d2/d29/d33/d41/f6e [0,4194304] 0 2026-03-10T14:08:43.861 INFO:tasks.workunit.client.0.vm03.stdout:5/916: getdents d4/d6/de 0 2026-03-10T14:08:43.892 INFO:tasks.workunit.client.0.vm03.stdout:9/765: link d2/d29/d38/f4b d2/d29/d33/d60/d8c/de1/ff7 0 2026-03-10T14:08:43.903 INFO:tasks.workunit.client.0.vm03.stdout:5/917: dread d4/d16/d19/d4a/fb2 [0,4194304] 0 2026-03-10T14:08:43.906 INFO:tasks.workunit.client.0.vm03.stdout:5/918: chown d4/d6/de/d10e/ced 454897151 1 2026-03-10T14:08:43.915 INFO:tasks.workunit.client.0.vm03.stdout:5/919: creat d4/d6/de/d10e/f11b x:0 0 0 2026-03-10T14:08:43.916 INFO:tasks.workunit.client.0.vm03.stdout:5/920: chown d4/d6/ddc/ffc 153 1 2026-03-10T14:08:43.916 INFO:tasks.workunit.client.0.vm03.stdout:5/921: write d4/d13/d43/f72 [40329,44567] 0 2026-03-10T14:08:43.920 INFO:tasks.workunit.client.0.vm03.stdout:5/922: creat d4/d16/df4/f11c x:0 0 0 2026-03-10T14:08:43.923 INFO:tasks.workunit.client.0.vm03.stdout:7/673: dread d5/d9/d14/d26/d39/f45 [0,4194304] 0 2026-03-10T14:08:43.929 INFO:tasks.workunit.client.0.vm03.stdout:7/674: creat d5/d9/d14/d21/d9a/fee x:0 0 0 2026-03-10T14:08:43.954 INFO:tasks.workunit.client.0.vm03.stdout:5/923: sync 2026-03-10T14:08:43.954 INFO:tasks.workunit.client.0.vm03.stdout:7/675: sync 2026-03-10T14:08:43.956 INFO:tasks.workunit.client.0.vm03.stdout:7/676: rmdir d5/d9/d14/d26/d5f 39 2026-03-10T14:08:43.957 INFO:tasks.workunit.client.0.vm03.stdout:5/924: dread - d4/d13/d43/fcf zero size 2026-03-10T14:08:43.962 INFO:tasks.workunit.client.0.vm03.stdout:6/775: dwrite d8/d11/da0/dbf/d8c/f8a [4194304,4194304] 0 2026-03-10T14:08:43.964 INFO:tasks.workunit.client.0.vm03.stdout:6/776: mknod d8/d11/da0/dbf/d8c/db6/cf3 0 2026-03-10T14:08:43.965 INFO:tasks.workunit.client.0.vm03.stdout:6/777: fdatasync d8/d11/da0/dbf/d5c/f8d 0 2026-03-10T14:08:43.972 INFO:tasks.workunit.client.0.vm03.stdout:8/866: write f7 [1652450,100030] 0 2026-03-10T14:08:43.978 INFO:tasks.workunit.client.0.vm03.stdout:6/778: link d8/d11/da0/dbf/d8c/lc1 d8/db/d49/d6c/d83/da3/lf4 0 2026-03-10T14:08:43.986 INFO:tasks.workunit.client.0.vm03.stdout:6/779: truncate d8/d1b/f29 1986727 0 2026-03-10T14:08:43.990 INFO:tasks.workunit.client.0.vm03.stdout:6/780: dwrite d8/d11/da0/dbf/d8c/f8a [4194304,4194304] 0 2026-03-10T14:08:43.991 INFO:tasks.workunit.client.0.vm03.stdout:6/781: chown d8/db/df 1742 1 2026-03-10T14:08:43.991 INFO:tasks.workunit.client.0.vm03.stdout:6/782: write d8/db/d49/d76/fc4 [1674620,84006] 0 2026-03-10T14:08:43.993 INFO:tasks.workunit.client.0.vm03.stdout:6/783: write d8/db/d49/d76/fc4 [1620614,15463] 0 2026-03-10T14:08:44.008 INFO:tasks.workunit.client.0.vm03.stdout:6/784: fsync d8/d1b/f61 0 2026-03-10T14:08:44.012 INFO:tasks.workunit.client.0.vm03.stdout:6/785: mknod d8/d11/d7a/cf5 0 2026-03-10T14:08:44.016 INFO:tasks.workunit.client.0.vm03.stdout:6/786: dread d8/db/f3c [0,4194304] 0 2026-03-10T14:08:44.017 INFO:tasks.workunit.client.0.vm03.stdout:6/787: read d8/db/df/f10 [5253913,125914] 0 2026-03-10T14:08:44.017 INFO:tasks.workunit.client.0.vm03.stdout:6/788: dread - d8/db/d49/d76/ff2 zero size 2026-03-10T14:08:44.021 INFO:tasks.workunit.client.0.vm03.stdout:6/789: readlink d8/db/d49/d58/ld4 0 2026-03-10T14:08:44.021 INFO:tasks.workunit.client.0.vm03.stdout:6/790: chown d8/d1b/d1c/c30 101848392 1 2026-03-10T14:08:44.023 INFO:tasks.workunit.client.0.vm03.stdout:6/791: truncate d8/d11/f35 2262321 0 2026-03-10T14:08:44.030 INFO:tasks.workunit.client.0.vm03.stdout:8/867: dread da/d3a/d44/f46 [0,4194304] 0 2026-03-10T14:08:44.035 INFO:tasks.workunit.client.0.vm03.stdout:8/868: dwrite da/d3c/d51/d85/f10c [0,4194304] 0 2026-03-10T14:08:44.041 INFO:tasks.workunit.client.0.vm03.stdout:8/869: rmdir da/d3a/d44/dfb/dd7/de7 0 2026-03-10T14:08:44.057 INFO:tasks.workunit.client.0.vm03.stdout:0/742: write d3/d16/d21/f50 [114482,21552] 0 2026-03-10T14:08:44.058 INFO:tasks.workunit.client.0.vm03.stdout:8/870: getdents da/d3c/d4b/dd2 0 2026-03-10T14:08:44.063 INFO:tasks.workunit.client.0.vm03.stdout:8/871: dread da/d3c/d51/d85/fcb [0,4194304] 0 2026-03-10T14:08:44.070 INFO:tasks.workunit.client.0.vm03.stdout:0/743: rmdir d3/d46/d5e 39 2026-03-10T14:08:44.071 INFO:tasks.workunit.client.0.vm03.stdout:0/744: write d3/d17/f35 [1953890,64729] 0 2026-03-10T14:08:44.082 INFO:tasks.workunit.client.0.vm03.stdout:0/745: fsync d3/d46/f63 0 2026-03-10T14:08:44.121 INFO:tasks.workunit.client.0.vm03.stdout:0/746: link d3/d16/d21/fb8 d3/d11/d76/db5/ddb/ff1 0 2026-03-10T14:08:44.127 INFO:tasks.workunit.client.0.vm03.stdout:0/747: mknod d3/d11/d2c/d4a/d4b/d89/db9/cf2 0 2026-03-10T14:08:44.138 INFO:tasks.workunit.client.0.vm03.stdout:2/750: write d5/d10/d1f/d4f/d76/da7/d40/f68 [748559,83067] 0 2026-03-10T14:08:44.139 INFO:tasks.workunit.client.0.vm03.stdout:2/751: write d5/d10/d1f/d4f/d76/da7/f7e [5302671,117598] 0 2026-03-10T14:08:44.139 INFO:tasks.workunit.client.0.vm03.stdout:2/752: fsync d5/d2a/f34 0 2026-03-10T14:08:44.140 INFO:tasks.workunit.client.0.vm03.stdout:2/753: fdatasync d5/d10/d31/fa9 0 2026-03-10T14:08:44.140 INFO:tasks.workunit.client.0.vm03.stdout:2/754: chown d5/cd4 32283478 1 2026-03-10T14:08:44.144 INFO:tasks.workunit.client.0.vm03.stdout:0/748: dread d3/d46/dac/d79/f8d [0,4194304] 0 2026-03-10T14:08:44.157 INFO:tasks.workunit.client.0.vm03.stdout:1/767: mknod d0/d2/df/cf7 0 2026-03-10T14:08:44.159 INFO:tasks.workunit.client.0.vm03.stdout:3/873: creat d1d/d33/d65/f113 x:0 0 0 2026-03-10T14:08:44.164 INFO:tasks.workunit.client.0.vm03.stdout:3/874: mknod d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/c114 0 2026-03-10T14:08:44.169 INFO:tasks.workunit.client.0.vm03.stdout:9/766: write d2/d14/f39 [3455578,75098] 0 2026-03-10T14:08:44.170 INFO:tasks.workunit.client.0.vm03.stdout:3/875: dread d1d/d33/d47/d53/d68/dcf/de7/faf [0,4194304] 0 2026-03-10T14:08:44.176 INFO:tasks.workunit.client.0.vm03.stdout:3/876: dread d1d/d33/d47/d53/d68/dcf/de7/f6b [0,4194304] 0 2026-03-10T14:08:44.176 INFO:tasks.workunit.client.0.vm03.stdout:7/677: dwrite d5/d9/f22 [0,4194304] 0 2026-03-10T14:08:44.186 INFO:tasks.workunit.client.0.vm03.stdout:3/877: dread d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/fb5 [0,4194304] 0 2026-03-10T14:08:44.187 INFO:tasks.workunit.client.0.vm03.stdout:3/878: stat d1d/d33/d47/d53/d68/dcf/de7/f87 0 2026-03-10T14:08:44.189 INFO:tasks.workunit.client.0.vm03.stdout:3/879: truncate d1d/d33/d47/d53/d68/dcf/de7/f9b 4231355 0 2026-03-10T14:08:44.193 INFO:tasks.workunit.client.0.vm03.stdout:7/678: chown d5/d9/l99 1169414 1 2026-03-10T14:08:44.193 INFO:tasks.workunit.client.0.vm03.stdout:7/679: write d5/d9/d14/d26/d36/d51/d7b/fc9 [501117,123820] 0 2026-03-10T14:08:44.197 INFO:tasks.workunit.client.0.vm03.stdout:3/880: unlink d1d/d33/d65/d5d/dae/db9/fd8 0 2026-03-10T14:08:44.207 INFO:tasks.workunit.client.0.vm03.stdout:4/792: rename f2 to d5/d9/f10e 0 2026-03-10T14:08:44.208 INFO:tasks.workunit.client.0.vm03.stdout:3/881: mkdir d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/db6/d115 0 2026-03-10T14:08:44.209 INFO:tasks.workunit.client.0.vm03.stdout:3/882: dread d1d/d33/f5e [0,4194304] 0 2026-03-10T14:08:44.210 INFO:tasks.workunit.client.0.vm03.stdout:7/680: creat d5/d9/d14/d26/d5f/de7/fef x:0 0 0 2026-03-10T14:08:44.211 INFO:tasks.workunit.client.0.vm03.stdout:7/681: truncate d5/d9/d14/d26/d36/db4/fda 30679 0 2026-03-10T14:08:44.212 INFO:tasks.workunit.client.0.vm03.stdout:7/682: chown d5/d9/da7/lc2 0 1 2026-03-10T14:08:44.216 INFO:tasks.workunit.client.0.vm03.stdout:4/793: dread - d5/d9/fd6 zero size 2026-03-10T14:08:44.221 INFO:tasks.workunit.client.0.vm03.stdout:5/925: rename d4/d16/d19/d6e/da3 to d4/d13/d8f/d11d 0 2026-03-10T14:08:44.224 INFO:tasks.workunit.client.0.vm03.stdout:7/683: dread d5/d9/d14/d26/d5f/f6b [0,4194304] 0 2026-03-10T14:08:44.239 INFO:tasks.workunit.client.0.vm03.stdout:6/792: dwrite d8/d11/da0/dbf/d5c/d60/f82 [4194304,4194304] 0 2026-03-10T14:08:44.243 INFO:tasks.workunit.client.0.vm03.stdout:6/793: dwrite f3 [4194304,4194304] 0 2026-03-10T14:08:44.244 INFO:tasks.workunit.client.0.vm03.stdout:6/794: write d8/d11/da0/dbf/d5c/d60/f82 [599424,81933] 0 2026-03-10T14:08:44.244 INFO:tasks.workunit.client.0.vm03.stdout:6/795: fsync d8/db/d12/f72 0 2026-03-10T14:08:44.247 INFO:tasks.workunit.client.0.vm03.stdout:3/883: mkdir d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/db6/d116 0 2026-03-10T14:08:44.254 INFO:tasks.workunit.client.0.vm03.stdout:3/884: dwrite d1d/d39/da1/de0/f111 [0,4194304] 0 2026-03-10T14:08:44.268 INFO:tasks.workunit.client.0.vm03.stdout:5/926: sync 2026-03-10T14:08:44.276 INFO:tasks.workunit.client.0.vm03.stdout:7/684: dread d5/d9/d14/d26/d36/d51/d7b/d83/f9e [0,4194304] 0 2026-03-10T14:08:44.278 INFO:tasks.workunit.client.0.vm03.stdout:8/872: truncate da/d24/d49/f66 6211184 0 2026-03-10T14:08:44.278 INFO:tasks.workunit.client.0.vm03.stdout:8/873: chown da/d3c/d51/caf 964295 1 2026-03-10T14:08:44.283 INFO:tasks.workunit.client.0.vm03.stdout:3/885: fdatasync d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f81 0 2026-03-10T14:08:44.287 INFO:tasks.workunit.client.0.vm03.stdout:9/767: rename d2/c1f to d2/d14/d2b/d79/d81/cf8 0 2026-03-10T14:08:44.287 INFO:tasks.workunit.client.0.vm03.stdout:3/886: rename d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83 to d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d117 22 2026-03-10T14:08:44.291 INFO:tasks.workunit.client.0.vm03.stdout:5/927: mkdir d4/d16/d19/d23/d11e 0 2026-03-10T14:08:44.293 INFO:tasks.workunit.client.0.vm03.stdout:2/755: write f4 [2473027,39179] 0 2026-03-10T14:08:44.300 INFO:tasks.workunit.client.0.vm03.stdout:0/749: dwrite d3/d11/d76/dd3/fe3 [0,4194304] 0 2026-03-10T14:08:44.305 INFO:tasks.workunit.client.0.vm03.stdout:0/750: dread d3/d4d/f22 [0,4194304] 0 2026-03-10T14:08:44.311 INFO:tasks.workunit.client.0.vm03.stdout:9/768: creat d2/d29/d33/d41/d46/ff9 x:0 0 0 2026-03-10T14:08:44.315 INFO:tasks.workunit.client.0.vm03.stdout:1/768: dwrite d0/d2/df/d27/d7e/d81/f86 [0,4194304] 0 2026-03-10T14:08:44.322 INFO:tasks.workunit.client.0.vm03.stdout:8/874: mknod da/d3c/d51/d85/d104/dd1/c113 0 2026-03-10T14:08:44.327 INFO:tasks.workunit.client.0.vm03.stdout:0/751: symlink d3/d16/lf3 0 2026-03-10T14:08:44.333 INFO:tasks.workunit.client.0.vm03.stdout:9/769: symlink d2/d29/d33/d60/lfa 0 2026-03-10T14:08:44.337 INFO:tasks.workunit.client.0.vm03.stdout:9/770: write d2/d29/dcd/d72/d9d/fa0 [3234036,96538] 0 2026-03-10T14:08:44.341 INFO:tasks.workunit.client.0.vm03.stdout:5/928: symlink d4/d40/l11f 0 2026-03-10T14:08:44.342 INFO:tasks.workunit.client.0.vm03.stdout:2/756: mkdir d5/d10/da3/df6 0 2026-03-10T14:08:44.349 INFO:tasks.workunit.client.0.vm03.stdout:9/771: fdatasync d2/d29/d38/fbf 0 2026-03-10T14:08:44.350 INFO:tasks.workunit.client.0.vm03.stdout:9/772: write d2/d29/d33/d41/d95/fae [4079799,94831] 0 2026-03-10T14:08:44.354 INFO:tasks.workunit.client.0.vm03.stdout:3/887: rmdir d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/db6/d116 0 2026-03-10T14:08:44.356 INFO:tasks.workunit.client.0.vm03.stdout:4/794: write d5/d47/d5b/d64/f78 [1007914,44270] 0 2026-03-10T14:08:44.380 INFO:tasks.workunit.client.0.vm03.stdout:1/769: link d0/d2/f9 d0/d2/df/d91/ff8 0 2026-03-10T14:08:44.384 INFO:tasks.workunit.client.0.vm03.stdout:6/796: write d8/d1b/f31 [3090709,38736] 0 2026-03-10T14:08:44.385 INFO:tasks.workunit.client.0.vm03.stdout:7/685: write d5/d9/d14/f41 [1733066,45876] 0 2026-03-10T14:08:44.388 INFO:tasks.workunit.client.0.vm03.stdout:6/797: dwrite d8/db/d49/d76/fc4 [4194304,4194304] 0 2026-03-10T14:08:44.388 INFO:tasks.workunit.client.0.vm03.stdout:6/798: stat d8/d11/d18/f21 0 2026-03-10T14:08:44.408 INFO:tasks.workunit.client.0.vm03.stdout:8/875: dwrite da/d3c/d51/d75/dc2/f7b [0,4194304] 0 2026-03-10T14:08:44.442 INFO:tasks.workunit.client.0.vm03.stdout:5/929: mknod d4/d40/d4e/d115/c120 0 2026-03-10T14:08:44.443 INFO:tasks.workunit.client.0.vm03.stdout:0/752: write d3/d46/f55 [695349,79014] 0 2026-03-10T14:08:44.456 INFO:tasks.workunit.client.0.vm03.stdout:7/686: creat d5/d9/d14/d26/d36/de1/dcc/ff0 x:0 0 0 2026-03-10T14:08:44.461 INFO:tasks.workunit.client.0.vm03.stdout:8/876: rename da/d58/d5f to da/d3c/d4b/d4c/dba/d114 0 2026-03-10T14:08:44.464 INFO:tasks.workunit.client.0.vm03.stdout:3/888: unlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/da6/db7/l103 0 2026-03-10T14:08:44.468 INFO:tasks.workunit.client.0.vm03.stdout:3/889: dwrite d1d/d33/d47/d53/d68/dcf/de7/f2e [0,4194304] 0 2026-03-10T14:08:44.470 INFO:tasks.workunit.client.0.vm03.stdout:3/890: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e/l8d 17 1 2026-03-10T14:08:44.478 INFO:tasks.workunit.client.0.vm03.stdout:4/795: symlink d5/d9/db/da7/dcc/d100/l10f 0 2026-03-10T14:08:44.478 INFO:tasks.workunit.client.0.vm03.stdout:4/796: stat d5/d9/db/d19/d34/f58 0 2026-03-10T14:08:44.483 INFO:tasks.workunit.client.0.vm03.stdout:5/930: chown d4/d6/de/d10e/f54 26420345 1 2026-03-10T14:08:44.486 INFO:tasks.workunit.client.0.vm03.stdout:5/931: dwrite d4/d6/de/fe7 [0,4194304] 0 2026-03-10T14:08:44.497 INFO:tasks.workunit.client.0.vm03.stdout:6/799: mkdir d8/d11/d18/d79/d80/df1/df6 0 2026-03-10T14:08:44.497 INFO:tasks.workunit.client.0.vm03.stdout:6/800: chown d8/d11 123 1 2026-03-10T14:08:44.501 INFO:tasks.workunit.client.0.vm03.stdout:6/801: dwrite d8/db/d49/d76/fc4 [4194304,4194304] 0 2026-03-10T14:08:44.508 INFO:tasks.workunit.client.0.vm03.stdout:2/757: creat d5/d10/da3/ff7 x:0 0 0 2026-03-10T14:08:44.508 INFO:tasks.workunit.client.0.vm03.stdout:2/758: chown d5/db4/d74/d83/lb8 2873864 1 2026-03-10T14:08:44.516 INFO:tasks.workunit.client.0.vm03.stdout:0/753: rename d3/d11/d2c/d4a/d7b to d3/d11/d2c/d4a/d4b/d89/db6/df4 0 2026-03-10T14:08:44.517 INFO:tasks.workunit.client.0.vm03.stdout:8/877: creat da/d58/f115 x:0 0 0 2026-03-10T14:08:44.522 INFO:tasks.workunit.client.0.vm03.stdout:3/891: dread d1d/d33/d65/d48/f58 [0,4194304] 0 2026-03-10T14:08:44.524 INFO:tasks.workunit.client.0.vm03.stdout:5/932: rmdir d4/d40/dbc/d105 39 2026-03-10T14:08:44.526 INFO:tasks.workunit.client.0.vm03.stdout:3/892: dread d1d/d33/d47/d53/d68/dcf/de7/f44 [0,4194304] 0 2026-03-10T14:08:44.526 INFO:tasks.workunit.client.0.vm03.stdout:3/893: fsync d1d/f7a 0 2026-03-10T14:08:44.527 INFO:tasks.workunit.client.0.vm03.stdout:3/894: stat d1d/d33/d65/d5d/ldd 0 2026-03-10T14:08:44.527 INFO:tasks.workunit.client.0.vm03.stdout:3/895: fsync d1d/d33/f3a 0 2026-03-10T14:08:44.528 INFO:tasks.workunit.client.0.vm03.stdout:3/896: readlink d1d/d33/d47/d53/d68/l77 0 2026-03-10T14:08:44.530 INFO:tasks.workunit.client.0.vm03.stdout:7/687: symlink d5/d9/d14/d26/d39/lf1 0 2026-03-10T14:08:44.537 INFO:tasks.workunit.client.0.vm03.stdout:6/802: dwrite d8/d11/d18/d79/d80/fd5 [0,4194304] 0 2026-03-10T14:08:44.547 INFO:tasks.workunit.client.0.vm03.stdout:9/773: link d2/d14/d2b/d79/d81/cf8 d2/d29/d33/d60/cfb 0 2026-03-10T14:08:44.556 INFO:tasks.workunit.client.0.vm03.stdout:9/774: write d2/d29/d33/d41/d46/ff9 [730338,24504] 0 2026-03-10T14:08:44.556 INFO:tasks.workunit.client.0.vm03.stdout:4/797: fdatasync d5/d9/f10e 0 2026-03-10T14:08:44.556 INFO:tasks.workunit.client.0.vm03.stdout:4/798: stat d5/d47/d5b/d64 0 2026-03-10T14:08:44.565 INFO:tasks.workunit.client.0.vm03.stdout:6/803: rmdir d8/db/d49/d58 39 2026-03-10T14:08:44.566 INFO:tasks.workunit.client.0.vm03.stdout:6/804: truncate d8/d1b/d1c/fd9 1040384 0 2026-03-10T14:08:44.569 INFO:tasks.workunit.client.0.vm03.stdout:2/759: symlink d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/lf8 0 2026-03-10T14:08:44.570 INFO:tasks.workunit.client.0.vm03.stdout:2/760: readlink d5/d10/d1f/d4f/d76/da7/d54/d5f/leb 0 2026-03-10T14:08:44.572 INFO:tasks.workunit.client.0.vm03.stdout:0/754: truncate d3/d16/d21/f78 1378546 0 2026-03-10T14:08:44.575 INFO:tasks.workunit.client.0.vm03.stdout:9/775: creat d2/d29/d33/d41/ffc x:0 0 0 2026-03-10T14:08:44.581 INFO:tasks.workunit.client.0.vm03.stdout:6/805: dread d8/d1b/d1c/f50 [0,4194304] 0 2026-03-10T14:08:44.581 INFO:tasks.workunit.client.0.vm03.stdout:4/799: fsync d5/d47/d62/d8a/dd0/d102/f63 0 2026-03-10T14:08:44.583 INFO:tasks.workunit.client.0.vm03.stdout:3/897: sync 2026-03-10T14:08:44.586 INFO:tasks.workunit.client.0.vm03.stdout:5/933: symlink d4/d40/l121 0 2026-03-10T14:08:44.589 INFO:tasks.workunit.client.0.vm03.stdout:1/770: write d0/f24 [3323737,6057] 0 2026-03-10T14:08:44.596 INFO:tasks.workunit.client.0.vm03.stdout:2/761: creat d5/d10/d31/ff9 x:0 0 0 2026-03-10T14:08:44.597 INFO:tasks.workunit.client.0.vm03.stdout:0/755: chown d3/d16/d21/cc8 9275 1 2026-03-10T14:08:44.599 INFO:tasks.workunit.client.0.vm03.stdout:9/776: rmdir d2/d29/d33/d41/d46/dd0 39 2026-03-10T14:08:44.602 INFO:tasks.workunit.client.0.vm03.stdout:8/878: dwrite da/d24/f28 [0,4194304] 0 2026-03-10T14:08:44.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:44 vm03.local ceph-mon[49718]: pgmap v9: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 37 MiB/s rd, 76 MiB/s wr, 229 op/s 2026-03-10T14:08:44.612 INFO:tasks.workunit.client.0.vm03.stdout:8/879: dread da/d3c/d4b/d4c/dba/d114/dbc/d70/d7d/fa1 [0,4194304] 0 2026-03-10T14:08:44.613 INFO:tasks.workunit.client.0.vm03.stdout:8/880: truncate da/d36/d4d/da5/dd6/fff 512723 0 2026-03-10T14:08:44.615 INFO:tasks.workunit.client.0.vm03.stdout:6/806: unlink d8/d11/l74 0 2026-03-10T14:08:44.621 INFO:tasks.workunit.client.0.vm03.stdout:7/688: write d5/d9/f17 [3247444,59043] 0 2026-03-10T14:08:44.621 INFO:tasks.workunit.client.0.vm03.stdout:3/898: read d1d/f62 [2571611,73224] 0 2026-03-10T14:08:44.622 INFO:tasks.workunit.client.0.vm03.stdout:7/689: readlink d5/d9/d35/l57 0 2026-03-10T14:08:44.625 INFO:tasks.workunit.client.0.vm03.stdout:5/934: rename d4/d16/d19/d4a/faa to d4/d16/da0/f122 0 2026-03-10T14:08:44.625 INFO:tasks.workunit.client.0.vm03.stdout:5/935: readlink d4/d13/d43/l9d 0 2026-03-10T14:08:44.626 INFO:tasks.workunit.client.0.vm03.stdout:1/771: symlink d0/d2/df/d16/ded/lf9 0 2026-03-10T14:08:44.627 INFO:tasks.workunit.client.0.vm03.stdout:1/772: write d0/d2/df/d27/fc5 [1405227,129378] 0 2026-03-10T14:08:44.628 INFO:tasks.workunit.client.0.vm03.stdout:1/773: write d0/d2/df/d27/ff4 [618533,41625] 0 2026-03-10T14:08:44.628 INFO:tasks.workunit.client.0.vm03.stdout:1/774: chown d0/d2/df/d27/l4e 181 1 2026-03-10T14:08:44.657 INFO:tasks.workunit.client.0.vm03.stdout:4/800: dwrite d5/d9/db/d19/d38/f77 [0,4194304] 0 2026-03-10T14:08:44.658 INFO:tasks.workunit.client.0.vm03.stdout:4/801: readlink d5/db4/lf4 0 2026-03-10T14:08:44.670 INFO:tasks.workunit.client.0.vm03.stdout:9/777: mknod d2/d29/d33/d41/d95/cfd 0 2026-03-10T14:08:44.671 INFO:tasks.workunit.client.0.vm03.stdout:9/778: write d2/d29/d33/d60/f65 [4665476,24073] 0 2026-03-10T14:08:44.672 INFO:tasks.workunit.client.0.vm03.stdout:3/899: rename d1d/d39/da1/de0 to d1d/d33/d65/d48/d118 0 2026-03-10T14:08:44.672 INFO:tasks.workunit.client.0.vm03.stdout:5/936: truncate d4/d6/f6d 3094396 0 2026-03-10T14:08:44.674 INFO:tasks.workunit.client.0.vm03.stdout:1/775: fsync d0/d2/df/d27/faf 0 2026-03-10T14:08:44.676 INFO:tasks.workunit.client.0.vm03.stdout:2/762: symlink d5/d10/d1f/d4f/d76/da7/d54/lfa 0 2026-03-10T14:08:44.697 INFO:tasks.workunit.client.0.vm03.stdout:4/802: dread d5/d9/f11 [0,4194304] 0 2026-03-10T14:08:44.703 INFO:tasks.workunit.client.0.vm03.stdout:6/807: dwrite d8/db/df/f37 [0,4194304] 0 2026-03-10T14:08:44.723 INFO:tasks.workunit.client.0.vm03.stdout:7/690: rename d5/d9/d14/d26/d36/db4/fda to d5/d9/d14/d26/d36/d51/ff2 0 2026-03-10T14:08:44.723 INFO:tasks.workunit.client.0.vm03.stdout:5/937: fdatasync d4/d40/fcb 0 2026-03-10T14:08:44.730 INFO:tasks.workunit.client.0.vm03.stdout:0/756: link d3/d16/c3a d3/d46/dac/d79/cf5 0 2026-03-10T14:08:44.731 INFO:tasks.workunit.client.0.vm03.stdout:2/763: dread - d5/d35/fad zero size 2026-03-10T14:08:44.732 INFO:tasks.workunit.client.0.vm03.stdout:2/764: dread - d5/d10/d1f/d4f/d76/da7/d40/d59/f70 zero size 2026-03-10T14:08:44.735 INFO:tasks.workunit.client.0.vm03.stdout:4/803: creat d5/d9/db/d19/d38/ded/f110 x:0 0 0 2026-03-10T14:08:44.736 INFO:tasks.workunit.client.0.vm03.stdout:6/808: unlink d8/d11/da0/dbf/d8c/f8a 0 2026-03-10T14:08:44.737 INFO:tasks.workunit.client.0.vm03.stdout:6/809: write d8/d11/d18/d79/d80/fd5 [2142086,80948] 0 2026-03-10T14:08:44.740 INFO:tasks.workunit.client.0.vm03.stdout:8/881: symlink da/d36/d6a/l116 0 2026-03-10T14:08:44.740 INFO:tasks.workunit.client.0.vm03.stdout:5/938: mknod d4/d13/d1f/c123 0 2026-03-10T14:08:44.746 INFO:tasks.workunit.client.0.vm03.stdout:5/939: dwrite d4/d13/d1f/d8c/f9c [0,4194304] 0 2026-03-10T14:08:44.746 INFO:tasks.workunit.client.0.vm03.stdout:8/882: dwrite da/d3c/d4b/d4c/dba/d114/dbc/d70/d7d/f9c [0,4194304] 0 2026-03-10T14:08:44.762 INFO:tasks.workunit.client.0.vm03.stdout:4/804: symlink d5/d9/db/da7/dcc/l111 0 2026-03-10T14:08:44.763 INFO:tasks.workunit.client.0.vm03.stdout:4/805: fsync d5/d9/db/f12 0 2026-03-10T14:08:44.769 INFO:tasks.workunit.client.0.vm03.stdout:8/883: fdatasync da/f62 0 2026-03-10T14:08:44.780 INFO:tasks.workunit.client.0.vm03.stdout:8/884: stat da/d3c/f4f 0 2026-03-10T14:08:44.780 INFO:tasks.workunit.client.0.vm03.stdout:4/806: creat d5/d47/d62/de9/f112 x:0 0 0 2026-03-10T14:08:44.780 INFO:tasks.workunit.client.0.vm03.stdout:8/885: dwrite da/d3c/d51/d85/d104/fc1 [0,4194304] 0 2026-03-10T14:08:44.780 INFO:tasks.workunit.client.0.vm03.stdout:7/691: getdents d5/d9/d14/d26/d5f 0 2026-03-10T14:08:44.780 INFO:tasks.workunit.client.0.vm03.stdout:7/692: truncate d5/f1b 2315141 0 2026-03-10T14:08:44.780 INFO:tasks.workunit.client.0.vm03.stdout:7/693: unlink d5/d9/d14/d26/d36/de1/c75 0 2026-03-10T14:08:44.780 INFO:tasks.workunit.client.0.vm03.stdout:7/694: write d5/f1b [2404464,34746] 0 2026-03-10T14:08:44.788 INFO:tasks.workunit.client.0.vm03.stdout:8/886: symlink da/d3c/d4b/d4c/dba/d114/dbc/d70/d99/l117 0 2026-03-10T14:08:44.789 INFO:tasks.workunit.client.0.vm03.stdout:8/887: creat da/d3c/d51/d85/d104/f118 x:0 0 0 2026-03-10T14:08:44.790 INFO:tasks.workunit.client.0.vm03.stdout:8/888: chown da/d3c/d4b/d4c/dba/d114/dbc/d70/d99 4 1 2026-03-10T14:08:44.809 INFO:tasks.workunit.client.0.vm03.stdout:6/810: dread d8/d1b/f31 [0,4194304] 0 2026-03-10T14:08:44.809 INFO:tasks.workunit.client.0.vm03.stdout:6/811: fsync d8/fd 0 2026-03-10T14:08:44.812 INFO:tasks.workunit.client.0.vm03.stdout:8/889: dread da/d24/d49/f66 [0,4194304] 0 2026-03-10T14:08:44.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:44 vm04.local ceph-mon[55966]: pgmap v9: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 37 MiB/s rd, 76 MiB/s wr, 229 op/s 2026-03-10T14:08:44.816 INFO:tasks.workunit.client.0.vm03.stdout:8/890: link da/d3c/d51/d75/cb9 da/d3c/d51/d85/dbb/c119 0 2026-03-10T14:08:44.817 INFO:tasks.workunit.client.0.vm03.stdout:8/891: mknod da/d3c/d51/d75/dc2/dc6/c11a 0 2026-03-10T14:08:44.823 INFO:tasks.workunit.client.0.vm03.stdout:9/779: dwrite d2/d14/f61 [0,4194304] 0 2026-03-10T14:08:44.824 INFO:tasks.workunit.client.0.vm03.stdout:0/757: sync 2026-03-10T14:08:44.824 INFO:tasks.workunit.client.0.vm03.stdout:5/940: sync 2026-03-10T14:08:44.824 INFO:tasks.workunit.client.0.vm03.stdout:4/807: sync 2026-03-10T14:08:44.830 INFO:tasks.workunit.client.0.vm03.stdout:2/765: dread d5/d10/d1f/d4f/d76/da7/d40/f86 [0,4194304] 0 2026-03-10T14:08:44.833 INFO:tasks.workunit.client.0.vm03.stdout:0/758: dread d3/d16/f9e [0,4194304] 0 2026-03-10T14:08:44.839 INFO:tasks.workunit.client.0.vm03.stdout:3/900: write d1d/d33/d65/d5d/dae/fee [2220552,14094] 0 2026-03-10T14:08:44.842 INFO:tasks.workunit.client.0.vm03.stdout:4/808: rename d5/d47/d62/la9 to d5/d9/db/da7/db9/def/l113 0 2026-03-10T14:08:44.843 INFO:tasks.workunit.client.0.vm03.stdout:1/776: write d0/d2/df/d16/d20/fb8 [397453,4451] 0 2026-03-10T14:08:44.845 INFO:tasks.workunit.client.0.vm03.stdout:3/901: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/da6/f119 x:0 0 0 2026-03-10T14:08:44.849 INFO:tasks.workunit.client.0.vm03.stdout:9/780: dread d2/d14/f30 [0,4194304] 0 2026-03-10T14:08:44.854 INFO:tasks.workunit.client.0.vm03.stdout:5/941: creat d4/d13/f124 x:0 0 0 2026-03-10T14:08:44.858 INFO:tasks.workunit.client.0.vm03.stdout:0/759: symlink d3/d4d/d30/lf6 0 2026-03-10T14:08:44.863 INFO:tasks.workunit.client.0.vm03.stdout:0/760: chown d3/d46/de0 31762 1 2026-03-10T14:08:44.863 INFO:tasks.workunit.client.0.vm03.stdout:2/766: rmdir d5/dac/df5 0 2026-03-10T14:08:44.864 INFO:tasks.workunit.client.0.vm03.stdout:9/781: sync 2026-03-10T14:08:44.870 INFO:tasks.workunit.client.0.vm03.stdout:9/782: fdatasync d2/d29/d38/f47 0 2026-03-10T14:08:44.876 INFO:tasks.workunit.client.0.vm03.stdout:9/783: mkdir d2/d29/d33/d60/d8c/dd3/dfe 0 2026-03-10T14:08:44.877 INFO:tasks.workunit.client.0.vm03.stdout:1/777: link d0/d42/d9f/fa2 d0/d2/df/d27/ffa 0 2026-03-10T14:08:44.878 INFO:tasks.workunit.client.0.vm03.stdout:9/784: mkdir d2/d14/d2b/d79/d8a/dff 0 2026-03-10T14:08:44.879 INFO:tasks.workunit.client.0.vm03.stdout:9/785: readlink d2/d14/d2b/d79/d8a/lb2 0 2026-03-10T14:08:44.881 INFO:tasks.workunit.client.0.vm03.stdout:1/778: mknod d0/d18/d1d/dc4/cfb 0 2026-03-10T14:08:44.888 INFO:tasks.workunit.client.0.vm03.stdout:0/761: dread d3/f94 [0,4194304] 0 2026-03-10T14:08:44.896 INFO:tasks.workunit.client.0.vm03.stdout:2/767: dread d5/f2d [0,4194304] 0 2026-03-10T14:08:44.908 INFO:tasks.workunit.client.0.vm03.stdout:2/768: dwrite d5/d10/d1f/d4f/d76/da7/d54/fe2 [0,4194304] 0 2026-03-10T14:08:44.909 INFO:tasks.workunit.client.0.vm03.stdout:1/779: rmdir d0/d2 39 2026-03-10T14:08:44.920 INFO:tasks.workunit.client.0.vm03.stdout:0/762: dread d3/d11/d2c/d4a/d4b/d89/db6/fe1 [0,4194304] 0 2026-03-10T14:08:44.923 INFO:tasks.workunit.client.0.vm03.stdout:0/763: symlink d3/d11/d76/db5/ddb/lf7 0 2026-03-10T14:08:44.926 INFO:tasks.workunit.client.0.vm03.stdout:0/764: rmdir d3/d4d/d47 39 2026-03-10T14:08:44.929 INFO:tasks.workunit.client.0.vm03.stdout:0/765: dwrite d3/d4d/da0/fbe [4194304,4194304] 0 2026-03-10T14:08:44.940 INFO:tasks.workunit.client.0.vm03.stdout:0/766: fsync d3/d4d/d47/fcd 0 2026-03-10T14:08:44.943 INFO:tasks.workunit.client.0.vm03.stdout:0/767: mknod d3/d11/d2c/d4a/d4b/d89/db9/cf8 0 2026-03-10T14:08:44.944 INFO:tasks.workunit.client.0.vm03.stdout:0/768: readlink d3/d11/d2c/d4a/d4b/d89/db9/ldf 0 2026-03-10T14:08:45.139 INFO:tasks.workunit.client.0.vm03.stdout:7/695: write d5/d9/d14/d21/fa3 [491065,9396] 0 2026-03-10T14:08:45.140 INFO:tasks.workunit.client.0.vm03.stdout:7/696: write d5/d9/d14/d26/d5f/d89/fcd [397356,50224] 0 2026-03-10T14:08:45.140 INFO:tasks.workunit.client.0.vm03.stdout:7/697: readlink d5/d9/d14/d26/d36/de1/l7f 0 2026-03-10T14:08:45.142 INFO:tasks.workunit.client.0.vm03.stdout:7/698: creat d5/d9/d14/d26/d36/de1/ff3 x:0 0 0 2026-03-10T14:08:45.143 INFO:tasks.workunit.client.0.vm03.stdout:7/699: truncate d5/d9/d14/d21/d9a/fee 810910 0 2026-03-10T14:08:45.143 INFO:tasks.workunit.client.0.vm03.stdout:7/700: write d5/d9/d14/d26/f9f [1753446,92421] 0 2026-03-10T14:08:45.145 INFO:tasks.workunit.client.0.vm03.stdout:7/701: fdatasync d5/d9/d14/d26/d36/f3a 0 2026-03-10T14:08:45.147 INFO:tasks.workunit.client.0.vm03.stdout:6/812: write d8/d11/f35 [3261386,39070] 0 2026-03-10T14:08:45.149 INFO:tasks.workunit.client.0.vm03.stdout:7/702: rmdir d5/d9/d35/da8/dd8 0 2026-03-10T14:08:45.153 INFO:tasks.workunit.client.0.vm03.stdout:7/703: dwrite d5/f1b [0,4194304] 0 2026-03-10T14:08:45.162 INFO:tasks.workunit.client.0.vm03.stdout:8/892: write da/d58/d6c/fa4 [1052210,111182] 0 2026-03-10T14:08:45.165 INFO:tasks.workunit.client.0.vm03.stdout:3/902: rename d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/da6 to d1d/d33/d47/dac/dba/d11a 0 2026-03-10T14:08:45.170 INFO:tasks.workunit.client.0.vm03.stdout:3/903: read - d1d/d33/d65/f113 zero size 2026-03-10T14:08:45.170 INFO:tasks.workunit.client.0.vm03.stdout:4/809: dwrite d5/d9/f97 [4194304,4194304] 0 2026-03-10T14:08:45.172 INFO:tasks.workunit.client.0.vm03.stdout:5/942: dwrite d4/d6/f97 [0,4194304] 0 2026-03-10T14:08:45.176 INFO:tasks.workunit.client.0.vm03.stdout:5/943: fsync d4/d6/f78 0 2026-03-10T14:08:45.185 INFO:tasks.workunit.client.0.vm03.stdout:5/944: creat d4/d13/d8f/d11d/dc5/f125 x:0 0 0 2026-03-10T14:08:45.195 INFO:tasks.workunit.client.0.vm03.stdout:9/786: write d2/d29/d33/d60/d8c/de1/f93 [711760,123799] 0 2026-03-10T14:08:45.195 INFO:tasks.workunit.client.0.vm03.stdout:2/769: write d5/d2a/f6d [1913637,7731] 0 2026-03-10T14:08:45.195 INFO:tasks.workunit.client.0.vm03.stdout:2/770: dwrite d5/dac/ff1 [0,4194304] 0 2026-03-10T14:08:45.197 INFO:tasks.workunit.client.0.vm03.stdout:1/780: write d0/d18/d3b/f3d [576673,106746] 0 2026-03-10T14:08:45.198 INFO:tasks.workunit.client.0.vm03.stdout:1/781: write d0/d42/f80 [397103,105628] 0 2026-03-10T14:08:45.199 INFO:tasks.workunit.client.0.vm03.stdout:9/787: dread d2/d14/f1b [0,4194304] 0 2026-03-10T14:08:45.200 INFO:tasks.workunit.client.0.vm03.stdout:9/788: readlink d2/d29/d33/d41/d46/l8f 0 2026-03-10T14:08:45.210 INFO:tasks.workunit.client.0.vm03.stdout:5/945: rmdir d4/d16/d19/d23/d11e 0 2026-03-10T14:08:45.215 INFO:tasks.workunit.client.0.vm03.stdout:1/782: dread - d0/d2/df/d27/f94 zero size 2026-03-10T14:08:45.231 INFO:tasks.workunit.client.0.vm03.stdout:5/946: fsync d4/d13/fad 0 2026-03-10T14:08:45.231 INFO:tasks.workunit.client.0.vm03.stdout:5/947: symlink d4/l126 0 2026-03-10T14:08:45.231 INFO:tasks.workunit.client.0.vm03.stdout:5/948: chown d4/d13/d1f/fb1 113 1 2026-03-10T14:08:45.231 INFO:tasks.workunit.client.0.vm03.stdout:5/949: rename d4/d13/d1f to d4/d13/d8f/d11d/d127 0 2026-03-10T14:08:45.243 INFO:tasks.workunit.client.0.vm03.stdout:3/904: sync 2026-03-10T14:08:45.250 INFO:tasks.workunit.client.0.vm03.stdout:2/771: sync 2026-03-10T14:08:45.251 INFO:tasks.workunit.client.0.vm03.stdout:2/772: stat d5/d35/f9b 0 2026-03-10T14:08:45.254 INFO:tasks.workunit.client.0.vm03.stdout:3/905: mknod d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/dd4/df9/c11b 0 2026-03-10T14:08:45.262 INFO:tasks.workunit.client.0.vm03.stdout:2/773: rmdir d5/dac 39 2026-03-10T14:08:45.268 INFO:tasks.workunit.client.0.vm03.stdout:0/769: write d3/d46/f6c [566830,15591] 0 2026-03-10T14:08:45.272 INFO:tasks.workunit.client.0.vm03.stdout:6/813: dwrite d8/db/f3c [0,4194304] 0 2026-03-10T14:08:45.276 INFO:tasks.workunit.client.0.vm03.stdout:7/704: dwrite d5/d9/d14/d26/d39/d92/fba [0,4194304] 0 2026-03-10T14:08:45.279 INFO:tasks.workunit.client.0.vm03.stdout:8/893: write da/d3c/d4b/f63 [1960744,16913] 0 2026-03-10T14:08:45.280 INFO:tasks.workunit.client.0.vm03.stdout:8/894: dread - da/d58/d6c/feb zero size 2026-03-10T14:08:45.290 INFO:tasks.workunit.client.0.vm03.stdout:9/789: dread d2/d29/d33/d60/d8c/de1/f93 [0,4194304] 0 2026-03-10T14:08:45.304 INFO:tasks.workunit.client.0.vm03.stdout:3/906: mknod d1d/d33/d65/c11c 0 2026-03-10T14:08:45.305 INFO:tasks.workunit.client.0.vm03.stdout:0/770: mknod d3/d11/d2c/d4a/d4b/d89/cf9 0 2026-03-10T14:08:45.309 INFO:tasks.workunit.client.0.vm03.stdout:3/907: dwrite d1d/d33/d65/d48/ffe [0,4194304] 0 2026-03-10T14:08:45.318 INFO:tasks.workunit.client.0.vm03.stdout:8/895: creat da/d36/d4d/da5/f11b x:0 0 0 2026-03-10T14:08:45.320 INFO:tasks.workunit.client.0.vm03.stdout:4/810: write d5/d47/d5b/f84 [928849,93674] 0 2026-03-10T14:08:45.332 INFO:tasks.workunit.client.0.vm03.stdout:0/771: dread d3/d16/d21/fa4 [0,4194304] 0 2026-03-10T14:08:45.333 INFO:tasks.workunit.client.0.vm03.stdout:0/772: truncate d3/d46/dac/fc2 545055 0 2026-03-10T14:08:45.333 INFO:tasks.workunit.client.0.vm03.stdout:7/705: symlink d5/d9/d14/d26/d36/d51/d7b/d9c/dd5/lf4 0 2026-03-10T14:08:45.335 INFO:tasks.workunit.client.0.vm03.stdout:8/896: fsync f2 0 2026-03-10T14:08:45.336 INFO:tasks.workunit.client.0.vm03.stdout:8/897: write da/d3c/d4b/d69/fb5 [3709757,103836] 0 2026-03-10T14:08:45.340 INFO:tasks.workunit.client.0.vm03.stdout:9/790: mkdir d2/d14/d100 0 2026-03-10T14:08:45.341 INFO:tasks.workunit.client.0.vm03.stdout:1/783: write d0/d2/df/d16/f4f [1024885,63181] 0 2026-03-10T14:08:45.351 INFO:tasks.workunit.client.0.vm03.stdout:4/811: symlink d5/d6e/db6/l114 0 2026-03-10T14:08:45.379 INFO:tasks.workunit.client.0.vm03.stdout:0/773: symlink d3/d4d/d47/lfa 0 2026-03-10T14:08:45.380 INFO:tasks.workunit.client.0.vm03.stdout:8/898: mknod da/d3c/d4b/dd2/d10b/c11c 0 2026-03-10T14:08:45.382 INFO:tasks.workunit.client.0.vm03.stdout:9/791: symlink d2/d29/dcd/l101 0 2026-03-10T14:08:45.382 INFO:tasks.workunit.client.0.vm03.stdout:5/950: dwrite d4/f17 [4194304,4194304] 0 2026-03-10T14:08:45.387 INFO:tasks.workunit.client.0.vm03.stdout:1/784: creat d0/d2/df/d16/d41/ffc x:0 0 0 2026-03-10T14:08:45.387 INFO:tasks.workunit.client.0.vm03.stdout:2/774: dread d5/d10/d1f/f5e [0,4194304] 0 2026-03-10T14:08:45.394 INFO:tasks.workunit.client.0.vm03.stdout:9/792: read d2/d29/d33/d41/d46/ff9 [348556,88610] 0 2026-03-10T14:08:45.423 INFO:tasks.workunit.client.0.vm03.stdout:6/814: dwrite d8/d11/d18/d54/f5b [0,4194304] 0 2026-03-10T14:08:45.435 INFO:tasks.workunit.client.0.vm03.stdout:7/706: write d5/d9/d14/d26/f64 [3120711,121950] 0 2026-03-10T14:08:45.443 INFO:tasks.workunit.client.0.vm03.stdout:0/774: unlink d3/d4d/f49 0 2026-03-10T14:08:45.443 INFO:tasks.workunit.client.0.vm03.stdout:8/899: truncate da/d24/d49/f66 262957 0 2026-03-10T14:08:45.447 INFO:tasks.workunit.client.0.vm03.stdout:0/775: dread d3/d11/d2c/d4a/fcf [0,4194304] 0 2026-03-10T14:08:45.449 INFO:tasks.workunit.client.0.vm03.stdout:1/785: read - d0/d2/df/d91/fd3 zero size 2026-03-10T14:08:45.450 INFO:tasks.workunit.client.0.vm03.stdout:1/786: chown d0/d2/df/d27/l87 103 1 2026-03-10T14:08:45.451 INFO:tasks.workunit.client.0.vm03.stdout:6/815: truncate d8/d11/f2a 2468807 0 2026-03-10T14:08:45.470 INFO:tasks.workunit.client.0.vm03.stdout:3/908: link d1d/d33/d47/d53/d68/dcf/de7/d41/cd3 d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/db6/c11d 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:1/787: creat d0/d18/d3b/d50/ffd x:0 0 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:2/775: symlink d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/lfb 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:6/816: unlink d8/db/d12/f57 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:3/909: unlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/lc6 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:5/951: rename d4/f9f to d4/d16/d19/d6e/f128 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:8/900: rename da/d3c/d51 to da/d3c/d51/d75/dc2/d11d 22 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:1/788: truncate d0/d2/df/f76 1033114 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:6/817: dread d8/fd [4194304,4194304] 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:1/789: write d0/d2/df/f1b [1134063,48169] 0 2026-03-10T14:08:45.471 INFO:tasks.workunit.client.0.vm03.stdout:2/776: mkdir d5/d10/d1f/d4f/d76/dfc 0 2026-03-10T14:08:45.472 INFO:tasks.workunit.client.0.vm03.stdout:1/790: mknod d0/d42/d9f/dc9/cfe 0 2026-03-10T14:08:45.472 INFO:tasks.workunit.client.0.vm03.stdout:1/791: readlink d0/d2/df/l93 0 2026-03-10T14:08:45.475 INFO:tasks.workunit.client.0.vm03.stdout:7/707: dread d5/d9/d14/d21/d28/f6e [0,4194304] 0 2026-03-10T14:08:45.477 INFO:tasks.workunit.client.0.vm03.stdout:5/952: rename d4/d13/d8f/f91 to d4/d16/d19/d23/f129 0 2026-03-10T14:08:45.478 INFO:tasks.workunit.client.0.vm03.stdout:8/901: link da/f31 da/d3c/d4b/d4c/dba/d114/dbc/db1/f11e 0 2026-03-10T14:08:45.479 INFO:tasks.workunit.client.0.vm03.stdout:6/818: symlink d8/d11/d18/d79/d80/df1/df6/lf7 0 2026-03-10T14:08:45.479 INFO:tasks.workunit.client.0.vm03.stdout:6/819: dread - d8/d11/da0/dca/fd6 zero size 2026-03-10T14:08:45.482 INFO:tasks.workunit.client.0.vm03.stdout:6/820: dread d8/db/d12/d64/fa8 [0,4194304] 0 2026-03-10T14:08:45.482 INFO:tasks.workunit.client.0.vm03.stdout:1/792: unlink d0/d2/df/fb0 0 2026-03-10T14:08:45.484 INFO:tasks.workunit.client.0.vm03.stdout:3/910: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f11e x:0 0 0 2026-03-10T14:08:45.488 INFO:tasks.workunit.client.0.vm03.stdout:2/777: rename d5/d10/dba/f9a to d5/d10/ffd 0 2026-03-10T14:08:45.489 INFO:tasks.workunit.client.0.vm03.stdout:6/821: symlink d8/db/d49/d6c/d32/lf8 0 2026-03-10T14:08:45.492 INFO:tasks.workunit.client.0.vm03.stdout:3/911: rmdir d1d/d33/d65/d5d 39 2026-03-10T14:08:45.496 INFO:tasks.workunit.client.0.vm03.stdout:9/793: getdents d2/d29 0 2026-03-10T14:08:45.498 INFO:tasks.workunit.client.0.vm03.stdout:2/778: fdatasync d5/f23 0 2026-03-10T14:08:45.504 INFO:tasks.workunit.client.0.vm03.stdout:9/794: dread d2/d29/dcd/d72/d9d/fa0 [0,4194304] 0 2026-03-10T14:08:45.510 INFO:tasks.workunit.client.0.vm03.stdout:9/795: stat d2/d29/dcd/cb3 0 2026-03-10T14:08:45.510 INFO:tasks.workunit.client.0.vm03.stdout:9/796: write d2/fd2 [2797,104693] 0 2026-03-10T14:08:45.510 INFO:tasks.workunit.client.0.vm03.stdout:2/779: rename d5/d10/d1f/d4f/l87 to d5/d10/d1f/d4f/d76/da7/d40/d59/lfe 0 2026-03-10T14:08:45.510 INFO:tasks.workunit.client.0.vm03.stdout:2/780: write d5/d10/d1f/d4f/d76/da7/d40/d59/f63 [211998,58459] 0 2026-03-10T14:08:45.513 INFO:tasks.workunit.client.0.vm03.stdout:8/902: getdents da/d36/d4d/da5/dd6 0 2026-03-10T14:08:45.515 INFO:tasks.workunit.client.0.vm03.stdout:3/912: dread d1d/f66 [0,4194304] 0 2026-03-10T14:08:45.515 INFO:tasks.workunit.client.0.vm03.stdout:9/797: creat d2/d29/d33/d60/f102 x:0 0 0 2026-03-10T14:08:45.516 INFO:tasks.workunit.client.0.vm03.stdout:3/913: chown d1d/d33/f5e 173632487 1 2026-03-10T14:08:45.517 INFO:tasks.workunit.client.0.vm03.stdout:2/781: fsync d5/d35/f81 0 2026-03-10T14:08:45.519 INFO:tasks.workunit.client.0.vm03.stdout:8/903: dread - da/d3c/d51/d75/dc2/dc6/ffc zero size 2026-03-10T14:08:45.521 INFO:tasks.workunit.client.0.vm03.stdout:0/776: sync 2026-03-10T14:08:45.529 INFO:tasks.workunit.client.0.vm03.stdout:6/822: getdents d8/db/dd0 0 2026-03-10T14:08:45.547 INFO:tasks.workunit.client.0.vm03.stdout:4/812: dwrite d5/d9/db/da7/dc0/fea [0,4194304] 0 2026-03-10T14:08:45.566 INFO:tasks.workunit.client.0.vm03.stdout:9/798: fsync d2/d29/d33/fd4 0 2026-03-10T14:08:45.571 INFO:tasks.workunit.client.0.vm03.stdout:2/782: mkdir d5/d10/dba/dff 0 2026-03-10T14:08:45.572 INFO:tasks.workunit.client.0.vm03.stdout:2/783: dread d5/d10/d1f/f5e [0,4194304] 0 2026-03-10T14:08:45.572 INFO:tasks.workunit.client.0.vm03.stdout:2/784: fsync d5/d10/d17/f20 0 2026-03-10T14:08:45.581 INFO:tasks.workunit.client.0.vm03.stdout:6/823: dread - d8/db/d49/d6c/f90 zero size 2026-03-10T14:08:45.584 INFO:tasks.workunit.client.0.vm03.stdout:6/824: dwrite d8/d11/d18/d54/f8b [4194304,4194304] 0 2026-03-10T14:08:45.588 INFO:tasks.workunit.client.0.vm03.stdout:9/799: dread d2/d29/d33/fa9 [4194304,4194304] 0 2026-03-10T14:08:45.624 INFO:tasks.workunit.client.0.vm03.stdout:7/708: write d5/d9/d14/d26/d36/d51/d7b/f87 [109713,112541] 0 2026-03-10T14:08:45.628 INFO:tasks.workunit.client.0.vm03.stdout:9/800: read d2/d29/d33/fa5 [2890293,22681] 0 2026-03-10T14:08:45.629 INFO:tasks.workunit.client.0.vm03.stdout:9/801: write d2/d14/ff1 [326577,15997] 0 2026-03-10T14:08:45.638 INFO:tasks.workunit.client.0.vm03.stdout:0/777: link d3/d11/c83 d3/d11/d76/dd3/cfb 0 2026-03-10T14:08:45.642 INFO:tasks.workunit.client.0.vm03.stdout:5/953: write d4/d6/f78 [122074,130089] 0 2026-03-10T14:08:45.643 INFO:tasks.workunit.client.0.vm03.stdout:1/793: write d0/d2/d71/d90/f9e [3534238,11561] 0 2026-03-10T14:08:45.644 INFO:tasks.workunit.client.0.vm03.stdout:1/794: chown d0/d2/df/l5d 13272 1 2026-03-10T14:08:45.644 INFO:tasks.workunit.client.0.vm03.stdout:8/904: rename da/d3a/fa3 to da/f11f 0 2026-03-10T14:08:45.649 INFO:tasks.workunit.client.0.vm03.stdout:8/905: dwrite da/d3c/d4b/d69/fb5 [4194304,4194304] 0 2026-03-10T14:08:45.658 INFO:tasks.workunit.client.0.vm03.stdout:7/709: dwrite d5/d9/d14/f41 [0,4194304] 0 2026-03-10T14:08:45.666 INFO:tasks.workunit.client.0.vm03.stdout:9/802: creat d2/d29/d33/d41/d95/f103 x:0 0 0 2026-03-10T14:08:45.667 INFO:tasks.workunit.client.0.vm03.stdout:9/803: chown d2/d29/d38/f51 49 1 2026-03-10T14:08:45.669 INFO:tasks.workunit.client.0.vm03.stdout:3/914: getdents d1d/d33/d65/d48 0 2026-03-10T14:08:45.677 INFO:tasks.workunit.client.0.vm03.stdout:5/954: mkdir d4/d13/d8f/d11d/dc5/d12a 0 2026-03-10T14:08:45.685 INFO:tasks.workunit.client.0.vm03.stdout:4/813: rename d5/d9/f31 to d5/db4/dd2/f115 0 2026-03-10T14:08:45.689 INFO:tasks.workunit.client.0.vm03.stdout:4/814: dwrite d5/d47/d5b/f84 [0,4194304] 0 2026-03-10T14:08:45.701 INFO:tasks.workunit.client.0.vm03.stdout:8/906: read - da/d3a/d44/f79 zero size 2026-03-10T14:08:45.707 INFO:tasks.workunit.client.0.vm03.stdout:7/710: readlink d5/d9/d14/d26/d36/de1/d84/lb9 0 2026-03-10T14:08:45.710 INFO:tasks.workunit.client.0.vm03.stdout:6/825: rename f3 to d8/db/d49/d6c/d83/ff9 0 2026-03-10T14:08:45.711 INFO:tasks.workunit.client.0.vm03.stdout:6/826: stat d8/l36 0 2026-03-10T14:08:45.712 INFO:tasks.workunit.client.0.vm03.stdout:4/815: symlink d5/d9/db/d19/d38/d7b/l116 0 2026-03-10T14:08:45.713 INFO:tasks.workunit.client.0.vm03.stdout:8/907: fdatasync da/d24/ff8 0 2026-03-10T14:08:45.715 INFO:tasks.workunit.client.0.vm03.stdout:9/804: creat d2/d29/d33/d60/d8c/dd3/df6/f104 x:0 0 0 2026-03-10T14:08:45.718 INFO:tasks.workunit.client.0.vm03.stdout:0/778: creat d3/d46/ffc x:0 0 0 2026-03-10T14:08:45.723 INFO:tasks.workunit.client.0.vm03.stdout:6/827: mkdir d8/d11/d7a/dfa 0 2026-03-10T14:08:45.726 INFO:tasks.workunit.client.0.vm03.stdout:7/711: fsync d5/d9/d14/d26/f38 0 2026-03-10T14:08:45.727 INFO:tasks.workunit.client.0.vm03.stdout:7/712: read d5/d9/d14/d26/d39/f63 [3912425,94032] 0 2026-03-10T14:08:45.730 INFO:tasks.workunit.client.0.vm03.stdout:1/795: write d0/d2/df/d27/d7e/d81/f8d [2377880,21328] 0 2026-03-10T14:08:45.733 INFO:tasks.workunit.client.0.vm03.stdout:1/796: dwrite d0/d2/df/f1b [0,4194304] 0 2026-03-10T14:08:45.739 INFO:tasks.workunit.client.0.vm03.stdout:1/797: dwrite d0/d2/df/d16/d20/fb8 [0,4194304] 0 2026-03-10T14:08:45.745 INFO:tasks.workunit.client.0.vm03.stdout:0/779: creat d3/d11/d2c/d4a/d4b/d89/db6/df4/ffd x:0 0 0 2026-03-10T14:08:45.748 INFO:tasks.workunit.client.0.vm03.stdout:5/955: dwrite d4/d13/d43/fbb [0,4194304] 0 2026-03-10T14:08:45.749 INFO:tasks.workunit.client.0.vm03.stdout:5/956: stat d4/d16/df4 0 2026-03-10T14:08:45.751 INFO:tasks.workunit.client.0.vm03.stdout:9/805: dread d2/d14/d2b/f2d [4194304,4194304] 0 2026-03-10T14:08:45.760 INFO:tasks.workunit.client.0.vm03.stdout:2/785: rename d5/d10/d31/f4e to d5/d10/d1f/d4f/d76/da7/d40/db0/f100 0 2026-03-10T14:08:45.762 INFO:tasks.workunit.client.0.vm03.stdout:2/786: truncate d5/d10/d1f/d4f/d76/da7/d40/f68 1020616 0 2026-03-10T14:08:45.763 INFO:tasks.workunit.client.0.vm03.stdout:8/908: mkdir da/d3c/d4b/d4c/dba/d114/dbc/d70/d120 0 2026-03-10T14:08:45.764 INFO:tasks.workunit.client.0.vm03.stdout:7/713: symlink d5/d9/d14/d26/d36/d51/d7b/d9c/lf5 0 2026-03-10T14:08:45.768 INFO:tasks.workunit.client.0.vm03.stdout:2/787: dwrite d5/d10/d1f/d4f/d76/da7/d40/d59/faa [0,4194304] 0 2026-03-10T14:08:45.780 INFO:tasks.workunit.client.0.vm03.stdout:2/788: readlink d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/lf8 0 2026-03-10T14:08:45.780 INFO:tasks.workunit.client.0.vm03.stdout:5/957: symlink d4/d16/d19/d4a/l12b 0 2026-03-10T14:08:45.780 INFO:tasks.workunit.client.0.vm03.stdout:9/806: mknod d2/d14/d2b/d79/d81/c105 0 2026-03-10T14:08:45.780 INFO:tasks.workunit.client.0.vm03.stdout:3/915: rename d1d/d59/f69 to d1d/d33/d47/dac/dba/f11f 0 2026-03-10T14:08:45.780 INFO:tasks.workunit.client.0.vm03.stdout:8/909: unlink da/fe 0 2026-03-10T14:08:45.780 INFO:tasks.workunit.client.0.vm03.stdout:2/789: truncate d5/d10/da3/dab/fdb 467534 0 2026-03-10T14:08:45.786 INFO:tasks.workunit.client.0.vm03.stdout:9/807: creat d2/d14/d2b/d43/ddd/f106 x:0 0 0 2026-03-10T14:08:45.788 INFO:tasks.workunit.client.0.vm03.stdout:0/780: dread d3/d46/d5e/f75 [0,4194304] 0 2026-03-10T14:08:45.791 INFO:tasks.workunit.client.0.vm03.stdout:4/816: rename d5/d9/db/da7/db9/def/l113 to d5/d9/db/d19/d38/d7b/daa/daf/l117 0 2026-03-10T14:08:45.792 INFO:tasks.workunit.client.0.vm03.stdout:9/808: dwrite d2/f83 [0,4194304] 0 2026-03-10T14:08:45.794 INFO:tasks.workunit.client.0.vm03.stdout:0/781: truncate d3/d11/d2c/d4a/fcf 1625035 0 2026-03-10T14:08:45.794 INFO:tasks.workunit.client.0.vm03.stdout:0/782: dread - d3/d46/ffc zero size 2026-03-10T14:08:45.796 INFO:tasks.workunit.client.0.vm03.stdout:7/714: rename d5/d9/d14/d26/d39/f45 to d5/d9/d35/da8/ff6 0 2026-03-10T14:08:45.798 INFO:tasks.workunit.client.0.vm03.stdout:4/817: dread - d5/d9/db/d19/d38/d53/d55/fad zero size 2026-03-10T14:08:45.802 INFO:tasks.workunit.client.0.vm03.stdout:2/790: rename d5/d2a/f6d to d5/d10/d1f/d4f/d76/da7/d40/db0/f101 0 2026-03-10T14:08:45.811 INFO:tasks.workunit.client.0.vm03.stdout:3/916: rename fb to d1d/d33/d65/d5d/dae/db9/dec/f120 0 2026-03-10T14:08:45.812 INFO:tasks.workunit.client.0.vm03.stdout:3/917: write d1d/d33/d65/d5d/dae/fee [323341,125356] 0 2026-03-10T14:08:45.814 INFO:tasks.workunit.client.0.vm03.stdout:2/791: creat d5/d10/d1f/d4f/d76/da7/d40/d92/f102 x:0 0 0 2026-03-10T14:08:45.816 INFO:tasks.workunit.client.0.vm03.stdout:3/918: truncate d1d/d33/f5e 1326411 0 2026-03-10T14:08:45.817 INFO:tasks.workunit.client.0.vm03.stdout:3/919: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f63 1781 1 2026-03-10T14:08:45.819 INFO:tasks.workunit.client.0.vm03.stdout:6/828: write d8/db/d49/fd3 [907814,89254] 0 2026-03-10T14:08:45.822 INFO:tasks.workunit.client.0.vm03.stdout:5/958: getdents d4/d16/d19/d4a 0 2026-03-10T14:08:45.823 INFO:tasks.workunit.client.0.vm03.stdout:1/798: truncate d0/d2/df/d27/fc5 3137107 0 2026-03-10T14:08:45.824 INFO:tasks.workunit.client.0.vm03.stdout:5/959: fsync d4/d13/d43/fbb 0 2026-03-10T14:08:45.824 INFO:tasks.workunit.client.0.vm03.stdout:7/715: getdents d5/d9/d14/d26/d36/d51/d7b/d9c/dd5 0 2026-03-10T14:08:45.827 INFO:tasks.workunit.client.0.vm03.stdout:8/910: dwrite da/d36/d6a/f6b [0,4194304] 0 2026-03-10T14:08:45.830 INFO:tasks.workunit.client.0.vm03.stdout:2/792: rename d5/d10/d1f/d4f/d76/da7/d40/d59/f63 to d5/d10/d1f/d4f/d76/da7/d54/d5f/f103 0 2026-03-10T14:08:45.830 INFO:tasks.workunit.client.0.vm03.stdout:8/911: write da/d3c/d4b/f63 [4064078,82702] 0 2026-03-10T14:08:45.830 INFO:tasks.workunit.client.0.vm03.stdout:3/920: mkdir d1d/d33/d65/d48/d118/d121 0 2026-03-10T14:08:45.830 INFO:tasks.workunit.client.0.vm03.stdout:6/829: mknod d8/db/d12/cfb 0 2026-03-10T14:08:45.835 INFO:tasks.workunit.client.0.vm03.stdout:7/716: mkdir d5/d9/d14/d26/d36/d51/d7b/d9c/dd5/df7 0 2026-03-10T14:08:45.836 INFO:tasks.workunit.client.0.vm03.stdout:7/717: readlink d5/d9/d35/laf 0 2026-03-10T14:08:45.840 INFO:tasks.workunit.client.0.vm03.stdout:8/912: mkdir da/d3a/dce/d121 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:8/913: chown da/fd 4030696 1 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:5/960: truncate d4/f82 2273745 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:3/921: fsync d1d/d33/d47/d53/d68/dcf/de7/d41/d45/f6a 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:1/799: creat d0/d42/fff x:0 0 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:9/809: write d2/d14/f1a [2341321,14951] 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:5/961: mkdir d4/d16/d19/d23/d12c 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:6/830: dread d8/d11/f2a [0,4194304] 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:2/793: rename d5/d10/d1f/d4f/d76/da7/d40/d59/da2 to d5/d10/d1f/d4f/d76/da7/d104 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:3/922: symlink d1d/d33/d47/dac/l122 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:1/800: symlink d0/d2/df/d16/d41/l100 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:1/801: fsync d0/d42/f80 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:3/923: dwrite d1d/d33/d65/d48/fa5 [0,4194304] 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:1/802: write d0/d2/df/d27/d7e/d81/fe2 [1221858,100906] 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:1/803: chown d0/d42/c8b 1009 1 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:8/914: symlink da/d3c/d51/d85/dbb/d106/l122 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:9/810: mknod d2/d29/d33/d60/d8c/c107 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:2/794: creat d5/d10/d1f/d4f/d76/da7/d40/f105 x:0 0 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:9/811: rename d2/d29/d33/d60/d8c/f98 to d2/d29/dcd/dd8/f108 0 2026-03-10T14:08:45.870 INFO:tasks.workunit.client.0.vm03.stdout:9/812: readlink d2/d29/d38/le3 0 2026-03-10T14:08:45.877 INFO:tasks.workunit.client.0.vm03.stdout:6/831: unlink d8/db/d49/d6c/l4c 0 2026-03-10T14:08:45.885 INFO:tasks.workunit.client.0.vm03.stdout:1/804: mknod d0/d2/df/d27/c101 0 2026-03-10T14:08:45.885 INFO:tasks.workunit.client.0.vm03.stdout:6/832: dwrite d8/d1b/d1c/fd9 [0,4194304] 0 2026-03-10T14:08:45.885 INFO:tasks.workunit.client.0.vm03.stdout:6/833: dread d8/d1b/f31 [0,4194304] 0 2026-03-10T14:08:45.893 INFO:tasks.workunit.client.0.vm03.stdout:4/818: sync 2026-03-10T14:08:45.898 INFO:tasks.workunit.client.0.vm03.stdout:9/813: rename d2/d29/l6b to d2/d29/dcd/d72/d9d/l109 0 2026-03-10T14:08:45.899 INFO:tasks.workunit.client.0.vm03.stdout:6/834: rmdir d8/db/dd0 39 2026-03-10T14:08:45.901 INFO:tasks.workunit.client.0.vm03.stdout:3/924: link d1d/d33/d47/f98 d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/f123 0 2026-03-10T14:08:45.903 INFO:tasks.workunit.client.0.vm03.stdout:4/819: truncate d5/d9/db/f6d 1025485 0 2026-03-10T14:08:45.903 INFO:tasks.workunit.client.0.vm03.stdout:4/820: chown d5/d9/db/d19/d38/d53/d55/fad 726 1 2026-03-10T14:08:45.905 INFO:tasks.workunit.client.0.vm03.stdout:3/925: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e/f124 x:0 0 0 2026-03-10T14:08:45.906 INFO:tasks.workunit.client.0.vm03.stdout:3/926: truncate d1d/d33/d47/dac/dba/f108 158104 0 2026-03-10T14:08:45.908 INFO:tasks.workunit.client.0.vm03.stdout:6/835: dread d8/d1b/f61 [0,4194304] 0 2026-03-10T14:08:45.910 INFO:tasks.workunit.client.0.vm03.stdout:5/962: dread d4/d6/de/d10e/f8e [0,4194304] 0 2026-03-10T14:08:45.910 INFO:tasks.workunit.client.0.vm03.stdout:5/963: readlink d4/d16/d19/d23/db8/l110 0 2026-03-10T14:08:45.912 INFO:tasks.workunit.client.0.vm03.stdout:4/821: creat d5/d47/d62/d8a/da0/f118 x:0 0 0 2026-03-10T14:08:45.913 INFO:tasks.workunit.client.0.vm03.stdout:6/836: read d8/d11/da0/dbf/d5c/d60/f67 [49795,43758] 0 2026-03-10T14:08:45.913 INFO:tasks.workunit.client.0.vm03.stdout:0/783: write d3/d17/f6d [3106392,92559] 0 2026-03-10T14:08:45.923 INFO:tasks.workunit.client.0.vm03.stdout:5/964: symlink d4/d13/d8f/l12d 0 2026-03-10T14:08:45.927 INFO:tasks.workunit.client.0.vm03.stdout:7/718: dwrite d5/d9/d14/d21/d28/f5b [0,4194304] 0 2026-03-10T14:08:45.938 INFO:tasks.workunit.client.0.vm03.stdout:3/927: rename d1d/d33/d47/d53/d68/dcf/de7/d41/cd3 to d1d/d33/d65/d5d/dae/db9/dec/c125 0 2026-03-10T14:08:45.944 INFO:tasks.workunit.client.0.vm03.stdout:8/915: dwrite f5 [0,4194304] 0 2026-03-10T14:08:45.946 INFO:tasks.workunit.client.0.vm03.stdout:3/928: read d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/fa9 [3777872,99251] 0 2026-03-10T14:08:45.956 INFO:tasks.workunit.client.0.vm03.stdout:4/822: getdents d5/d96 0 2026-03-10T14:08:45.957 INFO:tasks.workunit.client.0.vm03.stdout:2/795: write d5/f23 [1143101,25209] 0 2026-03-10T14:08:45.959 INFO:tasks.workunit.client.0.vm03.stdout:0/784: truncate d3/d4d/d30/f69 937122 0 2026-03-10T14:08:45.962 INFO:tasks.workunit.client.0.vm03.stdout:1/805: dwrite d0/d2/df/d27/faf [0,4194304] 0 2026-03-10T14:08:45.968 INFO:tasks.workunit.client.0.vm03.stdout:8/916: mknod da/d3c/d4b/d4c/c123 0 2026-03-10T14:08:45.968 INFO:tasks.workunit.client.0.vm03.stdout:8/917: truncate da/d58/f115 833633 0 2026-03-10T14:08:45.968 INFO:tasks.workunit.client.0.vm03.stdout:3/929: mknod d1d/d33/d47/dac/dba/c126 0 2026-03-10T14:08:45.968 INFO:tasks.workunit.client.0.vm03.stdout:8/918: dread - da/d36/d4d/da5/f11b zero size 2026-03-10T14:08:45.971 INFO:tasks.workunit.client.0.vm03.stdout:2/796: stat d5/d2a/d8a 0 2026-03-10T14:08:45.973 INFO:tasks.workunit.client.0.vm03.stdout:0/785: creat d3/d11/d2c/d4a/d4b/d89/db9/ffe x:0 0 0 2026-03-10T14:08:45.976 INFO:tasks.workunit.client.0.vm03.stdout:0/786: dwrite d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:45.983 INFO:tasks.workunit.client.0.vm03.stdout:1/806: chown d0/d42/d9f/fa2 159618042 1 2026-03-10T14:08:45.988 INFO:tasks.workunit.client.0.vm03.stdout:9/814: stat d2/d29/d33/fa9 0 2026-03-10T14:08:45.988 INFO:tasks.workunit.client.0.vm03.stdout:8/919: unlink da/d3c/d51/d85/d104/f107 0 2026-03-10T14:08:45.990 INFO:tasks.workunit.client.0.vm03.stdout:7/719: creat d5/d9/ff8 x:0 0 0 2026-03-10T14:08:45.992 INFO:tasks.workunit.client.0.vm03.stdout:6/837: write d8/db/d49/d6c/f90 [527672,23746] 0 2026-03-10T14:08:45.994 INFO:tasks.workunit.client.0.vm03.stdout:2/797: mkdir d5/db4/d74/d106 0 2026-03-10T14:08:45.994 INFO:tasks.workunit.client.0.vm03.stdout:2/798: chown d5/d10/d1f/d4f/d76/da7/d40/db0 272 1 2026-03-10T14:08:45.996 INFO:tasks.workunit.client.0.vm03.stdout:5/965: creat d4/d40/f12e x:0 0 0 2026-03-10T14:08:45.998 INFO:tasks.workunit.client.0.vm03.stdout:8/920: chown da/d3c/c86 1 1 2026-03-10T14:08:45.999 INFO:tasks.workunit.client.0.vm03.stdout:8/921: write da/d3c/d4b/d69/fb5 [6905842,59533] 0 2026-03-10T14:08:46.003 INFO:tasks.workunit.client.0.vm03.stdout:0/787: mknod d3/d16/d21/cff 0 2026-03-10T14:08:46.003 INFO:tasks.workunit.client.0.vm03.stdout:1/807: symlink d0/d2/df/d27/dd7/l102 0 2026-03-10T14:08:46.005 INFO:tasks.workunit.client.0.vm03.stdout:9/815: symlink d2/d29/d9a/l10a 0 2026-03-10T14:08:46.006 INFO:tasks.workunit.client.0.vm03.stdout:9/816: dread - d2/d29/d33/d60/d7f/fe2 zero size 2026-03-10T14:08:46.006 INFO:tasks.workunit.client.0.vm03.stdout:2/799: dwrite d5/d10/d1f/d4f/d76/da7/d40/db0/f101 [0,4194304] 0 2026-03-10T14:08:46.008 INFO:tasks.workunit.client.0.vm03.stdout:9/817: fsync d2/d29/d33/d60/d7f/fe2 0 2026-03-10T14:08:46.008 INFO:tasks.workunit.client.0.vm03.stdout:2/800: chown d5/f4d 1876498 1 2026-03-10T14:08:46.011 INFO:tasks.workunit.client.0.vm03.stdout:9/818: write d2/d14/d2b/d79/fb8 [1203177,25706] 0 2026-03-10T14:08:46.014 INFO:tasks.workunit.client.0.vm03.stdout:3/930: rename d1d/d33/d47/d53/lad to d1d/d33/d65/l127 0 2026-03-10T14:08:46.019 INFO:tasks.workunit.client.0.vm03.stdout:7/720: mknod d5/d9/d14/d26/d36/d51/dc8/de3/cf9 0 2026-03-10T14:08:46.022 INFO:tasks.workunit.client.0.vm03.stdout:0/788: mknod d3/d11/d76/db5/ddb/dd4/c100 0 2026-03-10T14:08:46.023 INFO:tasks.workunit.client.0.vm03.stdout:5/966: unlink d4/c4c 0 2026-03-10T14:08:46.024 INFO:tasks.workunit.client.0.vm03.stdout:5/967: stat d4/d16/d19/d23/d12c 0 2026-03-10T14:08:46.024 INFO:tasks.workunit.client.0.vm03.stdout:5/968: chown d4/d6/de/dd5/f10b 11225 1 2026-03-10T14:08:46.025 INFO:tasks.workunit.client.0.vm03.stdout:2/801: readlink d5/d10/d1f/d4f/d76/da7/la0 0 2026-03-10T14:08:46.033 INFO:tasks.workunit.client.0.vm03.stdout:4/823: dwrite d5/d47/d5b/d64/d85/f9d [0,4194304] 0 2026-03-10T14:08:46.047 INFO:tasks.workunit.client.0.vm03.stdout:3/931: unlink d1d/d33/d65/d48/ccc 0 2026-03-10T14:08:46.053 INFO:tasks.workunit.client.0.vm03.stdout:7/721: creat d5/d9/da7/ffa x:0 0 0 2026-03-10T14:08:46.053 INFO:tasks.workunit.client.0.vm03.stdout:7/722: write d5/d9/d14/d26/d36/d51/d7b/f90 [1612510,100955] 0 2026-03-10T14:08:46.059 INFO:tasks.workunit.client.0.vm03.stdout:6/838: symlink d8/db/df/lfc 0 2026-03-10T14:08:46.059 INFO:tasks.workunit.client.0.vm03.stdout:6/839: fdatasync d8/d1b/f7f 0 2026-03-10T14:08:46.060 INFO:tasks.workunit.client.0.vm03.stdout:0/789: read - d3/fd7 zero size 2026-03-10T14:08:46.064 INFO:tasks.workunit.client.0.vm03.stdout:1/808: write d0/d2/df/f43 [505581,18807] 0 2026-03-10T14:08:46.065 INFO:tasks.workunit.client.0.vm03.stdout:1/809: stat d0/d2/df/d16/f61 0 2026-03-10T14:08:46.075 INFO:tasks.workunit.client.0.vm03.stdout:4/824: mkdir d5/d9/db/da7/d119 0 2026-03-10T14:08:46.075 INFO:tasks.workunit.client.0.vm03.stdout:3/932: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/de4/f128 x:0 0 0 2026-03-10T14:08:46.076 INFO:tasks.workunit.client.0.vm03.stdout:3/933: stat d1d/d33/d65/d48/fa5 0 2026-03-10T14:08:46.078 INFO:tasks.workunit.client.0.vm03.stdout:3/934: read d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e/f112 [867794,104726] 0 2026-03-10T14:08:46.078 INFO:tasks.workunit.client.0.vm03.stdout:3/935: fsync d1d/d33/d65/d5d/dae/db9/dec/f104 0 2026-03-10T14:08:46.082 INFO:tasks.workunit.client.0.vm03.stdout:4/825: chown d5/d47/c4d 75758 1 2026-03-10T14:08:46.083 INFO:tasks.workunit.client.0.vm03.stdout:8/922: getdents da/d3c/d51/d75/dc2/dc6 0 2026-03-10T14:08:46.085 INFO:tasks.workunit.client.0.vm03.stdout:0/790: fdatasync d3/d16/d21/fb8 0 2026-03-10T14:08:46.086 INFO:tasks.workunit.client.0.vm03.stdout:7/723: sync 2026-03-10T14:08:46.089 INFO:tasks.workunit.client.0.vm03.stdout:3/936: mkdir d1d/d33/d47/d53/d68/dcf/de7/db2/d129 0 2026-03-10T14:08:46.090 INFO:tasks.workunit.client.0.vm03.stdout:3/937: stat d1d/d33/d47/d53/d68/dcf/de7/d41/f75 0 2026-03-10T14:08:46.091 INFO:tasks.workunit.client.0.vm03.stdout:4/826: dread d5/d9/db/d19/d38/d7b/daa/daf/fe1 [0,4194304] 0 2026-03-10T14:08:46.092 INFO:tasks.workunit.client.0.vm03.stdout:1/810: rename d0/d2/df/dab/cd4 to d0/d2/df/d16/c103 0 2026-03-10T14:08:46.092 INFO:tasks.workunit.client.0.vm03.stdout:7/724: rename d5/d9/d14/d26/d36 to d5/d9/d14/d26/d36/d51/d7b/d9c/dfb 22 2026-03-10T14:08:46.097 INFO:tasks.workunit.client.0.vm03.stdout:2/802: dwrite d5/d10/d1f/d4f/d76/da7/d54/d5f/f103 [0,4194304] 0 2026-03-10T14:08:46.098 INFO:tasks.workunit.client.0.vm03.stdout:6/840: truncate d8/fd 8350561 0 2026-03-10T14:08:46.100 INFO:tasks.workunit.client.0.vm03.stdout:0/791: symlink d3/d46/dac/d79/l101 0 2026-03-10T14:08:46.106 INFO:tasks.workunit.client.0.vm03.stdout:3/938: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/f12a x:0 0 0 2026-03-10T14:08:46.106 INFO:tasks.workunit.client.0.vm03.stdout:9/819: dwrite d2/d14/d2b/d43/f45 [4194304,4194304] 0 2026-03-10T14:08:46.114 INFO:tasks.workunit.client.0.vm03.stdout:3/939: sync 2026-03-10T14:08:46.116 INFO:tasks.workunit.client.0.vm03.stdout:7/725: dread d5/d9/d14/d21/d6f/faa [0,4194304] 0 2026-03-10T14:08:46.126 INFO:tasks.workunit.client.0.vm03.stdout:4/827: creat d5/d9/db/d19/d38/d7b/daa/f11a x:0 0 0 2026-03-10T14:08:46.128 INFO:tasks.workunit.client.0.vm03.stdout:5/969: write d4/d13/d8f/d11d/d127/fce [1098438,73557] 0 2026-03-10T14:08:46.130 INFO:tasks.workunit.client.0.vm03.stdout:1/811: symlink d0/d18/d3b/dec/l104 0 2026-03-10T14:08:46.136 INFO:tasks.workunit.client.0.vm03.stdout:2/803: rmdir d5/d10/d1f/d4f/d76/da7/d40 39 2026-03-10T14:08:46.139 INFO:tasks.workunit.client.0.vm03.stdout:3/940: fsync d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/fa9 0 2026-03-10T14:08:46.140 INFO:tasks.workunit.client.0.vm03.stdout:7/726: fsync d5/d9/d14/d26/d5f/f8c 0 2026-03-10T14:08:46.142 INFO:tasks.workunit.client.0.vm03.stdout:1/812: mknod d0/d2/df/d16/d20/c105 0 2026-03-10T14:08:46.144 INFO:tasks.workunit.client.0.vm03.stdout:2/804: truncate d5/d10/d1f/d4f/d76/da7/d40/f68 1808751 0 2026-03-10T14:08:46.144 INFO:tasks.workunit.client.0.vm03.stdout:2/805: write d5/f4d [215327,111047] 0 2026-03-10T14:08:46.147 INFO:tasks.workunit.client.0.vm03.stdout:3/941: creat d1d/d33/d47/dac/dba/d11a/f12b x:0 0 0 2026-03-10T14:08:46.148 INFO:tasks.workunit.client.0.vm03.stdout:5/970: mknod d4/d13/c12f 0 2026-03-10T14:08:46.148 INFO:tasks.workunit.client.0.vm03.stdout:5/971: chown d4/d16/ffb 5 1 2026-03-10T14:08:46.151 INFO:tasks.workunit.client.0.vm03.stdout:2/806: creat d5/db4/d74/d83/dc1/f107 x:0 0 0 2026-03-10T14:08:46.152 INFO:tasks.workunit.client.0.vm03.stdout:3/942: creat d1d/d39/d51/d72/d99/f12c x:0 0 0 2026-03-10T14:08:46.153 INFO:tasks.workunit.client.0.vm03.stdout:3/943: chown d1d/d33/d47/d53/d68/dcf/de7/d41/cd0 2395853 1 2026-03-10T14:08:46.156 INFO:tasks.workunit.client.0.vm03.stdout:5/972: mknod d4/d40/d4e/c130 0 2026-03-10T14:08:46.156 INFO:tasks.workunit.client.0.vm03.stdout:5/973: fsync d4/d13/d43/f72 0 2026-03-10T14:08:46.158 INFO:tasks.workunit.client.0.vm03.stdout:6/841: getdents d8/d11/d18/d79/d80 0 2026-03-10T14:08:46.159 INFO:tasks.workunit.client.0.vm03.stdout:2/807: write d5/d10/d31/ff9 [637143,100955] 0 2026-03-10T14:08:46.162 INFO:tasks.workunit.client.0.vm03.stdout:3/944: fdatasync d1d/f62 0 2026-03-10T14:08:46.162 INFO:tasks.workunit.client.0.vm03.stdout:7/727: link d5/d9/d14/d26/d39/lb3 d5/d9/d14/d26/d36/de1/lfc 0 2026-03-10T14:08:46.163 INFO:tasks.workunit.client.0.vm03.stdout:4/828: link d5/d9/db/d19/l30 d5/d47/l11b 0 2026-03-10T14:08:46.165 INFO:tasks.workunit.client.0.vm03.stdout:7/728: symlink d5/d9/d14/d21/d6f/lfd 0 2026-03-10T14:08:46.166 INFO:tasks.workunit.client.0.vm03.stdout:7/729: stat d5/d9/d14/d26/d36/de1/d84/lb9 0 2026-03-10T14:08:46.168 INFO:tasks.workunit.client.0.vm03.stdout:2/808: symlink d5/d2a/l108 0 2026-03-10T14:08:46.169 INFO:tasks.workunit.client.0.vm03.stdout:7/730: dread d5/d9/d14/f2f [0,4194304] 0 2026-03-10T14:08:46.171 INFO:tasks.workunit.client.0.vm03.stdout:6/842: sync 2026-03-10T14:08:46.172 INFO:tasks.workunit.client.0.vm03.stdout:4/829: sync 2026-03-10T14:08:46.172 INFO:tasks.workunit.client.0.vm03.stdout:4/830: chown d5/d47/d5b/dbe/ffc 9551300 1 2026-03-10T14:08:46.173 INFO:tasks.workunit.client.0.vm03.stdout:4/831: stat d5/d9/db/d19/d38/d7b/daa/daf 0 2026-03-10T14:08:46.184 INFO:tasks.workunit.client.0.vm03.stdout:7/731: truncate d5/d9/d14/d26/d36/d51/d7b/d83/f9e 2832477 0 2026-03-10T14:08:46.192 INFO:tasks.workunit.client.0.vm03.stdout:6/843: dread d8/db/d49/d6c/f8e [0,4194304] 0 2026-03-10T14:08:46.193 INFO:tasks.workunit.client.0.vm03.stdout:4/832: creat d5/d9/db/d19/d38/d7b/daa/f11c x:0 0 0 2026-03-10T14:08:46.193 INFO:tasks.workunit.client.0.vm03.stdout:6/844: truncate d8/d11/d7a/fa6 165514 0 2026-03-10T14:08:46.196 INFO:tasks.workunit.client.0.vm03.stdout:8/923: write da/d58/fc4 [1379333,54157] 0 2026-03-10T14:08:46.199 INFO:tasks.workunit.client.0.vm03.stdout:4/833: mknod d5/d47/d5b/dab/c11d 0 2026-03-10T14:08:46.202 INFO:tasks.workunit.client.0.vm03.stdout:9/820: write d2/fdb [5047140,31543] 0 2026-03-10T14:08:46.205 INFO:tasks.workunit.client.0.vm03.stdout:6/845: write d8/db/dd0/fef [682864,101671] 0 2026-03-10T14:08:46.207 INFO:tasks.workunit.client.0.vm03.stdout:1/813: write d0/f48 [3129037,98340] 0 2026-03-10T14:08:46.210 INFO:tasks.workunit.client.0.vm03.stdout:8/924: creat da/d24/db4/f124 x:0 0 0 2026-03-10T14:08:46.212 INFO:tasks.workunit.client.0.vm03.stdout:7/732: symlink d5/d9/d14/d26/d39/lfe 0 2026-03-10T14:08:46.213 INFO:tasks.workunit.client.0.vm03.stdout:4/834: unlink d5/d47/d5b/dbe/ffc 0 2026-03-10T14:08:46.214 INFO:tasks.workunit.client.0.vm03.stdout:9/821: mkdir d2/d29/d33/d60/d7f/d10b 0 2026-03-10T14:08:46.215 INFO:tasks.workunit.client.0.vm03.stdout:9/822: stat d2/d29/d33/d60/d8c 0 2026-03-10T14:08:46.216 INFO:tasks.workunit.client.0.vm03.stdout:6/846: fdatasync d8/d11/da0/dbf/d5c/f68 0 2026-03-10T14:08:46.219 INFO:tasks.workunit.client.0.vm03.stdout:7/733: sync 2026-03-10T14:08:46.225 INFO:tasks.workunit.client.0.vm03.stdout:1/814: rename d0/f70 to d0/d2/df/d16/f106 0 2026-03-10T14:08:46.226 INFO:tasks.workunit.client.0.vm03.stdout:1/815: dread - d0/d2/df/d16/d41/feb zero size 2026-03-10T14:08:46.228 INFO:tasks.workunit.client.0.vm03.stdout:6/847: dread d8/d11/d7a/fa4 [0,4194304] 0 2026-03-10T14:08:46.229 INFO:tasks.workunit.client.0.vm03.stdout:6/848: write d8/d11/da0/dbf/d8c/db6/fdd [437845,51739] 0 2026-03-10T14:08:46.232 INFO:tasks.workunit.client.0.vm03.stdout:6/849: sync 2026-03-10T14:08:46.233 INFO:tasks.workunit.client.0.vm03.stdout:8/925: creat da/d3c/d4b/d4c/dba/d114/f125 x:0 0 0 2026-03-10T14:08:46.233 INFO:tasks.workunit.client.0.vm03.stdout:4/835: mknod d5/d9/db/d19/d38/d53/d71/c11e 0 2026-03-10T14:08:46.237 INFO:tasks.workunit.client.0.vm03.stdout:8/926: mknod da/d3c/d51/c126 0 2026-03-10T14:08:46.239 INFO:tasks.workunit.client.0.vm03.stdout:4/836: stat d5/c42 0 2026-03-10T14:08:46.250 INFO:tasks.workunit.client.0.vm03.stdout:4/837: rename d5/d6e/db6 to d5/d9/db/da7/db9/def/d11f 0 2026-03-10T14:08:46.251 INFO:tasks.workunit.client.0.vm03.stdout:0/792: truncate d3/f10 4120694 0 2026-03-10T14:08:46.254 INFO:tasks.workunit.client.0.vm03.stdout:0/793: dwrite d3/d46/f55 [0,4194304] 0 2026-03-10T14:08:46.261 INFO:tasks.workunit.client.0.vm03.stdout:5/974: write d4/d6/fbe [1355378,34548] 0 2026-03-10T14:08:46.262 INFO:tasks.workunit.client.0.vm03.stdout:3/945: write d1d/d33/d65/d5d/dae/db9/dec/f120 [3710498,18378] 0 2026-03-10T14:08:46.262 INFO:tasks.workunit.client.0.vm03.stdout:5/975: write d4/d16/fac [4935187,98502] 0 2026-03-10T14:08:46.266 INFO:tasks.workunit.client.0.vm03.stdout:9/823: rmdir d2/d29/d33/d60/d7f/d10b 0 2026-03-10T14:08:46.268 INFO:tasks.workunit.client.0.vm03.stdout:7/734: link d5/d9/d35/laf d5/d9/d14/d26/d5f/lff 0 2026-03-10T14:08:46.271 INFO:tasks.workunit.client.0.vm03.stdout:8/927: mkdir da/d3c/d4b/df9/d10d/d127 0 2026-03-10T14:08:46.276 INFO:tasks.workunit.client.0.vm03.stdout:4/838: read d5/d9/db/f3d [635653,9573] 0 2026-03-10T14:08:46.286 INFO:tasks.workunit.client.0.vm03.stdout:2/809: write d5/d10/da3/dab/fdb [801608,126661] 0 2026-03-10T14:08:46.289 INFO:tasks.workunit.client.0.vm03.stdout:5/976: symlink d4/d16/d19/d23/l131 0 2026-03-10T14:08:46.290 INFO:tasks.workunit.client.0.vm03.stdout:5/977: chown d4/d16/d19/d6e 466602517 1 2026-03-10T14:08:46.293 INFO:tasks.workunit.client.0.vm03.stdout:5/978: dread d4/d13/d8f/d11d/d127/fce [0,4194304] 0 2026-03-10T14:08:46.299 INFO:tasks.workunit.client.0.vm03.stdout:7/735: fsync d5/d9/d14/d26/d39/d92/fb5 0 2026-03-10T14:08:46.303 INFO:tasks.workunit.client.0.vm03.stdout:1/816: dread d0/d2/df/f1f [4194304,4194304] 0 2026-03-10T14:08:46.304 INFO:tasks.workunit.client.0.vm03.stdout:8/928: creat da/d3c/d4b/d4c/dba/d114/dbc/d70/d7d/f128 x:0 0 0 2026-03-10T14:08:46.313 INFO:tasks.workunit.client.0.vm03.stdout:5/979: mknod d4/d13/d43/c132 0 2026-03-10T14:08:46.314 INFO:tasks.workunit.client.0.vm03.stdout:5/980: chown d4/d13/l38 23545 1 2026-03-10T14:08:46.316 INFO:tasks.workunit.client.0.vm03.stdout:7/736: fsync d5/f6 0 2026-03-10T14:08:46.318 INFO:tasks.workunit.client.0.vm03.stdout:6/850: rmdir d8/db/df/dc8 0 2026-03-10T14:08:46.319 INFO:tasks.workunit.client.0.vm03.stdout:6/851: write d8/d11/da0/dca/fd6 [561813,107805] 0 2026-03-10T14:08:46.327 INFO:tasks.workunit.client.0.vm03.stdout:8/929: unlink da/d36/d40/l8d 0 2026-03-10T14:08:46.330 INFO:tasks.workunit.client.0.vm03.stdout:2/810: mkdir d5/d10/d1f/d4f/d76/da7/de9/d109 0 2026-03-10T14:08:46.332 INFO:tasks.workunit.client.0.vm03.stdout:5/981: creat d4/d16/d19/d23/db8/f133 x:0 0 0 2026-03-10T14:08:46.333 INFO:tasks.workunit.client.0.vm03.stdout:7/737: symlink d5/d9/d14/d26/d36/de1/l100 0 2026-03-10T14:08:46.336 INFO:tasks.workunit.client.0.vm03.stdout:6/852: read d8/fd [4562349,113449] 0 2026-03-10T14:08:46.336 INFO:tasks.workunit.client.0.vm03.stdout:2/811: rmdir d5/d10/d31 39 2026-03-10T14:08:46.338 INFO:tasks.workunit.client.0.vm03.stdout:5/982: rename d4/d13/d43/fcf to d4/d6/de/f134 0 2026-03-10T14:08:46.338 INFO:tasks.workunit.client.0.vm03.stdout:7/738: mkdir d5/d9/d35/d101 0 2026-03-10T14:08:46.339 INFO:tasks.workunit.client.0.vm03.stdout:7/739: stat d5/d9/d14/d26/d36/d51/d7b/d83/c88 0 2026-03-10T14:08:46.343 INFO:tasks.workunit.client.0.vm03.stdout:8/930: read da/d58/f115 [156440,25699] 0 2026-03-10T14:08:46.343 INFO:tasks.workunit.client.0.vm03.stdout:6/853: stat d8/db/d49/d6c/d83/da3/lf4 0 2026-03-10T14:08:46.345 INFO:tasks.workunit.client.0.vm03.stdout:8/931: rmdir da/d3c/d4b/d69 39 2026-03-10T14:08:46.346 INFO:tasks.workunit.client.0.vm03.stdout:2/812: mknod d5/d10/d1f/c10a 0 2026-03-10T14:08:46.351 INFO:tasks.workunit.client.0.vm03.stdout:5/983: fsync d4/d16/f2d 0 2026-03-10T14:08:46.352 INFO:tasks.workunit.client.0.vm03.stdout:8/932: rmdir da/d3a/d44/dfb 39 2026-03-10T14:08:46.352 INFO:tasks.workunit.client.0.vm03.stdout:7/740: sync 2026-03-10T14:08:46.353 INFO:tasks.workunit.client.0.vm03.stdout:7/741: chown d5/d9/d14/d26/d36/de1/c4c 23 1 2026-03-10T14:08:46.358 INFO:tasks.workunit.client.0.vm03.stdout:6/854: dread d8/d1b/d1c/f50 [0,4194304] 0 2026-03-10T14:08:46.359 INFO:tasks.workunit.client.0.vm03.stdout:6/855: stat d8/d1b/d1c/c1d 0 2026-03-10T14:08:46.362 INFO:tasks.workunit.client.0.vm03.stdout:6/856: sync 2026-03-10T14:08:46.380 INFO:tasks.workunit.client.0.vm03.stdout:0/794: dwrite d3/d11/d2c/d4a/d4b/f7d [0,4194304] 0 2026-03-10T14:08:46.385 INFO:tasks.workunit.client.0.vm03.stdout:3/946: dwrite d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/f9e [0,4194304] 0 2026-03-10T14:08:46.397 INFO:tasks.workunit.client.0.vm03.stdout:9/824: dwrite d2/d29/d33/d41/d46/ff9 [0,4194304] 0 2026-03-10T14:08:46.410 INFO:tasks.workunit.client.0.vm03.stdout:4/839: write d5/d47/d5b/d64/fda [437510,8673] 0 2026-03-10T14:08:46.412 INFO:tasks.workunit.client.0.vm03.stdout:6/857: creat d8/db/d49/d6c/d83/ffd x:0 0 0 2026-03-10T14:08:46.414 INFO:tasks.workunit.client.0.vm03.stdout:0/795: readlink d3/d4d/l71 0 2026-03-10T14:08:46.415 INFO:tasks.workunit.client.0.vm03.stdout:0/796: chown d3/d4d/d47/lfa 139124 1 2026-03-10T14:08:46.418 INFO:tasks.workunit.client.0.vm03.stdout:3/947: creat d1d/d33/d47/d53/d68/dcf/f12d x:0 0 0 2026-03-10T14:08:46.419 INFO:tasks.workunit.client.0.vm03.stdout:3/948: stat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95 0 2026-03-10T14:08:46.423 INFO:tasks.workunit.client.0.vm03.stdout:8/933: symlink da/d3a/dce/d121/l129 0 2026-03-10T14:08:46.424 INFO:tasks.workunit.client.0.vm03.stdout:4/840: mknod d5/db4/c120 0 2026-03-10T14:08:46.426 INFO:tasks.workunit.client.0.vm03.stdout:1/817: truncate d0/d2/df/d27/faf 1516598 0 2026-03-10T14:08:46.428 INFO:tasks.workunit.client.0.vm03.stdout:2/813: fdatasync d5/d10/d31/f3d 0 2026-03-10T14:08:46.431 INFO:tasks.workunit.client.0.vm03.stdout:1/818: dwrite d0/d42/f7f [0,4194304] 0 2026-03-10T14:08:46.431 INFO:tasks.workunit.client.0.vm03.stdout:1/819: fsync d0/d2/df/f1b 0 2026-03-10T14:08:46.441 INFO:tasks.workunit.client.0.vm03.stdout:3/949: symlink d1d/d33/d47/d53/d68/dcf/l12e 0 2026-03-10T14:08:46.444 INFO:tasks.workunit.client.0.vm03.stdout:8/934: dread - da/d3c/d51/fd9 zero size 2026-03-10T14:08:46.445 INFO:tasks.workunit.client.0.vm03.stdout:4/841: mkdir d5/d9/db/da7/db9/d121 0 2026-03-10T14:08:46.447 INFO:tasks.workunit.client.0.vm03.stdout:2/814: unlink d5/d10/d17/cd2 0 2026-03-10T14:08:46.453 INFO:tasks.workunit.client.0.vm03.stdout:9/825: rename d2/d29/d33/d60/d8c/dd3/df6 to d2/d29/d33/d10c 0 2026-03-10T14:08:46.455 INFO:tasks.workunit.client.0.vm03.stdout:4/842: mknod d5/d9/db/d19/d99/c122 0 2026-03-10T14:08:46.456 INFO:tasks.workunit.client.0.vm03.stdout:2/815: fsync d5/d10/d17/fb6 0 2026-03-10T14:08:46.458 INFO:tasks.workunit.client.0.vm03.stdout:1/820: link d0/d2/df/d16/d41/ffc d0/d2/d71/d90/f107 0 2026-03-10T14:08:46.458 INFO:tasks.workunit.client.0.vm03.stdout:1/821: chown d0/d2/df/dab/f73 252 1 2026-03-10T14:08:46.459 INFO:tasks.workunit.client.0.vm03.stdout:6/858: creat d8/ffe x:0 0 0 2026-03-10T14:08:46.460 INFO:tasks.workunit.client.0.vm03.stdout:6/859: stat d8/db/d49/d6c/d32/f4b 0 2026-03-10T14:08:46.460 INFO:tasks.workunit.client.0.vm03.stdout:6/860: write d8/ffe [669070,31590] 0 2026-03-10T14:08:46.463 INFO:tasks.workunit.client.0.vm03.stdout:8/935: rename da/d3c/f9e to da/d3c/d51/d85/dbb/d106/f12a 0 2026-03-10T14:08:46.463 INFO:tasks.workunit.client.0.vm03.stdout:8/936: readlink da/d3a/dce/d121/l129 0 2026-03-10T14:08:46.465 INFO:tasks.workunit.client.0.vm03.stdout:4/843: readlink d5/d9/db/d19/l30 0 2026-03-10T14:08:46.466 INFO:tasks.workunit.client.0.vm03.stdout:8/937: dread da/d36/d6a/f6b [0,4194304] 0 2026-03-10T14:08:46.466 INFO:tasks.workunit.client.0.vm03.stdout:9/826: sync 2026-03-10T14:08:46.467 INFO:tasks.workunit.client.0.vm03.stdout:2/816: creat d5/d2a/d8a/f10b x:0 0 0 2026-03-10T14:08:46.470 INFO:tasks.workunit.client.0.vm03.stdout:1/822: rename d0/d42/fd9 to d0/d42/d9f/dc9/f108 0 2026-03-10T14:08:46.472 INFO:tasks.workunit.client.0.vm03.stdout:6/861: dwrite f0 [0,4194304] 0 2026-03-10T14:08:46.479 INFO:tasks.workunit.client.0.vm03.stdout:5/984: write d4/d16/d19/d6e/d7f/dd1/fda [555014,20231] 0 2026-03-10T14:08:46.480 INFO:tasks.workunit.client.0.vm03.stdout:5/985: readlink d4/d6/de/lec 0 2026-03-10T14:08:46.480 INFO:tasks.workunit.client.0.vm03.stdout:5/986: write d4/d16/df4/f10c [9059216,22330] 0 2026-03-10T14:08:46.481 INFO:tasks.workunit.client.0.vm03.stdout:5/987: stat d4/d13/d8f/d11d/dc5 0 2026-03-10T14:08:46.488 INFO:tasks.workunit.client.0.vm03.stdout:1/823: creat d0/d2/df/d27/d7e/d81/f109 x:0 0 0 2026-03-10T14:08:46.490 INFO:tasks.workunit.client.0.vm03.stdout:1/824: write d0/d2/df/f43 [3728905,87768] 0 2026-03-10T14:08:46.492 INFO:tasks.workunit.client.0.vm03.stdout:5/988: creat d4/d16/d19/d6e/d7f/dd1/f135 x:0 0 0 2026-03-10T14:08:46.498 INFO:tasks.workunit.client.0.vm03.stdout:0/797: dwrite d3/f94 [0,4194304] 0 2026-03-10T14:08:46.500 INFO:tasks.workunit.client.0.vm03.stdout:0/798: write d3/d46/dac/fc2 [9866,17759] 0 2026-03-10T14:08:46.503 INFO:tasks.workunit.client.0.vm03.stdout:0/799: dwrite d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:46.512 INFO:tasks.workunit.client.0.vm03.stdout:0/800: dwrite d3/d11/d2c/d4a/d4b/d89/db6/df4/ffd [0,4194304] 0 2026-03-10T14:08:46.517 INFO:tasks.workunit.client.0.vm03.stdout:1/825: creat d0/d18/d1d/dc4/f10a x:0 0 0 2026-03-10T14:08:46.528 INFO:tasks.workunit.client.0.vm03.stdout:3/950: dread d1d/d33/d47/d53/d68/dcf/de7/f10b [0,4194304] 0 2026-03-10T14:08:46.529 INFO:tasks.workunit.client.0.vm03.stdout:3/951: readlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/lcb 0 2026-03-10T14:08:46.532 INFO:tasks.workunit.client.0.vm03.stdout:4/844: creat d5/d47/f123 x:0 0 0 2026-03-10T14:08:46.533 INFO:tasks.workunit.client.0.vm03.stdout:4/845: fdatasync d5/d47/d5b/d64/fda 0 2026-03-10T14:08:46.537 INFO:tasks.workunit.client.0.vm03.stdout:2/817: creat d5/f10c x:0 0 0 2026-03-10T14:08:46.539 INFO:tasks.workunit.client.0.vm03.stdout:1/826: unlink d0/d18/daa/ce6 0 2026-03-10T14:08:46.540 INFO:tasks.workunit.client.0.vm03.stdout:3/952: mknod d1d/d33/d47/d53/c12f 0 2026-03-10T14:08:46.541 INFO:tasks.workunit.client.0.vm03.stdout:6/862: link d8/db/d49/d6c/d32/d3a/f5e d8/db/d49/d6c/d32/dcb/fff 0 2026-03-10T14:08:46.545 INFO:tasks.workunit.client.0.vm03.stdout:0/801: dread d3/d46/f63 [4194304,4194304] 0 2026-03-10T14:08:46.546 INFO:tasks.workunit.client.0.vm03.stdout:2/818: dwrite d5/d10/d1f/d4f/d76/da7/d54/d5f/f82 [0,4194304] 0 2026-03-10T14:08:46.548 INFO:tasks.workunit.client.0.vm03.stdout:6/863: symlink d8/db/d49/d6c/d32/l100 0 2026-03-10T14:08:46.567 INFO:tasks.workunit.client.0.vm03.stdout:9/827: write d2/d14/d2b/d34/f59 [2659058,67844] 0 2026-03-10T14:08:46.576 INFO:tasks.workunit.client.0.vm03.stdout:7/742: dwrite d5/d9/d14/d26/d5f/f78 [0,4194304] 0 2026-03-10T14:08:46.581 INFO:tasks.workunit.client.0.vm03.stdout:2/819: rename d5/d10/da3/dab/ce6 to d5/d10/d1f/d4f/c10d 0 2026-03-10T14:08:46.581 INFO:tasks.workunit.client.0.vm03.stdout:2/820: readlink d5/db4/d74/d83/lb8 0 2026-03-10T14:08:46.581 INFO:tasks.workunit.client.0.vm03.stdout:1/827: link d0/d42/d9f/fd5 d0/d42/d9f/dc9/f10b 0 2026-03-10T14:08:46.581 INFO:tasks.workunit.client.0.vm03.stdout:3/953: creat d1d/d33/f130 x:0 0 0 2026-03-10T14:08:46.591 INFO:tasks.workunit.client.0.vm03.stdout:9/828: dread d2/f15 [0,4194304] 0 2026-03-10T14:08:46.593 INFO:tasks.workunit.client.0.vm03.stdout:6/864: rename d8/c48 to d8/d11/d18/d79/d80/c101 0 2026-03-10T14:08:46.595 INFO:tasks.workunit.client.0.vm03.stdout:1/828: symlink d0/d2/df/d16/ded/l10c 0 2026-03-10T14:08:46.597 INFO:tasks.workunit.client.0.vm03.stdout:9/829: dwrite d2/d29/d9a/fb7 [0,4194304] 0 2026-03-10T14:08:46.600 INFO:tasks.workunit.client.0.vm03.stdout:1/829: mknod d0/d2/df/d27/d7e/c10d 0 2026-03-10T14:08:46.607 INFO:tasks.workunit.client.0.vm03.stdout:3/954: getdents d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/d7e 0 2026-03-10T14:08:46.616 INFO:tasks.workunit.client.0.vm03.stdout:7/743: dread d5/d9/f10 [0,4194304] 0 2026-03-10T14:08:46.628 INFO:tasks.workunit.client.0.vm03.stdout:3/955: symlink d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/l131 0 2026-03-10T14:08:46.628 INFO:tasks.workunit.client.0.vm03.stdout:3/956: rename d1d/d33/d47/d53/d68 to d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/dbf/d132 22 2026-03-10T14:08:46.628 INFO:tasks.workunit.client.0.vm03.stdout:3/957: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f85 3734 1 2026-03-10T14:08:46.628 INFO:tasks.workunit.client.0.vm03.stdout:8/938: dwrite da/d36/f77 [0,4194304] 0 2026-03-10T14:08:46.628 INFO:tasks.workunit.client.0.vm03.stdout:8/939: fdatasync da/d3c/d51/d85/d104/fc1 0 2026-03-10T14:08:46.628 INFO:tasks.workunit.client.0.vm03.stdout:8/940: truncate da/d24/db4/f124 109013 0 2026-03-10T14:08:46.639 INFO:tasks.workunit.client.0.vm03.stdout:1/830: dread d0/d2/df/dab/f73 [0,4194304] 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:1/831: creat d0/d2/df/d27/d7e/d81/f10e x:0 0 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:5/989: write d4/d40/da5/feb [554941,87968] 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:1/832: creat d0/d2/df/d16/ded/f10f x:0 0 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:6/865: dread d8/d11/da0/dbf/d5c/d60/f82 [4194304,4194304] 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:6/866: stat d8/d11/cd8 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:5/990: fsync d4/d13/fad 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:1/833: creat d0/d2/d71/f110 x:0 0 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:6/867: dwrite d8/d11/da0/dbf/d5c/fe4 [0,4194304] 0 2026-03-10T14:08:46.657 INFO:tasks.workunit.client.0.vm03.stdout:1/834: stat d0/d18/d3b/d50/lb2 0 2026-03-10T14:08:46.664 INFO:tasks.workunit.client.0.vm03.stdout:4/846: dwrite d5/d9/db/f29 [4194304,4194304] 0 2026-03-10T14:08:46.666 INFO:tasks.workunit.client.0.vm03.stdout:1/835: mkdir d0/d2/df/d91/d111 0 2026-03-10T14:08:46.667 INFO:tasks.workunit.client.0.vm03.stdout:4/847: chown d5/d47/d62/d8a/dd0/d102/f10d 1 1 2026-03-10T14:08:46.673 INFO:tasks.workunit.client.0.vm03.stdout:6/868: truncate d8/db/d49/d6c/d32/faa 886934 0 2026-03-10T14:08:46.673 INFO:tasks.workunit.client.0.vm03.stdout:1/836: dwrite d0/d18/f75 [0,4194304] 0 2026-03-10T14:08:46.685 INFO:tasks.workunit.client.0.vm03.stdout:6/869: rmdir d8/db/d49/d58 39 2026-03-10T14:08:46.688 INFO:tasks.workunit.client.0.vm03.stdout:6/870: creat d8/db/d49/d6c/d83/f102 x:0 0 0 2026-03-10T14:08:46.689 INFO:tasks.workunit.client.0.vm03.stdout:6/871: chown d8/d11/d18/d79/d80/df1/df6/lf7 20120 1 2026-03-10T14:08:46.691 INFO:tasks.workunit.client.0.vm03.stdout:1/837: getdents d0/d2/df 0 2026-03-10T14:08:46.693 INFO:tasks.workunit.client.0.vm03.stdout:1/838: mknod d0/d18/d3b/dec/c112 0 2026-03-10T14:08:46.693 INFO:tasks.workunit.client.0.vm03.stdout:9/830: sync 2026-03-10T14:08:46.693 INFO:tasks.workunit.client.0.vm03.stdout:7/744: sync 2026-03-10T14:08:46.693 INFO:tasks.workunit.client.0.vm03.stdout:1/839: stat d0/d2/df/d27/f99 0 2026-03-10T14:08:46.698 INFO:tasks.workunit.client.0.vm03.stdout:7/745: dwrite d5/d9/d14/d26/d36/d51/dc8/fdd [0,4194304] 0 2026-03-10T14:08:46.707 INFO:tasks.workunit.client.0.vm03.stdout:6/872: readlink d8/db/d49/d58/lce 0 2026-03-10T14:08:46.707 INFO:tasks.workunit.client.0.vm03.stdout:9/831: truncate d2/d29/d33/fd4 817724 0 2026-03-10T14:08:46.707 INFO:tasks.workunit.client.0.vm03.stdout:1/840: rename d0/d2/df/f6c to d0/d2/df/d91/d111/f113 0 2026-03-10T14:08:46.707 INFO:tasks.workunit.client.0.vm03.stdout:7/746: mkdir d5/d9/d14/d26/d36/db4/d102 0 2026-03-10T14:08:46.716 INFO:tasks.workunit.client.0.vm03.stdout:0/802: dwrite d3/d11/d2c/d4a/fcf [0,4194304] 0 2026-03-10T14:08:46.721 INFO:tasks.workunit.client.0.vm03.stdout:2/821: write d5/d10/d17/f8d [560407,8237] 0 2026-03-10T14:08:46.725 INFO:tasks.workunit.client.0.vm03.stdout:6/873: dread d8/db/d49/d6c/d32/dcb/fff [0,4194304] 0 2026-03-10T14:08:46.729 INFO:tasks.workunit.client.0.vm03.stdout:6/874: dwrite d8/ffe [0,4194304] 0 2026-03-10T14:08:46.734 INFO:tasks.workunit.client.0.vm03.stdout:2/822: creat d5/d10/da3/df6/f10e x:0 0 0 2026-03-10T14:08:46.742 INFO:tasks.workunit.client.0.vm03.stdout:6/875: getdents d8/db/d12 0 2026-03-10T14:08:46.745 INFO:tasks.workunit.client.0.vm03.stdout:6/876: rename d8/d11/d18/d54/caf to d8/db/d49/c103 0 2026-03-10T14:08:46.752 INFO:tasks.workunit.client.0.vm03.stdout:7/747: dread d5/d9/d14/f9b [0,4194304] 0 2026-03-10T14:08:46.753 INFO:tasks.workunit.client.0.vm03.stdout:7/748: write d5/d9/f17 [4464756,88657] 0 2026-03-10T14:08:46.756 INFO:tasks.workunit.client.0.vm03.stdout:3/958: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/fa9 [2864446,57725] 0 2026-03-10T14:08:46.756 INFO:tasks.workunit.client.0.vm03.stdout:8/941: write da/f1f [3844476,28527] 0 2026-03-10T14:08:46.757 INFO:tasks.workunit.client.0.vm03.stdout:8/942: chown da/d3a/dce/f10a 257453870 1 2026-03-10T14:08:46.760 INFO:tasks.workunit.client.0.vm03.stdout:6/877: link d8/db/d49/d6c/d83/ff9 d8/db/dd0/de2/f104 0 2026-03-10T14:08:46.763 INFO:tasks.workunit.client.0.vm03.stdout:8/943: dread da/d24/d49/f66 [0,4194304] 0 2026-03-10T14:08:46.764 INFO:tasks.workunit.client.0.vm03.stdout:3/959: dread d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d95/fa2 [0,4194304] 0 2026-03-10T14:08:46.765 INFO:tasks.workunit.client.0.vm03.stdout:6/878: creat d8/d11/d7a/dfa/f105 x:0 0 0 2026-03-10T14:08:46.766 INFO:tasks.workunit.client.0.vm03.stdout:3/960: dread - d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/dde/d100/f106 zero size 2026-03-10T14:08:46.768 INFO:tasks.workunit.client.0.vm03.stdout:3/961: mknod d1d/d59/c133 0 2026-03-10T14:08:46.770 INFO:tasks.workunit.client.0.vm03.stdout:3/962: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f63 0 1 2026-03-10T14:08:46.771 INFO:tasks.workunit.client.0.vm03.stdout:8/944: rename da/d3a/d44/dfb/dd7/l111 to da/d36/d4d/da5/dd6/l12b 0 2026-03-10T14:08:46.771 INFO:tasks.workunit.client.0.vm03.stdout:8/945: readlink da/d36/d6a/lb3 0 2026-03-10T14:08:46.772 INFO:tasks.workunit.client.0.vm03.stdout:6/879: mkdir d8/db/d49/d6c/d32/dcb/d106 0 2026-03-10T14:08:46.773 INFO:tasks.workunit.client.0.vm03.stdout:8/946: mkdir da/d3c/d4b/dd2/d12c 0 2026-03-10T14:08:46.774 INFO:tasks.workunit.client.0.vm03.stdout:8/947: write da/d3c/d51/d75/dc2/fe1 [1030705,26788] 0 2026-03-10T14:08:46.777 INFO:tasks.workunit.client.0.vm03.stdout:8/948: mkdir da/d3c/d51/d85/dbb/d106/d12d 0 2026-03-10T14:08:46.782 INFO:tasks.workunit.client.0.vm03.stdout:3/963: getdents d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55 0 2026-03-10T14:08:46.784 INFO:tasks.workunit.client.0.vm03.stdout:5/991: write d4/d16/d19/d23/db8/fb0 [1870233,73681] 0 2026-03-10T14:08:46.787 INFO:tasks.workunit.client.0.vm03.stdout:5/992: mknod d4/d16/df4/c136 0 2026-03-10T14:08:46.787 INFO:tasks.workunit.client.0.vm03.stdout:5/993: dread - d4/d16/df4/f11c zero size 2026-03-10T14:08:46.788 INFO:tasks.workunit.client.0.vm03.stdout:5/994: creat d4/d16/d19/d23/d3f/df3/f137 x:0 0 0 2026-03-10T14:08:46.789 INFO:tasks.workunit.client.0.vm03.stdout:5/995: write d4/d16/d19/d23/db8/f133 [984051,56526] 0 2026-03-10T14:08:46.789 INFO:tasks.workunit.client.0.vm03.stdout:3/964: link d1d/d39/d51/d72/d99/fbc d1d/d33/d47/d53/d68/dcf/f134 0 2026-03-10T14:08:46.790 INFO:tasks.workunit.client.0.vm03.stdout:3/965: readlink d1d/d33/d47/d53/d68/la3 0 2026-03-10T14:08:46.793 INFO:tasks.workunit.client.0.vm03.stdout:5/996: mknod d4/c138 0 2026-03-10T14:08:46.794 INFO:tasks.workunit.client.0.vm03.stdout:3/966: symlink d1d/d33/d47/d53/d68/dcf/de7/d41/dc0/dfb/l135 0 2026-03-10T14:08:46.795 INFO:tasks.workunit.client.0.vm03.stdout:5/997: link d4/d6/f97 d4/d13/d8f/d11d/dc5/f139 0 2026-03-10T14:08:46.797 INFO:tasks.workunit.client.0.vm03.stdout:3/967: creat d1d/d33/d65/d48/d118/d121/f136 x:0 0 0 2026-03-10T14:08:46.800 INFO:tasks.workunit.client.0.vm03.stdout:3/968: rename d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/f63 to d1d/d33/d47/d53/d68/dcf/de7/d41/dc0/f137 0 2026-03-10T14:08:46.803 INFO:tasks.workunit.client.0.vm03.stdout:7/749: sync 2026-03-10T14:08:46.804 INFO:tasks.workunit.client.0.vm03.stdout:7/750: write d5/d9/d14/d26/d39/d92/fba [419265,84386] 0 2026-03-10T14:08:46.807 INFO:tasks.workunit.client.0.vm03.stdout:3/969: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/f138 x:0 0 0 2026-03-10T14:08:46.811 INFO:tasks.workunit.client.0.vm03.stdout:7/751: dwrite d5/d9/fec [0,4194304] 0 2026-03-10T14:08:46.814 INFO:tasks.workunit.client.0.vm03.stdout:3/970: mkdir d1d/d39/da1/d139 0 2026-03-10T14:08:46.815 INFO:tasks.workunit.client.0.vm03.stdout:3/971: chown d1d/d39/d51/d72/d99 858737 1 2026-03-10T14:08:46.818 INFO:tasks.workunit.client.0.vm03.stdout:3/972: mkdir d1d/d33/d47/d13a 0 2026-03-10T14:08:46.819 INFO:tasks.workunit.client.0.vm03.stdout:3/973: mknod d1d/d33/d47/d53/d68/dcf/de7/d41/d45/c13b 0 2026-03-10T14:08:46.821 INFO:tasks.workunit.client.0.vm03.stdout:7/752: sync 2026-03-10T14:08:46.827 INFO:tasks.workunit.client.0.vm03.stdout:4/848: dwrite d5/d47/d62/d8a/dd0/d102/fd1 [0,4194304] 0 2026-03-10T14:08:46.830 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: pgmap v10: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 34 MiB/s rd, 69 MiB/s wr, 209 op/s 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: Upgrade: Updating mgr.vm04.ywwcto 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:46.834 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:46 vm04.local ceph-mon[55966]: Deploying daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:08:46.834 INFO:tasks.workunit.client.0.vm03.stdout:4/849: symlink d5/d9/db/d19/d38/d53/d55/d108/l124 0 2026-03-10T14:08:46.834 INFO:tasks.workunit.client.0.vm03.stdout:4/850: readlink d5/db4/lf4 0 2026-03-10T14:08:46.847 INFO:tasks.workunit.client.0.vm03.stdout:6/880: getdents d8/db/d49/d6c/d83 0 2026-03-10T14:08:46.849 INFO:tasks.workunit.client.0.vm03.stdout:6/881: dread d8/db/d49/d6c/d83/f87 [0,4194304] 0 2026-03-10T14:08:46.852 INFO:tasks.workunit.client.0.vm03.stdout:6/882: dread - d8/d3b/f9c zero size 2026-03-10T14:08:46.853 INFO:tasks.workunit.client.0.vm03.stdout:6/883: fsync d8/d11/f2a 0 2026-03-10T14:08:46.853 INFO:tasks.workunit.client.0.vm03.stdout:6/884: readlink d8/db/d49/d6c/lbb 0 2026-03-10T14:08:46.854 INFO:tasks.workunit.client.0.vm03.stdout:6/885: creat d8/d11/da0/dca/f107 x:0 0 0 2026-03-10T14:08:46.856 INFO:tasks.workunit.client.0.vm03.stdout:4/851: sync 2026-03-10T14:08:46.856 INFO:tasks.workunit.client.0.vm03.stdout:4/852: chown d5/d9/db/f20 18530097 1 2026-03-10T14:08:46.857 INFO:tasks.workunit.client.0.vm03.stdout:6/886: mkdir d8/db/df/dcd/d108 0 2026-03-10T14:08:46.857 INFO:tasks.workunit.client.0.vm03.stdout:6/887: dread - d8/d11/da0/dca/f107 zero size 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: pgmap v10: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 34 MiB/s rd, 69 MiB/s wr, 209 op/s 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: Upgrade: Updating mgr.vm04.ywwcto 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:46 vm03.local ceph-mon[49718]: Deploying daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:08:46.868 INFO:tasks.workunit.client.0.vm03.stdout:6/888: dread d8/fd [0,4194304] 0 2026-03-10T14:08:46.868 INFO:tasks.workunit.client.0.vm03.stdout:6/889: write d8/db/d49/d76/fc4 [5768507,107431] 0 2026-03-10T14:08:46.872 INFO:tasks.workunit.client.0.vm03.stdout:6/890: creat d8/db/d12/d64/f109 x:0 0 0 2026-03-10T14:08:46.877 INFO:tasks.workunit.client.0.vm03.stdout:6/891: dwrite d8/d11/f35 [0,4194304] 0 2026-03-10T14:08:46.883 INFO:tasks.workunit.client.0.vm03.stdout:9/832: dwrite d2/d29/d33/d41/daa/fab [0,4194304] 0 2026-03-10T14:08:46.888 INFO:tasks.workunit.client.0.vm03.stdout:6/892: mkdir d8/db/df/dcd/d10a 0 2026-03-10T14:08:46.889 INFO:tasks.workunit.client.0.vm03.stdout:6/893: chown d8/d1b/d1c/c1d 7 1 2026-03-10T14:08:46.892 INFO:tasks.workunit.client.0.vm03.stdout:1/841: write d0/d42/d9f/fd5 [144714,63125] 0 2026-03-10T14:08:46.892 INFO:tasks.workunit.client.0.vm03.stdout:9/833: creat d2/d14/d100/f10d x:0 0 0 2026-03-10T14:08:46.897 INFO:tasks.workunit.client.0.vm03.stdout:9/834: mkdir d2/d14/d10e 0 2026-03-10T14:08:46.903 INFO:tasks.workunit.client.0.vm03.stdout:6/894: creat d8/db/d49/d76/f10b x:0 0 0 2026-03-10T14:08:46.903 INFO:tasks.workunit.client.0.vm03.stdout:6/895: chown d8/d3b/cbe 32 1 2026-03-10T14:08:46.903 INFO:tasks.workunit.client.0.vm03.stdout:1/842: unlink d0/d2/df/d16/d20/c105 0 2026-03-10T14:08:46.905 INFO:tasks.workunit.client.0.vm03.stdout:6/896: dwrite d8/d11/da0/dca/fd6 [0,4194304] 0 2026-03-10T14:08:46.910 INFO:tasks.workunit.client.0.vm03.stdout:6/897: mkdir d8/d11/d10c 0 2026-03-10T14:08:46.916 INFO:tasks.workunit.client.0.vm03.stdout:6/898: creat d8/d11/da0/f10d x:0 0 0 2026-03-10T14:08:46.932 INFO:tasks.workunit.client.0.vm03.stdout:1/843: sync 2026-03-10T14:08:46.938 INFO:tasks.workunit.client.0.vm03.stdout:1/844: rename d0/d2/fe7 to d0/d2/df/d91/d111/f114 0 2026-03-10T14:08:46.941 INFO:tasks.workunit.client.0.vm03.stdout:1/845: mkdir d0/d18/d3b/db7/d115 0 2026-03-10T14:08:46.943 INFO:tasks.workunit.client.0.vm03.stdout:1/846: symlink d0/d2/df/d16/ded/l116 0 2026-03-10T14:08:46.948 INFO:tasks.workunit.client.0.vm03.stdout:1/847: sync 2026-03-10T14:08:46.949 INFO:tasks.workunit.client.0.vm03.stdout:0/803: write d3/d11/f8e [730198,30056] 0 2026-03-10T14:08:46.954 INFO:tasks.workunit.client.0.vm03.stdout:0/804: dread - d3/d4d/d8f/fb3 zero size 2026-03-10T14:08:46.955 INFO:tasks.workunit.client.0.vm03.stdout:1/848: readlink d0/d2/df/dab/dc3/le5 0 2026-03-10T14:08:46.960 INFO:tasks.workunit.client.0.vm03.stdout:2/823: dwrite d5/d10/d1f/f5e [0,4194304] 0 2026-03-10T14:08:46.961 INFO:tasks.workunit.client.0.vm03.stdout:2/824: dread - d5/d10/d1f/d4f/d76/da7/d104/fdf zero size 2026-03-10T14:08:46.974 INFO:tasks.workunit.client.0.vm03.stdout:2/825: creat d5/dac/f10f x:0 0 0 2026-03-10T14:08:46.976 INFO:tasks.workunit.client.0.vm03.stdout:1/849: getdents d0/d2/df/d27/d7e/d81 0 2026-03-10T14:08:46.978 INFO:tasks.workunit.client.0.vm03.stdout:2/826: rename d5/d2a/d8a to d5/d2a/d110 0 2026-03-10T14:08:46.978 INFO:tasks.workunit.client.0.vm03.stdout:2/827: stat d5/d2a/l2f 0 2026-03-10T14:08:46.980 INFO:tasks.workunit.client.0.vm03.stdout:2/828: getdents d5/d2a/d110 0 2026-03-10T14:08:46.981 INFO:tasks.workunit.client.0.vm03.stdout:2/829: creat d5/d10/dba/f111 x:0 0 0 2026-03-10T14:08:46.984 INFO:tasks.workunit.client.0.vm03.stdout:2/830: link d5/d10/d1f/d4f/d76/da7/d40/d92/f102 d5/d10/d17/f112 0 2026-03-10T14:08:46.988 INFO:tasks.workunit.client.0.vm03.stdout:2/831: fsync d5/d10/d17/f28 0 2026-03-10T14:08:46.993 INFO:tasks.workunit.client.0.vm03.stdout:2/832: sync 2026-03-10T14:08:46.993 INFO:tasks.workunit.client.0.vm03.stdout:2/833: chown d5/d2a/f45 124893 1 2026-03-10T14:08:46.995 INFO:tasks.workunit.client.0.vm03.stdout:8/949: write da/d24/ff8 [99187,109679] 0 2026-03-10T14:08:47.005 INFO:tasks.workunit.client.0.vm03.stdout:2/834: truncate d5/d10/d31/fa9 3585372 0 2026-03-10T14:08:47.006 INFO:tasks.workunit.client.0.vm03.stdout:8/950: dread da/d3c/d4b/d69/fb5 [4194304,4194304] 0 2026-03-10T14:08:47.006 INFO:tasks.workunit.client.0.vm03.stdout:2/835: chown d5/d10/d17/fb6 1 1 2026-03-10T14:08:47.007 INFO:tasks.workunit.client.0.vm03.stdout:8/951: fdatasync da/d24/d49/fbe 0 2026-03-10T14:08:47.011 INFO:tasks.workunit.client.0.vm03.stdout:5/998: write d4/d40/d4e/f5c [531272,89299] 0 2026-03-10T14:08:47.013 INFO:tasks.workunit.client.0.vm03.stdout:8/952: mknod da/c12e 0 2026-03-10T14:08:47.013 INFO:tasks.workunit.client.0.vm03.stdout:5/999: symlink d4/d16/d19/d6e/d7f/dd1/de8/l13a 0 2026-03-10T14:08:47.014 INFO:tasks.workunit.client.0.vm03.stdout:8/953: stat da/d3c/c41 0 2026-03-10T14:08:47.014 INFO:tasks.workunit.client.0.vm03.stdout:8/954: fdatasync da/d3a/d44/fe8 0 2026-03-10T14:08:47.015 INFO:tasks.workunit.client.0.vm03.stdout:8/955: chown da/d3a/dce/dfd/c101 51843794 1 2026-03-10T14:08:47.016 INFO:tasks.workunit.client.0.vm03.stdout:8/956: fsync da/d36/d4d/f8f 0 2026-03-10T14:08:47.019 INFO:tasks.workunit.client.0.vm03.stdout:8/957: mkdir da/d3c/d51/d85/dbb/d106/d12d/d12f 0 2026-03-10T14:08:47.034 INFO:tasks.workunit.client.0.vm03.stdout:3/974: write d1d/d33/d47/d53/d68/dcf/de7/faf [706363,73165] 0 2026-03-10T14:08:47.036 INFO:tasks.workunit.client.0.vm03.stdout:3/975: creat d1d/d33/d47/d53/d68/dcf/de7/d41/d45/db4/de4/f13c x:0 0 0 2026-03-10T14:08:47.045 INFO:tasks.workunit.client.0.vm03.stdout:3/976: link d1d/d33/d65/d5d/dae/db9/dec/f10e d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/f13d 0 2026-03-10T14:08:47.052 INFO:tasks.workunit.client.0.vm03.stdout:3/977: sync 2026-03-10T14:08:47.062 INFO:tasks.workunit.client.0.vm03.stdout:3/978: dread d1d/d33/d47/d53/d68/dcf/de7/d41/dc0/fca [0,4194304] 0 2026-03-10T14:08:47.064 INFO:tasks.workunit.client.0.vm03.stdout:3/979: rename d1d/d33/d65/d48/f8b to d1d/d39/d51/d72/d99/f13e 0 2026-03-10T14:08:47.065 INFO:tasks.workunit.client.0.vm03.stdout:3/980: symlink d1d/d39/d51/d72/d99/l13f 0 2026-03-10T14:08:47.065 INFO:tasks.workunit.client.0.vm03.stdout:3/981: mkdir d1d/d59/d140 0 2026-03-10T14:08:47.068 INFO:tasks.workunit.client.0.vm03.stdout:3/982: creat d1d/d33/d47/d53/d68/dcf/de7/f141 x:0 0 0 2026-03-10T14:08:47.069 INFO:tasks.workunit.client.0.vm03.stdout:3/983: symlink d1d/d33/d47/d53/d68/dcf/de7/l142 0 2026-03-10T14:08:47.071 INFO:tasks.workunit.client.0.vm03.stdout:3/984: truncate d1d/d39/ff2 96140 0 2026-03-10T14:08:47.072 INFO:tasks.workunit.client.0.vm03.stdout:3/985: truncate d1d/d33/d65/d48/d118/fff 62764 0 2026-03-10T14:08:47.109 INFO:tasks.workunit.client.0.vm03.stdout:7/753: write d5/d9/d14/d26/d36/f3a [2567966,75852] 0 2026-03-10T14:08:47.112 INFO:tasks.workunit.client.0.vm03.stdout:7/754: rename d5/d9/d14/c3b to d5/d9/d14/d26/d36/d51/dc8/c103 0 2026-03-10T14:08:47.114 INFO:tasks.workunit.client.0.vm03.stdout:7/755: unlink d5/d9/d14/d26/d5f/c74 0 2026-03-10T14:08:47.120 INFO:tasks.workunit.client.0.vm03.stdout:7/756: sync 2026-03-10T14:08:47.149 INFO:tasks.workunit.client.0.vm03.stdout:4/853: dwrite f3 [0,4194304] 0 2026-03-10T14:08:47.152 INFO:tasks.workunit.client.0.vm03.stdout:4/854: symlink d5/d9/db/da7/dcc/d100/d8e/l125 0 2026-03-10T14:08:47.154 INFO:tasks.workunit.client.0.vm03.stdout:4/855: mkdir d5/d9/db/d19/d38/d7b/dbd/d126 0 2026-03-10T14:08:47.155 INFO:tasks.workunit.client.0.vm03.stdout:4/856: mknod d5/d9/db/d19/d38/ded/c127 0 2026-03-10T14:08:47.157 INFO:tasks.workunit.client.0.vm03.stdout:4/857: mknod d5/d9/db/d19/d38/d7b/daa/daf/c128 0 2026-03-10T14:08:47.158 INFO:tasks.workunit.client.0.vm03.stdout:4/858: mkdir d5/d9/db/da7/dcc/d100/dd5/d129 0 2026-03-10T14:08:47.206 INFO:tasks.workunit.client.0.vm03.stdout:9/835: truncate d2/d14/d2b/d43/fa2 1006753 0 2026-03-10T14:08:47.208 INFO:tasks.workunit.client.0.vm03.stdout:9/836: getdents d2/d14/d2b/d79/d81/df3 0 2026-03-10T14:08:47.210 INFO:tasks.workunit.client.0.vm03.stdout:9/837: fdatasync d2/d29/d33/d60/d8c/de1/ff7 0 2026-03-10T14:08:47.212 INFO:tasks.workunit.client.0.vm03.stdout:9/838: creat d2/d29/d38/f10f x:0 0 0 2026-03-10T14:08:47.214 INFO:tasks.workunit.client.0.vm03.stdout:9/839: creat d2/d14/d2b/d43/f110 x:0 0 0 2026-03-10T14:08:47.216 INFO:tasks.workunit.client.0.vm03.stdout:9/840: symlink d2/d29/dcd/d72/d9d/l111 0 2026-03-10T14:08:47.217 INFO:tasks.workunit.client.0.vm03.stdout:9/841: creat d2/d29/d38/dec/f112 x:0 0 0 2026-03-10T14:08:47.222 INFO:tasks.workunit.client.0.vm03.stdout:6/899: dwrite d8/db/d49/d6c/d32/dcb/fff [0,4194304] 0 2026-03-10T14:08:47.224 INFO:tasks.workunit.client.0.vm03.stdout:9/842: symlink d2/d29/d33/d60/d8c/dd3/l113 0 2026-03-10T14:08:47.228 INFO:tasks.workunit.client.0.vm03.stdout:6/900: creat d8/d11/da0/dbf/f10e x:0 0 0 2026-03-10T14:08:47.242 INFO:tasks.workunit.client.0.vm03.stdout:0/805: write d3/d16/d21/d3c/f5b [729963,118340] 0 2026-03-10T14:08:47.244 INFO:tasks.workunit.client.0.vm03.stdout:0/806: read d3/d11/d76/dd3/fe3 [3523201,69159] 0 2026-03-10T14:08:47.247 INFO:tasks.workunit.client.0.vm03.stdout:0/807: creat d3/d17/f102 x:0 0 0 2026-03-10T14:08:47.259 INFO:tasks.workunit.client.0.vm03.stdout:1/850: write d0/d2/df/d27/f58 [1171295,89193] 0 2026-03-10T14:08:47.266 INFO:tasks.workunit.client.0.vm03.stdout:1/851: creat d0/d2/df/d91/dda/f117 x:0 0 0 2026-03-10T14:08:47.268 INFO:tasks.workunit.client.0.vm03.stdout:1/852: symlink d0/l118 0 2026-03-10T14:08:47.272 INFO:tasks.workunit.client.0.vm03.stdout:1/853: fdatasync d0/d18/fe8 0 2026-03-10T14:08:47.277 INFO:tasks.workunit.client.0.vm03.stdout:2/836: dwrite d5/db4/d74/d83/fc4 [0,4194304] 0 2026-03-10T14:08:47.284 INFO:tasks.workunit.client.0.vm03.stdout:2/837: mkdir d5/db4/d74/d113 0 2026-03-10T14:08:47.290 INFO:tasks.workunit.client.0.vm03.stdout:8/958: write da/d3c/d4b/d4c/dba/d114/dbc/f112 [760771,117223] 0 2026-03-10T14:08:47.294 INFO:tasks.workunit.client.0.vm03.stdout:8/959: dread da/d24/d49/fbe [0,4194304] 0 2026-03-10T14:08:47.298 INFO:tasks.workunit.client.0.vm03.stdout:8/960: rename da/d24/f3d to da/d36/f130 0 2026-03-10T14:08:47.299 INFO:tasks.workunit.client.0.vm03.stdout:8/961: readlink da/d24/l2f 0 2026-03-10T14:08:47.310 INFO:tasks.workunit.client.0.vm03.stdout:2/838: sync 2026-03-10T14:08:47.310 INFO:tasks.workunit.client.0.vm03.stdout:2/839: fdatasync d5/dac/f10f 0 2026-03-10T14:08:47.311 INFO:tasks.workunit.client.0.vm03.stdout:2/840: read d5/dac/ff1 [3023207,90581] 0 2026-03-10T14:08:47.312 INFO:tasks.workunit.client.0.vm03.stdout:2/841: stat d5/d10/d1f/d4f/d76/da7/d54/fe2 0 2026-03-10T14:08:47.314 INFO:tasks.workunit.client.0.vm03.stdout:2/842: symlink d5/d10/d1f/d4f/dbe/l114 0 2026-03-10T14:08:47.325 INFO:tasks.workunit.client.0.vm03.stdout:7/757: write d5/d9/d14/d26/d39/f63 [4408289,101390] 0 2026-03-10T14:08:47.328 INFO:tasks.workunit.client.0.vm03.stdout:3/986: dread d1d/d39/ff2 [0,4194304] 0 2026-03-10T14:08:47.332 INFO:tasks.workunit.client.0.vm03.stdout:4/859: dwrite d5/d47/d62/d8a/dd0/d102/f63 [0,4194304] 0 2026-03-10T14:08:47.340 INFO:tasks.workunit.client.0.vm03.stdout:7/758: truncate d5/d9/d14/d26/d36/de1/d84/f95 874242 0 2026-03-10T14:08:47.346 INFO:tasks.workunit.client.0.vm03.stdout:3/987: creat d1d/d33/d47/d53/d68/dcf/f143 x:0 0 0 2026-03-10T14:08:47.348 INFO:tasks.workunit.client.0.vm03.stdout:2/843: symlink d5/d10/d1f/d4f/d76/da7/d54/d5f/l115 0 2026-03-10T14:08:47.356 INFO:tasks.workunit.client.0.vm03.stdout:4/860: mknod d5/d47/c12a 0 2026-03-10T14:08:47.359 INFO:tasks.workunit.client.0.vm03.stdout:9/843: truncate d2/d29/dcd/d9c/fe5 35389 0 2026-03-10T14:08:47.359 INFO:tasks.workunit.client.0.vm03.stdout:9/844: stat d2/d14/f39 0 2026-03-10T14:08:47.360 INFO:tasks.workunit.client.0.vm03.stdout:9/845: write d2/d14/d2b/d79/fb8 [2020091,30675] 0 2026-03-10T14:08:47.368 INFO:tasks.workunit.client.0.vm03.stdout:4/861: dread d5/d9/db/f2a [0,4194304] 0 2026-03-10T14:08:47.369 INFO:tasks.workunit.client.0.vm03.stdout:4/862: chown d5/d6e/f9c 554261 1 2026-03-10T14:08:47.374 INFO:tasks.workunit.client.0.vm03.stdout:6/901: dwrite d8/d11/d18/f21 [4194304,4194304] 0 2026-03-10T14:08:47.377 INFO:tasks.workunit.client.0.vm03.stdout:6/902: read d8/d11/da0/dbf/d8c/db6/fdd [267505,95057] 0 2026-03-10T14:08:47.377 INFO:tasks.workunit.client.0.vm03.stdout:6/903: fsync d8/db/d49/d6c/f90 0 2026-03-10T14:08:47.378 INFO:tasks.workunit.client.0.vm03.stdout:6/904: chown d8/d11/da0/dbf 909 1 2026-03-10T14:08:47.380 INFO:tasks.workunit.client.0.vm03.stdout:0/808: write d3/d46/f8c [4970462,55015] 0 2026-03-10T14:08:47.397 INFO:tasks.workunit.client.0.vm03.stdout:7/759: unlink d5/d9/c12 0 2026-03-10T14:08:47.399 INFO:tasks.workunit.client.0.vm03.stdout:2/844: rename d5/d10/d1f/d4f/d76/cf3 to d5/db4/d74/c116 0 2026-03-10T14:08:47.401 INFO:tasks.workunit.client.0.vm03.stdout:4/863: unlink d5/d9/db/d19/d38/d7b/daa/f11c 0 2026-03-10T14:08:47.406 INFO:tasks.workunit.client.0.vm03.stdout:6/905: fsync d8/db/d49/d6c/d32/f94 0 2026-03-10T14:08:47.406 INFO:tasks.workunit.client.0.vm03.stdout:1/854: write d0/d2/df/d27/f32 [3509790,64009] 0 2026-03-10T14:08:47.407 INFO:tasks.workunit.client.0.vm03.stdout:0/809: truncate d3/d4d/d47/fd9 669992 0 2026-03-10T14:08:47.410 INFO:tasks.workunit.client.0.vm03.stdout:9/846: rename d2/d29/d33/c36 to d2/d29/d33/d60/d8c/dd3/c114 0 2026-03-10T14:08:47.413 INFO:tasks.workunit.client.0.vm03.stdout:4/864: mkdir d5/d9/db/da7/db9/def/d11f/d12b 0 2026-03-10T14:08:47.417 INFO:tasks.workunit.client.0.vm03.stdout:6/906: unlink d8/db/feb 0 2026-03-10T14:08:47.420 INFO:tasks.workunit.client.0.vm03.stdout:8/962: dwrite da/f15 [8388608,4194304] 0 2026-03-10T14:08:47.422 INFO:tasks.workunit.client.0.vm03.stdout:1/855: rmdir d0/d2/df/d27/dd7 39 2026-03-10T14:08:47.427 INFO:tasks.workunit.client.0.vm03.stdout:1/856: dread d0/d2/df/d27/ff4 [0,4194304] 0 2026-03-10T14:08:47.430 INFO:tasks.workunit.client.0.vm03.stdout:9/847: truncate d2/d14/d2b/d43/fcc 1046228 0 2026-03-10T14:08:47.430 INFO:tasks.workunit.client.0.vm03.stdout:4/865: symlink d5/d9/db/d19/d38/d7b/daa/l12c 0 2026-03-10T14:08:47.431 INFO:tasks.workunit.client.0.vm03.stdout:4/866: dread - d5/d47/d62/d8a/dd0/ffd zero size 2026-03-10T14:08:47.443 INFO:tasks.workunit.client.0.vm03.stdout:8/963: symlink da/d3c/d4b/d4c/dba/d114/dbc/db1/l131 0 2026-03-10T14:08:47.446 INFO:tasks.workunit.client.0.vm03.stdout:6/907: symlink d8/d11/d10c/l10f 0 2026-03-10T14:08:47.448 INFO:tasks.workunit.client.0.vm03.stdout:0/810: mknod d3/d11/d2c/c103 0 2026-03-10T14:08:47.449 INFO:tasks.workunit.client.0.vm03.stdout:8/964: symlink da/d3c/d51/d75/l132 0 2026-03-10T14:08:47.453 INFO:tasks.workunit.client.0.vm03.stdout:1/857: dread d0/f24 [0,4194304] 0 2026-03-10T14:08:47.455 INFO:tasks.workunit.client.0.vm03.stdout:6/908: dwrite d8/db/d49/d6c/d83/ffd [0,4194304] 0 2026-03-10T14:08:47.458 INFO:tasks.workunit.client.0.vm03.stdout:9/848: mknod d2/d14/d2b/d79/d8a/dff/c115 0 2026-03-10T14:08:47.460 INFO:tasks.workunit.client.0.vm03.stdout:4/867: symlink d5/d9/db/d19/d38/l12d 0 2026-03-10T14:08:47.466 INFO:tasks.workunit.client.0.vm03.stdout:1/858: rename d0/d42/f80 to d0/d2/df/d16/d41/f119 0 2026-03-10T14:08:47.473 INFO:tasks.workunit.client.0.vm03.stdout:1/859: fdatasync d0/d42/ff6 0 2026-03-10T14:08:47.473 INFO:tasks.workunit.client.0.vm03.stdout:9/849: fsync d2/d29/dcd/d72/d9d/fa6 0 2026-03-10T14:08:47.473 INFO:tasks.workunit.client.0.vm03.stdout:4/868: chown d5/d9/db/l54 3379808 1 2026-03-10T14:08:47.473 INFO:tasks.workunit.client.0.vm03.stdout:4/869: truncate d5/d9/db/d19/d38/d7b/daa/daf/ff5 754470 0 2026-03-10T14:08:47.475 INFO:tasks.workunit.client.0.vm03.stdout:9/850: creat d2/d29/d33/d60/d8c/de1/dbd/f116 x:0 0 0 2026-03-10T14:08:47.476 INFO:tasks.workunit.client.0.vm03.stdout:4/870: rmdir d5/d47/d5b/dbe 39 2026-03-10T14:08:47.479 INFO:tasks.workunit.client.0.vm03.stdout:4/871: mknod d5/d9/db/da7/db9/def/c12e 0 2026-03-10T14:08:47.482 INFO:tasks.workunit.client.0.vm03.stdout:4/872: mkdir d5/d47/d5b/d64/d12f 0 2026-03-10T14:08:47.486 INFO:tasks.workunit.client.0.vm03.stdout:6/909: dread f0 [4194304,4194304] 0 2026-03-10T14:08:47.486 INFO:tasks.workunit.client.0.vm03.stdout:1/860: rename d0/d2/f1a to d0/d2/df/d91/f11a 0 2026-03-10T14:08:47.486 INFO:tasks.workunit.client.0.vm03.stdout:0/811: sync 2026-03-10T14:08:47.490 INFO:tasks.workunit.client.0.vm03.stdout:6/910: creat d8/d11/da0/dbf/d5c/f110 x:0 0 0 2026-03-10T14:08:47.492 INFO:tasks.workunit.client.0.vm03.stdout:0/812: sync 2026-03-10T14:08:47.496 INFO:tasks.workunit.client.0.vm03.stdout:3/988: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/fb5 [2843661,11617] 0 2026-03-10T14:08:47.508 INFO:tasks.workunit.client.0.vm03.stdout:4/873: truncate d5/d9/db/d19/d38/d53/d55/fa8 2751601 0 2026-03-10T14:08:47.509 INFO:tasks.workunit.client.0.vm03.stdout:4/874: readlink d5/d9/db/da7/dcc/l111 0 2026-03-10T14:08:47.510 INFO:tasks.workunit.client.0.vm03.stdout:3/989: fsync d1d/d33/d47/d53/d68/dcf/de7/d41/f70 0 2026-03-10T14:08:47.517 INFO:tasks.workunit.client.0.vm03.stdout:3/990: read fe [678004,19603] 0 2026-03-10T14:08:47.517 INFO:tasks.workunit.client.0.vm03.stdout:3/991: stat d1d/d33/d47/d53/d68/dcf/l12e 0 2026-03-10T14:08:47.518 INFO:tasks.workunit.client.0.vm03.stdout:4/875: creat d5/d9/db/da7/d119/f130 x:0 0 0 2026-03-10T14:08:47.519 INFO:tasks.workunit.client.0.vm03.stdout:3/992: write d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d5b/f9e [4260244,62579] 0 2026-03-10T14:08:47.521 INFO:tasks.workunit.client.0.vm03.stdout:4/876: rmdir d5 39 2026-03-10T14:08:47.524 INFO:tasks.workunit.client.0.vm03.stdout:3/993: symlink d1d/d33/d47/d53/d68/dcf/de7/db2/d129/l144 0 2026-03-10T14:08:47.527 INFO:tasks.workunit.client.0.vm03.stdout:4/877: sync 2026-03-10T14:08:47.530 INFO:tasks.workunit.client.0.vm03.stdout:2/845: dwrite d5/d10/f4c [0,4194304] 0 2026-03-10T14:08:47.538 INFO:tasks.workunit.client.0.vm03.stdout:3/994: mkdir d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/dd5/d145 0 2026-03-10T14:08:47.540 INFO:tasks.workunit.client.0.vm03.stdout:3/995: chown d1d/d33/d47/d53/d68/dcf/de7/d41/d45/d55/d6e/d83/d91/dd5/d145 361 1 2026-03-10T14:08:47.541 INFO:tasks.workunit.client.0.vm03.stdout:7/760: truncate d5/d9/d14/d26/f64 2618396 0 2026-03-10T14:08:47.549 INFO:tasks.workunit.client.0.vm03.stdout:7/761: fsync d5/d9/d14/d26/d36/fa4 0 2026-03-10T14:08:47.550 INFO:tasks.workunit.client.0.vm03.stdout:7/762: readlink d5/d9/d14/d21/d6f/lfd 0 2026-03-10T14:08:47.555 INFO:tasks.workunit.client.0.vm03.stdout:8/965: dwrite da/d3c/d51/d75/dc2/f91 [0,4194304] 0 2026-03-10T14:08:47.558 INFO:tasks.workunit.client.0.vm03.stdout:3/996: mkdir d1d/d33/d47/d10c/d146 0 2026-03-10T14:08:47.559 INFO:tasks.workunit.client.0.vm03.stdout:3/997: dread d1d/d33/d47/d53/d68/fc4 [0,4194304] 0 2026-03-10T14:08:47.560 INFO:tasks.workunit.client.0.vm03.stdout:3/998: read - d1d/d33/d47/d53/d68/dcf/f143 zero size 2026-03-10T14:08:47.562 INFO:tasks.workunit.client.0.vm03.stdout:7/763: mknod d5/d9/d14/d26/d36/d51/dc8/c104 0 2026-03-10T14:08:47.565 INFO:tasks.workunit.client.0.vm03.stdout:7/764: dwrite d5/d9/fec [0,4194304] 0 2026-03-10T14:08:47.566 INFO:tasks.workunit.client.0.vm03.stdout:7/765: write d5/d9/d14/d26/d36/d51/d7b/fc9 [424956,97935] 0 2026-03-10T14:08:47.567 INFO:tasks.workunit.client.0.vm03.stdout:7/766: stat d5/d9/d14/f4d 0 2026-03-10T14:08:47.569 INFO:tasks.workunit.client.0.vm03.stdout:3/999: rmdir d1d/d59 39 2026-03-10T14:08:47.573 INFO:tasks.workunit.client.0.vm03.stdout:8/966: creat da/d3c/d51/d85/dbb/d106/d12d/d12f/f133 x:0 0 0 2026-03-10T14:08:47.574 INFO:tasks.workunit.client.0.vm03.stdout:7/767: creat d5/d9/d14/d26/d5f/de2/f105 x:0 0 0 2026-03-10T14:08:47.574 INFO:tasks.workunit.client.0.vm03.stdout:7/768: chown d5/f32 590732783 1 2026-03-10T14:08:47.575 INFO:tasks.workunit.client.0.vm03.stdout:7/769: chown d5/d9/d14/d26/f8d 87 1 2026-03-10T14:08:47.575 INFO:tasks.workunit.client.0.vm03.stdout:8/967: write da/d24/d49/fd0 [875515,34460] 0 2026-03-10T14:08:47.576 INFO:tasks.workunit.client.0.vm03.stdout:8/968: dread - da/d3a/d44/dfb/dd7/ff3 zero size 2026-03-10T14:08:47.579 INFO:tasks.workunit.client.0.vm03.stdout:8/969: creat da/d3c/d51/f134 x:0 0 0 2026-03-10T14:08:47.581 INFO:tasks.workunit.client.0.vm03.stdout:8/970: creat da/d3a/f135 x:0 0 0 2026-03-10T14:08:47.590 INFO:tasks.workunit.client.0.vm03.stdout:8/971: dread da/d3a/d44/f97 [0,4194304] 0 2026-03-10T14:08:47.608 INFO:tasks.workunit.client.0.vm03.stdout:8/972: sync 2026-03-10T14:08:47.617 INFO:tasks.workunit.client.0.vm03.stdout:9/851: write d2/d29/d38/f4b [496898,124730] 0 2026-03-10T14:08:47.618 INFO:tasks.workunit.client.0.vm03.stdout:9/852: stat d2/d29/d33/d41/ca4 0 2026-03-10T14:08:47.624 INFO:tasks.workunit.client.0.vm03.stdout:1/861: write d0/d2/df/d27/d7e/d81/fd2 [36842,60178] 0 2026-03-10T14:08:47.624 INFO:tasks.workunit.client.0.vm03.stdout:0/813: write d3/d16/d21/d9a/fc3 [1739648,33887] 0 2026-03-10T14:08:47.627 INFO:tasks.workunit.client.0.vm03.stdout:6/911: dwrite d8/db/d49/d6c/f8e [0,4194304] 0 2026-03-10T14:08:47.627 INFO:tasks.workunit.client.0.vm03.stdout:1/862: sync 2026-03-10T14:08:47.628 INFO:tasks.workunit.client.0.vm03.stdout:1/863: readlink d0/d2/df/dab/ld0 0 2026-03-10T14:08:47.628 INFO:tasks.workunit.client.0.vm03.stdout:6/912: chown d8/d11/da0/dbf/d5c/f68 28546 1 2026-03-10T14:08:47.630 INFO:tasks.workunit.client.0.vm03.stdout:6/913: truncate d8/d11/da0/f10d 329523 0 2026-03-10T14:08:47.630 INFO:tasks.workunit.client.0.vm03.stdout:2/846: write d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/f8c [1126656,53590] 0 2026-03-10T14:08:47.637 INFO:tasks.workunit.client.0.vm03.stdout:0/814: mkdir d3/d11/d66/da2/d104 0 2026-03-10T14:08:47.638 INFO:tasks.workunit.client.0.vm03.stdout:4/878: truncate d5/d47/d62/d8a/dd0/d102/fd1 620818 0 2026-03-10T14:08:47.642 INFO:tasks.workunit.client.0.vm03.stdout:4/879: dwrite d5/d47/d62/d8a/da0/f118 [0,4194304] 0 2026-03-10T14:08:47.644 INFO:tasks.workunit.client.0.vm03.stdout:6/914: mknod d8/db/d49/d6c/d32/d3a/c111 0 2026-03-10T14:08:47.644 INFO:tasks.workunit.client.0.vm03.stdout:2/847: mknod d5/d2a/d110/c117 0 2026-03-10T14:08:47.645 INFO:tasks.workunit.client.0.vm03.stdout:0/815: symlink d3/d11/d2c/d4a/d4b/d89/db9/l105 0 2026-03-10T14:08:47.650 INFO:tasks.workunit.client.0.vm03.stdout:2/848: creat d5/d10/da3/f118 x:0 0 0 2026-03-10T14:08:47.650 INFO:tasks.workunit.client.0.vm03.stdout:2/849: write f4 [5699964,128342] 0 2026-03-10T14:08:47.661 INFO:tasks.workunit.client.0.vm03.stdout:6/915: read d8/d11/d18/f34 [3979022,51767] 0 2026-03-10T14:08:47.666 INFO:tasks.workunit.client.0.vm03.stdout:6/916: dread d8/db/d49/d6c/d83/ff9 [0,4194304] 0 2026-03-10T14:08:47.667 INFO:tasks.workunit.client.0.vm03.stdout:6/917: write d8/d11/da0/dbf/f10e [377825,91180] 0 2026-03-10T14:08:47.695 INFO:tasks.workunit.client.0.vm03.stdout:1/864: dread d0/d18/d3b/f4c [0,4194304] 0 2026-03-10T14:08:47.703 INFO:tasks.workunit.client.0.vm03.stdout:1/865: dread d0/d2/df/d27/f99 [0,4194304] 0 2026-03-10T14:08:47.705 INFO:tasks.workunit.client.0.vm03.stdout:1/866: creat d0/d2/df/d91/dda/f11b x:0 0 0 2026-03-10T14:08:47.707 INFO:tasks.workunit.client.0.vm03.stdout:1/867: truncate d0/d2/df/dab/f9a 3741254 0 2026-03-10T14:08:47.707 INFO:tasks.workunit.client.0.vm03.stdout:1/868: stat d0/d18/d1d 0 2026-03-10T14:08:47.708 INFO:tasks.workunit.client.0.vm03.stdout:1/869: mknod d0/d2/df/dab/c11c 0 2026-03-10T14:08:47.710 INFO:tasks.workunit.client.0.vm03.stdout:1/870: mknod d0/d2/d71/c11d 0 2026-03-10T14:08:47.711 INFO:tasks.workunit.client.0.vm03.stdout:1/871: chown d0/d2/df/d16/d41/f119 42478951 1 2026-03-10T14:08:47.711 INFO:tasks.workunit.client.0.vm03.stdout:1/872: fsync d0/d18/d3b/d50/ffd 0 2026-03-10T14:08:47.713 INFO:tasks.workunit.client.0.vm03.stdout:1/873: getdents d0/d42/d9f/dc9 0 2026-03-10T14:08:47.715 INFO:tasks.workunit.client.0.vm03.stdout:1/874: creat d0/d2/df/d91/f11e x:0 0 0 2026-03-10T14:08:47.715 INFO:tasks.workunit.client.0.vm03.stdout:1/875: chown d0/d2/df/dab/f56 20 1 2026-03-10T14:08:47.721 INFO:tasks.workunit.client.0.vm03.stdout:9/853: dread d2/f4c [0,4194304] 0 2026-03-10T14:08:47.725 INFO:tasks.workunit.client.0.vm03.stdout:9/854: dread - d2/d29/d38/fbf zero size 2026-03-10T14:08:47.726 INFO:tasks.workunit.client.0.vm03.stdout:9/855: fsync d2/d14/d2b/d79/fb8 0 2026-03-10T14:08:47.729 INFO:tasks.workunit.client.0.vm03.stdout:0/816: dread d3/d46/f8c [4194304,4194304] 0 2026-03-10T14:08:47.731 INFO:tasks.workunit.client.0.vm03.stdout:7/770: dwrite d5/d9/d14/d21/fea [0,4194304] 0 2026-03-10T14:08:47.734 INFO:tasks.workunit.client.0.vm03.stdout:7/771: chown d5/d9/d14/d26/d36/d51/d7b/d83/c91 1429 1 2026-03-10T14:08:47.737 INFO:tasks.workunit.client.0.vm03.stdout:9/856: rename d2/d14/f1b to d2/d29/d33/d41/d95/f117 0 2026-03-10T14:08:47.739 INFO:tasks.workunit.client.0.vm03.stdout:7/772: fsync d5/d9/d14/f4d 0 2026-03-10T14:08:47.743 INFO:tasks.workunit.client.0.vm03.stdout:7/773: stat d5/d9/d14/d26/d5f/de7/lca 0 2026-03-10T14:08:47.744 INFO:tasks.workunit.client.0.vm03.stdout:7/774: stat d5/d9/d14/d26 0 2026-03-10T14:08:47.745 INFO:tasks.workunit.client.0.vm03.stdout:8/973: dwrite da/d3a/d44/f97 [0,4194304] 0 2026-03-10T14:08:47.756 INFO:tasks.workunit.client.0.vm03.stdout:8/974: dread da/f31 [0,4194304] 0 2026-03-10T14:08:47.761 INFO:tasks.workunit.client.0.vm03.stdout:7/775: sync 2026-03-10T14:08:47.765 INFO:tasks.workunit.client.0.vm03.stdout:8/975: rename da/d3c/d4b/d4c/dba/d114/dbc/db1/cfe to da/d24/d49/dee/c136 0 2026-03-10T14:08:47.785 INFO:tasks.workunit.client.0.vm03.stdout:4/880: dwrite d5/d96/fc1 [0,4194304] 0 2026-03-10T14:08:47.786 INFO:tasks.workunit.client.0.vm03.stdout:4/881: chown d5/d9/db/ff0 38573 1 2026-03-10T14:08:47.795 INFO:tasks.workunit.client.0.vm03.stdout:8/976: mknod da/d3c/d51/d85/dbb/d106/d12d/d12f/c137 0 2026-03-10T14:08:47.797 INFO:tasks.workunit.client.0.vm03.stdout:4/882: rename d5/d9/db/da7/db9/def/c12e to d5/d9/db/da7/d119/c131 0 2026-03-10T14:08:47.798 INFO:tasks.workunit.client.0.vm03.stdout:4/883: readlink d5/d47/la2 0 2026-03-10T14:08:47.799 INFO:tasks.workunit.client.0.vm03.stdout:2/850: write d5/f9f [945321,50859] 0 2026-03-10T14:08:47.799 INFO:tasks.workunit.client.0.vm03.stdout:2/851: chown d5/d10/da3/dab/dc8 11624665 1 2026-03-10T14:08:47.800 INFO:tasks.workunit.client.0.vm03.stdout:2/852: truncate d5/db4/d74/d83/dc1/f107 619982 0 2026-03-10T14:08:47.801 INFO:tasks.workunit.client.0.vm03.stdout:7/776: symlink d5/d9/d14/d21/l106 0 2026-03-10T14:08:47.802 INFO:tasks.workunit.client.0.vm03.stdout:6/918: write d8/d11/da0/dbf/fc0 [831141,38600] 0 2026-03-10T14:08:47.803 INFO:tasks.workunit.client.0.vm03.stdout:6/919: chown d8/db/d12/d64/fa8 23 1 2026-03-10T14:08:47.804 INFO:tasks.workunit.client.0.vm03.stdout:4/884: creat d5/d9/db/da7/f132 x:0 0 0 2026-03-10T14:08:47.815 INFO:tasks.workunit.client.0.vm03.stdout:1/876: dwrite d0/d18/d1d/f6f [0,4194304] 0 2026-03-10T14:08:47.817 INFO:tasks.workunit.client.0.vm03.stdout:4/885: sync 2026-03-10T14:08:47.824 INFO:tasks.workunit.client.0.vm03.stdout:0/817: write d3/d17/fb1 [2878328,128366] 0 2026-03-10T14:08:47.841 INFO:tasks.workunit.client.0.vm03.stdout:1/877: unlink d0/d42/fcc 0 2026-03-10T14:08:47.852 INFO:tasks.workunit.client.0.vm03.stdout:4/886: creat d5/d9/db/da7/db9/def/d11f/d12b/f133 x:0 0 0 2026-03-10T14:08:47.853 INFO:tasks.workunit.client.0.vm03.stdout:1/878: symlink d0/d18/d3b/db7/d115/l11f 0 2026-03-10T14:08:47.854 INFO:tasks.workunit.client.0.vm03.stdout:4/887: symlink d5/d9/db/da7/db9/l134 0 2026-03-10T14:08:47.855 INFO:tasks.workunit.client.0.vm03.stdout:4/888: chown d5/d47/d5b/f79 1 1 2026-03-10T14:08:47.863 INFO:tasks.workunit.client.0.vm03.stdout:4/889: creat d5/d9/db/da7/d119/f135 x:0 0 0 2026-03-10T14:08:47.866 INFO:tasks.workunit.client.0.vm03.stdout:2/853: dwrite f1 [0,4194304] 0 2026-03-10T14:08:47.868 INFO:tasks.workunit.client.0.vm03.stdout:8/977: truncate da/d3a/d44/f97 3564153 0 2026-03-10T14:08:47.869 INFO:tasks.workunit.client.0.vm03.stdout:7/777: truncate d5/d9/d14/f2f 2555934 0 2026-03-10T14:08:47.871 INFO:tasks.workunit.client.0.vm03.stdout:6/920: write d8/db/f1f [2019060,101743] 0 2026-03-10T14:08:47.881 INFO:tasks.workunit.client.0.vm03.stdout:7/778: dread d5/d9/d14/d26/d36/d51/d7b/f87 [0,4194304] 0 2026-03-10T14:08:47.886 INFO:tasks.workunit.client.0.vm03.stdout:9/857: dwrite d2/d29/d33/d41/d95/f117 [4194304,4194304] 0 2026-03-10T14:08:47.895 INFO:tasks.workunit.client.0.vm03.stdout:0/818: getdents d3/d11/d2c/d4a 0 2026-03-10T14:08:47.896 INFO:tasks.workunit.client.0.vm03.stdout:4/890: dread - d5/d9/db/d19/d34/ff6 zero size 2026-03-10T14:08:47.900 INFO:tasks.workunit.client.0.vm03.stdout:1/879: dwrite d0/fbf [0,4194304] 0 2026-03-10T14:08:47.911 INFO:tasks.workunit.client.0.vm03.stdout:7/779: truncate d5/d9/d14/d21/fb0 373178 0 2026-03-10T14:08:47.916 INFO:tasks.workunit.client.0.vm03.stdout:9/858: fsync d2/d29/d33/d41/daa/fcf 0 2026-03-10T14:08:47.916 INFO:tasks.workunit.client.0.vm03.stdout:9/859: chown d2/d29/d33/d41/d95/cac 31506 1 2026-03-10T14:08:47.916 INFO:tasks.workunit.client.0.vm03.stdout:0/819: mknod d3/d11/d76/dbc/c106 0 2026-03-10T14:08:47.917 INFO:tasks.workunit.client.0.vm03.stdout:1/880: truncate d0/d18/d3b/f4c 4555300 0 2026-03-10T14:08:47.919 INFO:tasks.workunit.client.0.vm03.stdout:6/921: fdatasync d8/d11/da0/dbf/d5c/d60/f92 0 2026-03-10T14:08:47.922 INFO:tasks.workunit.client.0.vm03.stdout:7/780: write d5/d9/d14/d26/d36/d51/d7b/f87 [727035,115540] 0 2026-03-10T14:08:47.925 INFO:tasks.workunit.client.0.vm03.stdout:9/860: symlink d2/d29/d33/d60/d8c/de1/dbd/l118 0 2026-03-10T14:08:47.926 INFO:tasks.workunit.client.0.vm03.stdout:0/820: rmdir d3/d46/da9 39 2026-03-10T14:08:47.928 INFO:tasks.workunit.client.0.vm03.stdout:0/821: read d3/d11/f13 [324361,111316] 0 2026-03-10T14:08:47.930 INFO:tasks.workunit.client.0.vm03.stdout:0/822: dwrite d3/d46/ffc [0,4194304] 0 2026-03-10T14:08:47.931 INFO:tasks.workunit.client.0.vm03.stdout:0/823: write d3/f94 [3101963,81687] 0 2026-03-10T14:08:47.940 INFO:tasks.workunit.client.0.vm03.stdout:0/824: truncate d3/d4d/f22 2290155 0 2026-03-10T14:08:47.945 INFO:tasks.workunit.client.0.vm03.stdout:7/781: creat d5/d9/d35/d101/f107 x:0 0 0 2026-03-10T14:08:47.947 INFO:tasks.workunit.client.0.vm03.stdout:1/881: link d0/d18/d1d/l40 d0/d2/df/d16/l120 0 2026-03-10T14:08:47.947 INFO:tasks.workunit.client.0.vm03.stdout:1/882: stat d0/fbf 0 2026-03-10T14:08:47.952 INFO:tasks.workunit.client.0.vm03.stdout:1/883: chown d0/d2/df/d27/l4e 39 1 2026-03-10T14:08:47.953 INFO:tasks.workunit.client.0.vm03.stdout:1/884: readlink d0/d2/df/d27/d7e/d81/l82 0 2026-03-10T14:08:47.954 INFO:tasks.workunit.client.0.vm03.stdout:6/922: link d8/d1b/d1c/c1d d8/db/d49/d6c/d32/d3a/c112 0 2026-03-10T14:08:47.954 INFO:tasks.workunit.client.0.vm03.stdout:2/854: write d5/d10/d1f/d4f/d76/da7/f44 [449134,73794] 0 2026-03-10T14:08:47.954 INFO:tasks.workunit.client.0.vm03.stdout:4/891: write d5/d9/db/f2a [484275,67248] 0 2026-03-10T14:08:47.960 INFO:tasks.workunit.client.0.vm03.stdout:9/861: mkdir d2/d29/d119 0 2026-03-10T14:08:47.962 INFO:tasks.workunit.client.0.vm03.stdout:6/923: dwrite d8/db/d49/d76/ff2 [0,4194304] 0 2026-03-10T14:08:47.964 INFO:tasks.workunit.client.0.vm03.stdout:8/978: dwrite da/d58/d6c/fd8 [0,4194304] 0 2026-03-10T14:08:47.965 INFO:tasks.workunit.client.0.vm03.stdout:7/782: fsync d5/d9/d14/f9b 0 2026-03-10T14:08:47.965 INFO:tasks.workunit.client.0.vm03.stdout:6/924: chown d8/db/df/lfc 7245349 1 2026-03-10T14:08:47.966 INFO:tasks.workunit.client.0.vm03.stdout:6/925: chown d8/db/d12/d64/fa8 118052 1 2026-03-10T14:08:47.966 INFO:tasks.workunit.client.0.vm03.stdout:8/979: dread - da/d3a/d44/dfb/dd7/ff3 zero size 2026-03-10T14:08:47.968 INFO:tasks.workunit.client.0.vm03.stdout:6/926: dread d8/db/d49/d6c/d83/f87 [0,4194304] 0 2026-03-10T14:08:47.980 INFO:tasks.workunit.client.0.vm03.stdout:4/892: rmdir d5/d47/d62/d8a/dd0 39 2026-03-10T14:08:47.981 INFO:tasks.workunit.client.0.vm03.stdout:2/855: dread - d5/db4/fd9 zero size 2026-03-10T14:08:47.982 INFO:tasks.workunit.client.0.vm03.stdout:9/862: dread - d2/d29/dcd/dd8/fed zero size 2026-03-10T14:08:47.982 INFO:tasks.workunit.client.0.vm03.stdout:9/863: read - d2/d14/d2b/d79/fd1 zero size 2026-03-10T14:08:47.984 INFO:tasks.workunit.client.0.vm03.stdout:7/783: creat d5/d9/d14/d26/d36/d51/dc8/de3/f108 x:0 0 0 2026-03-10T14:08:47.985 INFO:tasks.workunit.client.0.vm03.stdout:7/784: chown d5/d9/d14/d21/d9a/fee 33789 1 2026-03-10T14:08:47.985 INFO:tasks.workunit.client.0.vm03.stdout:6/927: chown d8/d1b/f29 183763 1 2026-03-10T14:08:47.994 INFO:tasks.workunit.client.0.vm03.stdout:8/980: creat da/d36/d4d/da5/dd6/f138 x:0 0 0 2026-03-10T14:08:47.996 INFO:tasks.workunit.client.0.vm03.stdout:7/785: unlink d5/d9/d14/d26/f38 0 2026-03-10T14:08:47.996 INFO:tasks.workunit.client.0.vm03.stdout:7/786: write d5/d9/f22 [87121,52192] 0 2026-03-10T14:08:48.000 INFO:tasks.workunit.client.0.vm03.stdout:9/864: rmdir d2/d29/d33/d6d/d87 39 2026-03-10T14:08:48.004 INFO:tasks.workunit.client.0.vm03.stdout:6/928: mknod d8/db/d49/d6c/d32/dcb/d106/c113 0 2026-03-10T14:08:48.004 INFO:tasks.workunit.client.0.vm03.stdout:7/787: creat d5/d9/d14/d26/d36/d51/d7b/d9c/dd5/df7/f109 x:0 0 0 2026-03-10T14:08:48.004 INFO:tasks.workunit.client.0.vm03.stdout:4/893: truncate d5/d9/db/f48 113510 0 2026-03-10T14:08:48.008 INFO:tasks.workunit.client.0.vm03.stdout:6/929: mknod d8/db/df/c114 0 2026-03-10T14:08:48.011 INFO:tasks.workunit.client.0.vm03.stdout:4/894: rmdir d5/d9/db/da7/db9 39 2026-03-10T14:08:48.013 INFO:tasks.workunit.client.0.vm03.stdout:4/895: symlink d5/db4/dd2/l136 0 2026-03-10T14:08:48.016 INFO:tasks.workunit.client.0.vm03.stdout:4/896: dread d5/d47/d5b/d64/f78 [0,4194304] 0 2026-03-10T14:08:48.021 INFO:tasks.workunit.client.0.vm03.stdout:7/788: getdents d5/d9/da7 0 2026-03-10T14:08:48.024 INFO:tasks.workunit.client.0.vm03.stdout:7/789: dwrite d5/d9/d35/d101/f107 [0,4194304] 0 2026-03-10T14:08:48.026 INFO:tasks.workunit.client.0.vm03.stdout:6/930: rmdir d8/d11/da0/dbf/d5c/d60/dc5 0 2026-03-10T14:08:48.027 INFO:tasks.workunit.client.0.vm03.stdout:6/931: write d8/db/d49/d6c/d83/ffd [2885021,34955] 0 2026-03-10T14:08:48.028 INFO:tasks.workunit.client.0.vm03.stdout:7/790: mkdir d5/d9/da7/d10a 0 2026-03-10T14:08:48.029 INFO:tasks.workunit.client.0.vm03.stdout:7/791: write d5/d9/d35/d101/f107 [2121281,11975] 0 2026-03-10T14:08:48.033 INFO:tasks.workunit.client.0.vm03.stdout:8/981: sync 2026-03-10T14:08:48.034 INFO:tasks.workunit.client.0.vm03.stdout:9/865: sync 2026-03-10T14:08:48.039 INFO:tasks.workunit.client.0.vm03.stdout:9/866: dwrite d2/d29/d33/d41/ffc [0,4194304] 0 2026-03-10T14:08:48.047 INFO:tasks.workunit.client.0.vm03.stdout:7/792: fdatasync d5/d9/d14/d26/d36/de1/d84/fc1 0 2026-03-10T14:08:48.050 INFO:tasks.workunit.client.0.vm03.stdout:8/982: mkdir da/d3c/d4b/d69/d139 0 2026-03-10T14:08:48.056 INFO:tasks.workunit.client.0.vm03.stdout:1/885: write d0/d18/d3b/f53 [1798576,128698] 0 2026-03-10T14:08:48.057 INFO:tasks.workunit.client.0.vm03.stdout:1/886: truncate d0/d2/df/d16/ded/f10f 558978 0 2026-03-10T14:08:48.061 INFO:tasks.workunit.client.0.vm03.stdout:8/983: truncate da/d3c/d4b/d4c/dba/d114/f102 386735 0 2026-03-10T14:08:48.065 INFO:tasks.workunit.client.0.vm03.stdout:0/825: dwrite d3/f10 [4194304,4194304] 0 2026-03-10T14:08:48.071 INFO:tasks.workunit.client.0.vm03.stdout:2/856: truncate d5/d10/d1f/d4f/dbe/ff4 323896 0 2026-03-10T14:08:48.071 INFO:tasks.workunit.client.0.vm03.stdout:2/857: fsync f1 0 2026-03-10T14:08:48.071 INFO:tasks.workunit.client.0.vm03.stdout:2/858: stat d5/d35/cdd 0 2026-03-10T14:08:48.075 INFO:tasks.workunit.client.0.vm03.stdout:4/897: write d5/d9/db/d19/d38/d53/d55/f76 [1110746,109413] 0 2026-03-10T14:08:48.084 INFO:tasks.workunit.client.0.vm03.stdout:8/984: rename da/d3c/d51/d75/l132 to da/d3c/d51/d85/dbb/d106/d12d/l13a 0 2026-03-10T14:08:48.087 INFO:tasks.workunit.client.0.vm03.stdout:0/826: chown d3/d46/da9/cda 0 1 2026-03-10T14:08:48.089 INFO:tasks.workunit.client.0.vm03.stdout:1/887: dread d0/d2/df/d27/d7e/d81/f86 [0,4194304] 0 2026-03-10T14:08:48.090 INFO:tasks.workunit.client.0.vm03.stdout:2/859: symlink d5/d10/d1f/d4f/d76/l119 0 2026-03-10T14:08:48.092 INFO:tasks.workunit.client.0.vm03.stdout:4/898: fsync d5/d9/db/da7/dcc/d100/fca 0 2026-03-10T14:08:48.093 INFO:tasks.workunit.client.0.vm03.stdout:2/860: truncate d5/d10/d1f/d4f/d76/da7/d40/f105 1032664 0 2026-03-10T14:08:48.095 INFO:tasks.workunit.client.0.vm03.stdout:0/827: creat d3/d4d/d47/f107 x:0 0 0 2026-03-10T14:08:48.103 INFO:tasks.workunit.client.0.vm03.stdout:2/861: creat d5/d10/d1f/d4f/d76/da7/d40/d59/d85/f11a x:0 0 0 2026-03-10T14:08:48.103 INFO:tasks.workunit.client.0.vm03.stdout:0/828: fsync d3/d11/d66/f86 0 2026-03-10T14:08:48.103 INFO:tasks.workunit.client.0.vm03.stdout:4/899: creat d5/d9/db/da7/db9/def/d11f/d12b/f137 x:0 0 0 2026-03-10T14:08:48.105 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:47 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:08:48.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:47 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:08:48.110 INFO:tasks.workunit.client.0.vm03.stdout:1/888: dread d0/d2/df/d27/fc5 [0,4194304] 0 2026-03-10T14:08:48.112 INFO:tasks.workunit.client.0.vm03.stdout:8/985: sync 2026-03-10T14:08:48.115 INFO:tasks.workunit.client.0.vm03.stdout:2/862: dread d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/f91 [0,4194304] 0 2026-03-10T14:08:48.116 INFO:tasks.workunit.client.0.vm03.stdout:6/932: truncate d8/d11/d18/d54/f5b 1777980 0 2026-03-10T14:08:48.116 INFO:tasks.workunit.client.0.vm03.stdout:1/889: dread d0/d2/df/d27/f32 [4194304,4194304] 0 2026-03-10T14:08:48.122 INFO:tasks.workunit.client.0.vm03.stdout:1/890: dwrite d0/d42/d9f/fd5 [0,4194304] 0 2026-03-10T14:08:48.134 INFO:tasks.workunit.client.0.vm03.stdout:7/793: write d5/d9/d14/d26/f5c [1505247,49787] 0 2026-03-10T14:08:48.140 INFO:tasks.workunit.client.0.vm03.stdout:7/794: dwrite d5/d9/d14/d26/d36/d51/d7b/fc9 [0,4194304] 0 2026-03-10T14:08:48.144 INFO:tasks.workunit.client.0.vm03.stdout:9/867: dwrite d2/d29/d38/f51 [0,4194304] 0 2026-03-10T14:08:48.150 INFO:tasks.workunit.client.0.vm03.stdout:6/933: rmdir d8/d11/da0/dbf/d5c/d60 39 2026-03-10T14:08:48.150 INFO:tasks.workunit.client.0.vm03.stdout:4/900: mknod d5/d9/db/d19/d38/d53/d71/df9/c138 0 2026-03-10T14:08:48.161 INFO:tasks.workunit.client.0.vm03.stdout:7/795: readlink d5/d9/d14/d26/d5f/lff 0 2026-03-10T14:08:48.164 INFO:tasks.workunit.client.0.vm03.stdout:1/891: symlink d0/d2/df/d16/ded/df1/l121 0 2026-03-10T14:08:48.165 INFO:tasks.workunit.client.0.vm03.stdout:1/892: fdatasync d0/f48 0 2026-03-10T14:08:48.166 INFO:tasks.workunit.client.0.vm03.stdout:4/901: creat d5/d9/db/da7/db9/def/f139 x:0 0 0 2026-03-10T14:08:48.167 INFO:tasks.workunit.client.0.vm03.stdout:0/829: getdents d3/d46/dac/d79 0 2026-03-10T14:08:48.170 INFO:tasks.workunit.client.0.vm03.stdout:7/796: mknod d5/c10b 0 2026-03-10T14:08:48.172 INFO:tasks.workunit.client.0.vm03.stdout:0/830: mkdir d3/d11/d2c/d4a/d4b/d89/d108 0 2026-03-10T14:08:48.173 INFO:tasks.workunit.client.0.vm03.stdout:6/934: link d8/d1b/f61 d8/db/d49/d6c/d32/f115 0 2026-03-10T14:08:48.178 INFO:tasks.workunit.client.0.vm03.stdout:6/935: mknod d8/d11/d18/d79/c116 0 2026-03-10T14:08:48.180 INFO:tasks.workunit.client.0.vm03.stdout:0/831: mknod d3/d11/c109 0 2026-03-10T14:08:48.181 INFO:tasks.workunit.client.0.vm03.stdout:1/893: sync 2026-03-10T14:08:48.181 INFO:tasks.workunit.client.0.vm03.stdout:4/902: sync 2026-03-10T14:08:48.184 INFO:tasks.workunit.client.0.vm03.stdout:7/797: getdents d5/d98 0 2026-03-10T14:08:48.187 INFO:tasks.workunit.client.0.vm03.stdout:0/832: creat d3/d16/f10a x:0 0 0 2026-03-10T14:08:48.192 INFO:tasks.workunit.client.0.vm03.stdout:7/798: sync 2026-03-10T14:08:48.203 INFO:tasks.workunit.client.0.vm03.stdout:1/894: dread d0/d2/f9 [0,4194304] 0 2026-03-10T14:08:48.207 INFO:tasks.workunit.client.0.vm03.stdout:4/903: rename d5/d96/fc1 to d5/d9/db/d19/d38/d53/d71/df9/f13a 0 2026-03-10T14:08:48.222 INFO:tasks.workunit.client.0.vm03.stdout:0/833: read d3/d11/d2c/d4a/d4b/d89/db6/df4/f87 [2407955,72384] 0 2026-03-10T14:08:48.274 INFO:tasks.workunit.client.0.vm03.stdout:8/986: dwrite da/d3c/d4b/d4c/dba/d114/dbc/db1/f11e [0,4194304] 0 2026-03-10T14:08:48.281 INFO:tasks.workunit.client.0.vm03.stdout:2/863: write d5/d10/da3/dab/fea [361257,100372] 0 2026-03-10T14:08:48.299 INFO:tasks.workunit.client.0.vm03.stdout:6/936: getdents d8/d11/d7a/dfa 0 2026-03-10T14:08:48.300 INFO:tasks.workunit.client.0.vm03.stdout:2/864: dread - d5/d10/d1f/d4f/d76/da7/d40/db0/fd5 zero size 2026-03-10T14:08:48.303 INFO:tasks.workunit.client.0.vm03.stdout:9/868: dwrite d2/d14/f96 [0,4194304] 0 2026-03-10T14:08:48.307 INFO:tasks.workunit.client.0.vm03.stdout:9/869: chown d2/d29/d33/d41/daa/fcf 4867 1 2026-03-10T14:08:48.309 INFO:tasks.workunit.client.0.vm03.stdout:8/987: dwrite da/d3c/d51/d75/dc2/dc6/ffc [0,4194304] 0 2026-03-10T14:08:48.316 INFO:tasks.workunit.client.0.vm03.stdout:1/895: write d0/fa [620850,49672] 0 2026-03-10T14:08:48.325 INFO:tasks.workunit.client.0.vm03.stdout:8/988: rmdir da/d3c/d51/d75/dc2/dc6 39 2026-03-10T14:08:48.327 INFO:tasks.workunit.client.0.vm03.stdout:2/865: rename d5/d10/d1f/d4f/d76/da7/d54/d5f/f6c to d5/d10/dba/dff/f11b 0 2026-03-10T14:08:48.331 INFO:tasks.workunit.client.0.vm03.stdout:1/896: creat d0/d18/d3b/d50/f122 x:0 0 0 2026-03-10T14:08:48.331 INFO:tasks.workunit.client.0.vm03.stdout:1/897: chown d0/d42/d9f 1 1 2026-03-10T14:08:48.333 INFO:tasks.workunit.client.0.vm03.stdout:2/866: dwrite d5/f4d [0,4194304] 0 2026-03-10T14:08:48.346 INFO:tasks.workunit.client.0.vm03.stdout:4/904: dwrite d5/d6e/f93 [0,4194304] 0 2026-03-10T14:08:48.350 INFO:tasks.workunit.client.0.vm03.stdout:9/870: rmdir d2/d14/d2b 39 2026-03-10T14:08:48.357 INFO:tasks.workunit.client.0.vm03.stdout:0/834: creat d3/d11/d76/db5/ddb/dd4/f10b x:0 0 0 2026-03-10T14:08:48.358 INFO:tasks.workunit.client.0.vm03.stdout:8/989: write da/d3a/d44/f97 [691977,52864] 0 2026-03-10T14:08:48.361 INFO:tasks.workunit.client.0.vm03.stdout:2/867: creat d5/d10/d1f/d4f/f11c x:0 0 0 2026-03-10T14:08:48.367 INFO:tasks.workunit.client.0.vm03.stdout:8/990: read da/d3c/d51/d75/dc2/f7b [2833749,64511] 0 2026-03-10T14:08:48.369 INFO:tasks.workunit.client.0.vm03.stdout:8/991: chown da/d58/fc4 4688 1 2026-03-10T14:08:48.371 INFO:tasks.workunit.client.0.vm03.stdout:4/905: rename d5/d9/db/d19/d99/fac to d5/d47/d5b/d64/d85/f13b 0 2026-03-10T14:08:48.377 INFO:tasks.workunit.client.0.vm03.stdout:9/871: chown d2/d14/d2b/d34/l3f 62076 1 2026-03-10T14:08:48.384 INFO:tasks.workunit.client.0.vm03.stdout:6/937: getdents d8/db 0 2026-03-10T14:08:48.394 INFO:tasks.workunit.client.0.vm03.stdout:7/799: write d5/d9/d14/d21/d28/f6e [2119088,73209] 0 2026-03-10T14:08:48.426 INFO:tasks.workunit.client.0.vm03.stdout:0/835: stat d3/d16/d21/f96 0 2026-03-10T14:08:48.427 INFO:tasks.workunit.client.0.vm03.stdout:2/868: dread - d5/d10/d1f/fa5 zero size 2026-03-10T14:08:48.431 INFO:tasks.workunit.client.0.vm03.stdout:8/992: read - da/d3a/d44/dfb/dd7/fdc zero size 2026-03-10T14:08:48.437 INFO:tasks.workunit.client.0.vm03.stdout:0/836: dread d3/d46/f63 [4194304,4194304] 0 2026-03-10T14:08:48.438 INFO:tasks.workunit.client.0.vm03.stdout:1/898: creat d0/d18/d3b/f123 x:0 0 0 2026-03-10T14:08:48.440 INFO:tasks.workunit.client.0.vm03.stdout:9/872: fdatasync d2/d14/fb5 0 2026-03-10T14:08:48.441 INFO:tasks.workunit.client.0.vm03.stdout:2/869: mkdir d5/d10/d1f/d11d 0 2026-03-10T14:08:48.443 INFO:tasks.workunit.client.0.vm03.stdout:1/899: mkdir d0/d18/d3b/db7/d115/d124 0 2026-03-10T14:08:48.446 INFO:tasks.workunit.client.0.vm03.stdout:6/938: rename d8/db/d12/f7c to d8/db/d49/f117 0 2026-03-10T14:08:48.446 INFO:tasks.workunit.client.0.vm03.stdout:7/800: rename d5/d9/d35 to d5/d9/d35/d10c 22 2026-03-10T14:08:48.446 INFO:tasks.workunit.client.0.vm03.stdout:9/873: creat d2/d29/d38/f11a x:0 0 0 2026-03-10T14:08:48.451 INFO:tasks.workunit.client.0.vm03.stdout:0/837: link d3/d46/dac/d79/f8d d3/d11/d2c/d4a/d4b/d89/db9/de6/f10c 0 2026-03-10T14:08:48.451 INFO:tasks.workunit.client.0.vm03.stdout:2/870: read - d5/db4/d74/dcf/fed zero size 2026-03-10T14:08:48.452 INFO:tasks.workunit.client.0.vm03.stdout:1/900: mknod d0/d2/db6/c125 0 2026-03-10T14:08:48.454 INFO:tasks.workunit.client.0.vm03.stdout:8/993: rename da/d3c/d4b/dd2/d10b/c11c to da/d3c/d4b/c13b 0 2026-03-10T14:08:48.454 INFO:tasks.workunit.client.0.vm03.stdout:2/871: chown d5/db4/lda 5228 1 2026-03-10T14:08:48.454 INFO:tasks.workunit.client.0.vm03.stdout:0/838: chown d3/d11/d2c/lc9 2 1 2026-03-10T14:08:48.459 INFO:tasks.workunit.client.0.vm03.stdout:1/901: dread d0/d2/df/d27/d7e/d81/fd2 [0,4194304] 0 2026-03-10T14:08:48.460 INFO:tasks.workunit.client.0.vm03.stdout:6/939: creat d8/d11/f118 x:0 0 0 2026-03-10T14:08:48.460 INFO:tasks.workunit.client.0.vm03.stdout:7/801: rename d5/d9/d14/d26/d36/d51/d7b/l81 to d5/d9/d14/d26/d5f/de2/l10d 0 2026-03-10T14:08:48.461 INFO:tasks.workunit.client.0.vm03.stdout:0/839: symlink d3/d4d/da0/l10d 0 2026-03-10T14:08:48.476 INFO:tasks.workunit.client.0.vm03.stdout:9/874: sync 2026-03-10T14:08:48.476 INFO:tasks.workunit.client.0.vm03.stdout:7/802: symlink d5/d9/d14/d26/d36/db4/d102/l10e 0 2026-03-10T14:08:48.476 INFO:tasks.workunit.client.0.vm03.stdout:9/875: mknod d2/d14/d2b/d76/c11b 0 2026-03-10T14:08:48.476 INFO:tasks.workunit.client.0.vm03.stdout:9/876: stat d2/d29/d33 0 2026-03-10T14:08:48.476 INFO:tasks.workunit.client.0.vm03.stdout:9/877: stat d2/d29/d33/d60/cfb 0 2026-03-10T14:08:48.485 INFO:tasks.workunit.client.0.vm03.stdout:9/878: getdents d2/d29/d33/d60/d8c/de1 0 2026-03-10T14:08:48.485 INFO:tasks.workunit.client.0.vm03.stdout:7/803: dread d5/d9/d14/d26/d5f/f68 [0,4194304] 0 2026-03-10T14:08:48.487 INFO:tasks.workunit.client.0.vm03.stdout:4/906: write d5/d9/db/d19/d34/f4c [5151980,27506] 0 2026-03-10T14:08:48.490 INFO:tasks.workunit.client.0.vm03.stdout:4/907: chown d5/d9/l7a 6 1 2026-03-10T14:08:48.492 INFO:tasks.workunit.client.0.vm03.stdout:4/908: dread - d5/d47/f65 zero size 2026-03-10T14:08:48.492 INFO:tasks.workunit.client.0.vm03.stdout:7/804: dwrite d5/d9/da7/ffa [0,4194304] 0 2026-03-10T14:08:48.496 INFO:tasks.workunit.client.0.vm03.stdout:9/879: mkdir d2/d29/d33/d60/d7f/d11c 0 2026-03-10T14:08:48.503 INFO:tasks.workunit.client.0.vm03.stdout:4/909: chown d5/c22 18576 1 2026-03-10T14:08:48.503 INFO:tasks.workunit.client.0.vm03.stdout:7/805: dwrite d5/d9/d14/d26/f5c [0,4194304] 0 2026-03-10T14:08:48.505 INFO:tasks.workunit.client.0.vm03.stdout:8/994: write da/d3c/d4b/d4c/dba/d114/dbc/d70/d7d/fa1 [88089,71990] 0 2026-03-10T14:08:48.505 INFO:tasks.workunit.client.0.vm03.stdout:2/872: write d5/d2a/f45 [6977545,40206] 0 2026-03-10T14:08:48.528 INFO:tasks.workunit.client.0.vm03.stdout:6/940: truncate d8/db/d49/d6c/f8e 888317 0 2026-03-10T14:08:48.530 INFO:tasks.workunit.client.0.vm03.stdout:4/910: creat d5/d9/db/d19/d38/d7b/daa/daf/f13c x:0 0 0 2026-03-10T14:08:48.531 INFO:tasks.workunit.client.0.vm03.stdout:0/840: dwrite d3/d16/d21/d9a/fab [0,4194304] 0 2026-03-10T14:08:48.538 INFO:tasks.workunit.client.0.vm03.stdout:1/902: dread d0/d2/df/f31 [0,4194304] 0 2026-03-10T14:08:48.547 INFO:tasks.workunit.client.0.vm03.stdout:7/806: rename d5/d9/d14/d21/d6f/lfd to d5/d9/d14/d26/d36/de1/d84/l10f 0 2026-03-10T14:08:48.552 INFO:tasks.workunit.client.0.vm03.stdout:4/911: mkdir d5/d9/db/d19/d99/d13d 0 2026-03-10T14:08:48.552 INFO:tasks.workunit.client.0.vm03.stdout:6/941: creat d8/d3b/da7/f119 x:0 0 0 2026-03-10T14:08:48.552 INFO:tasks.workunit.client.0.vm03.stdout:0/841: dread - d3/d46/dac/fce zero size 2026-03-10T14:08:48.554 INFO:tasks.workunit.client.0.vm03.stdout:9/880: dread d2/f37 [0,4194304] 0 2026-03-10T14:08:48.556 INFO:tasks.workunit.client.0.vm03.stdout:0/842: dread - d3/d11/d2c/d4a/d4b/d89/db9/ffe zero size 2026-03-10T14:08:48.556 INFO:tasks.workunit.client.0.vm03.stdout:9/881: chown d2/f2f 11854 1 2026-03-10T14:08:48.557 INFO:tasks.workunit.client.0.vm03.stdout:1/903: dwrite d0/d2/df/d27/d7e/d81/f8d [4194304,4194304] 0 2026-03-10T14:08:48.561 INFO:tasks.workunit.client.0.vm03.stdout:6/942: write d8/db/d12/d64/f109 [703714,32076] 0 2026-03-10T14:08:48.568 INFO:tasks.workunit.client.0.vm03.stdout:8/995: dwrite da/d3c/d4b/d69/fb5 [0,4194304] 0 2026-03-10T14:08:48.574 INFO:tasks.workunit.client.0.vm03.stdout:7/807: symlink d5/d9/d14/d26/d36/d51/l110 0 2026-03-10T14:08:48.574 INFO:tasks.workunit.client.0.vm03.stdout:9/882: mknod d2/d29/d33/d60/d8c/de1/dbd/c11d 0 2026-03-10T14:08:48.574 INFO:tasks.workunit.client.0.vm03.stdout:8/996: write da/d24/d49/fd0 [1563725,104552] 0 2026-03-10T14:08:48.575 INFO:tasks.workunit.client.0.vm03.stdout:2/873: link d5/d10/f60 d5/d10/d1f/d4f/d76/da7/d54/d5f/f11e 0 2026-03-10T14:08:48.575 INFO:tasks.workunit.client.0.vm03.stdout:6/943: fdatasync f0 0 2026-03-10T14:08:48.578 INFO:tasks.workunit.client.0.vm03.stdout:4/912: symlink d5/d9/db/d19/d38/l13e 0 2026-03-10T14:08:48.585 INFO:tasks.workunit.client.0.vm03.stdout:1/904: mkdir d0/d2/df/d126 0 2026-03-10T14:08:48.592 INFO:tasks.workunit.client.0.vm03.stdout:6/944: mknod d8/db/df/dcd/c11a 0 2026-03-10T14:08:48.594 INFO:tasks.workunit.client.0.vm03.stdout:8/997: truncate da/d3c/f48 7783354 0 2026-03-10T14:08:48.598 INFO:tasks.workunit.client.0.vm03.stdout:0/843: mkdir d3/d11/d2c/d10e 0 2026-03-10T14:08:48.600 INFO:tasks.workunit.client.0.vm03.stdout:7/808: dwrite d5/d9/d14/f2f [0,4194304] 0 2026-03-10T14:08:48.607 INFO:tasks.workunit.client.0.vm03.stdout:1/905: symlink d0/d2/df/l127 0 2026-03-10T14:08:48.607 INFO:tasks.workunit.client.0.vm03.stdout:7/809: dread - d5/d9/d14/d26/d39/d92/fb5 zero size 2026-03-10T14:08:48.608 INFO:tasks.workunit.client.0.vm03.stdout:1/906: truncate d0/d2/df/d16/d41/f119 893153 0 2026-03-10T14:08:48.613 INFO:tasks.workunit.client.0.vm03.stdout:0/844: symlink d3/d11/d76/dd3/l10f 0 2026-03-10T14:08:48.615 INFO:tasks.workunit.client.0.vm03.stdout:9/883: dread d2/d14/d2b/d43/fcc [0,4194304] 0 2026-03-10T14:08:48.624 INFO:tasks.workunit.client.0.vm03.stdout:6/945: symlink d8/db/d49/d58/ddc/l11b 0 2026-03-10T14:08:48.627 INFO:tasks.workunit.client.0.vm03.stdout:8/998: sync 2026-03-10T14:08:48.647 INFO:tasks.workunit.client.0.vm03.stdout:8/999: sync 2026-03-10T14:08:48.668 INFO:tasks.workunit.client.0.vm03.stdout:2/874: write d5/d10/d1f/d4f/d76/da7/d104/fbf [427422,128406] 0 2026-03-10T14:08:48.677 INFO:tasks.workunit.client.0.vm03.stdout:4/913: dwrite d5/d47/d5b/f84 [4194304,4194304] 0 2026-03-10T14:08:48.682 INFO:tasks.workunit.client.0.vm03.stdout:7/810: write d5/d9/f19 [3548102,24375] 0 2026-03-10T14:08:48.682 INFO:tasks.workunit.client.0.vm03.stdout:4/914: chown d5/d9/db/f29 1440 1 2026-03-10T14:08:48.684 INFO:tasks.workunit.client.0.vm03.stdout:7/811: stat d5/d9/d14/d26/d5f/de2 0 2026-03-10T14:08:48.687 INFO:tasks.workunit.client.0.vm03.stdout:1/907: dwrite d0/d18/d1d/f5e [0,4194304] 0 2026-03-10T14:08:48.696 INFO:tasks.workunit.client.0.vm03.stdout:9/884: mkdir d2/d14/d2b/d11e 0 2026-03-10T14:08:48.696 INFO:tasks.workunit.client.0.vm03.stdout:6/946: getdents d8/d11/d7a/dfa 0 2026-03-10T14:08:48.696 INFO:tasks.workunit.client.0.vm03.stdout:6/947: stat d8/d3b/da7/ccc 0 2026-03-10T14:08:48.702 INFO:tasks.workunit.client.0.vm03.stdout:1/908: creat d0/d2/df/d91/dda/f128 x:0 0 0 2026-03-10T14:08:48.705 INFO:tasks.workunit.client.0.vm03.stdout:2/875: dread d5/d10/d17/f18 [0,4194304] 0 2026-03-10T14:08:48.744 INFO:tasks.workunit.client.0.vm03.stdout:6/948: getdents d8/db/df/dcd 0 2026-03-10T14:08:48.759 INFO:tasks.workunit.client.0.vm03.stdout:4/915: dread d5/d9/db/f24 [4194304,4194304] 0 2026-03-10T14:08:48.763 INFO:tasks.workunit.client.0.vm03.stdout:1/909: rename d0/d2/df/d91/dda/f117 to d0/d18/d3b/f129 0 2026-03-10T14:08:48.786 INFO:tasks.workunit.client.0.vm03.stdout:4/916: dread d5/d9/db/d19/d38/d53/d55/f76 [0,4194304] 0 2026-03-10T14:08:48.787 INFO:tasks.workunit.client.0.vm03.stdout:4/917: dread - d5/d9/fd6 zero size 2026-03-10T14:08:48.797 INFO:tasks.workunit.client.0.vm03.stdout:2/876: rename d5/d10/d1f/d4f/cd0 to d5/d2a/c11f 0 2026-03-10T14:08:48.798 INFO:tasks.workunit.client.0.vm03.stdout:4/918: chown d5/d9/db/c57 1575761384 1 2026-03-10T14:08:48.803 INFO:tasks.workunit.client.0.vm03.stdout:2/877: mknod d5/d10/d1f/d4f/d76/da7/d40/d59/d7b/de1/c120 0 2026-03-10T14:08:48.807 INFO:tasks.workunit.client.0.vm03.stdout:2/878: read d5/d10/d1f/d4f/d76/da7/d40/f68 [411114,91333] 0 2026-03-10T14:08:48.810 INFO:tasks.workunit.client.0.vm03.stdout:7/812: rmdir d5/d9/d14/d26/d36/de1/d84 39 2026-03-10T14:08:48.812 INFO:tasks.workunit.client.0.vm03.stdout:2/879: dwrite d5/f9f [0,4194304] 0 2026-03-10T14:08:48.813 INFO:tasks.workunit.client.0.vm03.stdout:2/880: stat d5/db4/d74 0 2026-03-10T14:08:48.816 INFO:tasks.workunit.client.0.vm03.stdout:2/881: chown d5/d10/d1f/d4f/d76/dfc 293255 1 2026-03-10T14:08:48.824 INFO:tasks.workunit.client.0.vm03.stdout:2/882: truncate d5/db4/fd9 78187 0 2026-03-10T14:08:48.824 INFO:tasks.workunit.client.0.vm03.stdout:7/813: dwrite d5/d9/d14/d26/d5f/de2/f105 [0,4194304] 0 2026-03-10T14:08:48.835 INFO:tasks.workunit.client.0.vm03.stdout:7/814: creat d5/d9/d14/d26/d39/f111 x:0 0 0 2026-03-10T14:08:48.838 INFO:tasks.workunit.client.0.vm03.stdout:7/815: mkdir d5/d9/d14/d26/d5f/d89/d112 0 2026-03-10T14:08:48.838 INFO:tasks.workunit.client.0.vm03.stdout:7/816: chown d5/d9/d14/d26/d36/d51/dc8/de3 9263154 1 2026-03-10T14:08:48.910 INFO:tasks.workunit.client.0.vm03.stdout:9/885: rmdir d2/d14 39 2026-03-10T14:08:48.913 INFO:tasks.workunit.client.0.vm03.stdout:9/886: rmdir d2/d29/dcd/dd8 39 2026-03-10T14:08:48.915 INFO:tasks.workunit.client.0.vm03.stdout:9/887: read d2/d29/d33/fa9 [959351,29430] 0 2026-03-10T14:08:48.917 INFO:tasks.workunit.client.0.vm03.stdout:9/888: fsync d2/d29/d33/d41/d95/f117 0 2026-03-10T14:08:48.918 INFO:tasks.workunit.client.0.vm03.stdout:9/889: fsync d2/d29/d33/d41/d95/fae 0 2026-03-10T14:08:48.971 INFO:tasks.workunit.client.0.vm03.stdout:0/845: mkdir d3/d11/d2c/d4a/d4b/d110 0 2026-03-10T14:08:48.971 INFO:tasks.workunit.client.0.vm03.stdout:9/890: dread d2/d14/d2b/d43/f7c [0,4194304] 0 2026-03-10T14:08:48.975 INFO:tasks.workunit.client.0.vm03.stdout:9/891: unlink d2/d29/d33/d41/d95/f103 0 2026-03-10T14:08:48.981 INFO:tasks.workunit.client.0.vm03.stdout:9/892: rmdir d2/d14/d2b/d79/d8a/dff 39 2026-03-10T14:08:48.989 INFO:tasks.workunit.client.0.vm03.stdout:9/893: link d2/fd2 d2/d29/d33/d60/d8c/dd3/f11f 0 2026-03-10T14:08:48.993 INFO:tasks.workunit.client.0.vm03.stdout:9/894: mknod d2/d29/dcd/c120 0 2026-03-10T14:08:48.993 INFO:tasks.workunit.client.0.vm03.stdout:9/895: readlink d2/l12 0 2026-03-10T14:08:48.997 INFO:tasks.workunit.client.0.vm03.stdout:9/896: creat d2/d29/d33/d41/daa/f121 x:0 0 0 2026-03-10T14:08:48.997 INFO:tasks.workunit.client.0.vm03.stdout:6/949: write d8/d11/da0/dca/fe9 [698710,52211] 0 2026-03-10T14:08:49.005 INFO:tasks.workunit.client.0.vm03.stdout:1/910: dwrite d0/d2/df/fce [0,4194304] 0 2026-03-10T14:08:49.012 INFO:tasks.workunit.client.0.vm03.stdout:1/911: mknod d0/d42/d9f/c12a 0 2026-03-10T14:08:49.022 INFO:tasks.workunit.client.0.vm03.stdout:9/897: getdents d2/d29/d33/d41 0 2026-03-10T14:08:49.023 INFO:tasks.workunit.client.0.vm03.stdout:9/898: chown d2/d29/dcd/dd8/fed 146632004 1 2026-03-10T14:08:49.024 INFO:tasks.workunit.client.0.vm03.stdout:9/899: fdatasync d2/d14/f61 0 2026-03-10T14:08:49.026 INFO:tasks.workunit.client.0.vm03.stdout:9/900: symlink d2/d14/d2b/d76/l122 0 2026-03-10T14:08:49.026 INFO:tasks.workunit.client.0.vm03.stdout:9/901: dread - d2/d29/d38/dec/f112 zero size 2026-03-10T14:08:49.030 INFO:tasks.workunit.client.0.vm03.stdout:9/902: dwrite d2/d29/d9a/fb7 [0,4194304] 0 2026-03-10T14:08:49.032 INFO:tasks.workunit.client.0.vm03.stdout:9/903: dread - d2/d14/d2b/d43/f110 zero size 2026-03-10T14:08:49.033 INFO:tasks.workunit.client.0.vm03.stdout:9/904: chown d2/d29/d33/d60/cfb 8897699 1 2026-03-10T14:08:49.033 INFO:tasks.workunit.client.0.vm03.stdout:9/905: chown d2/d29/d38/l3b 925537 1 2026-03-10T14:08:49.035 INFO:tasks.workunit.client.0.vm03.stdout:9/906: mknod d2/d14/d2b/d79/d81/df3/c123 0 2026-03-10T14:08:49.047 INFO:tasks.workunit.client.0.vm03.stdout:4/919: write d5/d9/db/f3d [4145415,40388] 0 2026-03-10T14:08:49.047 INFO:tasks.workunit.client.0.vm03.stdout:7/817: rename d5/d9/d14/d26/d36/d51/l77 to d5/d9/d14/d26/d5f/de7/l113 0 2026-03-10T14:08:49.054 INFO:tasks.workunit.client.0.vm03.stdout:4/920: creat d5/d9/db/da7/db9/def/d11f/f13f x:0 0 0 2026-03-10T14:08:49.056 INFO:tasks.workunit.client.0.vm03.stdout:7/818: link d5/d9/d14/d26/d5f/de7/cac d5/d9/d14/d26/d5f/dce/c114 0 2026-03-10T14:08:49.056 INFO:tasks.workunit.client.0.vm03.stdout:4/921: dread - d5/d9/db/da7/dcc/d100/d8e/ffe zero size 2026-03-10T14:08:49.062 INFO:tasks.workunit.client.0.vm03.stdout:2/883: write d5/d10/d1f/d4f/d76/fca [949258,25266] 0 2026-03-10T14:08:49.068 INFO:tasks.workunit.client.0.vm03.stdout:4/922: readlink d5/d9/db/d19/d38/d53/d71/lfa 0 2026-03-10T14:08:49.070 INFO:tasks.workunit.client.0.vm03.stdout:4/923: getdents d5/db4/dd2 0 2026-03-10T14:08:49.076 INFO:tasks.workunit.client.0.vm03.stdout:4/924: fdatasync d5/d9/db/d19/d38/d53/f83 0 2026-03-10T14:08:49.086 INFO:tasks.workunit.client.0.vm03.stdout:4/925: unlink d5/d9/db/d19/d38/ded/c127 0 2026-03-10T14:08:49.103 INFO:tasks.workunit.client.0.vm03.stdout:0/846: truncate d3/d4d/d30/f45 675583 0 2026-03-10T14:08:49.111 INFO:tasks.workunit.client.0.vm03.stdout:0/847: chown d3/d11/d76/dd3/cfb 1305 1 2026-03-10T14:08:49.114 INFO:tasks.workunit.client.0.vm03.stdout:6/950: write d8/d11/da0/dbf/d5c/f68 [5628903,19102] 0 2026-03-10T14:08:49.114 INFO:tasks.workunit.client.0.vm03.stdout:1/912: write d0/f10 [454314,39035] 0 2026-03-10T14:08:49.115 INFO:tasks.workunit.client.0.vm03.stdout:6/951: write d8/d11/d18/d79/d80/fd5 [4297656,113424] 0 2026-03-10T14:08:49.120 INFO:tasks.workunit.client.0.vm03.stdout:9/907: write d2/d14/f30 [405706,2530] 0 2026-03-10T14:08:49.127 INFO:tasks.workunit.client.0.vm03.stdout:2/884: dwrite d5/d2a/f6e [4194304,4194304] 0 2026-03-10T14:08:49.134 INFO:tasks.workunit.client.0.vm03.stdout:6/952: creat d8/d11/d18/d79/d80/f11c x:0 0 0 2026-03-10T14:08:49.134 INFO:tasks.workunit.client.0.vm03.stdout:2/885: rename d5/d10/d1f/d4f/d76/da7/d40/d59 to d5/db4/d74/d121 0 2026-03-10T14:08:49.134 INFO:tasks.workunit.client.0.vm03.stdout:9/908: mknod d2/d29/d38/c124 0 2026-03-10T14:08:49.142 INFO:tasks.workunit.client.0.vm03.stdout:1/913: getdents d0/d18/d3b 0 2026-03-10T14:08:49.143 INFO:tasks.workunit.client.0.vm03.stdout:2/886: mknod d5/d10/d1f/d4f/d76/da7/d40/db0/c122 0 2026-03-10T14:08:49.144 INFO:tasks.workunit.client.0.vm03.stdout:1/914: mkdir d0/d2/db6/d12b 0 2026-03-10T14:08:49.153 INFO:tasks.workunit.client.0.vm03.stdout:1/915: mkdir d0/d12c 0 2026-03-10T14:08:49.160 INFO:tasks.workunit.client.0.vm03.stdout:1/916: creat d0/d18/d3b/d50/f12d x:0 0 0 2026-03-10T14:08:49.165 INFO:tasks.workunit.client.0.vm03.stdout:2/887: fsync d5/d10/d1f/d4f/d76/da7/d40/db0/f100 0 2026-03-10T14:08:49.165 INFO:tasks.workunit.client.0.vm03.stdout:4/926: dwrite d5/d9/db/da7/dcc/d100/fca [0,4194304] 0 2026-03-10T14:08:49.172 INFO:tasks.workunit.client.0.vm03.stdout:2/888: dread d5/f2d [0,4194304] 0 2026-03-10T14:08:49.177 INFO:tasks.workunit.client.0.vm03.stdout:1/917: rename d0/d18/d3b/f129 to d0/d2/d71/d90/f12e 0 2026-03-10T14:08:49.179 INFO:tasks.workunit.client.0.vm03.stdout:4/927: rename d5/db4/fb5 to d5/d9/db/d19/d38/d53/d71/df9/f140 0 2026-03-10T14:08:49.181 INFO:tasks.workunit.client.0.vm03.stdout:9/909: dread d2/d29/d33/fa9 [0,4194304] 0 2026-03-10T14:08:49.187 INFO:tasks.workunit.client.0.vm03.stdout:2/889: mknod d5/d10/d31/c123 0 2026-03-10T14:08:49.187 INFO:tasks.workunit.client.0.vm03.stdout:0/848: write d3/d11/d76/db5/ddb/ff1 [239755,16826] 0 2026-03-10T14:08:49.187 INFO:tasks.workunit.client.0.vm03.stdout:4/928: unlink d5/l6a 0 2026-03-10T14:08:49.188 INFO:tasks.workunit.client.0.vm03.stdout:1/918: mkdir d0/d42/d12f 0 2026-03-10T14:08:49.188 INFO:tasks.workunit.client.0.vm03.stdout:0/849: readlink d3/d4d/d30/lf6 0 2026-03-10T14:08:49.195 INFO:tasks.workunit.client.0.vm03.stdout:1/919: creat d0/d2/df/dab/dc3/f130 x:0 0 0 2026-03-10T14:08:49.196 INFO:tasks.workunit.client.0.vm03.stdout:1/920: readlink d0/d2/df/d27/l4e 0 2026-03-10T14:08:49.196 INFO:tasks.workunit.client.0.vm03.stdout:9/910: dread d2/d14/d2b/d34/f7a [0,4194304] 0 2026-03-10T14:08:49.197 INFO:tasks.workunit.client.0.vm03.stdout:1/921: truncate d0/d2/d71/d90/f12e 1019190 0 2026-03-10T14:08:49.200 INFO:tasks.workunit.client.0.vm03.stdout:9/911: unlink d2/c24 0 2026-03-10T14:08:49.216 INFO:tasks.workunit.client.0.vm03.stdout:9/912: write d2/d14/f61 [867259,100670] 0 2026-03-10T14:08:49.217 INFO:tasks.workunit.client.0.vm03.stdout:4/929: dwrite d5/d9/db/da7/dcc/d100/fca [0,4194304] 0 2026-03-10T14:08:49.217 INFO:tasks.workunit.client.0.vm03.stdout:1/922: fdatasync d0/d18/fa6 0 2026-03-10T14:08:49.217 INFO:tasks.workunit.client.0.vm03.stdout:1/923: stat d0/d12c 0 2026-03-10T14:08:49.217 INFO:tasks.workunit.client.0.vm03.stdout:0/850: dwrite d3/d4d/f2a [0,4194304] 0 2026-03-10T14:08:49.217 INFO:tasks.workunit.client.0.vm03.stdout:4/930: fsync d5/d47/d5b/d64/d85/f9d 0 2026-03-10T14:08:49.217 INFO:tasks.workunit.client.0.vm03.stdout:4/931: chown d5/d47/d5b/lc5 11 1 2026-03-10T14:08:49.217 INFO:tasks.workunit.client.0.vm03.stdout:9/913: symlink d2/d14/d2b/d43/ddd/l125 0 2026-03-10T14:08:49.218 INFO:tasks.workunit.client.0.vm03.stdout:9/914: dread d2/d14/d2b/d43/f7c [0,4194304] 0 2026-03-10T14:08:49.229 INFO:tasks.workunit.client.0.vm03.stdout:1/924: creat d0/d18/f131 x:0 0 0 2026-03-10T14:08:49.231 INFO:tasks.workunit.client.0.vm03.stdout:4/932: dwrite d5/d9/db/d19/d38/f77 [0,4194304] 0 2026-03-10T14:08:49.245 INFO:tasks.workunit.client.0.vm03.stdout:1/925: stat d0/d18/d1d/f5e 0 2026-03-10T14:08:49.245 INFO:tasks.workunit.client.0.vm03.stdout:4/933: dread - d5/d9/db/d19/d38/ded/f110 zero size 2026-03-10T14:08:49.245 INFO:tasks.workunit.client.0.vm03.stdout:1/926: symlink d0/d2/df/l132 0 2026-03-10T14:08:49.245 INFO:tasks.workunit.client.0.vm03.stdout:4/934: symlink d5/d47/d5b/dbe/l141 0 2026-03-10T14:08:49.247 INFO:tasks.workunit.client.0.vm03.stdout:2/890: sync 2026-03-10T14:08:49.248 INFO:tasks.workunit.client.0.vm03.stdout:4/935: truncate d5/d9/db/d19/d34/f6b 3645147 0 2026-03-10T14:08:49.251 INFO:tasks.workunit.client.0.vm03.stdout:2/891: dread d5/d10/d1f/d4f/d76/da7/d40/f105 [0,4194304] 0 2026-03-10T14:08:49.252 INFO:tasks.workunit.client.0.vm03.stdout:1/927: dwrite d0/d18/f131 [0,4194304] 0 2026-03-10T14:08:49.255 INFO:tasks.workunit.client.0.vm03.stdout:1/928: chown d0/d2/d71/d90/fc1 1811 1 2026-03-10T14:08:49.261 INFO:tasks.workunit.client.0.vm03.stdout:1/929: chown d0/d2/df/d16 873126 1 2026-03-10T14:08:49.263 INFO:tasks.workunit.client.0.vm03.stdout:9/915: rmdir d2/d29/d33 39 2026-03-10T14:08:49.273 INFO:tasks.workunit.client.0.vm03.stdout:4/936: unlink d5/d9/db/f48 0 2026-03-10T14:08:49.276 INFO:tasks.workunit.client.0.vm03.stdout:1/930: symlink d0/d12c/l133 0 2026-03-10T14:08:49.278 INFO:tasks.workunit.client.0.vm03.stdout:1/931: read - d0/d2/df/d16/d41/ffc zero size 2026-03-10T14:08:49.279 INFO:tasks.workunit.client.0.vm03.stdout:6/953: dread d8/db/dd0/fef [0,4194304] 0 2026-03-10T14:08:49.285 INFO:tasks.workunit.client.0.vm03.stdout:2/892: rename d5/d10/d1f/f3f to d5/d10/d1f/d4f/d76/da7/f124 0 2026-03-10T14:08:49.285 INFO:tasks.workunit.client.0.vm03.stdout:6/954: mkdir d8/d1b/d1c/d11d 0 2026-03-10T14:08:49.289 INFO:tasks.workunit.client.0.vm03.stdout:6/955: fdatasync d8/d3b/da7/fc9 0 2026-03-10T14:08:49.298 INFO:tasks.workunit.client.0.vm03.stdout:2/893: truncate d5/d10/d1f/d4f/d76/da7/fe3 702582 0 2026-03-10T14:08:49.298 INFO:tasks.workunit.client.0.vm03.stdout:1/932: dread d0/d2/df/d27/f52 [0,4194304] 0 2026-03-10T14:08:49.298 INFO:tasks.workunit.client.0.vm03.stdout:6/956: truncate d8/d11/da0/dbf/fde 800733 0 2026-03-10T14:08:49.299 INFO:tasks.workunit.client.0.vm03.stdout:6/957: stat d8/db/l6f 0 2026-03-10T14:08:49.300 INFO:tasks.workunit.client.0.vm03.stdout:1/933: readlink d0/d2/df/d27/dd7/l102 0 2026-03-10T14:08:49.303 INFO:tasks.workunit.client.0.vm03.stdout:2/894: creat d5/d10/d1f/d4f/d76/da7/d54/d5f/f125 x:0 0 0 2026-03-10T14:08:49.304 INFO:tasks.workunit.client.0.vm03.stdout:2/895: write d5/db4/d74/d83/fc4 [2599997,38465] 0 2026-03-10T14:08:49.305 INFO:tasks.workunit.client.0.vm03.stdout:1/934: mknod d0/d42/d9f/dc9/c134 0 2026-03-10T14:08:49.305 INFO:tasks.workunit.client.0.vm03.stdout:1/935: readlink d0/d2/db6/le9 0 2026-03-10T14:08:49.307 INFO:tasks.workunit.client.0.vm03.stdout:6/958: rename d8/db/d49/d58/lce to d8/d11/l11e 0 2026-03-10T14:08:49.314 INFO:tasks.workunit.client.0.vm03.stdout:1/936: dwrite d0/d2/df/d16/d41/feb [0,4194304] 0 2026-03-10T14:08:49.316 INFO:tasks.workunit.client.0.vm03.stdout:6/959: mkdir d8/d11/d18/d79/d80/df1/df6/d11f 0 2026-03-10T14:08:49.317 INFO:tasks.workunit.client.0.vm03.stdout:1/937: truncate d0/d2/df/d16/ded/f10f 809603 0 2026-03-10T14:08:49.322 INFO:tasks.workunit.client.0.vm03.stdout:1/938: mkdir d0/d2/df/d27/d7e/d81/d135 0 2026-03-10T14:08:49.355 INFO:tasks.workunit.client.0.vm03.stdout:2/896: sync 2026-03-10T14:08:49.357 INFO:tasks.workunit.client.0.vm03.stdout:2/897: chown d5/d2a/d110/f10b 862382 1 2026-03-10T14:08:49.366 INFO:tasks.workunit.client.0.vm03.stdout:2/898: fsync d5/d10/d1f/d4f/d76/da7/d104/fec 0 2026-03-10T14:08:49.375 INFO:tasks.workunit.client.0.vm03.stdout:2/899: mkdir d5/d2a/d110/d126 0 2026-03-10T14:08:49.380 INFO:tasks.workunit.client.0.vm03.stdout:2/900: dwrite d5/d10/d17/fb6 [4194304,4194304] 0 2026-03-10T14:08:49.392 INFO:tasks.workunit.client.0.vm03.stdout:2/901: unlink d5/d10/d1f/d4f/d76/da7/d54/d5f/f125 0 2026-03-10T14:08:49.395 INFO:tasks.workunit.client.0.vm03.stdout:2/902: unlink d5/d35/f49 0 2026-03-10T14:08:49.448 INFO:tasks.workunit.client.0.vm03.stdout:0/851: dwrite d3/d46/d5e/f7c [0,4194304] 0 2026-03-10T14:08:49.457 INFO:tasks.workunit.client.0.vm03.stdout:9/916: write d2/d14/d2b/d43/f7c [991554,14754] 0 2026-03-10T14:08:49.460 INFO:tasks.workunit.client.0.vm03.stdout:4/937: write d5/d9/db/da7/dcc/d100/fe6 [4431492,99099] 0 2026-03-10T14:08:49.460 INFO:tasks.workunit.client.0.vm03.stdout:9/917: fdatasync d2/d14/d2b/d43/f110 0 2026-03-10T14:08:49.464 INFO:tasks.workunit.client.0.vm03.stdout:4/938: write d5/d9/db/da7/dcc/d100/fe6 [4616033,52124] 0 2026-03-10T14:08:49.476 INFO:tasks.workunit.client.0.vm03.stdout:1/939: dread d0/d18/d3b/d50/fd6 [0,4194304] 0 2026-03-10T14:08:49.486 INFO:tasks.workunit.client.0.vm03.stdout:4/939: unlink d5/d9/db/ldf 0 2026-03-10T14:08:49.492 INFO:tasks.workunit.client.0.vm03.stdout:4/940: dread - d5/d9/db/d19/d38/ded/f110 zero size 2026-03-10T14:08:49.492 INFO:tasks.workunit.client.0.vm03.stdout:4/941: chown d5/d9/db 1 1 2026-03-10T14:08:49.492 INFO:tasks.workunit.client.0.vm03.stdout:6/960: dwrite d8/d3b/da7/fc9 [0,4194304] 0 2026-03-10T14:08:49.492 INFO:tasks.workunit.client.0.vm03.stdout:9/918: sync 2026-03-10T14:08:49.507 INFO:tasks.workunit.client.0.vm03.stdout:4/942: rename d5/d9/db/d19/d38/d53/ldd to d5/d9/db/da7/dcc/d100/l142 0 2026-03-10T14:08:49.530 INFO:tasks.workunit.client.0.vm03.stdout:1/940: dread d0/f11 [0,4194304] 0 2026-03-10T14:08:49.535 INFO:tasks.workunit.client.0.vm03.stdout:9/919: link d2/d29/d33/d60/d8c/de1/f93 d2/d29/dcd/dd8/f126 0 2026-03-10T14:08:49.535 INFO:tasks.workunit.client.0.vm03.stdout:2/903: write d5/d2a/fcc [123577,9812] 0 2026-03-10T14:08:49.546 INFO:tasks.workunit.client.0.vm03.stdout:1/941: mkdir d0/d2/df/d27/d136 0 2026-03-10T14:08:49.548 INFO:tasks.workunit.client.0.vm03.stdout:1/942: readlink d0/l63 0 2026-03-10T14:08:49.549 INFO:tasks.workunit.client.0.vm03.stdout:9/920: dwrite d2/d14/d2b/d79/fb8 [0,4194304] 0 2026-03-10T14:08:49.553 INFO:tasks.workunit.client.0.vm03.stdout:2/904: fsync d5/d35/fb2 0 2026-03-10T14:08:49.556 INFO:tasks.workunit.client.0.vm03.stdout:1/943: sync 2026-03-10T14:08:49.564 INFO:tasks.workunit.client.0.vm03.stdout:9/921: symlink d2/d29/dcd/d72/l127 0 2026-03-10T14:08:49.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:49 vm04.local ceph-mon[55966]: pgmap v11: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 46 MiB/s rd, 95 MiB/s wr, 290 op/s 2026-03-10T14:08:49.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:49 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:49.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:49 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:49.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:49 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:49.568 INFO:tasks.workunit.client.0.vm03.stdout:0/852: write d3/d11/d2c/d4a/d4b/d89/fb4 [394344,2859] 0 2026-03-10T14:08:49.569 INFO:tasks.workunit.client.0.vm03.stdout:0/853: chown d3/d16/c42 31844656 1 2026-03-10T14:08:49.572 INFO:tasks.workunit.client.0.vm03.stdout:2/905: mkdir d5/db4/d74/d127 0 2026-03-10T14:08:49.574 INFO:tasks.workunit.client.0.vm03.stdout:0/854: chown d3/d11/d2c/d4a/l4f 21 1 2026-03-10T14:08:49.576 INFO:tasks.workunit.client.0.vm03.stdout:9/922: rename d2/d14/d2b/d43/f110 to d2/d29/ddc/f128 0 2026-03-10T14:08:49.577 INFO:tasks.workunit.client.0.vm03.stdout:2/906: unlink f4 0 2026-03-10T14:08:49.579 INFO:tasks.workunit.client.0.vm03.stdout:2/907: chown d5/db4/d74/d121/d85 7507920 1 2026-03-10T14:08:49.582 INFO:tasks.workunit.client.0.vm03.stdout:2/908: creat d5/d10/d1f/d4f/d76/da7/d54/d5f/f128 x:0 0 0 2026-03-10T14:08:49.585 INFO:tasks.workunit.client.0.vm03.stdout:9/923: sync 2026-03-10T14:08:49.586 INFO:tasks.workunit.client.0.vm03.stdout:9/924: chown d2/d14/d2b/d34/l89 30390 1 2026-03-10T14:08:49.596 INFO:tasks.workunit.client.0.vm03.stdout:0/855: dread d3/d11/f13 [0,4194304] 0 2026-03-10T14:08:49.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:49 vm03.local ceph-mon[49718]: pgmap v11: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 46 MiB/s rd, 95 MiB/s wr, 290 op/s 2026-03-10T14:08:49.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:49 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:49.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:49 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:49.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:49 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:49.614 INFO:tasks.workunit.client.0.vm03.stdout:4/943: write d5/d9/db/d19/d38/d7b/daa/daf/fe1 [370850,49679] 0 2026-03-10T14:08:49.615 INFO:tasks.workunit.client.0.vm03.stdout:4/944: chown d5/d47/d62/d8a/da0 24809 1 2026-03-10T14:08:49.621 INFO:tasks.workunit.client.0.vm03.stdout:9/925: dread d2/d29/d33/d41/d46/ff9 [0,4194304] 0 2026-03-10T14:08:49.625 INFO:tasks.workunit.client.0.vm03.stdout:4/945: mknod d5/d9/db/d19/d38/ded/c143 0 2026-03-10T14:08:49.625 INFO:tasks.workunit.client.0.vm03.stdout:9/926: unlink d2/lb0 0 2026-03-10T14:08:49.626 INFO:tasks.workunit.client.0.vm03.stdout:9/927: truncate d2/d14/f30 1242090 0 2026-03-10T14:08:49.628 INFO:tasks.workunit.client.0.vm03.stdout:4/946: creat d5/d9/db/da7/db9/d121/f144 x:0 0 0 2026-03-10T14:08:49.631 INFO:tasks.workunit.client.0.vm03.stdout:4/947: rmdir d5/d9/db/d19/d99 39 2026-03-10T14:08:49.640 INFO:tasks.workunit.client.0.vm03.stdout:9/928: dwrite d2/d29/f35 [0,4194304] 0 2026-03-10T14:08:49.653 INFO:tasks.workunit.client.0.vm03.stdout:1/944: dwrite d0/fe0 [0,4194304] 0 2026-03-10T14:08:49.657 INFO:tasks.workunit.client.0.vm03.stdout:2/909: dwrite d5/fa [0,4194304] 0 2026-03-10T14:08:49.660 INFO:tasks.workunit.client.0.vm03.stdout:9/929: mkdir d2/d29/d38/d129 0 2026-03-10T14:08:49.665 INFO:tasks.workunit.client.0.vm03.stdout:4/948: creat d5/d9/f145 x:0 0 0 2026-03-10T14:08:49.665 INFO:tasks.workunit.client.0.vm03.stdout:2/910: chown d5/d10/d1f/d4f/d76/da7/d40/c65 1339889 1 2026-03-10T14:08:49.665 INFO:tasks.workunit.client.0.vm03.stdout:9/930: write d2/d14/d100/f10d [389877,69011] 0 2026-03-10T14:08:49.665 INFO:tasks.workunit.client.0.vm03.stdout:4/949: chown d5/db4/dd2/dff 15867 1 2026-03-10T14:08:49.669 INFO:tasks.workunit.client.0.vm03.stdout:4/950: stat d5/d9/db/da7/d119/f135 0 2026-03-10T14:08:49.674 INFO:tasks.workunit.client.0.vm03.stdout:0/856: dwrite d3/d11/d2c/d4a/d4b/d89/db6/df4/f87 [4194304,4194304] 0 2026-03-10T14:08:49.681 INFO:tasks.workunit.client.0.vm03.stdout:4/951: sync 2026-03-10T14:08:49.690 INFO:tasks.workunit.client.0.vm03.stdout:1/945: rename d0/d2/d71/l85 to d0/d18/l137 0 2026-03-10T14:08:49.694 INFO:tasks.workunit.client.0.vm03.stdout:9/931: creat d2/d29/d33/d60/d7f/d11c/f12a x:0 0 0 2026-03-10T14:08:49.694 INFO:tasks.workunit.client.0.vm03.stdout:1/946: fdatasync d0/d2/df/d27/d7e/d81/fe2 0 2026-03-10T14:08:49.694 INFO:tasks.workunit.client.0.vm03.stdout:9/932: fsync d2/d29/d33/d60/f65 0 2026-03-10T14:08:49.698 INFO:tasks.workunit.client.0.vm03.stdout:0/857: mknod d3/d11/d2c/d4a/d4b/d89/d108/c111 0 2026-03-10T14:08:49.704 INFO:tasks.workunit.client.0.vm03.stdout:4/952: rename d5/f10c to d5/d47/d5b/dbe/f146 0 2026-03-10T14:08:49.711 INFO:tasks.workunit.client.0.vm03.stdout:0/858: creat d3/d11/d76/db5/ddb/dd4/f112 x:0 0 0 2026-03-10T14:08:49.711 INFO:tasks.workunit.client.0.vm03.stdout:9/933: link d2/d29/d9a/ca8 d2/d14/d2b/d34/c12b 0 2026-03-10T14:08:49.714 INFO:tasks.workunit.client.0.vm03.stdout:9/934: mkdir d2/d29/d33/d60/d7f/d11c/d12c 0 2026-03-10T14:08:49.714 INFO:tasks.workunit.client.0.vm03.stdout:4/953: chown d5/d9/db/ff 325 1 2026-03-10T14:08:49.716 INFO:tasks.workunit.client.0.vm03.stdout:4/954: creat d5/d9/db/da7/db9/def/f147 x:0 0 0 2026-03-10T14:08:49.717 INFO:tasks.workunit.client.0.vm03.stdout:9/935: creat d2/d29/d33/d60/d8c/dd3/dfe/f12d x:0 0 0 2026-03-10T14:08:49.717 INFO:tasks.workunit.client.0.vm03.stdout:9/936: readlink d2/d29/d33/d41/d46/l8f 0 2026-03-10T14:08:49.724 INFO:tasks.workunit.client.0.vm03.stdout:9/937: symlink d2/d29/d33/d60/d8c/de1/dbd/l12e 0 2026-03-10T14:08:49.726 INFO:tasks.workunit.client.0.vm03.stdout:4/955: write d5/d9/db/da7/db9/d121/f144 [695259,15382] 0 2026-03-10T14:08:49.727 INFO:tasks.workunit.client.0.vm03.stdout:9/938: readlink d2/d29/d33/d60/d8c/de1/dbd/l118 0 2026-03-10T14:08:49.730 INFO:tasks.workunit.client.0.vm03.stdout:4/956: symlink d5/d9/db/d19/d38/d7b/daa/l148 0 2026-03-10T14:08:49.730 INFO:tasks.workunit.client.0.vm03.stdout:9/939: chown d2/d14/d2b/d34/c5a 10624 1 2026-03-10T14:08:49.737 INFO:tasks.workunit.client.0.vm03.stdout:4/957: read d5/d47/d5b/d64/d85/f9d [1239021,76065] 0 2026-03-10T14:08:49.740 INFO:tasks.workunit.client.0.vm03.stdout:4/958: truncate d5/fe 7539150 0 2026-03-10T14:08:49.741 INFO:tasks.workunit.client.0.vm03.stdout:4/959: readlink d5/d9/db/d19/d38/d53/d55/l6f 0 2026-03-10T14:08:49.764 INFO:tasks.workunit.client.0.vm03.stdout:2/911: dwrite d5/d10/dba/dff/f11b [0,4194304] 0 2026-03-10T14:08:49.769 INFO:tasks.workunit.client.0.vm03.stdout:1/947: dread d0/d2/f14 [0,4194304] 0 2026-03-10T14:08:49.795 INFO:tasks.workunit.client.0.vm03.stdout:1/948: rename d0/d2/df/d27/d7e/d81/f86 to d0/d18/d1d/dc4/f138 0 2026-03-10T14:08:49.807 INFO:tasks.workunit.client.0.vm03.stdout:0/859: write d3/d16/d21/d3c/f99 [2974520,27885] 0 2026-03-10T14:08:49.812 INFO:tasks.workunit.client.0.vm03.stdout:2/912: dread d5/d10/d1f/d4f/d76/da7/d54/fe2 [0,4194304] 0 2026-03-10T14:08:49.831 INFO:tasks.workunit.client.0.vm03.stdout:4/960: write d5/d9/db/d19/d34/f5d [3246886,75581] 0 2026-03-10T14:08:49.835 INFO:tasks.workunit.client.0.vm03.stdout:9/940: dwrite d2/d29/dcd/dd8/f108 [0,4194304] 0 2026-03-10T14:08:49.852 INFO:tasks.workunit.client.0.vm03.stdout:0/860: write d3/d16/d21/d3c/fde [506966,52223] 0 2026-03-10T14:08:49.854 INFO:tasks.workunit.client.0.vm03.stdout:4/961: fdatasync d5/d47/f65 0 2026-03-10T14:08:49.855 INFO:tasks.workunit.client.0.vm03.stdout:0/861: chown d3/d16/d21/f78 7514 1 2026-03-10T14:08:49.857 INFO:tasks.workunit.client.0.vm03.stdout:4/962: stat d5/d9/db/d19/d38/d53/d55/d108/l124 0 2026-03-10T14:08:49.861 INFO:tasks.workunit.client.0.vm03.stdout:0/862: mknod d3/d16/c113 0 2026-03-10T14:08:49.863 INFO:tasks.workunit.client.0.vm03.stdout:4/963: rmdir d5/d9/db/d19/d34 39 2026-03-10T14:08:49.870 INFO:tasks.workunit.client.0.vm03.stdout:2/913: dwrite d5/d10/d17/f112 [0,4194304] 0 2026-03-10T14:08:49.884 INFO:tasks.workunit.client.0.vm03.stdout:4/964: dread d5/d9/db/d19/d34/f9b [0,4194304] 0 2026-03-10T14:08:49.888 INFO:tasks.workunit.client.0.vm03.stdout:2/914: getdents d5/d2a 0 2026-03-10T14:08:49.889 INFO:tasks.workunit.client.0.vm03.stdout:0/863: truncate d3/d11/d66/da2/fe2 556718 0 2026-03-10T14:08:49.890 INFO:tasks.workunit.client.0.vm03.stdout:9/941: dwrite d2/d14/d2b/d79/d8a/f8d [0,4194304] 0 2026-03-10T14:08:49.921 INFO:tasks.workunit.client.0.vm03.stdout:0/864: dread d3/d11/d2c/d4a/d4b/faa [0,4194304] 0 2026-03-10T14:08:49.926 INFO:tasks.workunit.client.0.vm03.stdout:0/865: fsync d3/d11/d76/db5/ddb/ff1 0 2026-03-10T14:08:49.931 INFO:tasks.workunit.client.0.vm03.stdout:2/915: sync 2026-03-10T14:08:49.932 INFO:tasks.workunit.client.0.vm03.stdout:2/916: stat d5/d10/d1f/d4f/d76/da7/d54/l5a 0 2026-03-10T14:08:49.944 INFO:tasks.workunit.client.0.vm03.stdout:0/866: write d3/d17/fb1 [2825335,77799] 0 2026-03-10T14:08:49.944 INFO:tasks.workunit.client.0.vm03.stdout:4/965: rename d5/d9/db/d19/d38/d53/d55/d108 to d5/d9/db/da7/d149 0 2026-03-10T14:08:49.952 INFO:tasks.workunit.client.0.vm03.stdout:9/942: rename d2/d29/dcd/d72/d9d/l111 to d2/d14/d2b/d76/l12f 0 2026-03-10T14:08:49.954 INFO:tasks.workunit.client.0.vm03.stdout:0/867: link d3/d16/f10a d3/d11/d2c/d4a/d4b/d89/db6/f114 0 2026-03-10T14:08:49.955 INFO:tasks.workunit.client.0.vm03.stdout:2/917: rename d5/db4/d74/d121/d7b/c7c to d5/db4/d74/d121/d85/c129 0 2026-03-10T14:08:49.960 INFO:tasks.workunit.client.0.vm03.stdout:2/918: dwrite d5/db4/d74/d121/d7b/f8c [0,4194304] 0 2026-03-10T14:08:49.977 INFO:tasks.workunit.client.0.vm03.stdout:4/966: rename d5/d9/db/da7/d149/l124 to d5/d9/db/d19/d99/d13d/l14a 0 2026-03-10T14:08:49.982 INFO:tasks.workunit.client.0.vm03.stdout:9/943: creat d2/d29/d33/f130 x:0 0 0 2026-03-10T14:08:49.984 INFO:tasks.workunit.client.0.vm03.stdout:4/967: fsync d5/d9/db/d19/d38/d53/d55/f76 0 2026-03-10T14:08:49.984 INFO:tasks.workunit.client.0.vm03.stdout:0/868: chown d3/d11/c1f 27 1 2026-03-10T14:08:49.985 INFO:tasks.workunit.client.0.vm03.stdout:4/968: chown d5/d9/db/da7/dcc/d100/d8e 9 1 2026-03-10T14:08:49.986 INFO:tasks.workunit.client.0.vm03.stdout:9/944: write d2/d29/ddc/f128 [695379,107330] 0 2026-03-10T14:08:49.989 INFO:tasks.workunit.client.0.vm03.stdout:9/945: read d2/d14/d2b/d34/f59 [75844,21406] 0 2026-03-10T14:08:49.992 INFO:tasks.workunit.client.0.vm03.stdout:0/869: truncate d3/d16/d21/d9a/fb2 1022051 0 2026-03-10T14:08:49.992 INFO:tasks.workunit.client.0.vm03.stdout:4/969: symlink d5/d47/d5b/d64/d85/l14b 0 2026-03-10T14:08:50.003 INFO:tasks.workunit.client.0.vm03.stdout:9/946: creat d2/d29/dcd/d72/d9d/f131 x:0 0 0 2026-03-10T14:08:50.007 INFO:tasks.workunit.client.0.vm03.stdout:9/947: symlink d2/d29/dcd/d9c/l132 0 2026-03-10T14:08:50.008 INFO:tasks.workunit.client.0.vm03.stdout:9/948: readlink d2/d14/d2b/l97 0 2026-03-10T14:08:50.020 INFO:tasks.workunit.client.0.vm03.stdout:9/949: read d2/d29/d33/d41/ffc [2723170,58574] 0 2026-03-10T14:08:50.031 INFO:tasks.workunit.client.0.vm03.stdout:4/970: getdents d5/d9/db/d19/d99 0 2026-03-10T14:08:50.031 INFO:tasks.workunit.client.0.vm03.stdout:2/919: write d5/d10/d31/f38 [2189,30779] 0 2026-03-10T14:08:50.032 INFO:tasks.workunit.client.0.vm03.stdout:0/870: write d3/d46/f82 [225251,121386] 0 2026-03-10T14:08:50.039 INFO:tasks.workunit.client.0.vm03.stdout:0/871: unlink d3/d46/f6c 0 2026-03-10T14:08:50.040 INFO:tasks.workunit.client.0.vm03.stdout:4/971: dread d5/d47/d5b/fba [0,4194304] 0 2026-03-10T14:08:50.041 INFO:tasks.workunit.client.0.vm03.stdout:9/950: getdents d2/d14/d2b/d79/d81/df3 0 2026-03-10T14:08:50.042 INFO:tasks.workunit.client.0.vm03.stdout:9/951: read - d2/d29/d33/d41/f7b zero size 2026-03-10T14:08:50.042 INFO:tasks.workunit.client.0.vm03.stdout:4/972: truncate d5/d9/db/da7/db9/def/d11f/d12b/f137 435950 0 2026-03-10T14:08:50.045 INFO:tasks.workunit.client.0.vm03.stdout:2/920: dwrite d5/d10/d1f/f5e [0,4194304] 0 2026-03-10T14:08:50.071 INFO:tasks.workunit.client.0.vm03.stdout:4/973: rename d5/d9/db/d19/d38/d53/d55/fa8 to d5/d9/db/da7/dcc/f14c 0 2026-03-10T14:08:50.077 INFO:tasks.workunit.client.0.vm03.stdout:4/974: rename d5/d47/la2 to d5/d9/db/d19/d38/l14d 0 2026-03-10T14:08:50.082 INFO:tasks.workunit.client.0.vm03.stdout:0/872: fdatasync d3/d4d/d30/f7a 0 2026-03-10T14:08:50.088 INFO:tasks.workunit.client.0.vm03.stdout:9/952: truncate d2/d29/d38/f4b 4773360 0 2026-03-10T14:08:50.093 INFO:tasks.workunit.client.0.vm03.stdout:6/961: stat d8/db/d49/d6c/d32/d3a/f5e 0 2026-03-10T14:08:50.098 INFO:tasks.workunit.client.0.vm03.stdout:2/921: dwrite d5/d10/d1f/d4f/d76/da7/d40/db0/fd5 [0,4194304] 0 2026-03-10T14:08:50.124 INFO:tasks.workunit.client.0.vm03.stdout:0/873: link d3/d11/d76/db5/ce5 d3/d11/d76/c115 0 2026-03-10T14:08:50.126 INFO:tasks.workunit.client.0.vm03.stdout:4/975: link d5/db4/lf4 d5/l14e 0 2026-03-10T14:08:50.138 INFO:tasks.workunit.client.0.vm03.stdout:9/953: getdents d2/d14/d2b/d43 0 2026-03-10T14:08:50.141 INFO:tasks.workunit.client.0.vm03.stdout:2/922: creat d5/d10/f12a x:0 0 0 2026-03-10T14:08:50.143 INFO:tasks.workunit.client.0.vm03.stdout:4/976: truncate d5/d9/db/da7/dcc/d100/fae 6855230 0 2026-03-10T14:08:50.155 INFO:tasks.workunit.client.0.vm03.stdout:4/977: dwrite d5/d9/db/da7/db9/def/d11f/d12b/f137 [0,4194304] 0 2026-03-10T14:08:50.169 INFO:tasks.workunit.client.0.vm03.stdout:4/978: dwrite f3 [0,4194304] 0 2026-03-10T14:08:50.175 INFO:tasks.workunit.client.0.vm03.stdout:4/979: creat d5/d9/db/d19/d38/ded/f14f x:0 0 0 2026-03-10T14:08:50.176 INFO:tasks.workunit.client.0.vm03.stdout:4/980: dread - d5/d9/db/da7/dcc/d100/fe3 zero size 2026-03-10T14:08:50.176 INFO:tasks.workunit.client.0.vm03.stdout:4/981: dread - d5/d47/d62/f80 zero size 2026-03-10T14:08:50.190 INFO:tasks.workunit.client.0.vm03.stdout:4/982: mknod d5/d9/db/d19/d38/d7b/daa/daf/c150 0 2026-03-10T14:08:50.220 INFO:tasks.workunit.client.0.vm03.stdout:2/923: dread d5/d10/d31/fa9 [0,4194304] 0 2026-03-10T14:08:50.278 INFO:tasks.workunit.client.0.vm03.stdout:0/874: truncate d3/d4d/da0/fbe 3282523 0 2026-03-10T14:08:50.279 INFO:tasks.workunit.client.0.vm03.stdout:0/875: write d3/f10 [9199054,35015] 0 2026-03-10T14:08:50.295 INFO:tasks.workunit.client.0.vm03.stdout:2/924: creat d5/db4/d74/d106/f12b x:0 0 0 2026-03-10T14:08:50.301 INFO:tasks.workunit.client.0.vm03.stdout:0/876: mkdir d3/d4d/d116 0 2026-03-10T14:08:50.307 INFO:tasks.workunit.client.0.vm03.stdout:0/877: rename d3/d16/f9e to d3/d11/d76/dbc/f117 0 2026-03-10T14:08:50.315 INFO:tasks.workunit.client.0.vm03.stdout:1/949: symlink d0/d2/db6/d12b/l139 0 2026-03-10T14:08:50.321 INFO:tasks.workunit.client.0.vm03.stdout:4/983: dwrite d5/d47/d5b/d64/feb [0,4194304] 0 2026-03-10T14:08:50.336 INFO:tasks.workunit.client.0.vm03.stdout:4/984: mknod d5/c151 0 2026-03-10T14:08:50.344 INFO:tasks.workunit.client.0.vm03.stdout:4/985: mknod d5/d47/d62/d8a/dd0/d102/c152 0 2026-03-10T14:08:50.350 INFO:tasks.workunit.client.0.vm03.stdout:4/986: chown d5/d9/db/da7/dcc/d100/le0 1882953 1 2026-03-10T14:08:50.351 INFO:tasks.workunit.client.0.vm03.stdout:4/987: creat d5/d9/db/da7/d149/f153 x:0 0 0 2026-03-10T14:08:50.390 INFO:tasks.workunit.client.0.vm03.stdout:2/925: write d5/d10/f22 [5390726,84442] 0 2026-03-10T14:08:50.405 INFO:tasks.workunit.client.0.vm03.stdout:2/926: read d5/d10/d1f/d4f/d76/da7/d40/f52 [456369,13998] 0 2026-03-10T14:08:50.410 INFO:tasks.workunit.client.0.vm03.stdout:0/878: dwrite d3/d4d/d30/f97 [0,4194304] 0 2026-03-10T14:08:50.419 INFO:tasks.workunit.client.0.vm03.stdout:2/927: creat d5/d10/da3/dab/dc8/f12c x:0 0 0 2026-03-10T14:08:50.424 INFO:tasks.workunit.client.0.vm03.stdout:2/928: mknod d5/d10/d1f/d4f/d76/da7/d40/dc2/c12d 0 2026-03-10T14:08:50.426 INFO:tasks.workunit.client.0.vm03.stdout:2/929: readlink d5/le 0 2026-03-10T14:08:50.429 INFO:tasks.workunit.client.0.vm03.stdout:2/930: mkdir d5/db4/d74/d127/d12e 0 2026-03-10T14:08:50.429 INFO:tasks.workunit.client.0.vm03.stdout:2/931: dread - d5/db4/d74/d121/d85/f11a zero size 2026-03-10T14:08:50.430 INFO:tasks.workunit.client.0.vm03.stdout:1/950: dwrite d0/d2/d71/d90/f107 [0,4194304] 0 2026-03-10T14:08:50.442 INFO:tasks.workunit.client.0.vm03.stdout:2/932: mknod d5/d2a/d110/d126/c12f 0 2026-03-10T14:08:50.451 INFO:tasks.workunit.client.0.vm03.stdout:2/933: mknod d5/d10/d17/c130 0 2026-03-10T14:08:50.490 INFO:tasks.workunit.client.0.vm03.stdout:0/879: sync 2026-03-10T14:08:50.491 INFO:tasks.workunit.client.0.vm03.stdout:1/951: creat d0/d18/d3b/db7/d115/f13a x:0 0 0 2026-03-10T14:08:50.492 INFO:tasks.workunit.client.0.vm03.stdout:1/952: chown d0/d2/df/d27/f99 145354165 1 2026-03-10T14:08:50.493 INFO:tasks.workunit.client.0.vm03.stdout:0/880: creat d3/d46/dac/d79/f118 x:0 0 0 2026-03-10T14:08:50.498 INFO:tasks.workunit.client.0.vm03.stdout:4/988: dwrite d5/d9/db/d19/d99/fd7 [0,4194304] 0 2026-03-10T14:08:50.504 INFO:tasks.workunit.client.0.vm03.stdout:4/989: dread - d5/d47/d62/de9/f103 zero size 2026-03-10T14:08:50.534 INFO:tasks.workunit.client.0.vm03.stdout:2/934: dwrite d5/d10/d17/f18 [0,4194304] 0 2026-03-10T14:08:50.544 INFO:tasks.workunit.client.0.vm03.stdout:6/962: link d8/d11/da0/ldf d8/d11/d18/d79/d80/df1/df6/d11f/l120 0 2026-03-10T14:08:50.551 INFO:tasks.workunit.client.0.vm03.stdout:2/935: creat d5/db4/d74/f131 x:0 0 0 2026-03-10T14:08:50.565 INFO:tasks.workunit.client.0.vm03.stdout:0/881: dwrite d3/f28 [0,4194304] 0 2026-03-10T14:08:50.567 INFO:tasks.workunit.client.0.vm03.stdout:0/882: fdatasync d3/d16/f10a 0 2026-03-10T14:08:50.576 INFO:tasks.workunit.client.0.vm03.stdout:0/883: dwrite d3/d46/d5e/f7c [0,4194304] 0 2026-03-10T14:08:50.577 INFO:tasks.workunit.client.0.vm03.stdout:7/819: dwrite d5/d9/d14/d21/d6f/faa [0,4194304] 0 2026-03-10T14:08:50.583 INFO:tasks.workunit.client.0.vm03.stdout:0/884: fsync d3/f9 0 2026-03-10T14:08:50.586 INFO:tasks.workunit.client.0.vm03.stdout:1/953: link d0/d18/d3b/d50/cde d0/d2/df/dab/dc3/c13b 0 2026-03-10T14:08:50.611 INFO:tasks.workunit.client.0.vm03.stdout:4/990: truncate d5/d9/db/da7/dc0/fea 2124392 0 2026-03-10T14:08:50.632 INFO:tasks.workunit.client.0.vm03.stdout:2/936: write d5/d10/d1f/d4f/d76/da7/d40/f52 [912184,112166] 0 2026-03-10T14:08:50.633 INFO:tasks.workunit.client.0.vm03.stdout:2/937: dread - d5/d10/d31/fe5 zero size 2026-03-10T14:08:50.633 INFO:tasks.workunit.client.0.vm03.stdout:2/938: chown d5/db4/cbc 1891791068 1 2026-03-10T14:08:50.634 INFO:tasks.workunit.client.0.vm03.stdout:2/939: stat d5/d35/fad 0 2026-03-10T14:08:50.640 INFO:tasks.workunit.client.0.vm03.stdout:4/991: getdents d5/d9/db/da7/db9/def/d11f 0 2026-03-10T14:08:50.657 INFO:tasks.workunit.client.0.vm03.stdout:6/963: write d8/d11/da0/dbf/d5c/f69 [1099971,47629] 0 2026-03-10T14:08:50.664 INFO:tasks.workunit.client.0.vm03.stdout:2/940: rename d5/d10/d1f/d4f/d76/da7/d40/f105 to d5/db4/d74/d121/d7b/f132 0 2026-03-10T14:08:50.664 INFO:tasks.workunit.client.0.vm03.stdout:4/992: link d5/db4/dd2/l136 d5/d9/db/d19/d99/l154 0 2026-03-10T14:08:50.666 INFO:tasks.workunit.client.0.vm03.stdout:4/993: symlink d5/d9/db/da7/dcc/d100/dd5/l155 0 2026-03-10T14:08:50.668 INFO:tasks.workunit.client.0.vm03.stdout:4/994: dread - d5/d47/d5b/dbe/fcd zero size 2026-03-10T14:08:50.673 INFO:tasks.workunit.client.0.vm03.stdout:4/995: fdatasync d5/d47/d62/fc3 0 2026-03-10T14:08:50.686 INFO:tasks.workunit.client.0.vm03.stdout:4/996: dread d5/d47/d62/d8a/dd0/d102/f63 [0,4194304] 0 2026-03-10T14:08:50.687 INFO:tasks.workunit.client.0.vm03.stdout:4/997: write d5/d47/f123 [76427,101554] 0 2026-03-10T14:08:50.708 INFO:tasks.workunit.client.0.vm03.stdout:1/954: dwrite d0/d2/df/d27/f52 [0,4194304] 0 2026-03-10T14:08:50.721 INFO:tasks.workunit.client.0.vm03.stdout:4/998: fdatasync d5/d47/d5b/fba 0 2026-03-10T14:08:50.722 INFO:tasks.workunit.client.0.vm03.stdout:4/999: write d5/d9/db/da7/d119/f135 [1046579,83683] 0 2026-03-10T14:08:50.724 INFO:tasks.workunit.client.0.vm03.stdout:2/941: write d5/d10/d1f/d4f/d76/da7/f3c [1920450,126876] 0 2026-03-10T14:08:50.732 INFO:tasks.workunit.client.0.vm03.stdout:2/942: truncate d5/db4/f98 3644205 0 2026-03-10T14:08:50.735 INFO:tasks.workunit.client.0.vm03.stdout:2/943: dread d5/d10/d31/fa9 [0,4194304] 0 2026-03-10T14:08:50.737 INFO:tasks.workunit.client.0.vm03.stdout:2/944: creat d5/d10/da3/dab/dc8/f133 x:0 0 0 2026-03-10T14:08:50.752 INFO:tasks.workunit.client.0.vm03.stdout:2/945: dwrite d5/db4/d74/d121/d7b/f91 [0,4194304] 0 2026-03-10T14:08:50.756 INFO:tasks.workunit.client.0.vm03.stdout:0/885: creat d3/d11/d2c/f119 x:0 0 0 2026-03-10T14:08:50.759 INFO:tasks.workunit.client.0.vm03.stdout:2/946: dwrite d5/f10c [0,4194304] 0 2026-03-10T14:08:50.763 INFO:tasks.workunit.client.0.vm03.stdout:2/947: creat d5/d2a/d110/f134 x:0 0 0 2026-03-10T14:08:50.764 INFO:tasks.workunit.client.0.vm03.stdout:2/948: symlink d5/d35/l135 0 2026-03-10T14:08:50.766 INFO:tasks.workunit.client.0.vm03.stdout:2/949: rename d5/d10/dba to d5/d10/da3/d136 0 2026-03-10T14:08:50.814 INFO:tasks.workunit.client.0.vm03.stdout:2/950: write d5/db4/f98 [2149698,124224] 0 2026-03-10T14:08:50.814 INFO:tasks.workunit.client.0.vm03.stdout:2/951: readlink d5/le 0 2026-03-10T14:08:50.823 INFO:tasks.workunit.client.0.vm03.stdout:0/886: dwrite d3/d11/d2c/d4a/d4b/d89/db6/df4/f85 [0,4194304] 0 2026-03-10T14:08:50.843 INFO:tasks.workunit.client.0.vm03.stdout:2/952: dread d5/d35/f81 [0,4194304] 0 2026-03-10T14:08:50.844 INFO:tasks.workunit.client.0.vm03.stdout:2/953: mknod d5/d2a/d110/d126/c137 0 2026-03-10T14:08:50.848 INFO:tasks.workunit.client.0.vm03.stdout:2/954: symlink d5/d10/d1f/d11d/l138 0 2026-03-10T14:08:50.849 INFO:tasks.workunit.client.0.vm03.stdout:2/955: truncate d5/d35/fad 655337 0 2026-03-10T14:08:50.851 INFO:tasks.workunit.client.0.vm03.stdout:2/956: fsync d5/db4/d74/d121/d7b/f132 0 2026-03-10T14:08:50.854 INFO:tasks.workunit.client.0.vm03.stdout:2/957: creat d5/db4/d74/d83/dc1/f139 x:0 0 0 2026-03-10T14:08:50.856 INFO:tasks.workunit.client.0.vm03.stdout:2/958: creat d5/d10/d1f/d4f/d76/da7/de9/d109/f13a x:0 0 0 2026-03-10T14:08:50.857 INFO:tasks.workunit.client.0.vm03.stdout:2/959: rmdir d5/d10/d1f/d4f/d76/da7/d40/dc2 39 2026-03-10T14:08:50.861 INFO:tasks.workunit.client.0.vm03.stdout:2/960: rename d5/d10/d1f/d4f/d76/da7/d54/d5f/caf to d5/db4/d74/d127/d12e/c13b 0 2026-03-10T14:08:50.867 INFO:tasks.workunit.client.0.vm03.stdout:0/887: dwrite d3/d16/f34 [4194304,4194304] 0 2026-03-10T14:08:50.868 INFO:tasks.workunit.client.0.vm03.stdout:0/888: chown d3/d11/d76/db5/fe4 438 1 2026-03-10T14:08:50.869 INFO:tasks.workunit.client.0.vm03.stdout:0/889: read d3/d4d/d30/f7a [3017845,54100] 0 2026-03-10T14:08:50.870 INFO:tasks.workunit.client.0.vm03.stdout:0/890: write d3/d16/d21/d9a/fc3 [2705862,29705] 0 2026-03-10T14:08:50.876 INFO:tasks.workunit.client.0.vm03.stdout:0/891: rename d3/d16/c42 to d3/d46/c11a 0 2026-03-10T14:08:50.885 INFO:tasks.workunit.client.0.vm03.stdout:6/964: symlink d8/l121 0 2026-03-10T14:08:50.919 INFO:tasks.workunit.client.0.vm03.stdout:1/955: rmdir d0/d2/df/d126 0 2026-03-10T14:08:50.919 INFO:tasks.workunit.client.0.vm03.stdout:7/820: stat d5/d9/d14/d26/d36/d51/dc8/c103 0 2026-03-10T14:08:50.919 INFO:tasks.workunit.client.0.vm03.stdout:7/821: stat d5/d9/d14/d26/d5f/de2/fe9 0 2026-03-10T14:08:50.919 INFO:tasks.workunit.client.0.vm03.stdout:7/822: dread d5/d9/d14/f23 [0,4194304] 0 2026-03-10T14:08:50.919 INFO:tasks.workunit.client.0.vm03.stdout:6/965: unlink d8/d11/da0/dbf/fde 0 2026-03-10T14:08:50.919 INFO:tasks.workunit.client.0.vm03.stdout:7/823: dwrite d5/d9/d14/d21/d6f/faa [0,4194304] 0 2026-03-10T14:08:50.937 INFO:tasks.workunit.client.0.vm03.stdout:9/954: creat d2/d29/d33/f133 x:0 0 0 2026-03-10T14:08:50.942 INFO:tasks.workunit.client.0.vm03.stdout:6/966: mknod d8/db/d49/d58/ddc/c122 0 2026-03-10T14:08:50.947 INFO:tasks.workunit.client.0.vm03.stdout:9/955: mknod d2/d14/d2b/d79/d81/c134 0 2026-03-10T14:08:50.953 INFO:tasks.workunit.client.0.vm03.stdout:9/956: dread - d2/d29/d33/d10c/f104 zero size 2026-03-10T14:08:50.953 INFO:tasks.workunit.client.0.vm03.stdout:7/824: getdents d5/d9/d14/d21 0 2026-03-10T14:08:50.953 INFO:tasks.workunit.client.0.vm03.stdout:9/957: truncate d2/d29/d38/f4e 582227 0 2026-03-10T14:08:50.956 INFO:tasks.workunit.client.0.vm03.stdout:6/967: rename d8/db/df/f10 to d8/d11/da0/dbf/d5c/d60/f123 0 2026-03-10T14:08:50.961 INFO:tasks.workunit.client.0.vm03.stdout:9/958: rename d2/d29/d33/f130 to d2/d14/d2b/d11e/f135 0 2026-03-10T14:08:50.962 INFO:tasks.workunit.client.0.vm03.stdout:9/959: chown d2/d29/dcd/d72/l127 118093 1 2026-03-10T14:08:50.962 INFO:tasks.workunit.client.0.vm03.stdout:6/968: chown d8/d11/d18/f34 2801 1 2026-03-10T14:08:50.963 INFO:tasks.workunit.client.0.vm03.stdout:6/969: write d8/d11/da0/dbf/d5c/f110 [752788,78350] 0 2026-03-10T14:08:50.969 INFO:tasks.workunit.client.0.vm03.stdout:9/960: rename d2/d14/c19 to d2/d14/d2b/d34/c136 0 2026-03-10T14:08:50.971 INFO:tasks.workunit.client.0.vm03.stdout:9/961: readlink d2/d14/d2b/lc1 0 2026-03-10T14:08:50.971 INFO:tasks.workunit.client.0.vm03.stdout:9/962: fdatasync d2/d14/ff1 0 2026-03-10T14:08:50.976 INFO:tasks.workunit.client.0.vm03.stdout:6/970: creat d8/d1b/f124 x:0 0 0 2026-03-10T14:08:50.978 INFO:tasks.workunit.client.0.vm03.stdout:9/963: creat d2/d29/d33/d60/d8c/de1/dbd/f137 x:0 0 0 2026-03-10T14:08:50.992 INFO:tasks.workunit.client.0.vm03.stdout:1/956: write d0/d18/d3b/db7/fbd [1093300,20487] 0 2026-03-10T14:08:50.998 INFO:tasks.workunit.client.0.vm03.stdout:1/957: mknod d0/d2/d71/c13c 0 2026-03-10T14:08:51.002 INFO:tasks.workunit.client.0.vm03.stdout:2/961: dwrite d5/db4/d74/d121/fe4 [0,4194304] 0 2026-03-10T14:08:51.016 INFO:tasks.workunit.client.0.vm03.stdout:7/825: sync 2026-03-10T14:08:51.016 INFO:tasks.workunit.client.0.vm03.stdout:0/892: sync 2026-03-10T14:08:51.025 INFO:tasks.workunit.client.0.vm03.stdout:1/958: dread d0/d18/d3b/db7/fbd [0,4194304] 0 2026-03-10T14:08:51.026 INFO:tasks.workunit.client.0.vm03.stdout:7/826: write d5/d9/d14/d26/d36/d51/dc8/de3/f108 [651102,97131] 0 2026-03-10T14:08:51.026 INFO:tasks.workunit.client.0.vm03.stdout:2/962: dwrite d5/d10/f12a [0,4194304] 0 2026-03-10T14:08:51.030 INFO:tasks.workunit.client.0.vm03.stdout:1/959: write d0/f24 [2330316,70957] 0 2026-03-10T14:08:51.031 INFO:tasks.workunit.client.0.vm03.stdout:6/971: write d8/d11/d18/d54/f8b [7930,91134] 0 2026-03-10T14:08:51.031 INFO:tasks.workunit.client.0.vm03.stdout:9/964: write d2/d14/f64 [1028191,75608] 0 2026-03-10T14:08:51.036 INFO:tasks.workunit.client.0.vm03.stdout:0/893: dwrite d3/f10 [4194304,4194304] 0 2026-03-10T14:08:51.044 INFO:tasks.workunit.client.0.vm03.stdout:0/894: truncate d3/d16/f31 4146193 0 2026-03-10T14:08:51.050 INFO:tasks.workunit.client.0.vm03.stdout:0/895: rename d3/d46/f8c to d3/d11/d76/db5/ddb/f11b 0 2026-03-10T14:08:51.059 INFO:tasks.workunit.client.0.vm03.stdout:0/896: fdatasync d3/f94 0 2026-03-10T14:08:51.059 INFO:tasks.workunit.client.0.vm03.stdout:1/960: dwrite d0/d2/df/d27/f99 [0,4194304] 0 2026-03-10T14:08:51.064 INFO:tasks.workunit.client.0.vm03.stdout:0/897: fdatasync d3/d46/d5e/f7c 0 2026-03-10T14:08:51.067 INFO:tasks.workunit.client.0.vm03.stdout:7/827: mkdir d5/d9/d35/d101/d115 0 2026-03-10T14:08:51.067 INFO:tasks.workunit.client.0.vm03.stdout:7/828: chown d5/d9/l99 122341278 1 2026-03-10T14:08:51.068 INFO:tasks.workunit.client.0.vm03.stdout:7/829: write d5/d9/d14/d21/fea [603948,121856] 0 2026-03-10T14:08:51.074 INFO:tasks.workunit.client.0.vm03.stdout:9/965: getdents d2/d29/d33/d60/d8c/de1 0 2026-03-10T14:08:51.077 INFO:tasks.workunit.client.0.vm03.stdout:0/898: fdatasync d3/d4d/da0/fbe 0 2026-03-10T14:08:51.077 INFO:tasks.workunit.client.0.vm03.stdout:2/963: dwrite d5/db4/fd9 [0,4194304] 0 2026-03-10T14:08:51.080 INFO:tasks.workunit.client.0.vm03.stdout:6/972: creat d8/db/d49/f125 x:0 0 0 2026-03-10T14:08:51.095 INFO:tasks.workunit.client.0.vm03.stdout:9/966: mkdir d2/d29/d33/d60/d8c/dd3/dfe/d138 0 2026-03-10T14:08:51.097 INFO:tasks.workunit.client.0.vm03.stdout:6/973: truncate d8/d3b/da7/f119 8101 0 2026-03-10T14:08:51.099 INFO:tasks.workunit.client.0.vm03.stdout:6/974: chown d8/db/d49/d58 4992 1 2026-03-10T14:08:51.105 INFO:tasks.workunit.client.0.vm03.stdout:6/975: creat d8/d11/da0/dbf/f126 x:0 0 0 2026-03-10T14:08:51.114 INFO:tasks.workunit.client.0.vm03.stdout:0/899: fdatasync d3/d16/d21/d3c/f9b 0 2026-03-10T14:08:51.114 INFO:tasks.workunit.client.0.vm03.stdout:6/976: creat d8/db/d12/d64/de0/f127 x:0 0 0 2026-03-10T14:08:51.114 INFO:tasks.workunit.client.0.vm03.stdout:6/977: mkdir d8/d11/da0/dbf/d8c/d128 0 2026-03-10T14:08:51.116 INFO:tasks.workunit.client.0.vm03.stdout:9/967: sync 2026-03-10T14:08:51.117 INFO:tasks.workunit.client.0.vm03.stdout:6/978: symlink d8/db/d12/l129 0 2026-03-10T14:08:51.124 INFO:tasks.workunit.client.0.vm03.stdout:0/900: rename d3/d4d/f2a to d3/d46/de0/f11c 0 2026-03-10T14:08:51.126 INFO:tasks.workunit.client.0.vm03.stdout:9/968: dwrite d2/d29/d9a/fb7 [0,4194304] 0 2026-03-10T14:08:51.130 INFO:tasks.workunit.client.0.vm03.stdout:9/969: chown d2/d29/d33/d41/d46/l58 0 1 2026-03-10T14:08:51.131 INFO:tasks.workunit.client.0.vm03.stdout:0/901: unlink d3/d16/d21/d9a/fab 0 2026-03-10T14:08:51.134 INFO:tasks.workunit.client.0.vm03.stdout:9/970: dread d2/f83 [0,4194304] 0 2026-03-10T14:08:51.134 INFO:tasks.workunit.client.0.vm03.stdout:9/971: chown d2/d29/d33/d6d/c71 3019074 1 2026-03-10T14:08:51.139 INFO:tasks.workunit.client.0.vm03.stdout:1/961: dread d0/d2/d71/d90/f9e [0,4194304] 0 2026-03-10T14:08:51.160 INFO:tasks.workunit.client.0.vm03.stdout:2/964: truncate d5/f10c 671950 0 2026-03-10T14:08:51.164 INFO:tasks.workunit.client.0.vm03.stdout:2/965: truncate d5/d10/d1f/d4f/d76/da7/d104/fbf 700222 0 2026-03-10T14:08:51.164 INFO:tasks.workunit.client.0.vm03.stdout:7/830: dwrite d5/d9/d14/d26/d39/db1/fe0 [0,4194304] 0 2026-03-10T14:08:51.167 INFO:tasks.workunit.client.0.vm03.stdout:2/966: chown d5/d10/d1f/d4f/d76/fa1 12830 1 2026-03-10T14:08:51.169 INFO:tasks.workunit.client.0.vm03.stdout:2/967: fdatasync d5/db4/d74/d121/f70 0 2026-03-10T14:08:51.176 INFO:tasks.workunit.client.0.vm03.stdout:2/968: dwrite d5/d35/f81 [0,4194304] 0 2026-03-10T14:08:51.183 INFO:tasks.workunit.client.0.vm03.stdout:2/969: dwrite d5/d10/d1f/d4f/d76/fca [0,4194304] 0 2026-03-10T14:08:51.186 INFO:tasks.workunit.client.0.vm03.stdout:2/970: readlink d5/d35/ld1 0 2026-03-10T14:08:51.189 INFO:tasks.workunit.client.0.vm03.stdout:2/971: mkdir d5/db4/d74/d83/d13c 0 2026-03-10T14:08:51.189 INFO:tasks.workunit.client.0.vm03.stdout:0/902: chown d3/d11/f8e 3813871 1 2026-03-10T14:08:51.191 INFO:tasks.workunit.client.0.vm03.stdout:0/903: chown d3/d11/d2c/d4a/d4b/d89/db9/ffe 3 1 2026-03-10T14:08:51.202 INFO:tasks.workunit.client.0.vm03.stdout:9/972: unlink d2/d14/d2b/lc1 0 2026-03-10T14:08:51.207 INFO:tasks.workunit.client.0.vm03.stdout:0/904: mkdir d3/d4d/da0/d11d 0 2026-03-10T14:08:51.210 INFO:tasks.workunit.client.0.vm03.stdout:9/973: fdatasync d2/d14/d2b/f2d 0 2026-03-10T14:08:51.211 INFO:tasks.workunit.client.0.vm03.stdout:9/974: chown d2/d29/d33/c49 727493141 1 2026-03-10T14:08:51.211 INFO:tasks.workunit.client.0.vm03.stdout:2/972: creat d5/d10/d1f/d4f/d76/da7/de9/d109/f13d x:0 0 0 2026-03-10T14:08:51.212 INFO:tasks.workunit.client.0.vm03.stdout:0/905: readlink d3/d16/d21/l43 0 2026-03-10T14:08:51.215 INFO:tasks.workunit.client.0.vm03.stdout:6/979: rename d8/db/d49/d6c/d32/lf8 to d8/db/d49/d58/l12a 0 2026-03-10T14:08:51.217 INFO:tasks.workunit.client.0.vm03.stdout:2/973: unlink d5/d10/d1f/d4f/d76/da7/d40/dc2/c12d 0 2026-03-10T14:08:51.217 INFO:tasks.workunit.client.0.vm03.stdout:2/974: write d5/d2a/d110/f134 [953419,129837] 0 2026-03-10T14:08:51.223 INFO:tasks.workunit.client.0.vm03.stdout:2/975: mknod d5/db4/d74/d127/d12e/c13e 0 2026-03-10T14:08:51.226 INFO:tasks.workunit.client.0.vm03.stdout:9/975: creat d2/d29/d33/d60/d7f/d11c/d12c/f139 x:0 0 0 2026-03-10T14:08:51.227 INFO:tasks.workunit.client.0.vm03.stdout:9/976: chown d2/d29/l32 40158 1 2026-03-10T14:08:51.228 INFO:tasks.workunit.client.0.vm03.stdout:9/977: read - d2/d29/d33/d60/d8c/de1/dbd/f116 zero size 2026-03-10T14:08:51.232 INFO:tasks.workunit.client.0.vm03.stdout:6/980: mkdir d8/d12b 0 2026-03-10T14:08:51.246 INFO:tasks.workunit.client.0.vm03.stdout:6/981: fdatasync d8/ffe 0 2026-03-10T14:08:51.253 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:51 vm04.local ceph-mon[55966]: pgmap v12: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 44 MiB/s rd, 84 MiB/s wr, 254 op/s 2026-03-10T14:08:51.253 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:51 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:51.254 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:51 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:51.254 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:51 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:51.254 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:51 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:51.257 INFO:tasks.workunit.client.0.vm03.stdout:6/982: mknod d8/db/d12/c12c 0 2026-03-10T14:08:51.263 INFO:tasks.workunit.client.0.vm03.stdout:6/983: creat d8/d11/da0/dca/f12d x:0 0 0 2026-03-10T14:08:51.285 INFO:tasks.workunit.client.0.vm03.stdout:6/984: rename d8/db/d49/d6c/d32/dcb/d106 to d8/d11/d10c/d12e 0 2026-03-10T14:08:51.286 INFO:tasks.workunit.client.0.vm03.stdout:6/985: dread - d8/d11/da0/dca/f107 zero size 2026-03-10T14:08:51.287 INFO:tasks.workunit.client.0.vm03.stdout:6/986: chown d8/db/d49/d6c/c59 502513 1 2026-03-10T14:08:51.314 INFO:tasks.workunit.client.0.vm03.stdout:7/831: dread d5/d9/fec [0,4194304] 0 2026-03-10T14:08:51.340 INFO:tasks.workunit.client.0.vm03.stdout:0/906: dread d3/d4d/d47/f74 [0,4194304] 0 2026-03-10T14:08:51.343 INFO:tasks.workunit.client.0.vm03.stdout:1/962: truncate d0/d42/f7f 370266 0 2026-03-10T14:08:51.350 INFO:tasks.workunit.client.0.vm03.stdout:2/976: dwrite d5/d2a/fb9 [0,4194304] 0 2026-03-10T14:08:51.350 INFO:tasks.workunit.client.0.vm03.stdout:9/978: dwrite d2/d29/d33/f70 [0,4194304] 0 2026-03-10T14:08:51.356 INFO:tasks.workunit.client.0.vm03.stdout:0/907: rename d3/d4d/d47 to d3/d11/d2c/d4a/d11e 0 2026-03-10T14:08:51.357 INFO:tasks.workunit.client.0.vm03.stdout:2/977: rmdir d5/d10/d1f/d4f/d76/da7/d40/db0 39 2026-03-10T14:08:51.359 INFO:tasks.workunit.client.0.vm03.stdout:0/908: symlink d3/d46/dac/d79/l11f 0 2026-03-10T14:08:51.366 INFO:tasks.workunit.client.0.vm03.stdout:6/987: rmdir d8/db 39 2026-03-10T14:08:51.367 INFO:tasks.workunit.client.0.vm03.stdout:0/909: mknod d3/d16/d21/d3c/c120 0 2026-03-10T14:08:51.368 INFO:tasks.workunit.client.0.vm03.stdout:1/963: rmdir d0/d2/df/d16/ded 39 2026-03-10T14:08:51.369 INFO:tasks.workunit.client.0.vm03.stdout:0/910: write d3/d17/f102 [308600,82861] 0 2026-03-10T14:08:51.373 INFO:tasks.workunit.client.0.vm03.stdout:1/964: write d0/d18/d3b/d50/ffd [888438,115718] 0 2026-03-10T14:08:51.400 INFO:tasks.workunit.client.0.vm03.stdout:1/965: dread d0/d18/d1d/dc4/f138 [0,4194304] 0 2026-03-10T14:08:51.408 INFO:tasks.workunit.client.0.vm03.stdout:0/911: getdents d3/d11/d76/db5/ddb 0 2026-03-10T14:08:51.410 INFO:tasks.workunit.client.0.vm03.stdout:0/912: symlink d3/d16/d21/dd6/l121 0 2026-03-10T14:08:51.411 INFO:tasks.workunit.client.0.vm03.stdout:0/913: truncate d3/d16/f3e 8840480 0 2026-03-10T14:08:51.445 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:51 vm03.local ceph-mon[49718]: pgmap v12: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 44 MiB/s rd, 84 MiB/s wr, 254 op/s 2026-03-10T14:08:51.445 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:51 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:51.445 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:51 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:51.445 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:51 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:51.445 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:51 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:51.518 INFO:tasks.workunit.client.0.vm03.stdout:2/978: truncate d5/d10/da3/d136/dff/f11b 2894415 0 2026-03-10T14:08:51.525 INFO:tasks.workunit.client.0.vm03.stdout:2/979: creat d5/db4/d74/f13f x:0 0 0 2026-03-10T14:08:51.527 INFO:tasks.workunit.client.0.vm03.stdout:0/914: write d3/d11/d76/db5/fe4 [800357,69525] 0 2026-03-10T14:08:51.536 INFO:tasks.workunit.client.0.vm03.stdout:2/980: creat d5/d10/d1f/d4f/d76/f140 x:0 0 0 2026-03-10T14:08:51.540 INFO:tasks.workunit.client.0.vm03.stdout:7/832: dwrite d5/d9/d14/d26/d36/d51/dc8/fd2 [0,4194304] 0 2026-03-10T14:08:51.542 INFO:tasks.workunit.client.0.vm03.stdout:2/981: creat d5/d10/d1f/d4f/d76/da7/d104/f141 x:0 0 0 2026-03-10T14:08:51.542 INFO:tasks.workunit.client.0.vm03.stdout:2/982: chown d5/d2a/d110 42 1 2026-03-10T14:08:51.618 INFO:tasks.workunit.client.0.vm03.stdout:0/915: truncate d3/d46/ffc 694882 0 2026-03-10T14:08:51.619 INFO:tasks.workunit.client.0.vm03.stdout:2/983: write d5/d10/d31/f3d [1537014,123972] 0 2026-03-10T14:08:51.621 INFO:tasks.workunit.client.0.vm03.stdout:0/916: mknod d3/d46/dac/c122 0 2026-03-10T14:08:51.645 INFO:tasks.workunit.client.0.vm03.stdout:9/979: unlink d2/d14/c48 0 2026-03-10T14:08:51.655 INFO:tasks.workunit.client.0.vm03.stdout:2/984: dread d5/d10/d1f/d4f/d76/da7/d54/d5f/f11e [0,4194304] 0 2026-03-10T14:08:51.657 INFO:tasks.workunit.client.0.vm03.stdout:2/985: symlink d5/d10/d1f/d4f/d76/da7/d40/dc2/l142 0 2026-03-10T14:08:51.658 INFO:tasks.workunit.client.0.vm03.stdout:2/986: read d5/dac/ff1 [1696865,121957] 0 2026-03-10T14:08:51.660 INFO:tasks.workunit.client.0.vm03.stdout:2/987: mkdir d5/db4/d74/d83/d143 0 2026-03-10T14:08:51.661 INFO:tasks.workunit.client.0.vm03.stdout:2/988: truncate d5/db4/d74/d121/f70 317311 0 2026-03-10T14:08:51.663 INFO:tasks.workunit.client.0.vm03.stdout:2/989: mknod d5/d10/d1f/d4f/d76/da7/d54/c144 0 2026-03-10T14:08:51.664 INFO:tasks.workunit.client.0.vm03.stdout:2/990: dread d5/d10/d1f/d4f/d76/da7/d54/d5f/f11e [0,4194304] 0 2026-03-10T14:08:51.665 INFO:tasks.workunit.client.0.vm03.stdout:2/991: chown d5/d10/d31/ca6 7905998 1 2026-03-10T14:08:51.666 INFO:tasks.workunit.client.0.vm03.stdout:2/992: symlink d5/db4/d74/d83/d143/l145 0 2026-03-10T14:08:51.668 INFO:tasks.workunit.client.0.vm03.stdout:2/993: creat d5/db4/d74/d121/d85/f146 x:0 0 0 2026-03-10T14:08:51.668 INFO:tasks.workunit.client.0.vm03.stdout:2/994: chown d5/db4/d74/d83/dc1/f139 74930 1 2026-03-10T14:08:51.669 INFO:tasks.workunit.client.0.vm03.stdout:2/995: mkdir d5/d10/d1f/d4f/d76/dfc/d147 0 2026-03-10T14:08:51.672 INFO:tasks.workunit.client.0.vm03.stdout:1/966: symlink d0/d2/df/d27/d7e/d81/l13d 0 2026-03-10T14:08:51.672 INFO:tasks.workunit.client.0.vm03.stdout:6/988: rename d8/db/d12/l6b to d8/db/df/dcd/d108/l12f 0 2026-03-10T14:08:51.674 INFO:tasks.workunit.client.0.vm03.stdout:7/833: getdents d5/d9/d14/d26/d39 0 2026-03-10T14:08:51.675 INFO:tasks.workunit.client.0.vm03.stdout:2/996: link d5/db4/d74/d127/d12e/c13b d5/d10/d1f/d4f/d76/c148 0 2026-03-10T14:08:51.680 INFO:tasks.workunit.client.0.vm03.stdout:9/980: sync 2026-03-10T14:08:51.683 INFO:tasks.workunit.client.0.vm03.stdout:1/967: mkdir d0/d2/df/d91/d111/d13e 0 2026-03-10T14:08:51.684 INFO:tasks.workunit.client.0.vm03.stdout:0/917: mknod d3/d4d/d30/c123 0 2026-03-10T14:08:51.684 INFO:tasks.workunit.client.0.vm03.stdout:1/968: readlink d0/l57 0 2026-03-10T14:08:51.685 INFO:tasks.workunit.client.0.vm03.stdout:1/969: write d0/d18/d3b/db7/fbd [430471,62247] 0 2026-03-10T14:08:51.688 INFO:tasks.workunit.client.0.vm03.stdout:2/997: rename d5/d10/d1f/d4f/d76/da7/d54/lfa to d5/db4/d74/d106/l149 0 2026-03-10T14:08:51.688 INFO:tasks.workunit.client.0.vm03.stdout:2/998: write d5/d10/d17/fb6 [3482015,26153] 0 2026-03-10T14:08:51.690 INFO:tasks.workunit.client.0.vm03.stdout:0/918: creat d3/d11/d2c/d4a/d4b/d89/db9/f124 x:0 0 0 2026-03-10T14:08:51.692 INFO:tasks.workunit.client.0.vm03.stdout:2/999: read - d5/db4/d74/d121/d85/f11a zero size 2026-03-10T14:08:51.701 INFO:tasks.workunit.client.0.vm03.stdout:0/919: truncate d3/d11/f13 1095354 0 2026-03-10T14:08:51.705 INFO:tasks.workunit.client.0.vm03.stdout:7/834: creat d5/d9/d14/d26/d39/db1/f116 x:0 0 0 2026-03-10T14:08:51.708 INFO:tasks.workunit.client.0.vm03.stdout:6/989: symlink d8/db/d12/l130 0 2026-03-10T14:08:51.708 INFO:tasks.workunit.client.0.vm03.stdout:7/835: chown d5/d9/d14/d26/d39/lbf 405779028 1 2026-03-10T14:08:51.709 INFO:tasks.workunit.client.0.vm03.stdout:6/990: dread - d8/d1b/f124 zero size 2026-03-10T14:08:51.710 INFO:tasks.workunit.client.0.vm03.stdout:7/836: dread - d5/d9/d14/d26/d39/f111 zero size 2026-03-10T14:08:51.722 INFO:tasks.workunit.client.0.vm03.stdout:0/920: getdents d3/d4d/da0/d11d 0 2026-03-10T14:08:51.723 INFO:tasks.workunit.client.0.vm03.stdout:6/991: symlink d8/d3b/da7/l131 0 2026-03-10T14:08:51.726 INFO:tasks.workunit.client.0.vm03.stdout:6/992: chown d8/db/d49/d76/ff2 24 1 2026-03-10T14:08:51.727 INFO:tasks.workunit.client.0.vm03.stdout:6/993: chown d8/d11/da0/dbf/fed 45306374 1 2026-03-10T14:08:51.733 INFO:tasks.workunit.client.0.vm03.stdout:0/921: dwrite d3/f10 [0,4194304] 0 2026-03-10T14:08:51.738 INFO:tasks.workunit.client.0.vm03.stdout:0/922: stat d3/d46/d5e/ca5 0 2026-03-10T14:08:51.747 INFO:tasks.workunit.client.0.vm03.stdout:0/923: creat d3/d16/d21/d3c/f125 x:0 0 0 2026-03-10T14:08:51.749 INFO:tasks.workunit.client.0.vm03.stdout:7/837: mknod d5/d9/d14/d26/d36/db4/d102/c117 0 2026-03-10T14:08:51.749 INFO:tasks.workunit.client.0.vm03.stdout:6/994: creat d8/d11/da0/f132 x:0 0 0 2026-03-10T14:08:51.758 INFO:tasks.workunit.client.0.vm03.stdout:0/924: fdatasync d3/d11/d66/f86 0 2026-03-10T14:08:51.783 INFO:tasks.workunit.client.0.vm03.stdout:9/981: getdents d2/d14/d2b/d76 0 2026-03-10T14:08:51.783 INFO:tasks.workunit.client.0.vm03.stdout:9/982: chown d2/fd2 246465 1 2026-03-10T14:08:51.786 INFO:tasks.workunit.client.0.vm03.stdout:9/983: chown d2/d29/d38/f10f 1408872 1 2026-03-10T14:08:51.789 INFO:tasks.workunit.client.0.vm03.stdout:9/984: stat d2/d29/d33/d60/d8c/dd3/dfe/f12d 0 2026-03-10T14:08:51.903 INFO:tasks.workunit.client.0.vm03.stdout:0/925: unlink d3/d11/l84 0 2026-03-10T14:08:51.903 INFO:tasks.workunit.client.0.vm03.stdout:0/926: chown d3/d11/d2c/d4a/d4b/f7d 12 1 2026-03-10T14:08:51.939 INFO:tasks.workunit.client.0.vm03.stdout:6/995: rmdir d8/db/df 39 2026-03-10T14:08:51.962 INFO:tasks.workunit.client.0.vm03.stdout:0/927: symlink d3/d4d/l126 0 2026-03-10T14:08:51.967 INFO:tasks.workunit.client.0.vm03.stdout:0/928: dwrite d3/d16/d21/d3c/f5b [0,4194304] 0 2026-03-10T14:08:51.972 INFO:tasks.workunit.client.0.vm03.stdout:1/970: dwrite d0/d18/d3b/d50/fd6 [0,4194304] 0 2026-03-10T14:08:52.004 INFO:tasks.workunit.client.0.vm03.stdout:0/929: dwrite d3/d4d/fec [0,4194304] 0 2026-03-10T14:08:52.041 INFO:tasks.workunit.client.0.vm03.stdout:9/985: rename d2/fd2 to d2/f13a 0 2026-03-10T14:08:52.041 INFO:tasks.workunit.client.0.vm03.stdout:7/838: truncate d5/d9/d14/d26/f5c 1865766 0 2026-03-10T14:08:52.042 INFO:tasks.workunit.client.0.vm03.stdout:7/839: chown d5/d9/d14/d26/d36/d51/dc8/c104 74 1 2026-03-10T14:08:52.043 INFO:tasks.workunit.client.0.vm03.stdout:6/996: creat d8/d11/d7a/f133 x:0 0 0 2026-03-10T14:08:52.075 INFO:tasks.workunit.client.0.vm03.stdout:0/930: mkdir d3/d11/d76/d127 0 2026-03-10T14:08:52.077 INFO:tasks.workunit.client.0.vm03.stdout:0/931: truncate d3/d4d/d8f/fb3 405017 0 2026-03-10T14:08:52.079 INFO:tasks.workunit.client.0.vm03.stdout:9/986: dread d2/d29/ddc/f128 [0,4194304] 0 2026-03-10T14:08:52.079 INFO:tasks.workunit.client.0.vm03.stdout:0/932: symlink d3/d11/d2c/d4a/d4b/l128 0 2026-03-10T14:08:52.080 INFO:tasks.workunit.client.0.vm03.stdout:0/933: mkdir d3/d4d/d116/d129 0 2026-03-10T14:08:52.083 INFO:tasks.workunit.client.0.vm03.stdout:7/840: creat d5/d9/d14/d26/d36/db4/f118 x:0 0 0 2026-03-10T14:08:52.084 INFO:tasks.workunit.client.0.vm03.stdout:6/997: creat d8/d11/d10c/d12e/f134 x:0 0 0 2026-03-10T14:08:52.084 INFO:tasks.workunit.client.0.vm03.stdout:0/934: creat d3/d17/f12a x:0 0 0 2026-03-10T14:08:52.085 INFO:tasks.workunit.client.0.vm03.stdout:0/935: fsync d3/d11/d2c/d4a/d4b/d89/db9/f124 0 2026-03-10T14:08:52.088 INFO:tasks.workunit.client.0.vm03.stdout:9/987: fdatasync d2/d29/d33/d60/fc6 0 2026-03-10T14:08:52.089 INFO:tasks.workunit.client.0.vm03.stdout:0/936: dwrite d3/d11/d2c/d4a/fcf [4194304,4194304] 0 2026-03-10T14:08:52.101 INFO:tasks.workunit.client.0.vm03.stdout:7/841: fdatasync d5/d9/d35/da8/ff6 0 2026-03-10T14:08:52.110 INFO:tasks.workunit.client.0.vm03.stdout:1/971: write d0/d2/f46 [516271,122938] 0 2026-03-10T14:08:52.117 INFO:tasks.workunit.client.0.vm03.stdout:9/988: symlink d2/d29/d33/d60/d8c/dd3/dfe/l13b 0 2026-03-10T14:08:52.177 INFO:tasks.workunit.client.0.vm03.stdout:7/842: mkdir d5/d9/d14/d26/d36/d51/d7b/d83/d119 0 2026-03-10T14:08:52.194 INFO:tasks.workunit.client.0.vm03.stdout:1/972: truncate d0/d2/df/d16/f61 9376113 0 2026-03-10T14:08:52.195 INFO:tasks.workunit.client.0.vm03.stdout:1/973: read d0/d2/d71/d90/f9e [2397873,130862] 0 2026-03-10T14:08:52.209 INFO:tasks.workunit.client.0.vm03.stdout:9/989: mkdir d2/d29/dcd/d72/d13c 0 2026-03-10T14:08:52.210 INFO:tasks.workunit.client.0.vm03.stdout:0/937: creat d3/d11/d2c/f12b x:0 0 0 2026-03-10T14:08:52.215 INFO:tasks.workunit.client.0.vm03.stdout:0/938: write d3/d16/d21/f78 [860424,101959] 0 2026-03-10T14:08:52.215 INFO:tasks.workunit.client.0.vm03.stdout:0/939: chown d3/d4d/d30/f7a 19724044 1 2026-03-10T14:08:52.216 INFO:tasks.workunit.client.0.vm03.stdout:0/940: readlink d3/d11/d2c/d4a/d11e/l7f 0 2026-03-10T14:08:52.222 INFO:tasks.workunit.client.0.vm03.stdout:0/941: truncate d3/d11/d2c/d4a/d4b/d89/db6/df4/ffd 2266198 0 2026-03-10T14:08:52.223 INFO:tasks.workunit.client.0.vm03.stdout:0/942: creat d3/d46/de0/f12c x:0 0 0 2026-03-10T14:08:52.224 INFO:tasks.workunit.client.0.vm03.stdout:0/943: mkdir d3/d11/d2c/d4a/d4b/d110/d12d 0 2026-03-10T14:08:52.225 INFO:tasks.workunit.client.0.vm03.stdout:7/843: truncate d5/d9/d14/d26/d5f/d89/fbd 1467604 0 2026-03-10T14:08:52.239 INFO:tasks.workunit.client.0.vm03.stdout:6/998: mkdir d8/db/d12/d64/de0/d135 0 2026-03-10T14:08:52.240 INFO:tasks.workunit.client.0.vm03.stdout:6/999: stat d8/d1b/d1c/c4d 0 2026-03-10T14:08:52.251 INFO:tasks.workunit.client.0.vm03.stdout:0/944: dread d3/d46/dac/d79/f8d [0,4194304] 0 2026-03-10T14:08:52.253 INFO:tasks.workunit.client.0.vm03.stdout:0/945: symlink d3/d11/d2c/d4a/d4b/d89/db9/l12e 0 2026-03-10T14:08:52.254 INFO:tasks.workunit.client.0.vm03.stdout:0/946: mkdir d3/d11/d2c/d4a/d4b/d110/d12d/d12f 0 2026-03-10T14:08:52.255 INFO:tasks.workunit.client.0.vm03.stdout:0/947: stat d3/d16/lf3 0 2026-03-10T14:08:52.263 INFO:tasks.workunit.client.0.vm03.stdout:0/948: dwrite d3/d11/d2c/d4a/d11e/fd9 [0,4194304] 0 2026-03-10T14:08:52.272 INFO:tasks.workunit.client.0.vm03.stdout:9/990: unlink d2/d29/dcd/dd8/lda 0 2026-03-10T14:08:52.298 INFO:tasks.workunit.client.0.vm03.stdout:0/949: sync 2026-03-10T14:08:52.299 INFO:tasks.workunit.client.0.vm03.stdout:0/950: mkdir d3/d46/dac/d130 0 2026-03-10T14:08:52.299 INFO:tasks.workunit.client.0.vm03.stdout:7/844: mknod d5/d9/d14/d26/d36/de1/c11a 0 2026-03-10T14:08:52.300 INFO:tasks.workunit.client.0.vm03.stdout:9/991: fdatasync d2/d14/d2b/d79/fd1 0 2026-03-10T14:08:52.303 INFO:tasks.workunit.client.0.vm03.stdout:1/974: getdents d0/d2/df/dab/dc3 0 2026-03-10T14:08:52.304 INFO:tasks.workunit.client.0.vm03.stdout:0/951: sync 2026-03-10T14:08:52.305 INFO:tasks.workunit.client.0.vm03.stdout:0/952: readlink d3/d11/d76/lae 0 2026-03-10T14:08:52.311 INFO:tasks.workunit.client.0.vm03.stdout:7/845: rename d5/d9/da7/ffa to d5/d9/d14/d26/d5f/d89/d112/f11b 0 2026-03-10T14:08:52.312 INFO:tasks.workunit.client.0.vm03.stdout:7/846: stat d5/d9/d14/d26/d36/de1/dcc/ce4 0 2026-03-10T14:08:52.357 INFO:tasks.workunit.client.0.vm03.stdout:1/975: mkdir d0/d18/d3b/d13f 0 2026-03-10T14:08:52.379 INFO:tasks.workunit.client.0.vm03.stdout:0/953: read d3/d11/d2c/d4a/d4b/d89/db6/df4/f87 [1751171,58773] 0 2026-03-10T14:08:52.403 INFO:tasks.workunit.client.0.vm03.stdout:1/976: rmdir d0/d2 39 2026-03-10T14:08:52.445 INFO:tasks.workunit.client.0.vm03.stdout:0/954: write d3/d11/d2c/d4a/d11e/f74 [1385112,78777] 0 2026-03-10T14:08:52.449 INFO:tasks.workunit.client.0.vm03.stdout:0/955: creat d3/d11/d2c/d4a/d4b/d89/d108/f131 x:0 0 0 2026-03-10T14:08:52.500 INFO:tasks.workunit.client.0.vm03.stdout:7/847: dread d5/d9/d35/f7c [0,4194304] 0 2026-03-10T14:08:52.529 INFO:tasks.workunit.client.0.vm03.stdout:9/992: rename d2/f37 to d2/d29/f13d 0 2026-03-10T14:08:52.534 INFO:tasks.workunit.client.0.vm03.stdout:0/956: rename d3/d46/dac/d79/f93 to d3/d16/d21/f132 0 2026-03-10T14:08:52.536 INFO:tasks.workunit.client.0.vm03.stdout:7/848: mkdir d5/d9/d14/d26/d36/d51/d11c 0 2026-03-10T14:08:52.542 INFO:tasks.workunit.client.0.vm03.stdout:1/977: mknod d0/d18/daa/c140 0 2026-03-10T14:08:52.545 INFO:tasks.workunit.client.0.vm03.stdout:0/957: chown d3/cd2 3 1 2026-03-10T14:08:52.545 INFO:tasks.workunit.client.0.vm03.stdout:0/958: dread - d3/d46/dac/fce zero size 2026-03-10T14:08:52.549 INFO:tasks.workunit.client.0.vm03.stdout:0/959: creat d3/d4d/d8f/f133 x:0 0 0 2026-03-10T14:08:52.552 INFO:tasks.workunit.client.0.vm03.stdout:0/960: creat d3/d16/d21/df0/f134 x:0 0 0 2026-03-10T14:08:52.557 INFO:tasks.workunit.client.0.vm03.stdout:0/961: creat d3/d16/d21/f135 x:0 0 0 2026-03-10T14:08:52.558 INFO:tasks.workunit.client.0.vm03.stdout:0/962: truncate d3/d16/f3e 9344043 0 2026-03-10T14:08:52.558 INFO:tasks.workunit.client.0.vm03.stdout:7/849: mkdir d5/d9/d14/d26/d39/d11d 0 2026-03-10T14:08:52.563 INFO:tasks.workunit.client.0.vm03.stdout:0/963: getdents d3/d11/d2c/d4a/d4b/d110 0 2026-03-10T14:08:52.563 INFO:tasks.workunit.client.0.vm03.stdout:7/850: dread - d5/d9/d14/d26/d5f/de7/fef zero size 2026-03-10T14:08:52.568 INFO:tasks.workunit.client.0.vm03.stdout:0/964: dwrite d3/d16/d21/d3c/f99 [0,4194304] 0 2026-03-10T14:08:52.577 INFO:tasks.workunit.client.0.vm03.stdout:0/965: write d3/d11/d76/db5/ddb/ff1 [1404051,93585] 0 2026-03-10T14:08:52.579 INFO:tasks.workunit.client.0.vm03.stdout:9/993: truncate d2/d14/d2b/d79/fb8 516465 0 2026-03-10T14:08:52.583 INFO:tasks.workunit.client.0.vm03.stdout:0/966: mknod d3/d11/d2c/d4a/d4b/d89/db6/c136 0 2026-03-10T14:08:52.588 INFO:tasks.workunit.client.0.vm03.stdout:0/967: creat d3/d11/d2c/d10e/f137 x:0 0 0 2026-03-10T14:08:52.590 INFO:tasks.workunit.client.0.vm03.stdout:9/994: creat d2/d29/d33/d41/d46/dd0/f13e x:0 0 0 2026-03-10T14:08:52.590 INFO:tasks.workunit.client.0.vm03.stdout:9/995: write d2/d29/dcd/dd8/fed [235638,13041] 0 2026-03-10T14:08:52.592 INFO:tasks.workunit.client.0.vm03.stdout:0/968: fsync d3/fbf 0 2026-03-10T14:08:52.593 INFO:tasks.workunit.client.0.vm03.stdout:7/851: stat d5/d9/d14/d26/d39/c86 0 2026-03-10T14:08:52.594 INFO:tasks.workunit.client.0.vm03.stdout:7/852: write d5/d9/d14/d21/d28/f6e [4380542,45159] 0 2026-03-10T14:08:52.615 INFO:tasks.workunit.client.0.vm03.stdout:9/996: mkdir d2/d29/dcd/d72/d9d/d13f 0 2026-03-10T14:08:52.644 INFO:tasks.workunit.client.0.vm03.stdout:1/978: truncate d0/d2/df/fce 2711963 0 2026-03-10T14:08:52.648 INFO:tasks.workunit.client.0.vm03.stdout:1/979: dwrite d0/d2/df/d27/f52 [0,4194304] 0 2026-03-10T14:08:52.649 INFO:tasks.workunit.client.0.vm03.stdout:1/980: stat d0/d18/d3b/db7/fbd 0 2026-03-10T14:08:52.651 INFO:tasks.workunit.client.0.vm03.stdout:0/969: creat d3/d11/d66/da2/d104/f138 x:0 0 0 2026-03-10T14:08:52.656 INFO:tasks.workunit.client.0.vm03.stdout:7/853: mkdir d5/d9/d14/d26/d36/de1/d84/d11e 0 2026-03-10T14:08:52.656 INFO:tasks.workunit.client.0.vm03.stdout:9/997: dread d2/d29/d38/f4b [0,4194304] 0 2026-03-10T14:08:52.657 INFO:tasks.workunit.client.0.vm03.stdout:9/998: stat d2/d29/dcd/d72/d9d/fa6 0 2026-03-10T14:08:52.664 INFO:tasks.workunit.client.0.vm03.stdout:0/970: mkdir d3/d11/d2c/d4a/d4b/d89/db6/d139 0 2026-03-10T14:08:52.681 INFO:tasks.workunit.client.0.vm03.stdout:1/981: mkdir d0/d2/df/d91/d141 0 2026-03-10T14:08:52.692 INFO:tasks.workunit.client.0.vm03.stdout:0/971: write d3/d11/d76/dbc/f117 [3592141,52780] 0 2026-03-10T14:08:52.695 INFO:tasks.workunit.client.0.vm03.stdout:0/972: truncate d3/d46/ffc 772619 0 2026-03-10T14:08:52.705 INFO:tasks.workunit.client.0.vm03.stdout:1/982: symlink d0/d2/df/dab/dc3/l142 0 2026-03-10T14:08:52.707 INFO:tasks.workunit.client.0.vm03.stdout:7/854: rename d5/d9/d14/d26/d5f/l70 to d5/d9/d14/d26/d36/d51/d11c/l11f 0 2026-03-10T14:08:52.726 INFO:tasks.workunit.client.0.vm03.stdout:1/983: sync 2026-03-10T14:08:52.727 INFO:tasks.workunit.client.0.vm03.stdout:1/984: write d0/d2/d71/d90/fc1 [2366558,22671] 0 2026-03-10T14:08:52.735 INFO:tasks.workunit.client.0.vm03.stdout:0/973: dread d3/f94 [0,4194304] 0 2026-03-10T14:08:52.742 INFO:tasks.workunit.client.0.vm03.stdout:9/999: creat d2/d14/d2b/f140 x:0 0 0 2026-03-10T14:08:52.757 INFO:tasks.workunit.client.0.vm03.stdout:7/855: creat d5/d9/d14/d26/d5f/de2/f120 x:0 0 0 2026-03-10T14:08:52.771 INFO:tasks.workunit.client.0.vm03.stdout:0/974: rename d3/d11/ca6 to d3/d11/d2c/d4a/d4b/d89/db9/de6/c13a 0 2026-03-10T14:08:52.773 INFO:tasks.workunit.client.0.vm03.stdout:0/975: creat d3/d11/d2c/d4a/d4b/d110/d12d/f13b x:0 0 0 2026-03-10T14:08:52.773 INFO:tasks.workunit.client.0.vm03.stdout:0/976: fdatasync d3/f10 0 2026-03-10T14:08:52.775 INFO:tasks.workunit.client.0.vm03.stdout:0/977: chown d3/d16/d21/df0/f134 4937119 1 2026-03-10T14:08:52.777 INFO:tasks.workunit.client.0.vm03.stdout:0/978: symlink d3/d11/d66/da2/l13c 0 2026-03-10T14:08:52.779 INFO:tasks.workunit.client.0.vm03.stdout:0/979: creat d3/d11/d76/db5/ddb/f13d x:0 0 0 2026-03-10T14:08:52.780 INFO:tasks.workunit.client.0.vm03.stdout:0/980: dread - d3/d17/f12a zero size 2026-03-10T14:08:52.784 INFO:tasks.workunit.client.0.vm03.stdout:0/981: rmdir d3/d11/d2c/d4a/d4b/d89/db6/d139 0 2026-03-10T14:08:52.785 INFO:tasks.workunit.client.0.vm03.stdout:0/982: dread - d3/d11/d2c/d4a/d4b/d89/db9/f124 zero size 2026-03-10T14:08:52.788 INFO:tasks.workunit.client.0.vm03.stdout:0/983: mkdir d3/d11/d66/da2/d13e 0 2026-03-10T14:08:52.791 INFO:tasks.workunit.client.0.vm03.stdout:0/984: rename d3/d11/d2c/d4a/d4b/d89/db6/c136 to d3/d4d/d116/c13f 0 2026-03-10T14:08:52.797 INFO:tasks.workunit.client.0.vm03.stdout:0/985: dwrite d3/d16/d21/fb8 [0,4194304] 0 2026-03-10T14:08:52.815 INFO:tasks.workunit.client.0.vm03.stdout:1/985: dwrite d0/d2/df/d16/f106 [0,4194304] 0 2026-03-10T14:08:52.816 INFO:tasks.workunit.client.0.vm03.stdout:7/856: dwrite d5/d9/d14/d26/d36/de1/d84/fc1 [0,4194304] 0 2026-03-10T14:08:52.819 INFO:tasks.workunit.client.0.vm03.stdout:0/986: link d3/d16/d21/c3f d3/d16/d21/d3c/c140 0 2026-03-10T14:08:52.833 INFO:tasks.workunit.client.0.vm03.stdout:7/857: write d5/d9/d14/d26/d36/de1/dcc/ff0 [395086,122112] 0 2026-03-10T14:08:52.884 INFO:tasks.workunit.client.0.vm03.stdout:0/987: write d3/d17/f35 [2097090,32549] 0 2026-03-10T14:08:52.897 INFO:tasks.workunit.client.0.vm03.stdout:0/988: link d3/d4d/l59 d3/d11/d66/da2/d104/l141 0 2026-03-10T14:08:52.898 INFO:tasks.workunit.client.0.vm03.stdout:0/989: chown d3/d11/d66/l77 465 1 2026-03-10T14:08:52.902 INFO:tasks.workunit.client.0.vm03.stdout:1/986: write d0/d2/df/d27/d7e/d81/f109 [371822,130478] 0 2026-03-10T14:08:52.907 INFO:tasks.workunit.client.0.vm03.stdout:7/858: dwrite d5/d9/f7a [4194304,4194304] 0 2026-03-10T14:08:52.908 INFO:tasks.workunit.client.0.vm03.stdout:0/990: stat d3/d11/d2c/d4a/d4b/d89/db9/de6/c13a 0 2026-03-10T14:08:52.918 INFO:tasks.workunit.client.0.vm03.stdout:7/859: chown d5/d9/d14/d26/d39/db1/f116 483232 1 2026-03-10T14:08:52.918 INFO:tasks.workunit.client.0.vm03.stdout:0/991: dread d3/d11/d76/db5/ddb/ff1 [0,4194304] 0 2026-03-10T14:08:52.919 INFO:tasks.workunit.client.0.vm03.stdout:1/987: dwrite d0/d2/df/d16/d41/f119 [0,4194304] 0 2026-03-10T14:08:52.927 INFO:tasks.workunit.client.0.vm03.stdout:7/860: mkdir d5/d9/d14/d26/d39/db1/d121 0 2026-03-10T14:08:52.936 INFO:tasks.workunit.client.0.vm03.stdout:1/988: chown d0/d2/df/dab/f9a 38050 1 2026-03-10T14:08:52.936 INFO:tasks.workunit.client.0.vm03.stdout:7/861: dwrite d5/d9/f7a [4194304,4194304] 0 2026-03-10T14:08:52.937 INFO:tasks.workunit.client.0.vm03.stdout:7/862: read d5/d9/d35/f7c [3960164,37095] 0 2026-03-10T14:08:52.954 INFO:tasks.workunit.client.0.vm03.stdout:7/863: getdents d5/d9/d35/d101/d115 0 2026-03-10T14:08:52.956 INFO:tasks.workunit.client.0.vm03.stdout:7/864: fsync d5/d9/d14/f41 0 2026-03-10T14:08:52.958 INFO:tasks.workunit.client.0.vm03.stdout:7/865: dread - d5/d9/d14/d26/d39/db1/f116 zero size 2026-03-10T14:08:52.966 INFO:tasks.workunit.client.0.vm03.stdout:1/989: dread d0/d2/df/f43 [0,4194304] 0 2026-03-10T14:08:52.972 INFO:tasks.workunit.client.0.vm03.stdout:1/990: rmdir d0/d2/df/d16/d20 39 2026-03-10T14:08:52.979 INFO:tasks.workunit.client.0.vm03.stdout:7/866: getdents d5/d9/d14/d26/d5f/dce 0 2026-03-10T14:08:52.981 INFO:tasks.workunit.client.0.vm03.stdout:7/867: read d5/d9/d14/d21/d28/f6e [2782800,7374] 0 2026-03-10T14:08:52.982 INFO:tasks.workunit.client.0.vm03.stdout:1/991: getdents d0/d2/d71/d90 0 2026-03-10T14:08:52.987 INFO:tasks.workunit.client.0.vm03.stdout:0/992: write d3/d11/f13 [498950,100170] 0 2026-03-10T14:08:52.988 INFO:tasks.workunit.client.0.vm03.stdout:0/993: stat d3/d16/d21/cff 0 2026-03-10T14:08:52.997 INFO:tasks.workunit.client.0.vm03.stdout:0/994: dwrite d3/d11/d2c/d4a/d11e/fd9 [0,4194304] 0 2026-03-10T14:08:52.998 INFO:tasks.workunit.client.0.vm03.stdout:1/992: mkdir d0/d2/df/d27/d143 0 2026-03-10T14:08:53.000 INFO:tasks.workunit.client.0.vm03.stdout:0/995: truncate d3/f94 4874205 0 2026-03-10T14:08:53.006 INFO:tasks.workunit.client.0.vm03.stdout:7/868: creat d5/d9/d14/d26/d5f/f122 x:0 0 0 2026-03-10T14:08:53.017 INFO:tasks.workunit.client.0.vm03.stdout:1/993: dread d0/d2/f9 [0,4194304] 0 2026-03-10T14:08:53.028 INFO:tasks.workunit.client.0.vm03.stdout:0/996: truncate d3/d11/d76/db5/ddb/f11b 82048 0 2026-03-10T14:08:53.031 INFO:tasks.workunit.client.0.vm03.stdout:7/869: write d5/d9/d14/d26/d36/d51/d7b/d83/f9e [3657134,85930] 0 2026-03-10T14:08:53.035 INFO:tasks.workunit.client.0.vm03.stdout:0/997: unlink d3/d46/dac/fce 0 2026-03-10T14:08:53.037 INFO:tasks.workunit.client.0.vm03.stdout:1/994: dwrite d0/d2/df/d16/d41/dba/fcb [0,4194304] 0 2026-03-10T14:08:53.045 INFO:tasks.workunit.client.0.vm03.stdout:0/998: write d3/d16/d21/f132 [727108,21078] 0 2026-03-10T14:08:53.054 INFO:tasks.workunit.client.0.vm03.stdout:0/999: dwrite d3/d11/d2c/d10e/f137 [0,4194304] 0 2026-03-10T14:08:53.091 INFO:tasks.workunit.client.0.vm03.stdout:7/870: truncate d5/d9/d14/d26/d5f/d89/fe8 623356 0 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: pgmap v13: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 41 MiB/s rd, 77 MiB/s wr, 243 op/s 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.rwbbep"}]: dispatch 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.rwbbep"}]: dispatch 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.rwbbep"}]': finished 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.ywwcto"}]: dispatch 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.ywwcto"}]: dispatch 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.ywwcto"}]': finished 2026-03-10T14:08:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:52 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T14:08:53.109 INFO:tasks.workunit.client.0.vm03.stdout:7/871: readlink d5/d9/d14/d21/d28/la9 0 2026-03-10T14:08:53.116 INFO:tasks.workunit.client.0.vm03.stdout:7/872: creat d5/d9/da7/d10a/f123 x:0 0 0 2026-03-10T14:08:53.120 INFO:tasks.workunit.client.0.vm03.stdout:1/995: dwrite d0/d2/df/d16/d20/fc2 [0,4194304] 0 2026-03-10T14:08:53.132 INFO:tasks.workunit.client.0.vm03.stdout:7/873: truncate d5/d9/d14/d26/d39/f6a 3424056 0 2026-03-10T14:08:53.133 INFO:tasks.workunit.client.0.vm03.stdout:1/996: mknod d0/d2/db6/c144 0 2026-03-10T14:08:53.139 INFO:tasks.workunit.client.0.vm03.stdout:1/997: chown d0/d18/d3b/f3d 3900703 1 2026-03-10T14:08:53.139 INFO:tasks.workunit.client.0.vm03.stdout:7/874: mknod d5/d9/d14/d26/d36/de1/dcc/c124 0 2026-03-10T14:08:53.151 INFO:tasks.workunit.client.0.vm03.stdout:1/998: fdatasync d0/d2/df/d16/d20/fa9 0 2026-03-10T14:08:53.157 INFO:tasks.workunit.client.0.vm03.stdout:1/999: mkdir d0/d2/df/d27/dd7/d145 0 2026-03-10T14:08:53.218 INFO:tasks.workunit.client.0.vm03.stdout:7/875: sync 2026-03-10T14:08:53.238 INFO:tasks.workunit.client.0.vm03.stdout:7/876: truncate d5/d9/d14/d26/d39/fd4 622172 0 2026-03-10T14:08:53.248 INFO:tasks.workunit.client.0.vm03.stdout:7/877: readlink d5/d9/d14/d26/d39/lbf 0 2026-03-10T14:08:53.252 INFO:tasks.workunit.client.0.vm03.stdout:7/878: truncate d5/f32 181455 0 2026-03-10T14:08:53.260 INFO:tasks.workunit.client.0.vm03.stdout:7/879: dread d5/fc4 [0,4194304] 0 2026-03-10T14:08:53.287 INFO:tasks.workunit.client.0.vm03.stdout:7/880: truncate d5/d9/d14/d26/d36/d51/d7b/f87 689190 0 2026-03-10T14:08:53.297 INFO:tasks.workunit.client.0.vm03.stdout:7/881: write d5/d9/d14/f4d [4471771,84315] 0 2026-03-10T14:08:53.301 INFO:tasks.workunit.client.0.vm03.stdout:7/882: rename d5/d9/d35/da8 to d5/d9/d14/d26/d36/d51/d7b/d83/d119/d125 0 2026-03-10T14:08:53.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: pgmap v13: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 41 MiB/s rd, 77 MiB/s wr, 243 op/s 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.rwbbep"}]: dispatch 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.rwbbep"}]: dispatch 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.rwbbep"}]': finished 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.ywwcto"}]: dispatch 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.ywwcto"}]: dispatch 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm04.ywwcto"}]': finished 2026-03-10T14:08:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:52 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T14:08:53.314 INFO:tasks.workunit.client.0.vm03.stdout:7/883: dread d5/d9/f17 [0,4194304] 0 2026-03-10T14:08:53.315 INFO:tasks.workunit.client.0.vm03.stdout:7/884: stat d5/d9/d35/laf 0 2026-03-10T14:08:53.329 INFO:tasks.workunit.client.0.vm03.stdout:7/885: dwrite d5/d9/d14/d26/d36/db4/feb [0,4194304] 0 2026-03-10T14:08:53.353 INFO:tasks.workunit.client.0.vm03.stdout:7/886: getdents d5/d9/d14/d26/d36/db4 0 2026-03-10T14:08:53.362 INFO:tasks.workunit.client.0.vm03.stdout:7/887: mkdir d5/d9/d14/d126 0 2026-03-10T14:08:53.433 INFO:tasks.workunit.client.0.vm03.stdout:7/888: getdents d5/d9/d14/d26/d36/d51/d7b/d83 0 2026-03-10T14:08:53.435 INFO:tasks.workunit.client.0.vm03.stdout:7/889: read d5/d9/d14/d26/d36/fa4 [3536023,68872] 0 2026-03-10T14:08:53.438 INFO:tasks.workunit.client.0.vm03.stdout:7/890: rmdir d5/d9/d14/d26/d36/de1/d84 39 2026-03-10T14:08:53.439 INFO:tasks.workunit.client.0.vm03.stdout:7/891: stat d5/d9/d14/d26/d36/d51/d7b/d83/d119/d125/ff6 0 2026-03-10T14:08:53.440 INFO:tasks.workunit.client.0.vm03.stdout:7/892: chown d5/d9/d14/d26/d36/db4/d102 3 1 2026-03-10T14:08:53.457 INFO:tasks.workunit.client.0.vm03.stdout:7/893: dwrite d5/f32 [0,4194304] 0 2026-03-10T14:08:53.493 INFO:tasks.workunit.client.0.vm03.stdout:7/894: dwrite d5/d9/d14/d26/d5f/f8e [4194304,4194304] 0 2026-03-10T14:08:53.512 INFO:tasks.workunit.client.0.vm03.stdout:7/895: dread d5/d9/d14/d26/d5f/d89/d112/f11b [0,4194304] 0 2026-03-10T14:08:53.516 INFO:tasks.workunit.client.0.vm03.stdout:7/896: truncate d5/d9/d14/d21/d6f/f9d 1060842 0 2026-03-10T14:08:53.528 INFO:tasks.workunit.client.0.vm03.stdout:7/897: dread d5/d9/d35/d101/f107 [0,4194304] 0 2026-03-10T14:08:53.539 INFO:tasks.workunit.client.0.vm03.stdout:7/898: mkdir d5/d9/d14/d26/d36/de1/d84/d11e/d127 0 2026-03-10T14:08:53.540 INFO:tasks.workunit.client.0.vm03.stdout:7/899: write d5/d9/d14/d26/d39/db1/f116 [106979,16562] 0 2026-03-10T14:08:53.555 INFO:tasks.workunit.client.0.vm03.stdout:7/900: mkdir d5/d9/d14/d21/d28/d128 0 2026-03-10T14:08:53.559 INFO:tasks.workunit.client.0.vm03.stdout:7/901: rename d5/d9/d14/f41 to d5/d9/d14/d26/d5f/d89/f129 0 2026-03-10T14:08:53.559 INFO:tasks.workunit.client.0.vm03.stdout:7/902: chown d5/d9/d14/d26/f9f 2107237053 1 2026-03-10T14:08:53.569 INFO:tasks.workunit.client.0.vm03.stdout:7/903: symlink d5/d9/d14/d26/d39/d92/l12a 0 2026-03-10T14:08:53.575 INFO:tasks.workunit.client.0.vm03.stdout:7/904: dread d5/d9/f1f [0,4194304] 0 2026-03-10T14:08:53.587 INFO:tasks.workunit.client.0.vm03.stdout:7/905: dwrite d5/d9/d35/f7c [4194304,4194304] 0 2026-03-10T14:08:53.603 INFO:tasks.workunit.client.0.vm03.stdout:7/906: creat d5/d9/d14/d26/d36/d51/f12b x:0 0 0 2026-03-10T14:08:53.607 INFO:tasks.workunit.client.0.vm03.stdout:7/907: fdatasync d5/fc4 0 2026-03-10T14:08:53.608 INFO:tasks.workunit.client.0.vm03.stdout:7/908: write d5/d9/d35/f7c [2458909,13262] 0 2026-03-10T14:08:53.697 INFO:tasks.workunit.client.0.vm03.stdout:7/909: getdents d5/d9/d35 0 2026-03-10T14:08:53.707 INFO:tasks.workunit.client.0.vm03.stdout:7/910: chown d5/d9/d14/d26/d39/lb3 42157871 1 2026-03-10T14:08:53.714 INFO:tasks.workunit.client.0.vm03.stdout:7/911: rmdir d5/d9/d14/d26/d36/de1/d84/d11e 39 2026-03-10T14:08:53.723 INFO:tasks.workunit.client.0.vm03.stdout:7/912: rename d5/d9/d14/d21/d6f/c82 to d5/d9/d35/d101/d115/c12c 0 2026-03-10T14:08:53.755 INFO:tasks.workunit.client.0.vm03.stdout:7/913: dwrite d5/d9/f17 [0,4194304] 0 2026-03-10T14:08:53.757 INFO:tasks.workunit.client.0.vm03.stdout:7/914: chown d5/d9/d14/d26/d36/de1/c11a 3126 1 2026-03-10T14:08:53.807 INFO:tasks.workunit.client.0.vm03.stdout:7/915: write d5/d9/d14/d26/d36/d51/d7b/d9c/fc3 [666276,114271] 0 2026-03-10T14:08:53.810 INFO:tasks.workunit.client.0.vm03.stdout:7/916: rmdir d5/d9/d14/d26/d36/d51/d7b/d9c/dd5 39 2026-03-10T14:08:53.836 INFO:tasks.workunit.client.0.vm03.stdout:7/917: sync 2026-03-10T14:08:53.845 INFO:tasks.workunit.client.0.vm03.stdout:7/918: dwrite d5/d9/d14/d26/d39/f111 [0,4194304] 0 2026-03-10T14:08:53.878 INFO:tasks.workunit.client.0.vm03.stdout:7/919: dwrite d5/d9/d14/d26/d36/d51/d7b/f87 [0,4194304] 0 2026-03-10T14:08:53.885 INFO:tasks.workunit.client.0.vm03.stdout:7/920: fdatasync d5/d9/da7/d10a/f123 0 2026-03-10T14:08:53.905 INFO:tasks.workunit.client.0.vm03.stdout:7/921: creat d5/d9/d14/d26/f12d x:0 0 0 2026-03-10T14:08:53.915 INFO:tasks.workunit.client.0.vm03.stdout:7/922: rename d5/d9/da7/lc2 to d5/d9/d14/d26/d36/db4/l12e 0 2026-03-10T14:08:53.917 INFO:tasks.workunit.client.0.vm03.stdout:7/923: dread d5/d9/d14/d26/d36/d51/ff2 [0,4194304] 0 2026-03-10T14:08:53.918 INFO:tasks.workunit.client.0.vm03.stdout:7/924: chown d5/d9/d14/d26/d5f/f78 36004 1 2026-03-10T14:08:53.946 INFO:tasks.workunit.client.0.vm03.stdout:7/925: rename d5/d9/d14/d21/d28 to d5/d9/d14/d26/d36/d51/d7b/d83/d119/d125/d12f 0 2026-03-10T14:08:53.947 INFO:tasks.workunit.client.0.vm03.stdout:7/926: dread d5/fc4 [0,4194304] 0 2026-03-10T14:08:53.969 INFO:tasks.workunit.client.0.vm03.stdout:7/927: dread d5/d9/d14/d26/d36/d51/d7b/d83/d119/d125/d12f/f6e [0,4194304] 0 2026-03-10T14:08:53.975 INFO:tasks.workunit.client.0.vm03.stdout:7/928: dwrite d5/d9/d14/d26/d39/f111 [0,4194304] 0 2026-03-10T14:08:53.983 INFO:tasks.workunit.client.0.vm03.stdout:7/929: dread d5/d9/d14/d26/d5f/f68 [0,4194304] 0 2026-03-10T14:08:54.008 INFO:tasks.workunit.client.0.vm03.stdout:7/930: dwrite d5/d9/d35/d101/f107 [0,4194304] 0 2026-03-10T14:08:54.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:53 vm04.local ceph-mon[55966]: Upgrade: Setting container_image for all mgr 2026-03-10T14:08:54.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:53 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:54.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:53 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:08:54.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:53 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:08:54.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:53 vm04.local ceph-mon[55966]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:53 vm03.local ceph-mon[49718]: Upgrade: Setting container_image for all mgr 2026-03-10T14:08:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:53 vm03.local ceph-mon[49718]: from='mgr.14704 ' entity='mgr.vm03.rwbbep' 2026-03-10T14:08:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:53 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:08:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:53 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:08:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:53 vm03.local ceph-mon[49718]: from='mgr.14704 192.168.123.103:0/59187310' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:08:54.149 INFO:tasks.workunit.client.0.vm03.stdout:7/931: rmdir d5 39 2026-03-10T14:08:54.160 INFO:tasks.workunit.client.0.vm03.stdout:7/932: dread d5/f6 [0,4194304] 0 2026-03-10T14:08:54.210 INFO:tasks.workunit.client.0.vm03.stdout:7/933: dwrite d5/d9/d14/d26/f5c [0,4194304] 0 2026-03-10T14:08:54.219 INFO:tasks.workunit.client.0.vm03.stdout:7/934: creat d5/d9/d14/d26/d36/d51/d7b/d83/f130 x:0 0 0 2026-03-10T14:08:54.227 INFO:tasks.workunit.client.0.vm03.stdout:7/935: dread d5/d9/f19 [0,4194304] 0 2026-03-10T14:08:54.233 INFO:tasks.workunit.client.0.vm03.stdout:7/936: fsync d5/d9/d14/d26/d36/d51/f12b 0 2026-03-10T14:08:54.234 INFO:tasks.workunit.client.0.vm03.stdout:7/937: chown d5/d9/d14/d26/d5f/f122 451792311 1 2026-03-10T14:08:54.234 INFO:tasks.workunit.client.0.vm03.stdout:7/938: symlink d5/d9/d35/d101/l131 0 2026-03-10T14:08:54.248 INFO:tasks.workunit.client.0.vm03.stdout:7/939: write d5/d9/d14/d26/d5f/d89/fcd [1370618,116866] 0 2026-03-10T14:08:54.259 INFO:tasks.workunit.client.0.vm03.stdout:7/940: unlink d5/d9/d14/d26/d36/d51/d7b/fc9 0 2026-03-10T14:08:54.262 INFO:tasks.workunit.client.0.vm03.stdout:7/941: write d5/d9/d14/d26/d5f/d89/f129 [2047172,79852] 0 2026-03-10T14:08:54.267 INFO:tasks.workunit.client.0.vm03.stdout:7/942: symlink d5/d9/d14/d26/d5f/d89/l132 0 2026-03-10T14:08:54.268 INFO:tasks.workunit.client.0.vm03.stdout:7/943: truncate d5/d9/d14/d26/d5f/de2/f105 5067819 0 2026-03-10T14:08:54.281 INFO:tasks.workunit.client.0.vm03.stdout:7/944: dwrite d5/d9/d14/f23 [0,4194304] 0 2026-03-10T14:08:54.295 INFO:tasks.workunit.client.0.vm03.stdout:7/945: unlink d5/f53 0 2026-03-10T14:08:54.297 INFO:tasks.workunit.client.0.vm03.stdout:7/946: chown d5/d9/d14/d26/d36/de1/l4b 396498 1 2026-03-10T14:08:54.304 INFO:tasks.workunit.client.0.vm03.stdout:7/947: rmdir d5/d9/d14/d26/d36/d51/d7b/d9c/dd5 39 2026-03-10T14:08:54.319 INFO:tasks.workunit.client.0.vm03.stdout:7/948: dwrite d5/d9/d14/d26/d36/db4/fb7 [0,4194304] 0 2026-03-10T14:08:54.320 INFO:tasks.workunit.client.0.vm03.stdout:7/949: chown d5/d9/d14/f23 1 1 2026-03-10T14:08:54.337 INFO:tasks.workunit.client.0.vm03.stdout:7/950: link d5/d9/d14/d26/d36/db4/f118 d5/d9/d14/d26/d5f/d89/f133 0 2026-03-10T14:08:54.337 INFO:tasks.workunit.client.0.vm03.stdout:7/951: write d5/d9/f7a [32868,16112] 0 2026-03-10T14:08:54.354 INFO:tasks.workunit.client.0.vm03.stdout:7/952: creat d5/d9/d14/d26/d5f/d89/f134 x:0 0 0 2026-03-10T14:08:54.355 INFO:tasks.workunit.client.0.vm03.stdout:7/953: readlink d5/d9/d14/d26/d36/de1/d84/l10f 0 2026-03-10T14:08:54.357 INFO:tasks.workunit.client.0.vm03.stdout:7/954: dread - d5/d9/ff8 zero size 2026-03-10T14:08:54.370 INFO:tasks.workunit.client.0.vm03.stdout:7/955: mknod d5/d9/d14/d21/c135 0 2026-03-10T14:08:54.371 INFO:tasks.workunit.client.0.vm03.stdout:7/956: chown d5/d9/d14/f23 116384453 1 2026-03-10T14:08:54.374 INFO:tasks.workunit.client.0.vm03.stdout:7/957: symlink d5/d9/d14/d26/d36/d51/d7b/d83/d119/d125/l136 0 2026-03-10T14:08:54.413 INFO:tasks.workunit.client.0.vm03.stdout:7/958: creat d5/d9/d35/d101/d115/f137 x:0 0 0 2026-03-10T14:08:54.413 INFO:tasks.workunit.client.0.vm03.stdout:7/959: stat d5/d9/d14/d26/d36/db4 0 2026-03-10T14:08:54.414 INFO:tasks.workunit.client.0.vm03.stdout:7/960: read - d5/d9/d14/d26/d5f/de7/fef zero size 2026-03-10T14:08:54.423 INFO:tasks.workunit.client.0.vm03.stdout:7/961: creat d5/d9/d14/d26/d36/de1/d84/d11e/f138 x:0 0 0 2026-03-10T14:08:54.438 INFO:tasks.workunit.client.0.vm03.stdout:7/962: dwrite d5/d9/d14/d26/d36/d51/dc8/fdd [0,4194304] 0 2026-03-10T14:08:54.449 INFO:tasks.workunit.client.0.vm03.stdout:7/963: link d5/d9/d14/d26/d5f/d89/fcd d5/d9/d14/d26/d36/d51/d11c/f139 0 2026-03-10T14:08:54.451 INFO:tasks.workunit.client.0.vm03.stdout:7/964: mknod d5/d9/d14/d26/d36/db4/c13a 0 2026-03-10T14:08:54.455 INFO:tasks.workunit.client.0.vm03.stdout:7/965: dread d5/d9/d14/d26/d36/d51/dc8/de3/f108 [0,4194304] 0 2026-03-10T14:08:54.458 INFO:tasks.workunit.client.0.vm03.stdout:7/966: dread d5/d9/d14/d26/d5f/d89/d112/f11b [0,4194304] 0 2026-03-10T14:08:54.459 INFO:tasks.workunit.client.0.vm03.stdout:7/967: read d5/d9/d14/d26/d36/de1/f4e [549537,116635] 0 2026-03-10T14:08:54.476 INFO:tasks.workunit.client.0.vm03.stdout:7/968: dwrite d5/d9/d14/d26/d36/db4/f118 [0,4194304] 0 2026-03-10T14:08:54.487 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local systemd[1]: Stopping Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:08:54.487 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[49714]: 2026-03-10T14:08:54.315+0000 7ffba7fb7700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm03 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:08:54.487 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[49714]: 2026-03-10T14:08:54.315+0000 7ffba7fb7700 -1 mon.vm03@0(leader) e2 *** Got Signal Terminated *** 2026-03-10T14:08:54.520 INFO:tasks.workunit.client.0.vm03.stdout:7/969: sync 2026-03-10T14:08:54.524 INFO:tasks.workunit.client.0.vm03.stdout:7/970: symlink d5/d9/d14/d26/d36/de1/d84/l13b 0 2026-03-10T14:08:54.526 INFO:tasks.workunit.client.0.vm03.stdout:7/971: symlink d5/d9/d14/d21/l13c 0 2026-03-10T14:08:54.532 INFO:tasks.workunit.client.0.vm03.stdout:7/972: dwrite d5/d9/d14/d26/d5f/f122 [0,4194304] 0 2026-03-10T14:08:54.570 INFO:tasks.workunit.client.0.vm03.stdout:7/973: dwrite d5/d9/d14/d26/d36/d51/d7b/d9c/dd5/df7/f109 [0,4194304] 0 2026-03-10T14:08:54.591 INFO:tasks.workunit.client.0.vm03.stdout:7/974: dread d5/d9/d14/d26/d39/db1/fe0 [0,4194304] 0 2026-03-10T14:08:54.593 INFO:tasks.workunit.client.0.vm03.stdout:7/975: creat d5/d98/f13d x:0 0 0 2026-03-10T14:08:54.594 INFO:tasks.workunit.client.0.vm03.stdout:7/976: chown d5/l5e 36 1 2026-03-10T14:08:54.597 INFO:tasks.workunit.client.0.vm03.stdout:7/977: chown d5/d9/d14/d26/d36/de1/d84/d11e 35691 1 2026-03-10T14:08:54.607 INFO:tasks.workunit.client.0.vm03.stdout:7/978: creat d5/d9/d14/d26/f13e x:0 0 0 2026-03-10T14:08:54.607 INFO:tasks.workunit.client.0.vm03.stdout:7/979: dread - d5/d9/da7/d10a/f123 zero size 2026-03-10T14:08:54.607 INFO:tasks.workunit.client.0.vm03.stdout:7/980: stat d5/d9/d35/laf 0 2026-03-10T14:08:54.616 INFO:tasks.workunit.client.0.vm03.stdout:7/981: write d5/d9/d14/d26/d5f/f8c [3365611,75544] 0 2026-03-10T14:08:54.620 INFO:tasks.workunit.client.0.vm03.stdout:7/982: symlink d5/d9/d14/d26/d36/d51/d7b/d9c/dd5/l13f 0 2026-03-10T14:08:54.623 INFO:tasks.workunit.client.0.vm03.stdout:7/983: chown d5/d9/d14/d21/fb0 16362787 1 2026-03-10T14:08:54.626 INFO:tasks.workunit.client.0.vm03.stdout:7/984: fsync d5/d9/d14/d26/d36/de1/d84/f95 0 2026-03-10T14:08:54.636 INFO:tasks.workunit.client.0.vm03.stdout:7/985: creat d5/d9/d35/f140 x:0 0 0 2026-03-10T14:08:54.637 INFO:tasks.workunit.client.0.vm03.stdout:7/986: stat d5/d9/d14/d26/d36/d51/d11c/f139 0 2026-03-10T14:08:54.647 INFO:tasks.workunit.client.0.vm03.stdout:7/987: dwrite d5/d9/d14/d26/d36/d51/d7b/d83/d119/d125/ff6 [0,4194304] 0 2026-03-10T14:08:54.651 INFO:tasks.workunit.client.0.vm03.stdout:7/988: rename d5/d9/d14/c50 to d5/d9/da7/d10a/c141 0 2026-03-10T14:08:54.653 INFO:tasks.workunit.client.0.vm03.stdout:7/989: write d5/d9/d14/d26/d36/d51/d7b/d9c/dd5/df7/f109 [2820436,58898] 0 2026-03-10T14:08:54.671 INFO:tasks.workunit.client.0.vm03.stdout:7/990: symlink d5/d9/d14/d26/d5f/dce/l142 0 2026-03-10T14:08:54.673 INFO:tasks.workunit.client.0.vm03.stdout:7/991: rename d5/d9/d14/d21/c135 to d5/d9/d14/d26/d5f/de7/c143 0 2026-03-10T14:08:54.686 INFO:tasks.workunit.client.0.vm03.stdout:7/992: creat d5/d9/d14/d126/f144 x:0 0 0 2026-03-10T14:08:54.699 INFO:tasks.workunit.client.0.vm03.stdout:7/993: creat d5/d9/d14/d26/d36/de1/f145 x:0 0 0 2026-03-10T14:08:54.713 INFO:tasks.workunit.client.0.vm03.stdout:7/994: dwrite d5/d9/d14/d26/d5f/de7/fef [0,4194304] 0 2026-03-10T14:08:54.732 INFO:tasks.workunit.client.0.vm03.stdout:7/995: rmdir d5/d9/d14/d26/d36/d51/d7b/d83/d119/d125/d12f 39 2026-03-10T14:08:54.736 INFO:tasks.workunit.client.0.vm03.stdout:7/996: dwrite d5/d9/d14/d26/d5f/d89/f133 [4194304,4194304] 0 2026-03-10T14:08:54.746 INFO:tasks.workunit.client.0.vm03.stdout:7/997: truncate d5/d9/d14/d26/d36/de1/d84/d11e/f138 1035375 0 2026-03-10T14:08:54.753 INFO:tasks.workunit.client.0.vm03.stdout:7/998: dwrite d5/d9/d14/d26/d36/de1/f145 [0,4194304] 0 2026-03-10T14:08:54.755 INFO:tasks.workunit.client.0.vm03.stdout:7/999: readlink d5/d9/d14/d26/d39/lb3 0 2026-03-10T14:08:54.770 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local podman[102973]: 2026-03-10 14:08:54.488870953 +0000 UTC m=+0.197140326 container died f59cc7d5bdfd721a1c607cf5c3921dd1e5f48688868f7fe7c9f2066755202e6e (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, ceph=True, org.label-schema.build-date=20231212, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.0, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) 2026-03-10T14:08:54.770 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local podman[102973]: 2026-03-10 14:08:54.563407488 +0000 UTC m=+0.271676852 container remove f59cc7d5bdfd721a1c607cf5c3921dd1e5f48688868f7fe7c9f2066755202e6e (image=quay.io/ceph/ceph:v18.2.0, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True, org.label-schema.license=GPLv2, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, CEPH_POINT_RELEASE=-18.2.0) 2026-03-10T14:08:54.770 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local bash[102973]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03 2026-03-10T14:08:54.770 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03.service: Deactivated successfully. 2026-03-10T14:08:54.770 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local systemd[1]: Stopped Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:08:54.770 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03.service: Consumed 6.631s CPU time. 2026-03-10T14:08:54.770 INFO:tasks.workunit.client.0.vm03.stderr:+ rm -rf -- ./tmp.Ji2F8ZhDoI 2026-03-10T14:08:55.208 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:54 vm03.local systemd[1]: Starting Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:08:55.208 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local podman[103084]: 2026-03-10 14:08:55.171714295 +0000 UTC m=+0.084318638 container create c2a0f005ef9d25695d3e74722de7ae656b06ac4b6bde2933041a0986fe82fead (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, ceph=True) 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local podman[103084]: 2026-03-10 14:08:55.127938776 +0000 UTC m=+0.040543129 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local podman[103084]: 2026-03-10 14:08:55.26563516 +0000 UTC m=+0.178239514 container init c2a0f005ef9d25695d3e74722de7ae656b06ac4b6bde2933041a0986fe82fead (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local podman[103084]: 2026-03-10 14:08:55.281081796 +0000 UTC m=+0.193686139 container start c2a0f005ef9d25695d3e74722de7ae656b06ac4b6bde2933041a0986fe82fead (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local bash[103084]: c2a0f005ef9d25695d3e74722de7ae656b06ac4b6bde2933041a0986fe82fead 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local systemd[1]: Started Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: pidfile_write: ignore empty --pid-file 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: load: jerasure load: lrc 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: RocksDB version: 7.9.2 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Git sha 0 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T14:08:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: DB SUMMARY 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: DB Session ID: 6SAE4GIV7Y6TF84I4PBB 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: CURRENT file: CURRENT 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: MANIFEST file: MANIFEST-000015 size: 775 Bytes 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm03/store.db dir, Total Num: 1, files: 000023.sst 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm03/store.db: 000021.log size: 2072506 ; 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.error_if_exists: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.create_if_missing: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.paranoid_checks: 1 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.env: 0x555b4a165dc0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.info_log: 0x555b4c6e9900 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.statistics: (nil) 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.use_fsync: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_log_file_size: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.allow_fallocate: 1 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.use_direct_reads: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.db_log_dir: 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.wal_dir: 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.write_buffer_manager: 0x555b4c6ed900 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T14:08:55.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.unordered_write: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.row_cache: None 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.wal_filter: None 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.two_write_queues: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.wal_compression: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.atomic_flush: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.log_readahead_size: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_background_jobs: 2 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_background_compactions: -1 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_subcompactions: 1 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T14:08:55.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_open_files: -1 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_background_flushes: -1 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Compression algorithms supported: 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: kZSTD supported: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: kXpressCompression supported: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: kBZip2Compression supported: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: kLZ4Compression supported: 1 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: kZlibCompression supported: 1 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: kSnappyCompression supported: 1 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000015 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.merge_operator: 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_filter: None 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555b4c6e9580) 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks: 1 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_top_level_index_and_filter: 1 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_type: 0 2026-03-10T14:08:55.611 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_index_type: 0 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_shortening: 1 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: checksum: 4 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: no_block_cache: 0 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache: 0x555b4c70c9b0 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_name: BinnedLRUCache 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_options: 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: capacity : 536870912 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_shard_bits : 4 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: strict_capacity_limit : 0 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: high_pri_pool_ratio: 0.000 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_compressed: (nil) 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: persistent_cache: (nil) 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size: 4096 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size_deviation: 10 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_restart_interval: 16 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_block_restart_interval: 1 2026-03-10T14:08:55.612 INFO:journalctl@ceph.mon.vm03.vm03.stdout: metadata_block_size: 4096 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: partition_filters: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: use_delta_encoding: 1 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: filter_policy: bloomfilter 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: whole_key_filtering: 1 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: verify_compression: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: read_amp_bytes_per_bit: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: format_version: 5 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_index_compression: 1 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_align: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_auto_readahead_size: 262144 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: prepopulate_block_cache: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: initial_auto_readahead_size: 8192 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression: NoCompression 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.num_levels: 7 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T14:08:55.613 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.inplace_update_support: 0 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T14:08:55.614 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.bloom_locality: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.max_successive_merges: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.ttl: 2592000 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.enable_blob_files: false 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.min_blob_size: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 25, last_sequence is 8197, log_number is 21,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 21 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 21 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 6793a155-1d93-4889-a6c4-fc7b890f7683 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151735318545, "job": 1, "event": "recovery_started", "wal_files": [21]} 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #21 mode 2 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151735329398, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 26, "file_size": 1879053, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8198, "largest_seqno": 8836, "table_properties": {"data_size": 1874922, "index_size": 2258, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 7541, "raw_average_key_size": 24, "raw_value_size": 1867750, "raw_average_value_size": 5948, "num_data_blocks": 108, "num_entries": 314, "num_filter_entries": 314, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773151735, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "6793a155-1d93-4889-a6c4-fc7b890f7683", "db_session_id": "6SAE4GIV7Y6TF84I4PBB", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151735329632, "job": 1, "event": "recovery_finished"} 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/version_set.cc:5047] Creating manifest 28 2026-03-10T14:08:55.615 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:55 vm03.local ceph-mon[103098]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T14:08:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: mon.vm03 calling monitor election 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: mon.vm03 is new leader, mons vm03,vm04 in quorum (ranks 0,1) 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: monmap epoch 2 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: fsid b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: last_changed 2026-03-10T14:03:54.187855+0000 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: created 2026-03-10T14:02:36.685064+0000 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: min_mon_release 18 (reef) 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: election_strategy: 1 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: 1: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.vm04 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: mgrmap e30: vm03.rwbbep(active, since 23s), standbys: vm04.ywwcto 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: overall HEALTH_OK 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: from='mgr.14704 ' entity='' 2026-03-10T14:08:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:56 vm04.local ceph-mon[55966]: mgrmap e31: vm03.rwbbep(active, since 23s), standbys: vm04.ywwcto 2026-03-10T14:08:56.332 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: mon.vm03 calling monitor election 2026-03-10T14:08:56.332 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: mon.vm03 is new leader, mons vm03,vm04 in quorum (ranks 0,1) 2026-03-10T14:08:56.332 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: monmap epoch 2 2026-03-10T14:08:56.332 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: fsid b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:08:56.332 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: last_changed 2026-03-10T14:03:54.187855+0000 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: created 2026-03-10T14:02:36.685064+0000 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: min_mon_release 18 (reef) 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: election_strategy: 1 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: 1: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.vm04 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: mgrmap e30: vm03.rwbbep(active, since 23s), standbys: vm04.ywwcto 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: overall HEALTH_OK 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: from='mgr.14704 ' entity='' 2026-03-10T14:08:56.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:56 vm03.local ceph-mon[103098]: mgrmap e31: vm03.rwbbep(active, since 23s), standbys: vm04.ywwcto 2026-03-10T14:08:57.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:57 vm03.local ceph-mon[103098]: Standby manager daemon vm04.ywwcto restarted 2026-03-10T14:08:57.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:57 vm03.local ceph-mon[103098]: Standby manager daemon vm04.ywwcto started 2026-03-10T14:08:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:57 vm04.local ceph-mon[55966]: Standby manager daemon vm04.ywwcto restarted 2026-03-10T14:08:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:57 vm04.local ceph-mon[55966]: Standby manager daemon vm04.ywwcto started 2026-03-10T14:08:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:08:58 vm04.local ceph-mon[55966]: mgrmap e32: vm03.rwbbep(active, since 25s), standbys: vm04.ywwcto 2026-03-10T14:08:58.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:08:58 vm03.local ceph-mon[103098]: mgrmap e32: vm03.rwbbep(active, since 25s), standbys: vm04.ywwcto 2026-03-10T14:09:00.764 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:00 vm03.local ceph-mon[103098]: Active manager daemon vm03.rwbbep restarted 2026-03-10T14:09:00.764 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:00 vm03.local ceph-mon[103098]: Activating manager daemon vm03.rwbbep 2026-03-10T14:09:00.987 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:00 vm04.local ceph-mon[55966]: Active manager daemon vm03.rwbbep restarted 2026-03-10T14:09:00.987 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:00 vm04.local ceph-mon[55966]: Activating manager daemon vm03.rwbbep 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: mgrmap e33: vm03.rwbbep(active, starting, since 0.137664s), standbys: vm04.ywwcto 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm04.ywwcto", "id": "vm04.ywwcto"}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:09:01.752 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: Manager daemon vm03.rwbbep is now available 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:09:01.753 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:01 vm03.local ceph-mon[103098]: mgrmap e34: vm03.rwbbep(active, since 1.15433s), standbys: vm04.ywwcto 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: mgrmap e33: vm03.rwbbep(active, starting, since 0.137664s), standbys: vm04.ywwcto 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm03.rwbbep", "id": "vm03.rwbbep"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr metadata", "who": "vm04.ywwcto", "id": "vm04.ywwcto"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: Manager daemon vm03.rwbbep is now available 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/mirror_snapshot_schedule"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.rwbbep/trash_purge_schedule"}]: dispatch 2026-03-10T14:09:01.929 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:01 vm04.local ceph-mon[55966]: mgrmap e34: vm03.rwbbep(active, since 1.15433s), standbys: vm04.ywwcto 2026-03-10T14:09:02.534 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T14:09:02.535 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-10T14:09:02.637 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:02 vm03.local ceph-mon[103098]: [10/Mar/2026:14:09:01] ENGINE Bus STARTING 2026-03-10T14:09:02.637 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:02 vm03.local ceph-mon[103098]: [10/Mar/2026:14:09:02] ENGINE Serving on http://192.168.123.103:8765 2026-03-10T14:09:03.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:02 vm04.local ceph-mon[55966]: [10/Mar/2026:14:09:01] ENGINE Bus STARTING 2026-03-10T14:09:03.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:02 vm04.local ceph-mon[55966]: [10/Mar/2026:14:09:02] ENGINE Serving on http://192.168.123.103:8765 2026-03-10T14:09:03.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.564+0000 7f38e9940700 1 -- 192.168.123.103:0/3324998469 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41002b0 msgr2=0x7f38e41006c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:03.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.564+0000 7f38e9940700 1 --2- 192.168.123.103:0/3324998469 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41002b0 0x7f38e41006c0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f38d8009b00 tx=0x7f38d8009e10 comp rx=0 tx=0).stop 2026-03-10T14:09:03.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.565+0000 7f38e9940700 1 -- 192.168.123.103:0/3324998469 shutdown_connections 2026-03-10T14:09:03.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.565+0000 7f38e9940700 1 --2- 192.168.123.103:0/3324998469 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e41014b0 0x7f38e4101900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.565+0000 7f38e9940700 1 --2- 192.168.123.103:0/3324998469 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41002b0 0x7f38e41006c0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.565+0000 7f38e9940700 1 -- 192.168.123.103:0/3324998469 >> 192.168.123.103:0/3324998469 conn(0x7f38e40fb860 msgr2=0x7f38e40fdc90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:03.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.565+0000 7f38e9940700 1 -- 192.168.123.103:0/3324998469 shutdown_connections 2026-03-10T14:09:03.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.565+0000 7f38e9940700 1 -- 192.168.123.103:0/3324998469 wait complete. 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.566+0000 7f38e9940700 1 Processor -- start 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.566+0000 7f38e9940700 1 -- start start 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.566+0000 7f38e9940700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e41002b0 0x7f38e4197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.566+0000 7f38e9940700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41014b0 0x7f38e41982e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.566+0000 7f38e9940700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38e4198900 con 0x7f38e41014b0 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.566+0000 7f38e9940700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38e4198a40 con 0x7f38e41002b0 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.567+0000 7f38e3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41014b0 0x7f38e41982e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.567+0000 7f38e3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41014b0 0x7f38e41982e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:37314/0 (socket says 192.168.123.103:37314) 2026-03-10T14:09:03.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.567+0000 7f38e3fff700 1 -- 192.168.123.103:0/954161168 learned_addr learned my addr 192.168.123.103:0/954161168 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:03.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.567+0000 7f38e893e700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e41002b0 0x7f38e4197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:03.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.567+0000 7f38e3fff700 1 -- 192.168.123.103:0/954161168 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e41002b0 msgr2=0x7f38e4197da0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:03.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.567+0000 7f38e3fff700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e41002b0 0x7f38e4197da0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.567+0000 7f38e3fff700 1 -- 192.168.123.103:0/954161168 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38d80097e0 con 0x7f38e41014b0 2026-03-10T14:09:03.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.567+0000 7f38e3fff700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41014b0 0x7f38e41982e0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f38d000d8d0 tx=0x7f38d000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:03.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.568+0000 7f38e1ffb700 1 -- 192.168.123.103:0/954161168 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38d0009880 con 0x7f38e41014b0 2026-03-10T14:09:03.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.568+0000 7f38e1ffb700 1 -- 192.168.123.103:0/954161168 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f38d0010460 con 0x7f38e41014b0 2026-03-10T14:09:03.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.568+0000 7f38e1ffb700 1 -- 192.168.123.103:0/954161168 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38d000f5d0 con 0x7f38e41014b0 2026-03-10T14:09:03.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.568+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38e419d4f0 con 0x7f38e41014b0 2026-03-10T14:09:03.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.568+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38e419da40 con 0x7f38e41014b0 2026-03-10T14:09:03.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.569+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38e4105590 con 0x7f38e41014b0 2026-03-10T14:09:03.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.571+0000 7f38e1ffb700 1 -- 192.168.123.103:0/954161168 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 35) v1 ==== 100161+0+0 (secure 0 0 0) 0x7f38d00105d0 con 0x7f38e41014b0 2026-03-10T14:09:03.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.571+0000 7f38e1ffb700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f38d4077a70 0x7f38d4079f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:03.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.571+0000 7f38e1ffb700 1 -- 192.168.123.103:0/954161168 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f38d0020030 con 0x7f38e41014b0 2026-03-10T14:09:03.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.572+0000 7f38e893e700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f38d4077a70 0x7f38d4079f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:03.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.572+0000 7f38e1ffb700 1 -- 192.168.123.103:0/954161168 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f38d0099e60 con 0x7f38e41014b0 2026-03-10T14:09:03.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.573+0000 7f38e893e700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f38d4077a70 0x7f38d4079f20 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f38d80052d0 tx=0x7f38d8005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:03.716 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: [10/Mar/2026:14:09:02] ENGINE Serving on https://192.168.123.103:7150 2026-03-10T14:09:03.716 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: [10/Mar/2026:14:09:02] ENGINE Bus STARTED 2026-03-10T14:09:03.716 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: [10/Mar/2026:14:09:02] ENGINE Client ('192.168.123.103', 41480) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: pgmap v4: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: mgrmap e35: vm03.rwbbep(active, since 2s), standbys: vm04.ywwcto 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.717 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:03 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.718+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f38e4061190 con 0x7f38d4077a70 2026-03-10T14:09:03.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.720+0000 7f38e1ffb700 1 -- 192.168.123.103:0/954161168 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+337 (secure 0 0 0) 0x7f38e4061190 con 0x7f38d4077a70 2026-03-10T14:09:03.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f38d4077a70 msgr2=0x7f38d4079f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:03.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f38d4077a70 0x7f38d4079f20 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f38d80052d0 tx=0x7f38d8005fb0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41014b0 msgr2=0x7f38e41982e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41014b0 0x7f38e41982e0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f38d000d8d0 tx=0x7f38d000dbe0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 shutdown_connections 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f38d4077a70 0x7f38d4079f20 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f38e41002b0 0x7f38e4197da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 --2- 192.168.123.103:0/954161168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f38e41014b0 0x7f38e41982e0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 >> 192.168.123.103:0/954161168 conn(0x7f38e40fb860 msgr2=0x7f38e41026d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 shutdown_connections 2026-03-10T14:09:03.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.723+0000 7f38e9940700 1 -- 192.168.123.103:0/954161168 wait complete. 2026-03-10T14:09:03.735 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: [10/Mar/2026:14:09:02] ENGINE Serving on https://192.168.123.103:7150 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: [10/Mar/2026:14:09:02] ENGINE Bus STARTED 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: [10/Mar/2026:14:09:02] ENGINE Client ('192.168.123.103', 41480) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: pgmap v4: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: mgrmap e35: vm03.rwbbep(active, since 2s), standbys: vm04.ywwcto 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.827 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:09:03.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:03.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.828+0000 7fd6ab59e700 1 -- 192.168.123.103:0/1664889108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 msgr2=0x7fd6ac0fef60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:03.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.828+0000 7fd6ab59e700 1 --2- 192.168.123.103:0/1664889108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 0x7fd6ac0fef60 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fd694009b50 tx=0x7fd694009e60 comp rx=0 tx=0).stop 2026-03-10T14:09:03.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.833+0000 7fd6ab59e700 1 -- 192.168.123.103:0/1664889108 shutdown_connections 2026-03-10T14:09:03.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.833+0000 7fd6ab59e700 1 --2- 192.168.123.103:0/1664889108 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6ac0ff4a0 0x7fd6ac0ff910 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.833+0000 7fd6ab59e700 1 --2- 192.168.123.103:0/1664889108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 0x7fd6ac0fef60 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.833+0000 7fd6ab59e700 1 -- 192.168.123.103:0/1664889108 >> 192.168.123.103:0/1664889108 conn(0x7fd6ac0fa760 msgr2=0x7fd6ac0fcbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:03.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.834+0000 7fd6ab59e700 1 -- 192.168.123.103:0/1664889108 shutdown_connections 2026-03-10T14:09:03.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.836+0000 7fd6ab59e700 1 -- 192.168.123.103:0/1664889108 wait complete. 2026-03-10T14:09:03.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.839+0000 7fd6ab59e700 1 Processor -- start 2026-03-10T14:09:03.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.839+0000 7fd6ab59e700 1 -- start start 2026-03-10T14:09:03.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.839+0000 7fd6ab59e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 0x7fd6ac071d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:03.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.839+0000 7fd6ab59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6ac0ff4a0 0x7fd6ac072240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:03.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.839+0000 7fd6ab59e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6ac072780 con 0x7fd6ac0feb50 2026-03-10T14:09:03.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.839+0000 7fd6ab59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6ac0728c0 con 0x7fd6ac0ff4a0 2026-03-10T14:09:03.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6aa59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 0x7fd6ac071d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:03.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6aa59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 0x7fd6ac071d00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60026/0 (socket says 192.168.123.103:60026) 2026-03-10T14:09:03.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6aa59c700 1 -- 192.168.123.103:0/244246534 learned_addr learned my addr 192.168.123.103:0/244246534 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:03.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6aa59c700 1 -- 192.168.123.103:0/244246534 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6ac0ff4a0 msgr2=0x7fd6ac072240 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:03.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6aa59c700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6ac0ff4a0 0x7fd6ac072240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6aa59c700 1 -- 192.168.123.103:0/244246534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6940097e0 con 0x7fd6ac0feb50 2026-03-10T14:09:03.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6aa59c700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 0x7fd6ac071d00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fd69400b5c0 tx=0x7fd6940057f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:03.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6a37fe700 1 -- 192.168.123.103:0/244246534 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd69401d070 con 0x7fd6ac0feb50 2026-03-10T14:09:03.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.840+0000 7fd6a37fe700 1 -- 192.168.123.103:0/244246534 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd69400bc30 con 0x7fd6ac0feb50 2026-03-10T14:09:03.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.841+0000 7fd6a37fe700 1 -- 192.168.123.103:0/244246534 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd694022620 con 0x7fd6ac0feb50 2026-03-10T14:09:03.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.842+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6ac1a18d0 con 0x7fd6ac0feb50 2026-03-10T14:09:03.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.842+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6ac1a1cf0 con 0x7fd6ac0feb50 2026-03-10T14:09:03.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.842+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd6ac04ea50 con 0x7fd6ac0feb50 2026-03-10T14:09:03.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.843+0000 7fd6a37fe700 1 -- 192.168.123.103:0/244246534 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 35) v1 ==== 100161+0+0 (secure 0 0 0) 0x7fd694022780 con 0x7fd6ac0feb50 2026-03-10T14:09:03.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.844+0000 7fd6a37fe700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd698077760 0x7fd698079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:03.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.844+0000 7fd6a9d9b700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd698077760 0x7fd698079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:03.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.844+0000 7fd6a9d9b700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd698077760 0x7fd698079c10 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fd6ac072df0 tx=0x7fd69c006d20 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:03.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.845+0000 7fd6a37fe700 1 -- 192.168.123.103:0/244246534 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd69400bda0 con 0x7fd6ac0feb50 2026-03-10T14:09:03.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.848+0000 7fd6a37fe700 1 -- 192.168.123.103:0/244246534 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd694064670 con 0x7fd6ac0feb50 2026-03-10T14:09:03.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.979+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd6ac1088a0 con 0x7fd698077760 2026-03-10T14:09:03.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.981+0000 7fd6a37fe700 1 -- 192.168.123.103:0/244246534 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+337 (secure 0 0 0) 0x7fd6ac1088a0 con 0x7fd698077760 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd698077760 msgr2=0x7fd698079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd698077760 0x7fd698079c10 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fd6ac072df0 tx=0x7fd69c006d20 comp rx=0 tx=0).stop 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 msgr2=0x7fd6ac071d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 0x7fd6ac071d00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fd69400b5c0 tx=0x7fd6940057f0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 shutdown_connections 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd698077760 0x7fd698079c10 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6ac0feb50 0x7fd6ac071d00 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 --2- 192.168.123.103:0/244246534 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd6ac0ff4a0 0x7fd6ac072240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 >> 192.168.123.103:0/244246534 conn(0x7fd6ac0fa760 msgr2=0x7fd6ac107180 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 shutdown_connections 2026-03-10T14:09:03.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:03.984+0000 7fd6ab59e700 1 -- 192.168.123.103:0/244246534 wait complete. 2026-03-10T14:09:04.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.066+0000 7f74d923a700 1 -- 192.168.123.103:0/2141475375 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc0987d0 msgr2=0x7f74cc09abb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.066+0000 7f74d923a700 1 --2- 192.168.123.103:0/2141475375 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc0987d0 0x7f74cc09abb0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f74c0009ab0 tx=0x7f74c0009dc0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.066+0000 7f74d923a700 1 -- 192.168.123.103:0/2141475375 shutdown_connections 2026-03-10T14:09:04.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.066+0000 7f74d923a700 1 --2- 192.168.123.103:0/2141475375 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc0987d0 0x7f74cc09abb0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.066+0000 7f74d923a700 1 --2- 192.168.123.103:0/2141475375 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74cc095eb0 0x7f74cc098290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.066+0000 7f74d923a700 1 -- 192.168.123.103:0/2141475375 >> 192.168.123.103:0/2141475375 conn(0x7f74cc08f8f0 msgr2=0x7f74cc091d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:04.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.067+0000 7f74d923a700 1 -- 192.168.123.103:0/2141475375 shutdown_connections 2026-03-10T14:09:04.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.067+0000 7f74d923a700 1 -- 192.168.123.103:0/2141475375 wait complete. 2026-03-10T14:09:04.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d923a700 1 Processor -- start 2026-03-10T14:09:04.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d923a700 1 -- start start 2026-03-10T14:09:04.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d923a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc095eb0 0x7f74cc0957d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d923a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74cc0987d0 0x7f74cc093e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d923a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74cc12e930 con 0x7f74cc095eb0 2026-03-10T14:09:04.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d923a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74cc12eaa0 con 0x7f74cc0987d0 2026-03-10T14:09:04.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc095eb0 0x7f74cc0957d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc095eb0 0x7f74cc0957d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60046/0 (socket says 192.168.123.103:60046) 2026-03-10T14:09:04.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d3fff700 1 -- 192.168.123.103:0/3028740819 learned_addr learned my addr 192.168.123.103:0/3028740819 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:04.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d37fe700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74cc0987d0 0x7f74cc093e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d3fff700 1 -- 192.168.123.103:0/3028740819 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74cc0987d0 msgr2=0x7f74cc093e60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d3fff700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74cc0987d0 0x7f74cc093e60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.068+0000 7f74d3fff700 1 -- 192.168.123.103:0/3028740819 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f74c0009710 con 0x7f74cc095eb0 2026-03-10T14:09:04.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.069+0000 7f74d3fff700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc095eb0 0x7f74cc0957d0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f74c800da40 tx=0x7f74c800de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:04.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.069+0000 7f74d17fa700 1 -- 192.168.123.103:0/3028740819 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f74c80099a0 con 0x7f74cc095eb0 2026-03-10T14:09:04.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.069+0000 7f74d17fa700 1 -- 192.168.123.103:0/3028740819 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f74c800b6a0 con 0x7f74cc095eb0 2026-03-10T14:09:04.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.069+0000 7f74d17fa700 1 -- 192.168.123.103:0/3028740819 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f74c800f5d0 con 0x7f74cc095eb0 2026-03-10T14:09:04.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.069+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f74cc0943a0 con 0x7f74cc095eb0 2026-03-10T14:09:04.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.069+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f74cc0948f0 con 0x7f74cc095eb0 2026-03-10T14:09:04.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.071+0000 7f74d17fa700 1 -- 192.168.123.103:0/3028740819 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 35) v1 ==== 100161+0+0 (secure 0 0 0) 0x7f74c800f730 con 0x7f74cc095eb0 2026-03-10T14:09:04.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.072+0000 7f74d17fa700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f74c4077960 0x7f74c4079e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.072+0000 7f74d17fa700 1 -- 192.168.123.103:0/3028740819 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f74c809ab00 con 0x7f74cc095eb0 2026-03-10T14:09:04.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.072+0000 7f74d37fe700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f74c4077960 0x7f74c4079e10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.073+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f74cc124ee0 con 0x7f74cc095eb0 2026-03-10T14:09:04.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.073+0000 7f74d37fe700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f74c4077960 0x7f74c4079e10 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f74c0005e90 tx=0x7f74c0005e20 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:04.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.076+0000 7f74d17fa700 1 -- 192.168.123.103:0/3028740819 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f74c80631d0 con 0x7f74cc095eb0 2026-03-10T14:09:04.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.220+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f74cc002420 con 0x7f74c4077960 2026-03-10T14:09:04.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.226+0000 7f74d17fa700 1 -- 192.168.123.103:0/3028740819 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f74cc002420 con 0x7f74c4077960 2026-03-10T14:09:04.225 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (5m) 1s ago 5m 22.0M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (3m) 1s ago 5m 8598k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (5m) 1s ago 5m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 1s ago 5m 7419k - 18.2.0 dc2bc1663786 57962aef7443 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (5m) 1s ago 5m 7407k - 18.2.0 dc2bc1663786 0918365fa827 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (4m) 1s ago 5m 90.9M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (3m) 1s ago 3m 157M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (3m) 1s ago 3m 101M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (3m) 1s ago 3m 88.5M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (3m) 1s ago 3m 172M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (44s) 1s ago 6m 590M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (16s) 1s ago 5m 304M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (9s) 1s ago 6m 42.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (5m) 1s ago 5m 46.3M 2048M 18.2.0 dc2bc1663786 4113774b34c7 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (5m) 1s ago 5m 15.0M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (5m) 1s ago 5m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 1s ago 4m 361M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (4m) 1s ago 4m 360M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (4m) 1s ago 4m 292M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (4m) 1s ago 4m 430M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (4m) 1s ago 4m 390M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (4m) 1s ago 4m 395M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:09:04.226 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (19s) 1s ago 5m 51.3M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.230+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f74c4077960 msgr2=0x7f74c4079e10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.230+0000 7f74d923a700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f74c4077960 0x7f74c4079e10 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f74c0005e90 tx=0x7f74c0005e20 comp rx=0 tx=0).stop 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.230+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc095eb0 msgr2=0x7f74cc0957d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.230+0000 7f74d923a700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc095eb0 0x7f74cc0957d0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f74c800da40 tx=0x7f74c800de00 comp rx=0 tx=0).stop 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.231+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 shutdown_connections 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.231+0000 7f74d923a700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f74c4077960 0x7f74c4079e10 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.231+0000 7f74d923a700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74cc095eb0 0x7f74cc0957d0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.231+0000 7f74d923a700 1 --2- 192.168.123.103:0/3028740819 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f74cc0987d0 0x7f74cc093e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.231+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 >> 192.168.123.103:0/3028740819 conn(0x7f74cc08f8f0 msgr2=0x7f74cc091cd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:04.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.231+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 shutdown_connections 2026-03-10T14:09:04.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.232+0000 7f74d923a700 1 -- 192.168.123.103:0/3028740819 wait complete. 2026-03-10T14:09:04.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.314+0000 7f22b702f700 1 -- 192.168.123.103:0/2052992688 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a42e0 msgr2=0x7f22a80a46f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.314+0000 7f22b702f700 1 --2- 192.168.123.103:0/2052992688 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a42e0 0x7f22a80a46f0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f22ac009b00 tx=0x7f22ac009e10 comp rx=0 tx=0).stop 2026-03-10T14:09:04.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.315+0000 7f22b702f700 1 -- 192.168.123.103:0/2052992688 shutdown_connections 2026-03-10T14:09:04.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.315+0000 7f22b702f700 1 --2- 192.168.123.103:0/2052992688 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f22a80a5420 0x7f22a80a5890 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.315+0000 7f22b702f700 1 --2- 192.168.123.103:0/2052992688 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a42e0 0x7f22a80a46f0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.315+0000 7f22b702f700 1 -- 192.168.123.103:0/2052992688 >> 192.168.123.103:0/2052992688 conn(0x7f22a809f7b0 msgr2=0x7f22a80a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:04.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.315+0000 7f22b702f700 1 -- 192.168.123.103:0/2052992688 shutdown_connections 2026-03-10T14:09:04.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.315+0000 7f22b702f700 1 -- 192.168.123.103:0/2052992688 wait complete. 2026-03-10T14:09:04.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b702f700 1 Processor -- start 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b702f700 1 -- start start 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b702f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f22a80a42e0 0x7f22a80b3630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b702f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a5420 0x7f22a80b3b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b702f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22a814bf90 con 0x7f22a80a5420 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b702f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22a814c100 con 0x7f22a80a42e0 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b582c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a5420 0x7f22a80b3b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b582c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a5420 0x7f22a80b3b70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60072/0 (socket says 192.168.123.103:60072) 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b582c700 1 -- 192.168.123.103:0/2327384346 learned_addr learned my addr 192.168.123.103:0/2327384346 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b582c700 1 -- 192.168.123.103:0/2327384346 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f22a80a42e0 msgr2=0x7f22a80b3630 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b582c700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f22a80a42e0 0x7f22a80b3630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.316+0000 7f22b582c700 1 -- 192.168.123.103:0/2327384346 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f22ac0097e0 con 0x7f22a80a5420 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.317+0000 7f22b582c700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a5420 0x7f22a80b3b70 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f22a000b700 tx=0x7f22a000bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:04.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.317+0000 7f22a77fe700 1 -- 192.168.123.103:0/2327384346 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f22a0010840 con 0x7f22a80a5420 2026-03-10T14:09:04.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.317+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f22a814c380 con 0x7f22a80a5420 2026-03-10T14:09:04.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.317+0000 7f22a77fe700 1 -- 192.168.123.103:0/2327384346 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f22a0010e80 con 0x7f22a80a5420 2026-03-10T14:09:04.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.317+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f22a814c8d0 con 0x7f22a80a5420 2026-03-10T14:09:04.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.317+0000 7f22a77fe700 1 -- 192.168.123.103:0/2327384346 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f22a0010b50 con 0x7f22a80a5420 2026-03-10T14:09:04.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.319+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f22a8004f40 con 0x7f22a80a5420 2026-03-10T14:09:04.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.319+0000 7f22a77fe700 1 -- 192.168.123.103:0/2327384346 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 35) v1 ==== 100161+0+0 (secure 0 0 0) 0x7f22a000f3e0 con 0x7f22a80a5420 2026-03-10T14:09:04.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.320+0000 7f22a77fe700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f229c077700 0x7f229c079bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.320+0000 7f22a77fe700 1 -- 192.168.123.103:0/2327384346 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f22a00991b0 con 0x7f22a80a5420 2026-03-10T14:09:04.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.320+0000 7f22b602d700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f229c077700 0x7f229c079bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.320+0000 7f22b602d700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f229c077700 0x7f229c079bb0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f22ac00b5c0 tx=0x7f22ac009f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:04.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.323+0000 7f22a77fe700 1 -- 192.168.123.103:0/2327384346 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f22a00618c0 con 0x7f22a80a5420 2026-03-10T14:09:04.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.516+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f22a81511f0 con 0x7f22a80a5420 2026-03-10T14:09:04.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.517+0000 7f22a77fe700 1 -- 192.168.123.103:0/2327384346 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+785 (secure 0 0 0) 0x7f22a0061010 con 0x7f22a80a5420 2026-03-10T14:09:04.516 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 1, 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 11, 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:09:04.517 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:09:04.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f229c077700 msgr2=0x7f229c079bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f229c077700 0x7f229c079bb0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f22ac00b5c0 tx=0x7f22ac009f90 comp rx=0 tx=0).stop 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a5420 msgr2=0x7f22a80b3b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a5420 0x7f22a80b3b70 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f22a000b700 tx=0x7f22a000bac0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 shutdown_connections 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f229c077700 0x7f229c079bb0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f22a80a42e0 0x7f22a80b3630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 --2- 192.168.123.103:0/2327384346 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f22a80a5420 0x7f22a80b3b70 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 >> 192.168.123.103:0/2327384346 conn(0x7f22a809f7b0 msgr2=0x7f22a80a8650 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 shutdown_connections 2026-03-10T14:09:04.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.521+0000 7f22b702f700 1 -- 192.168.123.103:0/2327384346 wait complete. 2026-03-10T14:09:04.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.598+0000 7ff915df5700 1 -- 192.168.123.103:0/974492533 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910107d50 msgr2=0x7ff9101081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.598+0000 7ff915df5700 1 --2- 192.168.123.103:0/974492533 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910107d50 0x7ff9101081c0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7ff900009b00 tx=0x7ff900009e10 comp rx=0 tx=0).stop 2026-03-10T14:09:04.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 -- 192.168.123.103:0/974492533 shutdown_connections 2026-03-10T14:09:04.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 --2- 192.168.123.103:0/974492533 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910107d50 0x7ff9101081c0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 --2- 192.168.123.103:0/974492533 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff910071db0 0x7ff9100721c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 -- 192.168.123.103:0/974492533 >> 192.168.123.103:0/974492533 conn(0x7ff91006d3e0 msgr2=0x7ff91006f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:04.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 -- 192.168.123.103:0/974492533 shutdown_connections 2026-03-10T14:09:04.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 -- 192.168.123.103:0/974492533 wait complete. 2026-03-10T14:09:04.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 Processor -- start 2026-03-10T14:09:04.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 -- start start 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910071db0 0x7ff9101169f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff910107d50 0x7ff910116f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff910117550 con 0x7ff910071db0 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.599+0000 7ff915df5700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9101b29c0 con 0x7ff910107d50 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.600+0000 7ff90f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910071db0 0x7ff9101169f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.600+0000 7ff90f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910071db0 0x7ff9101169f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60100/0 (socket says 192.168.123.103:60100) 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.600+0000 7ff90f7fe700 1 -- 192.168.123.103:0/4059371198 learned_addr learned my addr 192.168.123.103:0/4059371198 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.600+0000 7ff90f7fe700 1 -- 192.168.123.103:0/4059371198 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff910107d50 msgr2=0x7ff910116f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.600+0000 7ff90f7fe700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff910107d50 0x7ff910116f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.600+0000 7ff90f7fe700 1 -- 192.168.123.103:0/4059371198 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9000097e0 con 0x7ff910071db0 2026-03-10T14:09:04.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.601+0000 7ff90f7fe700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910071db0 0x7ff9101169f0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7ff8f800b700 tx=0x7ff8f800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:04.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.601+0000 7ff90cff9700 1 -- 192.168.123.103:0/4059371198 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8f8010840 con 0x7ff910071db0 2026-03-10T14:09:04.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.601+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff9101b2bc0 con 0x7ff910071db0 2026-03-10T14:09:04.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.601+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff9101b30c0 con 0x7ff910071db0 2026-03-10T14:09:04.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.601+0000 7ff90cff9700 1 -- 192.168.123.103:0/4059371198 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff8f8010e80 con 0x7ff910071db0 2026-03-10T14:09:04.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.601+0000 7ff90cff9700 1 -- 192.168.123.103:0/4059371198 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8f8019600 con 0x7ff910071db0 2026-03-10T14:09:04.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.603+0000 7ff90cff9700 1 -- 192.168.123.103:0/4059371198 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 35) v1 ==== 100161+0+0 (secure 0 0 0) 0x7ff8f8019760 con 0x7ff910071db0 2026-03-10T14:09:04.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.603+0000 7ff90cff9700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff8fc0779b0 0x7ff8fc079e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.603+0000 7ff90cff9700 1 -- 192.168.123.103:0/4059371198 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ff8f809a7b0 con 0x7ff910071db0 2026-03-10T14:09:04.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.604+0000 7ff90effd700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff8fc0779b0 0x7ff8fc079e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.604+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff8f0005320 con 0x7ff910071db0 2026-03-10T14:09:04.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.607+0000 7ff90effd700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff8fc0779b0 0x7ff8fc079e60 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7ff900009ad0 tx=0x7ff900009f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:04.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.607+0000 7ff90cff9700 1 -- 192.168.123.103:0/4059371198 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff8f8062e00 con 0x7ff910071db0 2026-03-10T14:09:04.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.673+0000 7ff90cff9700 1 -- 192.168.123.103:0/4059371198 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ff8f805f070 con 0x7ff910071db0 2026-03-10T14:09:04.757 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:04 vm04.local ceph-mon[55966]: from='client.34130 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:04.757 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:04 vm04.local ceph-mon[55966]: from='client.34134 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:04.757 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:04 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/2327384346' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:04.768 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:04 vm03.local ceph-mon[103098]: from='client.34130 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:04.769 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:04 vm03.local ceph-mon[103098]: from='client.34134 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:04.769 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:04 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2327384346' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:04.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.769+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7ff8f0005cc0 con 0x7ff910071db0 2026-03-10T14:09:04.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.772+0000 7ff90cff9700 1 -- 192.168.123.103:0/4059371198 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1973 (secure 0 0 0) 0x7ff8f80753c0 con 0x7ff910071db0 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff8fc0779b0 msgr2=0x7ff8fc079e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff8fc0779b0 0x7ff8fc079e60 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7ff900009ad0 tx=0x7ff900009f90 comp rx=0 tx=0).stop 2026-03-10T14:09:04.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910071db0 msgr2=0x7ff9101169f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910071db0 0x7ff9101169f0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7ff8f800b700 tx=0x7ff8f800bac0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 shutdown_connections 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff8fc0779b0 0x7ff8fc079e60 secure :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7ff900009ad0 tx=0x7ff900009f90 comp rx=0 tx=0).stop 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff910071db0 0x7ff9101169f0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 --2- 192.168.123.103:0/4059371198 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff910107d50 0x7ff910116f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 >> 192.168.123.103:0/4059371198 conn(0x7ff91006d3e0 msgr2=0x7ff91010af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 shutdown_connections 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.776+0000 7ff915df5700 1 -- 192.168.123.103:0/4059371198 wait complete. 2026-03-10T14:09:04.776 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:09:04.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.855+0000 7f0efbd40700 1 -- 192.168.123.103:0/76994492 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4102760 msgr2=0x7f0ef4102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.855+0000 7f0efbd40700 1 --2- 192.168.123.103:0/76994492 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4102760 0x7f0ef4102b70 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f0ee4009b00 tx=0x7f0ee4009e10 comp rx=0 tx=0).stop 2026-03-10T14:09:04.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.855+0000 7f0efbd40700 1 -- 192.168.123.103:0/76994492 shutdown_connections 2026-03-10T14:09:04.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.855+0000 7f0efbd40700 1 --2- 192.168.123.103:0/76994492 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ef4103960 0x7f0ef4103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.855+0000 7f0efbd40700 1 --2- 192.168.123.103:0/76994492 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4102760 0x7f0ef4102b70 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.855+0000 7f0efbd40700 1 -- 192.168.123.103:0/76994492 >> 192.168.123.103:0/76994492 conn(0x7f0ef40fdcf0 msgr2=0x7f0ef4100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:04.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.855+0000 7f0efbd40700 1 -- 192.168.123.103:0/76994492 shutdown_connections 2026-03-10T14:09:04.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.855+0000 7f0efbd40700 1 -- 192.168.123.103:0/76994492 wait complete. 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.856+0000 7f0efbd40700 1 Processor -- start 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.856+0000 7f0efbd40700 1 -- start start 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.856+0000 7f0efbd40700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ef4102760 0x7f0ef41980c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.856+0000 7f0efbd40700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4103960 0x7f0ef4198600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.856+0000 7f0efbd40700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ef4198c20 con 0x7f0ef4103960 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.856+0000 7f0efbd40700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ef4198d60 con 0x7f0ef4102760 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.857+0000 7f0ef92db700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4103960 0x7f0ef4198600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.857+0000 7f0ef92db700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4103960 0x7f0ef4198600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60128/0 (socket says 192.168.123.103:60128) 2026-03-10T14:09:04.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.857+0000 7f0ef92db700 1 -- 192.168.123.103:0/813573519 learned_addr learned my addr 192.168.123.103:0/813573519 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:04.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.857+0000 7f0ef92db700 1 -- 192.168.123.103:0/813573519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ef4102760 msgr2=0x7f0ef41980c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:04.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.857+0000 7f0ef92db700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ef4102760 0x7f0ef41980c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:04.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.857+0000 7f0ef92db700 1 -- 192.168.123.103:0/813573519 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0ee40097e0 con 0x7f0ef4103960 2026-03-10T14:09:04.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.857+0000 7f0ef92db700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4103960 0x7f0ef4198600 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f0ef000ba70 tx=0x7f0ef000be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:04.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.858+0000 7f0eeaffd700 1 -- 192.168.123.103:0/813573519 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ef000c780 con 0x7f0ef4103960 2026-03-10T14:09:04.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.858+0000 7f0eeaffd700 1 -- 192.168.123.103:0/813573519 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0ef000cdc0 con 0x7f0ef4103960 2026-03-10T14:09:04.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.858+0000 7f0eeaffd700 1 -- 192.168.123.103:0/813573519 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ef0012550 con 0x7f0ef4103960 2026-03-10T14:09:04.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.858+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ef419d810 con 0x7f0ef4103960 2026-03-10T14:09:04.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.858+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ef40754f0 con 0x7f0ef4103960 2026-03-10T14:09:04.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.859+0000 7f0eeaffd700 1 -- 192.168.123.103:0/813573519 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f0ef00126b0 con 0x7f0ef4103960 2026-03-10T14:09:04.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.859+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ef4066e40 con 0x7f0ef4103960 2026-03-10T14:09:04.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.859+0000 7f0eeaffd700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0ee00779a0 0x7f0ee0079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:04.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.859+0000 7f0eeaffd700 1 -- 192.168.123.103:0/813573519 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f0ef0098dc0 con 0x7f0ef4103960 2026-03-10T14:09:04.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.862+0000 7f0ef9adc700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0ee00779a0 0x7f0ee0079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:04.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.862+0000 7f0ef9adc700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0ee00779a0 0x7f0ee0079e50 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f0ee400b5c0 tx=0x7f0ee4009f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:04.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:04.862+0000 7f0eeaffd700 1 -- 192.168.123.103:0/813573519 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0ef009c030 con 0x7f0ef4103960 2026-03-10T14:09:05.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.034+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0ef41082b0 con 0x7f0ee00779a0 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.038+0000 7f0eeaffd700 1 -- 192.168.123.103:0/813573519 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+337 (secure 0 0 0) 0x7f0ef41082b0 con 0x7f0ee00779a0 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "3/23 daemons upgraded", 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: "message": "", 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:09:05.037 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:09:05.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.041+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0ee00779a0 msgr2=0x7f0ee0079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:05.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.041+0000 7f0efbd40700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0ee00779a0 0x7f0ee0079e50 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f0ee400b5c0 tx=0x7f0ee4009f90 comp rx=0 tx=0).stop 2026-03-10T14:09:05.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.041+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4103960 msgr2=0x7f0ef4198600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:05.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.041+0000 7f0efbd40700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4103960 0x7f0ef4198600 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f0ef000ba70 tx=0x7f0ef000be30 comp rx=0 tx=0).stop 2026-03-10T14:09:05.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.042+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 shutdown_connections 2026-03-10T14:09:05.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.042+0000 7f0efbd40700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0ee00779a0 0x7f0ee0079e50 secure :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f0ee400b5c0 tx=0x7f0ee4009f90 comp rx=0 tx=0).stop 2026-03-10T14:09:05.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.042+0000 7f0efbd40700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0ef4102760 0x7f0ef41980c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:05.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.042+0000 7f0efbd40700 1 --2- 192.168.123.103:0/813573519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ef4103960 0x7f0ef4198600 secure :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f0ef000ba70 tx=0x7f0ef000be30 comp rx=0 tx=0).stop 2026-03-10T14:09:05.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.042+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 >> 192.168.123.103:0/813573519 conn(0x7f0ef40fdcf0 msgr2=0x7f0ef4106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:05.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.044+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 shutdown_connections 2026-03-10T14:09:05.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.044+0000 7f0efbd40700 1 -- 192.168.123.103:0/813573519 wait complete. 2026-03-10T14:09:05.129 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.130+0000 7fbd3359e700 1 -- 192.168.123.103:0/1016079394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34072330 msgr2=0x7fbd340770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.130+0000 7fbd3359e700 1 --2- 192.168.123.103:0/1016079394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34072330 0x7fbd340770b0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7fbd2c00d3f0 tx=0x7fbd2c00d700 comp rx=0 tx=0).stop 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.130+0000 7fbd3359e700 1 -- 192.168.123.103:0/1016079394 shutdown_connections 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.130+0000 7fbd3359e700 1 --2- 192.168.123.103:0/1016079394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34072330 0x7fbd340770b0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.130+0000 7fbd3359e700 1 --2- 192.168.123.103:0/1016079394 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd34071950 0x7fbd34071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.130+0000 7fbd3359e700 1 -- 192.168.123.103:0/1016079394 >> 192.168.123.103:0/1016079394 conn(0x7fbd3406d1a0 msgr2=0x7fbd3406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.130+0000 7fbd3359e700 1 -- 192.168.123.103:0/1016079394 shutdown_connections 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.130+0000 7fbd3359e700 1 -- 192.168.123.103:0/1016079394 wait complete. 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.131+0000 7fbd3359e700 1 Processor -- start 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.131+0000 7fbd3359e700 1 -- start start 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.131+0000 7fbd3359e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34071950 0x7fbd34082510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.131+0000 7fbd3359e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd34082a50 0x7fbd34082ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.131+0000 7fbd3359e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd3412dd80 con 0x7fbd34071950 2026-03-10T14:09:05.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.131+0000 7fbd3359e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd3412def0 con 0x7fbd34082a50 2026-03-10T14:09:05.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.132+0000 7fbd3259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34071950 0x7fbd34082510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:05.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.132+0000 7fbd31d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd34082a50 0x7fbd34082ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:05.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.132+0000 7fbd31d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd34082a50 0x7fbd34082ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34242/0 (socket says 192.168.123.103:34242) 2026-03-10T14:09:05.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.132+0000 7fbd31d9b700 1 -- 192.168.123.103:0/741895863 learned_addr learned my addr 192.168.123.103:0/741895863 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:05.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.132+0000 7fbd3259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34071950 0x7fbd34082510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60142/0 (socket says 192.168.123.103:60142) 2026-03-10T14:09:05.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.133+0000 7fbd3259c700 1 -- 192.168.123.103:0/741895863 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd34082a50 msgr2=0x7fbd34082ec0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:05.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.133+0000 7fbd3259c700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd34082a50 0x7fbd34082ec0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:05.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.133+0000 7fbd3259c700 1 -- 192.168.123.103:0/741895863 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbd2c007ed0 con 0x7fbd34071950 2026-03-10T14:09:05.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.133+0000 7fbd3259c700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34071950 0x7fbd34082510 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fbd2400b700 tx=0x7fbd2400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:05.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.134+0000 7fbd237fe700 1 -- 192.168.123.103:0/741895863 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd24010820 con 0x7fbd34071950 2026-03-10T14:09:05.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.134+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbd3412e110 con 0x7fbd34071950 2026-03-10T14:09:05.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.134+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbd3412e660 con 0x7fbd34071950 2026-03-10T14:09:05.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.135+0000 7fbd237fe700 1 -- 192.168.123.103:0/741895863 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbd24010e60 con 0x7fbd34071950 2026-03-10T14:09:05.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.135+0000 7fbd237fe700 1 -- 192.168.123.103:0/741895863 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd24017570 con 0x7fbd34071950 2026-03-10T14:09:05.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.137+0000 7fbd237fe700 1 -- 192.168.123.103:0/741895863 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fbd24010980 con 0x7fbd34071950 2026-03-10T14:09:05.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.137+0000 7fbd237fe700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbd1c077a40 0x7fbd1c079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:05.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.137+0000 7fbd237fe700 1 -- 192.168.123.103:0/741895863 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fbd24099510 con 0x7fbd34071950 2026-03-10T14:09:05.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.138+0000 7fbd31d9b700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbd1c077a40 0x7fbd1c079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:05.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.138+0000 7fbd31d9b700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbd1c077a40 0x7fbd1c079ef0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fbd2c000f80 tx=0x7fbd2c00db00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:05.139 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.139+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbd14005320 con 0x7fbd34071950 2026-03-10T14:09:05.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.144+0000 7fbd237fe700 1 -- 192.168.123.103:0/741895863 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbd24061b20 con 0x7fbd34071950 2026-03-10T14:09:05.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.355+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fbd14005190 con 0x7fbd34071950 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.356+0000 7fbd237fe700 1 -- 192.168.123.103:0/741895863 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fbd2401e020 con 0x7fbd34071950 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_OK 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.359+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbd1c077a40 msgr2=0x7fbd1c079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.359+0000 7fbd3359e700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbd1c077a40 0x7fbd1c079ef0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fbd2c000f80 tx=0x7fbd2c00db00 comp rx=0 tx=0).stop 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.359+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34071950 msgr2=0x7fbd34082510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.359+0000 7fbd3359e700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34071950 0x7fbd34082510 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fbd2400b700 tx=0x7fbd2400bac0 comp rx=0 tx=0).stop 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.360+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 shutdown_connections 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.360+0000 7fbd3359e700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbd1c077a40 0x7fbd1c079ef0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.360+0000 7fbd3359e700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd34071950 0x7fbd34082510 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.360+0000 7fbd3359e700 1 --2- 192.168.123.103:0/741895863 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbd34082a50 0x7fbd34082ec0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.360+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 >> 192.168.123.103:0/741895863 conn(0x7fbd3406d1a0 msgr2=0x7fbd34076380 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.360+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 shutdown_connections 2026-03-10T14:09:05.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:05.360+0000 7fbd3359e700 1 -- 192.168.123.103:0/741895863 wait complete. 2026-03-10T14:09:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='client.34138 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: pgmap v5: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: mgrmap e36: vm03.rwbbep(active, since 4s), standbys: vm04.ywwcto 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/4059371198' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='client.34150 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: Standby manager daemon vm04.ywwcto restarted 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: Standby manager daemon vm04.ywwcto started 2026-03-10T14:09:06.386 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.? 192.168.123.104:0/426369740' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/crt"}]: dispatch 2026-03-10T14:09:06.387 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.? 192.168.123.104:0/426369740' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:09:06.387 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.? 192.168.123.104:0/426369740' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/key"}]: dispatch 2026-03-10T14:09:06.387 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='mgr.? 192.168.123.104:0/426369740' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:09:06.387 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:06 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/741895863' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='client.34138 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: pgmap v5: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: mgrmap e36: vm03.rwbbep(active, since 4s), standbys: vm04.ywwcto 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/4059371198' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: Updating vm03:/etc/ceph/ceph.conf 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: Updating vm04:/etc/ceph/ceph.conf 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='client.34150 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: Standby manager daemon vm04.ywwcto restarted 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: Standby manager daemon vm04.ywwcto started 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/426369740' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/crt"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/426369740' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/426369740' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm04.ywwcto/key"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='mgr.? 192.168.123.104:0/426369740' entity='mgr.vm04.ywwcto' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T14:09:06.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:06 vm04.local ceph-mon[55966]: from='client.? 192.168.123.103:0/741895863' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.270 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: mgrmap e37: vm03.rwbbep(active, since 6s), standbys: vm04.ywwcto 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: pgmap v6: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:09:07.271 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-mon[55966]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.conf 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: Updating vm04:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: Updating vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: Updating vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/config/ceph.client.admin.keyring 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: mgrmap e37: vm03.rwbbep(active, since 6s), standbys: vm04.ywwcto 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: pgmap v6: 65 pgs: 65 active+clean; 2.0 GiB data, 8.2 GiB used, 112 GiB / 120 GiB avail 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:09:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:08.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local systemd[1]: Stopping Ceph mon.vm04 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:09:08.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04[55962]: 2026-03-10T14:09:07.925+0000 7f3bd31b6700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm04 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:09:08.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:07 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04[55962]: 2026-03-10T14:09:07.925+0000 7f3bd31b6700 -1 mon.vm04@1(peon) e2 *** Got Signal Terminated *** 2026-03-10T14:09:08.364 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local podman[91963]: 2026-03-10 14:09:08.104358594 +0000 UTC m=+0.212667424 container died 4113774b34c7ef7776a2e7615fe1d8ffedfbfaab7a2ef4d4620d5fe0fab8e0ed (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04, ceph=True, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.29.1, org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, RELEASE=HEAD) 2026-03-10T14:09:08.364 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local podman[91963]: 2026-03-10 14:09:08.137612755 +0000 UTC m=+0.245921585 container remove 4113774b34c7ef7776a2e7615fe1d8ffedfbfaab7a2ef4d4620d5fe0fab8e0ed (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04, org.label-schema.license=GPLv2, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, ceph=True, org.label-schema.schema-version=1.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux ) 2026-03-10T14:09:08.364 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local bash[91963]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04 2026-03-10T14:09:08.364 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm04.service: Deactivated successfully. 2026-03-10T14:09:08.364 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local systemd[1]: Stopped Ceph mon.vm04 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:09:08.364 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm04.service: Consumed 3.669s CPU time. 2026-03-10T14:09:08.793 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local systemd[1]: Starting Ceph mon.vm04 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:09:08.793 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local podman[92070]: 2026-03-10 14:09:08.683206291 +0000 UTC m=+0.041149651 container create 111e2285827974b1bf30ba12303f7bca9e3c39a4934cc50785de31c8619c29f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) 2026-03-10T14:09:08.793 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local podman[92070]: 2026-03-10 14:09:08.747561761 +0000 UTC m=+0.105505132 container init 111e2285827974b1bf30ba12303f7bca9e3c39a4934cc50785de31c8619c29f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0) 2026-03-10T14:09:08.793 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local podman[92070]: 2026-03-10 14:09:08.750510071 +0000 UTC m=+0.108453431 container start 111e2285827974b1bf30ba12303f7bca9e3c39a4934cc50785de31c8619c29f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-10T14:09:08.793 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local bash[92070]: 111e2285827974b1bf30ba12303f7bca9e3c39a4934cc50785de31c8619c29f0 2026-03-10T14:09:08.793 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local podman[92070]: 2026-03-10 14:09:08.657036433 +0000 UTC m=+0.014979802 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:09:08.793 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local systemd[1]: Started Ceph mon.vm04 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: pidfile_write: ignore empty --pid-file 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: load: jerasure load: lrc 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: RocksDB version: 7.9.2 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Git sha 0 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: DB SUMMARY 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: DB Session ID: BY7IZUMFV9RJVQV46PL3 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: CURRENT file: CURRENT 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: MANIFEST file: MANIFEST-000010 size: 668 Bytes 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm04/store.db dir, Total Num: 1, files: 000018.sst 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm04/store.db: 000016.log size: 6530908 ; 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.error_if_exists: 0 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.create_if_missing: 0 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.paranoid_checks: 1 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.env: 0x556e31ab7dc0 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.info_log: 0x556e338fd900 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.statistics: (nil) 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.use_fsync: 0 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_log_file_size: 0 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T14:09:09.065 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.allow_fallocate: 1 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.use_direct_reads: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.db_log_dir: 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.wal_dir: 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.write_buffer_manager: 0x556e33901900 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.unordered_write: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.row_cache: None 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.wal_filter: None 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.two_write_queues: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.wal_compression: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.atomic_flush: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.log_readahead_size: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_background_jobs: 2 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_background_compactions: -1 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_subcompactions: 1 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T14:09:09.066 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_open_files: -1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_background_flushes: -1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Compression algorithms supported: 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: kZSTD supported: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: kXpressCompression supported: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: kBZip2Compression supported: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: kLZ4Compression supported: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: kZlibCompression supported: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: kSnappyCompression supported: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm04/store.db/MANIFEST-000010 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.merge_operator: 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_filter: None 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556e338fd580) 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: cache_index_and_filter_blocks: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: pin_top_level_index_and_filter: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_type: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: data_block_index_type: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_shortening: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: checksum: 4 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: no_block_cache: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache: 0x556e339209b0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_name: BinnedLRUCache 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_options: 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: capacity : 536870912 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: num_shard_bits : 4 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: strict_capacity_limit : 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: high_pri_pool_ratio: 0.000 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_cache_compressed: (nil) 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: persistent_cache: (nil) 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_size: 4096 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_size_deviation: 10 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_restart_interval: 16 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: index_block_restart_interval: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: metadata_block_size: 4096 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: partition_filters: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: use_delta_encoding: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: filter_policy: bloomfilter 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: whole_key_filtering: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: verify_compression: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: read_amp_bytes_per_bit: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: format_version: 5 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: enable_index_compression: 1 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: block_align: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_auto_readahead_size: 262144 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: prepopulate_block_cache: 0 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: initial_auto_readahead_size: 8192 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression: NoCompression 2026-03-10T14:09:09.067 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.num_levels: 7 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.inplace_update_support: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.bloom_locality: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.max_successive_merges: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T14:09:09.068 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.ttl: 2592000 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.enable_blob_files: false 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.min_blob_size: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm04/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 20, last_sequence is 8139, log_number is 16,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 16 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 16 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0a019e39-2c77-41b0-964b-de0a9f352abd 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151748804854, "job": 1, "event": "recovery_started", "wal_files": [16]} 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #16 mode 2 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151748824948, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 21, "file_size": 3926028, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8144, "largest_seqno": 9262, "table_properties": {"data_size": 3920006, "index_size": 3696, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 12028, "raw_average_key_size": 23, "raw_value_size": 3908762, "raw_average_value_size": 7740, "num_data_blocks": 175, "num_entries": 505, "num_filter_entries": 505, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773151748, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0a019e39-2c77-41b0-964b-de0a9f352abd", "db_session_id": "BY7IZUMFV9RJVQV46PL3", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773151748825055, "job": 1, "event": "recovery_finished"} 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/version_set.cc:5047] Creating manifest 23 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm04/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x556e33922e00 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: DB pointer 0x556e33932000 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** DB Stats ** 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** Compaction Stats [default] ** 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: L0 1/0 3.74 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 273.3 0.01 0.00 1 0.014 0 0 0.0 0.0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: L6 1/0 6.42 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Sum 2/0 10.17 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 273.3 0.01 0.00 1 0.014 0 0 0.0 0.0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 273.3 0.01 0.00 1 0.014 0 0 0.0 0.0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** Compaction Stats [default] ** 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 273.3 0.01 0.00 1 0.014 0 0 0.0 0.0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Flush(GB): cumulative 0.004, interval 0.004 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Cumulative compaction: 0.00 GB write, 133.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Interval compaction: 0.00 GB write, 133.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Block cache BinnedLRUCache@0x556e339209b0#2 capacity: 512.00 MB usage: 5.48 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.8e-05 secs_since: 0 2026-03-10T14:09:09.069 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,1.41 KB,0.000268221%) IndexBlock(1,4.08 KB,0.000777841%) Misc(1,0.00 KB,0%) 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: starting mon.vm04 rank 1 at public addrs [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] at bind addrs [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon_data /var/lib/ceph/mon/ceph-vm04 fsid b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: mon.vm04@-1(???) e2 preinit fsid b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: mon.vm04@-1(???).mds e14 new map 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: mon.vm04@-1(???).mds e14 print_map 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: e14 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: legacy client fscid: 1 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: Filesystem 'cephfs' (1) 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: fs_name cephfs 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: epoch 14 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: tableserver 0 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: root 0 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: session_timeout 60 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: session_autoclose 300 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_file_size 1099511627776 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_xattr_size 65536 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: required_client_features {} 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: last_failure 0 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: last_failure_osd_epoch 0 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: max_mds 2 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: in 0,1 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: up {0=24263,1=14470} 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: failed 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: damaged 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: stopped 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: data_pools [3] 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: metadata_pool 2 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: inline_data disabled 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: balancer 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: bal_rank_mask -1 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: standby_count_wanted 1 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: qdb_cluster leader: 0 members: 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: [mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: [mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: [mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: [mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout: 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: mon.vm04@-1(???).osd e43 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: mon.vm04@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: mon.vm04@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: mon.vm04@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T14:09:09.070 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:08 vm04.local ceph-mon[92084]: mon.vm04@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-10T14:09:10.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:10 vm04.local ceph-mon[92084]: Upgrade: Updating mon.vm04 2026-03-10T14:09:10.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:10 vm04.local ceph-mon[92084]: Deploying daemon mon.vm04 on vm04 2026-03-10T14:09:10.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:10 vm03.local ceph-mon[103098]: Upgrade: Updating mon.vm04 2026-03-10T14:09:10.863 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:10 vm03.local ceph-mon[103098]: Deploying daemon mon.vm04 on vm04 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: pgmap v8: 65 pgs: 65 active+clean; 1.7 GiB data, 7.1 GiB used, 113 GiB / 120 GiB avail; 275 KiB/s rd, 227 KiB/s wr, 51 op/s 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: mon.vm03 calling monitor election 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: mon.vm04 calling monitor election 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: mon.vm03 is new leader, mons vm03,vm04 in quorum (ranks 0,1) 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: monmap epoch 3 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: fsid b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: last_changed 2026-03-10T14:09:10.552336+0000 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: created 2026-03-10T14:02:36.685064+0000 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: min_mon_release 19 (squid) 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: election_strategy: 1 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: 1: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.vm04 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: mgrmap e37: vm03.rwbbep(active, since 10s), standbys: vm04.ywwcto 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: overall HEALTH_OK 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:11.703 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: pgmap v8: 65 pgs: 65 active+clean; 1.7 GiB data, 7.1 GiB used, 113 GiB / 120 GiB avail; 275 KiB/s rd, 227 KiB/s wr, 51 op/s 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mon metadata", "id": "vm04"}]: dispatch 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: mon.vm03 calling monitor election 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: mon.vm04 calling monitor election 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: mon.vm03 is new leader, mons vm03,vm04 in quorum (ranks 0,1) 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: monmap epoch 3 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: fsid b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: last_changed 2026-03-10T14:09:10.552336+0000 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: created 2026-03-10T14:02:36.685064+0000 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: min_mon_release 19 (squid) 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: election_strategy: 1 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: 1: [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] mon.vm04 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: mgrmap e37: vm03.rwbbep(active, since 10s), standbys: vm04.ywwcto 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: overall HEALTH_OK 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:13 vm03.local ceph-mon[103098]: pgmap v9: 65 pgs: 65 active+clean; 1.4 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 639 KiB/s rd, 605 KiB/s wr, 94 op/s 2026-03-10T14:09:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:14.180 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:13 vm04.local ceph-mon[92084]: pgmap v9: 65 pgs: 65 active+clean; 1.4 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 639 KiB/s rd, 605 KiB/s wr, 94 op/s 2026-03-10T14:09:14.180 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:14.180 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:15.225 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:14 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:15.225 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:14 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:15.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:14 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:15.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:14 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: pgmap v10: 65 pgs: 65 active+clean; 1.4 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 584 KiB/s rd, 553 KiB/s wr, 86 op/s 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:16.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: pgmap v10: 65 pgs: 65 active+clean; 1.4 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 584 KiB/s rd, 553 KiB/s wr, 86 op/s 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:16.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:09:18.249 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: Reconfiguring mon.vm03 (monmap changed)... 2026-03-10T14:09:18.249 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: Reconfiguring daemon mon.vm03 on vm03 2026-03-10T14:09:18.249 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:18.250 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:18.250 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: Reconfiguring mgr.vm03.rwbbep (monmap changed)... 2026-03-10T14:09:18.250 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.rwbbep", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:09:18.250 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:09:18.250 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:18.250 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:17 vm03.local ceph-mon[103098]: Reconfiguring daemon mgr.vm03.rwbbep on vm03 2026-03-10T14:09:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: Reconfiguring mon.vm03 (monmap changed)... 2026-03-10T14:09:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: Reconfiguring daemon mon.vm03 on vm03 2026-03-10T14:09:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: Reconfiguring mgr.vm03.rwbbep (monmap changed)... 2026-03-10T14:09:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.rwbbep", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:09:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:09:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:17 vm04.local ceph-mon[92084]: Reconfiguring daemon mgr.vm03.rwbbep on vm03 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: pgmap v11: 65 pgs: 65 active+clean; 1.4 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 584 KiB/s rd, 553 KiB/s wr, 86 op/s 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: Unable to update caps for client.ceph-exporter.vm03 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm03"}]: dispatch 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T14:09:19.159 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: pgmap v11: 65 pgs: 65 active+clean; 1.4 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 584 KiB/s rd, 553 KiB/s wr, 86 op/s 2026-03-10T14:09:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: Unable to update caps for client.ceph-exporter.vm03 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm03"}]: dispatch 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T14:09:19.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-10T14:09:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: Reconfiguring daemon crash.vm03 on vm03 2026-03-10T14:09:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: pgmap v12: 65 pgs: 65 active+clean; 1.1 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 961 KiB/s rd, 993 KiB/s wr, 191 op/s 2026-03-10T14:09:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T14:09:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: Reconfiguring daemon osd.0 on vm03 2026-03-10T14:09:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:20.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:20.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T14:09:20.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:20.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:20.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:20.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T14:09:20.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:20.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-10T14:09:20.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: Reconfiguring daemon crash.vm03 on vm03 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: pgmap v12: 65 pgs: 65 active+clean; 1.1 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 961 KiB/s rd, 993 KiB/s wr, 191 op/s 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: Reconfiguring daemon osd.0 on vm03 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T14:09:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: Reconfiguring daemon osd.1 on vm03 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: Reconfiguring daemon osd.2 on vm03 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.aqaspa", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.itwezo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:09:21.216 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: Reconfiguring daemon osd.1 on vm03 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: Reconfiguring daemon osd.2 on vm03 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.aqaspa", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.itwezo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:09:21.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: Reconfiguring mds.cephfs.vm03.aqaspa (monmap changed)... 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: Reconfiguring daemon mds.cephfs.vm03.aqaspa on vm03 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: pgmap v13: 65 pgs: 65 active+clean; 1.1 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 755 KiB/s rd, 823 KiB/s wr, 153 op/s 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: Reconfiguring mds.cephfs.vm03.itwezo (monmap changed)... 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: Reconfiguring daemon mds.cephfs.vm03.itwezo on vm03 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: Reconfiguring ceph-exporter.vm04 (monmap changed)... 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: Unable to update caps for client.ceph-exporter.vm04 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm04"}]: dispatch 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: Reconfiguring daemon ceph-exporter.vm04 on vm04 2026-03-10T14:09:22.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:22.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:22.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:09:22.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: Reconfiguring mds.cephfs.vm03.aqaspa (monmap changed)... 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: Reconfiguring daemon mds.cephfs.vm03.aqaspa on vm03 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: pgmap v13: 65 pgs: 65 active+clean; 1.1 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 755 KiB/s rd, 823 KiB/s wr, 153 op/s 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: Reconfiguring mds.cephfs.vm03.itwezo (monmap changed)... 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: Reconfiguring daemon mds.cephfs.vm03.itwezo on vm03 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: Reconfiguring ceph-exporter.vm04 (monmap changed)... 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: Unable to update caps for client.ceph-exporter.vm04 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm04"}]: dispatch 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: Reconfiguring daemon ceph-exporter.vm04 on vm04 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:09:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: Reconfiguring crash.vm04 (monmap changed)... 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: Reconfiguring daemon crash.vm04 on vm04 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: Reconfiguring mgr.vm04.ywwcto (monmap changed)... 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: Reconfiguring daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:09:23.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:23 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: Reconfiguring crash.vm04 (monmap changed)... 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: Reconfiguring daemon crash.vm04 on vm04 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: Reconfiguring mgr.vm04.ywwcto (monmap changed)... 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm04.ywwcto", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: Reconfiguring daemon mgr.vm04.ywwcto on vm04 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T14:09:23.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:23 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: Reconfiguring mon.vm04 (monmap changed)... 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: Reconfiguring daemon mon.vm04 on vm04 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: pgmap v14: 65 pgs: 65 active+clean; 903 MiB data, 5.0 GiB used, 115 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.4 MiB/s wr, 236 op/s 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T14:09:24.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: Reconfiguring mon.vm04 (monmap changed)... 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: Reconfiguring daemon mon.vm04 on vm04 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: pgmap v14: 65 pgs: 65 active+clean; 903 MiB data, 5.0 GiB used, 115 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.4 MiB/s wr, 236 op/s 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T14:09:24.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: Reconfiguring daemon osd.3 on vm04 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: Reconfiguring daemon osd.4 on vm04 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.sslxuq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.puavjd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:09:25.305 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: Reconfiguring daemon osd.3 on vm04 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: Reconfiguring daemon osd.4 on vm04 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.sslxuq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.puavjd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:09:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:26.306 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: Reconfiguring daemon osd.5 on vm04 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: pgmap v15: 65 pgs: 65 active+clean; 903 MiB data, 5.0 GiB used, 115 GiB / 120 GiB avail; 859 KiB/s rd, 1013 KiB/s wr, 188 op/s 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: Reconfiguring mds.cephfs.vm04.sslxuq (monmap changed)... 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: Reconfiguring daemon mds.cephfs.vm04.sslxuq on vm04 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: Reconfiguring mds.cephfs.vm04.puavjd (monmap changed)... 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: Reconfiguring daemon mds.cephfs.vm04.puavjd on vm04 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]: dispatch 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]': finished 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm04"}]: dispatch 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm04"}]': finished 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:09:26.307 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: Reconfiguring daemon osd.5 on vm04 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: pgmap v15: 65 pgs: 65 active+clean; 903 MiB data, 5.0 GiB used, 115 GiB / 120 GiB avail; 859 KiB/s rd, 1013 KiB/s wr, 188 op/s 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: Reconfiguring mds.cephfs.vm04.sslxuq (monmap changed)... 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: Reconfiguring daemon mds.cephfs.vm04.sslxuq on vm04 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: Reconfiguring mds.cephfs.vm04.puavjd (monmap changed)... 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: Reconfiguring daemon mds.cephfs.vm04.puavjd on vm04 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:26.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]: dispatch 2026-03-10T14:09:26.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]': finished 2026-03-10T14:09:26.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm04"}]: dispatch 2026-03-10T14:09:26.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm04"}]': finished 2026-03-10T14:09:26.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:26.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:09:26.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:27.256 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T14:09:27.256 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-10T14:09:27.269 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:27 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all mon 2026-03-10T14:09:27.269 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:27 vm03.local ceph-mon[103098]: Upgrade: Updating crash.vm03 (1/2) 2026-03-10T14:09:27.270 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:27 vm03.local ceph-mon[103098]: Deploying daemon crash.vm03 on vm03 2026-03-10T14:09:27.270 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:27 vm03.local ceph-mon[103098]: pgmap v16: 65 pgs: 65 active+clean; 903 MiB data, 5.0 GiB used, 115 GiB / 120 GiB avail; 859 KiB/s rd, 1013 KiB/s wr, 188 op/s 2026-03-10T14:09:27.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:27 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all mon 2026-03-10T14:09:27.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:27 vm04.local ceph-mon[92084]: Upgrade: Updating crash.vm03 (1/2) 2026-03-10T14:09:27.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:27 vm04.local ceph-mon[92084]: Deploying daemon crash.vm03 on vm03 2026-03-10T14:09:27.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:27 vm04.local ceph-mon[92084]: pgmap v16: 65 pgs: 65 active+clean; 903 MiB data, 5.0 GiB used, 115 GiB / 120 GiB avail; 859 KiB/s rd, 1013 KiB/s wr, 188 op/s 2026-03-10T14:09:29.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:29.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:29.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:29.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:09:29.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:29.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:29.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:29.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:29.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm04", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T14:09:29.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:29.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:29 vm04.local ceph-mon[92084]: Upgrade: Updating crash.vm04 (2/2) 2026-03-10T14:09:29.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:29 vm04.local ceph-mon[92084]: Deploying daemon crash.vm04 on vm04 2026-03-10T14:09:29.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:29 vm04.local ceph-mon[92084]: pgmap v17: 65 pgs: 65 active+clean; 602 MiB data, 4.1 GiB used, 116 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.3 MiB/s wr, 315 op/s 2026-03-10T14:09:30.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:29 vm03.local ceph-mon[103098]: Upgrade: Updating crash.vm04 (2/2) 2026-03-10T14:09:30.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:29 vm03.local ceph-mon[103098]: Deploying daemon crash.vm04 on vm04 2026-03-10T14:09:30.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:29 vm03.local ceph-mon[103098]: pgmap v17: 65 pgs: 65 active+clean; 602 MiB data, 4.1 GiB used, 116 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.3 MiB/s wr, 315 op/s 2026-03-10T14:09:32.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:32.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:32.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:32.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:09:32.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:32.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:32.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:32.069 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:09:32.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:32 vm03.local ceph-mon[103098]: pgmap v18: 65 pgs: 65 active+clean; 602 MiB data, 4.1 GiB used, 116 GiB / 120 GiB avail; 879 KiB/s rd, 890 KiB/s wr, 210 op/s 2026-03-10T14:09:32.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:32 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:32.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:32 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:33.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:32 vm04.local ceph-mon[92084]: pgmap v18: 65 pgs: 65 active+clean; 602 MiB data, 4.1 GiB used, 116 GiB / 120 GiB avail; 879 KiB/s rd, 890 KiB/s wr, 210 op/s 2026-03-10T14:09:33.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:32 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:33.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:32 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:34.198 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:34 vm04.local ceph-mon[92084]: pgmap v19: 65 pgs: 65 active+clean; 305 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 303 op/s 2026-03-10T14:09:34.198 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:34 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:34.198 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:34 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:34.198 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:34 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:34.198 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:34 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:34.366 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:34 vm03.local ceph-mon[103098]: pgmap v19: 65 pgs: 65 active+clean; 305 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.4 MiB/s wr, 303 op/s 2026-03-10T14:09:34.366 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:34 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:34.366 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:34 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:34.366 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:34 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:34.366 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:34 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 -- 192.168.123.103:0/911607168 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064072360 msgr2=0x7fc0640770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 --2- 192.168.123.103:0/911607168 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064072360 0x7fc0640770e0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc05c00a200 tx=0x7fc05c00a510 comp rx=0 tx=0).stop 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 -- 192.168.123.103:0/911607168 shutdown_connections 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 --2- 192.168.123.103:0/911607168 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064072360 0x7fc0640770e0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 --2- 192.168.123.103:0/911607168 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc064071980 0x7fc064071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 -- 192.168.123.103:0/911607168 >> 192.168.123.103:0/911607168 conn(0x7fc06406d1a0 msgr2=0x7fc06406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 -- 192.168.123.103:0/911607168 shutdown_connections 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 -- 192.168.123.103:0/911607168 wait complete. 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.487+0000 7fc0690c0700 1 Processor -- start 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.488+0000 7fc0690c0700 1 -- start start 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.488+0000 7fc0690c0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064071980 0x7fc0640824f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.488+0000 7fc0690c0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc064082a30 0x7fc064082ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.488+0000 7fc0690c0700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0641b2a90 con 0x7fc064082a30 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.488+0000 7fc0690c0700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0641b2bd0 con 0x7fc064071980 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.488+0000 7fc062d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064071980 0x7fc0640824f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.488+0000 7fc062d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064071980 0x7fc0640824f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:40762/0 (socket says 192.168.123.103:40762) 2026-03-10T14:09:35.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.488+0000 7fc062d9d700 1 -- 192.168.123.103:0/2868798848 learned_addr learned my addr 192.168.123.103:0/2868798848 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:35.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.489+0000 7fc062d9d700 1 -- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc064082a30 msgr2=0x7fc064082ea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:35.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.489+0000 7fc062d9d700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc064082a30 0x7fc064082ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.489+0000 7fc062d9d700 1 -- 192.168.123.103:0/2868798848 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc05c009e30 con 0x7fc064071980 2026-03-10T14:09:35.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.489+0000 7fc062d9d700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064071980 0x7fc0640824f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc05400cae0 tx=0x7fc05400cea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:35.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.491+0000 7fc04bfff700 1 -- 192.168.123.103:0/2868798848 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc054012040 con 0x7fc064071980 2026-03-10T14:09:35.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.491+0000 7fc04bfff700 1 -- 192.168.123.103:0/2868798848 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc054004830 con 0x7fc064071980 2026-03-10T14:09:35.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.491+0000 7fc04bfff700 1 -- 192.168.123.103:0/2868798848 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc054005600 con 0x7fc064071980 2026-03-10T14:09:35.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.492+0000 7fc0690c0700 1 -- 192.168.123.103:0/2868798848 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0641b2d10 con 0x7fc064071980 2026-03-10T14:09:35.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.492+0000 7fc0690c0700 1 -- 192.168.123.103:0/2868798848 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0641bb9f0 con 0x7fc064071980 2026-03-10T14:09:35.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.492+0000 7fc0690c0700 1 -- 192.168.123.103:0/2868798848 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc06407c8b0 con 0x7fc064071980 2026-03-10T14:09:35.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.494+0000 7fc04bfff700 1 -- 192.168.123.103:0/2868798848 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc0540049a0 con 0x7fc064071980 2026-03-10T14:09:35.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.495+0000 7fc04bfff700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc04c077a40 0x7fc04c079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:35.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.495+0000 7fc04bfff700 1 -- 192.168.123.103:0/2868798848 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc054099bb0 con 0x7fc064071980 2026-03-10T14:09:35.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.496+0000 7fc06259c700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc04c077a40 0x7fc04c079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:35.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.497+0000 7fc04bfff700 1 -- 192.168.123.103:0/2868798848 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc054062240 con 0x7fc064071980 2026-03-10T14:09:35.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.497+0000 7fc06259c700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc04c077a40 0x7fc04c079ef0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fc05c00b8b0 tx=0x7fc05c009d10 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:35.787 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.788+0000 7fc0690c0700 1 -- 192.168.123.103:0/2868798848 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc064061190 con 0x7fc04c077a40 2026-03-10T14:09:35.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:35 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:35.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:35 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:35.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:35 vm03.local ceph-mon[103098]: pgmap v20: 65 pgs: 65 active+clean; 305 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 906 KiB/s rd, 841 KiB/s wr, 220 op/s 2026-03-10T14:09:35.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:35 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:35.788 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:35 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:35.791 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.792+0000 7fc04bfff700 1 -- 192.168.123.103:0/2868798848 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+402 (secure 0 0 0) 0x7fc064061190 con 0x7fc04c077a40 2026-03-10T14:09:35.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 -- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc04c077a40 msgr2=0x7fc04c079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:35.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc04c077a40 0x7fc04c079ef0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fc05c00b8b0 tx=0x7fc05c009d10 comp rx=0 tx=0).stop 2026-03-10T14:09:35.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 -- 192.168.123.103:0/2868798848 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064071980 msgr2=0x7fc0640824f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:35.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064071980 0x7fc0640824f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc05400cae0 tx=0x7fc05400cea0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 -- 192.168.123.103:0/2868798848 shutdown_connections 2026-03-10T14:09:35.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc04c077a40 0x7fc04c079ef0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc064071980 0x7fc0640824f0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 --2- 192.168.123.103:0/2868798848 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc064082a30 0x7fc064082ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.797+0000 7fc049ffb700 1 -- 192.168.123.103:0/2868798848 >> 192.168.123.103:0/2868798848 conn(0x7fc06406d1a0 msgr2=0x7fc064076470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:35.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.798+0000 7fc049ffb700 1 -- 192.168.123.103:0/2868798848 shutdown_connections 2026-03-10T14:09:35.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.799+0000 7fc049ffb700 1 -- 192.168.123.103:0/2868798848 wait complete. 2026-03-10T14:09:35.822 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.973+0000 7f430155f700 1 -- 192.168.123.103:0/83001744 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071a90 msgr2=0x7f42fc071ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.973+0000 7f430155f700 1 --2- 192.168.123.103:0/83001744 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071a90 0x7f42fc071ea0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f42ec009a60 tx=0x7f42ec009d70 comp rx=0 tx=0).stop 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.974+0000 7f430155f700 1 -- 192.168.123.103:0/83001744 shutdown_connections 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.974+0000 7f430155f700 1 --2- 192.168.123.103:0/83001744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f42fc072470 0x7f42fc10beb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.974+0000 7f430155f700 1 --2- 192.168.123.103:0/83001744 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071a90 0x7f42fc071ea0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.974+0000 7f430155f700 1 -- 192.168.123.103:0/83001744 >> 192.168.123.103:0/83001744 conn(0x7f42fc06d1a0 msgr2=0x7f42fc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.974+0000 7f430155f700 1 -- 192.168.123.103:0/83001744 shutdown_connections 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.974+0000 7f430155f700 1 -- 192.168.123.103:0/83001744 wait complete. 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.974+0000 7f430155f700 1 Processor -- start 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.974+0000 7f430155f700 1 -- start start 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f430155f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071a90 0x7f42fc116a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:35.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f430155f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f42fc072470 0x7f42fc116f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f430155f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42fc117580 con 0x7f42fc072470 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f430155f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42fc1b27d0 con 0x7f42fc071a90 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f42fa7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f42fc072470 0x7f42fc116f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f42fa7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f42fc072470 0x7f42fc116f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38462/0 (socket says 192.168.123.103:38462) 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f42fa7fc700 1 -- 192.168.123.103:0/4212968736 learned_addr learned my addr 192.168.123.103:0/4212968736 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f42fa7fc700 1 -- 192.168.123.103:0/4212968736 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071a90 msgr2=0x7f42fc116a20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f42fa7fc700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071a90 0x7f42fc116a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.975+0000 7f42fa7fc700 1 -- 192.168.123.103:0/4212968736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f42ec009710 con 0x7f42fc072470 2026-03-10T14:09:35.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.976+0000 7f42fa7fc700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f42fc072470 0x7f42fc116f60 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f42f4007f00 tx=0x7f42f400d3b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:35.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.977+0000 7f42e3fff700 1 -- 192.168.123.103:0/4212968736 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f42f400dcf0 con 0x7f42fc072470 2026-03-10T14:09:35.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.977+0000 7f430155f700 1 -- 192.168.123.103:0/4212968736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f42fc1b2970 con 0x7f42fc072470 2026-03-10T14:09:35.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.977+0000 7f430155f700 1 -- 192.168.123.103:0/4212968736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f42fc1b2e40 con 0x7f42fc072470 2026-03-10T14:09:35.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.978+0000 7f430155f700 1 -- 192.168.123.103:0/4212968736 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f42fc110c20 con 0x7f42fc072470 2026-03-10T14:09:35.977 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.978+0000 7f42e3fff700 1 -- 192.168.123.103:0/4212968736 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f42f400f040 con 0x7f42fc072470 2026-03-10T14:09:35.977 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.978+0000 7f42e3fff700 1 -- 192.168.123.103:0/4212968736 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f42f4027920 con 0x7f42fc072470 2026-03-10T14:09:35.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.980+0000 7f42e3fff700 1 -- 192.168.123.103:0/4212968736 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f42f401dcb0 con 0x7f42fc072470 2026-03-10T14:09:35.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.980+0000 7f42e3fff700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f42e40779a0 0x7f42e4079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:35.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.980+0000 7f42e3fff700 1 -- 192.168.123.103:0/4212968736 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f42f409a300 con 0x7f42fc072470 2026-03-10T14:09:35.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.980+0000 7f42faffd700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f42e40779a0 0x7f42e4079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:35.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.981+0000 7f42faffd700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f42e40779a0 0x7f42e4079e50 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f42ec009fd0 tx=0x7f42ec005dc0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:35.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:35.984+0000 7f42e3fff700 1 -- 192.168.123.103:0/4212968736 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f42f4062a50 con 0x7f42fc072470 2026-03-10T14:09:35.988 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:35 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:35.988 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:35 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:35.988 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:35 vm04.local ceph-mon[92084]: pgmap v20: 65 pgs: 65 active+clean; 305 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 906 KiB/s rd, 841 KiB/s wr, 220 op/s 2026-03-10T14:09:35.988 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:35 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:35.988 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:35 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:36.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.199+0000 7f430155f700 1 -- 192.168.123.103:0/4212968736 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f42fc061190 con 0x7f42e40779a0 2026-03-10T14:09:36.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.200+0000 7f42e3fff700 1 -- 192.168.123.103:0/4212968736 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+402 (secure 0 0 0) 0x7f42fc061190 con 0x7f42e40779a0 2026-03-10T14:09:36.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.205+0000 7f42e1ffb700 1 -- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f42e40779a0 msgr2=0x7f42e4079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:36.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.205+0000 7f42e1ffb700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f42e40779a0 0x7f42e4079e50 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f42ec009fd0 tx=0x7f42ec005dc0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.205+0000 7f42e1ffb700 1 -- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f42fc072470 msgr2=0x7f42fc116f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:36.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.205+0000 7f42e1ffb700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f42fc072470 0x7f42fc116f60 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f42f4007f00 tx=0x7f42f400d3b0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.206+0000 7f42e1ffb700 1 -- 192.168.123.103:0/4212968736 shutdown_connections 2026-03-10T14:09:36.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.206+0000 7f42e1ffb700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f42e40779a0 0x7f42e4079e50 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.206+0000 7f42e1ffb700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f42fc071a90 0x7f42fc116a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.206+0000 7f42e1ffb700 1 --2- 192.168.123.103:0/4212968736 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f42fc072470 0x7f42fc116f60 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.206+0000 7f42e1ffb700 1 -- 192.168.123.103:0/4212968736 >> 192.168.123.103:0/4212968736 conn(0x7f42fc06d1a0 msgr2=0x7f42fc10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:36.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.206+0000 7f42e1ffb700 1 -- 192.168.123.103:0/4212968736 shutdown_connections 2026-03-10T14:09:36.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.206+0000 7f42e1ffb700 1 -- 192.168.123.103:0/4212968736 wait complete. 2026-03-10T14:09:36.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.320+0000 7fa57ac75700 1 -- 192.168.123.103:0/1431876309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa574071950 msgr2=0x7fa574071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.320+0000 7fa57ac75700 1 --2- 192.168.123.103:0/1431876309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa574071950 0x7fa574071d60 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fa570007780 tx=0x7fa570007a90 comp rx=0 tx=0).stop 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 -- 192.168.123.103:0/1431876309 shutdown_connections 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 --2- 192.168.123.103:0/1431876309 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa574072330 0x7fa5740770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 --2- 192.168.123.103:0/1431876309 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa574071950 0x7fa574071d60 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 -- 192.168.123.103:0/1431876309 >> 192.168.123.103:0/1431876309 conn(0x7fa57406d1a0 msgr2=0x7fa57406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 -- 192.168.123.103:0/1431876309 shutdown_connections 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 -- 192.168.123.103:0/1431876309 wait complete. 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 Processor -- start 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 -- start start 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa574072330 0x7fa574082380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5740828c0 0x7fa574082d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa574083d30 con 0x7fa574072330 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.321+0000 7fa57ac75700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa57412dd80 con 0x7fa5740828c0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa579472700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5740828c0 0x7fa574082d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa579472700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5740828c0 0x7fa574082d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:40816/0 (socket says 192.168.123.103:40816) 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa579472700 1 -- 192.168.123.103:0/3159592964 learned_addr learned my addr 192.168.123.103:0/3159592964 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa579c73700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa574072330 0x7fa574082380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa579472700 1 -- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa574072330 msgr2=0x7fa574082380 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa579472700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa574072330 0x7fa574082380 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa579472700 1 -- 192.168.123.103:0/3159592964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa570007430 con 0x7fa5740828c0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa579472700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5740828c0 0x7fa574082d30 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fa56c00c370 tx=0x7fa56c00c680 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa56affd700 1 -- 192.168.123.103:0/3159592964 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa56c00e050 con 0x7fa5740828c0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa57ac75700 1 -- 192.168.123.103:0/3159592964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa57412e000 con 0x7fa5740828c0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.322+0000 7fa57ac75700 1 -- 192.168.123.103:0/3159592964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa57412e550 con 0x7fa5740828c0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.323+0000 7fa56affd700 1 -- 192.168.123.103:0/3159592964 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa56c00f040 con 0x7fa5740828c0 2026-03-10T14:09:36.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.323+0000 7fa56affd700 1 -- 192.168.123.103:0/3159592964 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa56c013400 con 0x7fa5740828c0 2026-03-10T14:09:36.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.325+0000 7fa56affd700 1 -- 192.168.123.103:0/3159592964 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa56c0090d0 con 0x7fa5740828c0 2026-03-10T14:09:36.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.326+0000 7fa56affd700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa560077a50 0x7fa560079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:36.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.326+0000 7fa579c73700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa560077a50 0x7fa560079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:36.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.326+0000 7fa579c73700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa560077a50 0x7fa560079f00 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fa570007e60 tx=0x7fa5700058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:36.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.327+0000 7fa56affd700 1 -- 192.168.123.103:0/3159592964 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fa56c0998c0 con 0x7fa5740828c0 2026-03-10T14:09:36.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.327+0000 7fa57ac75700 1 -- 192.168.123.103:0/3159592964 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa558005320 con 0x7fa5740828c0 2026-03-10T14:09:36.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.331+0000 7fa56affd700 1 -- 192.168.123.103:0/3159592964 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa56c061800 con 0x7fa5740828c0 2026-03-10T14:09:36.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.521+0000 7fa57ac75700 1 -- 192.168.123.103:0/3159592964 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa558000bf0 con 0x7fa560077a50 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (5m) 2s ago 6m 22.0M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (4m) 2s ago 6m 8892k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (5m) 4s ago 5m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (9s) 2s ago 6m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (6s) 4s ago 5m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (5m) 2s ago 6m 91.1M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (4m) 2s ago 4m 159M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (4m) 2s ago 4m 93.9M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:09:36.529 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (4m) 4s ago 4m 91.2M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (4m) 4s ago 4m 166M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (76s) 2s ago 6m 610M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (48s) 4s ago 5m 496M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (41s) 2s ago 6m 54.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (27s) 4s ago 5m 49.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (6m) 2s ago 6m 15.5M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (5m) 4s ago 5m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 2s ago 5m 345M 4096M 18.2.0 dc2bc1663786 5a222b855ee3 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (5m) 2s ago 5m 360M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (5m) 2s ago 5m 282M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (4m) 4s ago 4m 434M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (4m) 4s ago 4m 397M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (4m) 4s ago 4m 352M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (51s) 2s ago 5m 52.7M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.527+0000 7fa56affd700 1 -- 192.168.123.103:0/3159592964 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fa558000bf0 con 0x7fa560077a50 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 -- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa560077a50 msgr2=0x7fa560079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa560077a50 0x7fa560079f00 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fa570007e60 tx=0x7fa5700058e0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 -- 192.168.123.103:0/3159592964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5740828c0 msgr2=0x7fa574082d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5740828c0 0x7fa574082d30 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fa56c00c370 tx=0x7fa56c00c680 comp rx=0 tx=0).stop 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 -- 192.168.123.103:0/3159592964 shutdown_connections 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa560077a50 0x7fa560079f00 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa574072330 0x7fa574082380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 --2- 192.168.123.103:0/3159592964 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5740828c0 0x7fa574082d30 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 -- 192.168.123.103:0/3159592964 >> 192.168.123.103:0/3159592964 conn(0x7fa57406d1a0 msgr2=0x7fa57406e0f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 -- 192.168.123.103:0/3159592964 shutdown_connections 2026-03-10T14:09:36.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.530+0000 7fa568ff9700 1 -- 192.168.123.103:0/3159592964 wait complete. 2026-03-10T14:09:36.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.670+0000 7f14123e2700 1 -- 192.168.123.103:0/2363358216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c072360 msgr2=0x7f140c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:36.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.670+0000 7f14123e2700 1 --2- 192.168.123.103:0/2363358216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c072360 0x7f140c0770e0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f140400d3f0 tx=0x7f140400d700 comp rx=0 tx=0).stop 2026-03-10T14:09:36.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.670+0000 7f14123e2700 1 -- 192.168.123.103:0/2363358216 shutdown_connections 2026-03-10T14:09:36.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.670+0000 7f14123e2700 1 --2- 192.168.123.103:0/2363358216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c072360 0x7f140c0770e0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.670+0000 7f14123e2700 1 --2- 192.168.123.103:0/2363358216 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f140c071980 0x7f140c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.670+0000 7f14123e2700 1 -- 192.168.123.103:0/2363358216 >> 192.168.123.103:0/2363358216 conn(0x7f140c06d1a0 msgr2=0x7f140c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:36.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.670+0000 7f14123e2700 1 -- 192.168.123.103:0/2363358216 shutdown_connections 2026-03-10T14:09:36.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.670+0000 7f14123e2700 1 -- 192.168.123.103:0/2363358216 wait complete. 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f14123e2700 1 Processor -- start 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f14123e2700 1 -- start start 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f14123e2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c071980 0x7f140c1313c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f14123e2700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f140c131900 0x7f140c07f590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f14123e2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f140c131e00 con 0x7f140c071980 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f14123e2700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f140c131f70 con 0x7f140c131900 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f140bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c071980 0x7f140c1313c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f140bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c071980 0x7f140c1313c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38502/0 (socket says 192.168.123.103:38502) 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f140bfff700 1 -- 192.168.123.103:0/333656674 learned_addr learned my addr 192.168.123.103:0/333656674 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.671+0000 7f140b7fe700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f140c131900 0x7f140c07f590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.672+0000 7f140bfff700 1 -- 192.168.123.103:0/333656674 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f140c131900 msgr2=0x7f140c07f590 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.672+0000 7f140bfff700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f140c131900 0x7f140c07f590 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.672+0000 7f140bfff700 1 -- 192.168.123.103:0/333656674 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1404007ed0 con 0x7f140c071980 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.672+0000 7f140bfff700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c071980 0x7f140c1313c0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f13fc00c8a0 tx=0x7f13fc00cc60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:36.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.672+0000 7f14097fa700 1 -- 192.168.123.103:0/333656674 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13fc00cea0 con 0x7f140c071980 2026-03-10T14:09:36.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.672+0000 7f14123e2700 1 -- 192.168.123.103:0/333656674 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f140c07fb30 con 0x7f140c071980 2026-03-10T14:09:36.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.672+0000 7f14123e2700 1 -- 192.168.123.103:0/333656674 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f140c080050 con 0x7f140c071980 2026-03-10T14:09:36.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.677+0000 7f14097fa700 1 -- 192.168.123.103:0/333656674 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f13fc004830 con 0x7f140c071980 2026-03-10T14:09:36.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.677+0000 7f14097fa700 1 -- 192.168.123.103:0/333656674 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13fc005650 con 0x7f140c071980 2026-03-10T14:09:36.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.678+0000 7f14097fa700 1 -- 192.168.123.103:0/333656674 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f13fc00f470 con 0x7f140c071980 2026-03-10T14:09:36.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.687+0000 7f14097fa700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f13f4077b00 0x7f13f4079fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:36.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.688+0000 7f140b7fe700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f13f4077b00 0x7f13f4079fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:36.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.689+0000 7f14097fa700 1 -- 192.168.123.103:0/333656674 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f13fc09a890 con 0x7f140c071980 2026-03-10T14:09:36.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.689+0000 7f14123e2700 1 -- 192.168.123.103:0/333656674 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f13f8005320 con 0x7f140c071980 2026-03-10T14:09:36.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.702+0000 7f14097fa700 1 -- 192.168.123.103:0/333656674 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f13fc062f20 con 0x7f140c071980 2026-03-10T14:09:36.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:36.702+0000 7f140b7fe700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f13f4077b00 0x7f13f4079fb0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f1404000f80 tx=0x7f140400db00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]': finished 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm04"}]: dispatch 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm04"}]': finished 2026-03-10T14:09:37.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.099+0000 7f14123e2700 1 -- 192.168.123.103:0/333656674 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f13f8005cc0 con 0x7f140c071980 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.101+0000 7f14097fa700 1 -- 192.168.123.103:0/333656674 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+694 (secure 0 0 0) 0x7f13fc01d020 con 0x7f140c071980 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:09:37.100 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:09:37.101 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:09:37.101 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:09:37.101 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 10, 2026-03-10T14:09:37.101 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T14:09:37.101 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:09:37.101 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:09:37.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.104+0000 7f13f2ffd700 1 -- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f13f4077b00 msgr2=0x7f13f4079fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.104+0000 7f13f2ffd700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f13f4077b00 0x7f13f4079fb0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f1404000f80 tx=0x7f140400db00 comp rx=0 tx=0).stop 2026-03-10T14:09:37.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.105+0000 7f13f2ffd700 1 -- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c071980 msgr2=0x7f140c1313c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.105+0000 7f13f2ffd700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c071980 0x7f140c1313c0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f13fc00c8a0 tx=0x7f13fc00cc60 comp rx=0 tx=0).stop 2026-03-10T14:09:37.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.105+0000 7f13f2ffd700 1 -- 192.168.123.103:0/333656674 shutdown_connections 2026-03-10T14:09:37.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.105+0000 7f13f2ffd700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f13f4077b00 0x7f13f4079fb0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.105+0000 7f13f2ffd700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f140c071980 0x7f140c1313c0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.105+0000 7f13f2ffd700 1 --2- 192.168.123.103:0/333656674 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f140c131900 0x7f140c07f590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.105+0000 7f13f2ffd700 1 -- 192.168.123.103:0/333656674 >> 192.168.123.103:0/333656674 conn(0x7f140c06d1a0 msgr2=0x7f140c0764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:37.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.105+0000 7f13f2ffd700 1 -- 192.168.123.103:0/333656674 shutdown_connections 2026-03-10T14:09:37.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.106+0000 7f13f2ffd700 1 -- 192.168.123.103:0/333656674 wait complete. 2026-03-10T14:09:37.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.240+0000 7fb04bfff700 1 -- 192.168.123.103:0/1334226870 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c071a60 msgr2=0x7fb04c071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.240+0000 7fb04bfff700 1 --2- 192.168.123.103:0/1334226870 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c071a60 0x7fb04c071e70 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb03c009b00 tx=0x7fb03c009e10 comp rx=0 tx=0).stop 2026-03-10T14:09:37.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 -- 192.168.123.103:0/1334226870 shutdown_connections 2026-03-10T14:09:37.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 --2- 192.168.123.103:0/1334226870 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb04c072440 0x7fb04c10be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 --2- 192.168.123.103:0/1334226870 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c071a60 0x7fb04c071e70 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 -- 192.168.123.103:0/1334226870 >> 192.168.123.103:0/1334226870 conn(0x7fb04c06d1a0 msgr2=0x7fb04c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 -- 192.168.123.103:0/1334226870 shutdown_connections 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 -- 192.168.123.103:0/1334226870 wait complete. 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 Processor -- start 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 -- start start 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb04c071a60 0x7fb04c116970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c072440 0x7fb04c116eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb04c1174f0 con 0x7fb04c072440 2026-03-10T14:09:37.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.243+0000 7fb04bfff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb04c117660 con 0x7fb04c071a60 2026-03-10T14:09:37.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.244+0000 7fb04a7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c072440 0x7fb04c116eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:37.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.244+0000 7fb04a7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c072440 0x7fb04c116eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38530/0 (socket says 192.168.123.103:38530) 2026-03-10T14:09:37.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.244+0000 7fb04a7fc700 1 -- 192.168.123.103:0/878634784 learned_addr learned my addr 192.168.123.103:0/878634784 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:37.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.244+0000 7fb04affd700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb04c071a60 0x7fb04c116970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:37.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.245+0000 7fb04a7fc700 1 -- 192.168.123.103:0/878634784 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb04c071a60 msgr2=0x7fb04c116970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.245+0000 7fb04a7fc700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb04c071a60 0x7fb04c116970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.245+0000 7fb04a7fc700 1 -- 192.168.123.103:0/878634784 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb03c0097e0 con 0x7fb04c072440 2026-03-10T14:09:37.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.245+0000 7fb04a7fc700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c072440 0x7fb04c116eb0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb040009fd0 tx=0x7fb04000eea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:37.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.248+0000 7fb050943700 1 -- 192.168.123.103:0/878634784 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb040009980 con 0x7fb04c072440 2026-03-10T14:09:37.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.248+0000 7fb050943700 1 -- 192.168.123.103:0/878634784 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb040004d10 con 0x7fb04c072440 2026-03-10T14:09:37.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.248+0000 7fb050943700 1 -- 192.168.123.103:0/878634784 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb040010470 con 0x7fb04c072440 2026-03-10T14:09:37.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.248+0000 7fb04bfff700 1 -- 192.168.123.103:0/878634784 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb04c1a15c0 con 0x7fb04c072440 2026-03-10T14:09:37.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.248+0000 7fb04bfff700 1 -- 192.168.123.103:0/878634784 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb04c1a1ae0 con 0x7fb04c072440 2026-03-10T14:09:37.248 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.249+0000 7fb04bfff700 1 -- 192.168.123.103:0/878634784 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb04c110c20 con 0x7fb04c072440 2026-03-10T14:09:37.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.253+0000 7fb050943700 1 -- 192.168.123.103:0/878634784 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb04000cca0 con 0x7fb04c072440 2026-03-10T14:09:37.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.254+0000 7fb050943700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb034077a30 0x7fb034079ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:37.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.254+0000 7fb050943700 1 -- 192.168.123.103:0/878634784 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb040014070 con 0x7fb04c072440 2026-03-10T14:09:37.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.255+0000 7fb04affd700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb034077a30 0x7fb034079ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:37.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.255+0000 7fb04affd700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb034077a30 0x7fb034079ee0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb04c117c60 tx=0x7fb03c01a040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:37.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.255+0000 7fb050943700 1 -- 192.168.123.103:0/878634784 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb040062db0 con 0x7fb04c072440 2026-03-10T14:09:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]': finished 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm04"}]: dispatch 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm04"}]': finished 2026-03-10T14:09:37.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T14:09:37.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.453+0000 7fb04bfff700 1 -- 192.168.123.103:0/878634784 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb04c04ea50 con 0x7fb04c072440 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:37.457 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.455+0000 7fb050943700 1 -- 192.168.123.103:0/878634784 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1973 (secure 0 0 0) 0x7fb040062500 con 0x7fb04c072440 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.458+0000 7fb0327fc700 1 -- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb034077a30 msgr2=0x7fb034079ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.458+0000 7fb0327fc700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb034077a30 0x7fb034079ee0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb04c117c60 tx=0x7fb03c01a040 comp rx=0 tx=0).stop 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.458+0000 7fb0327fc700 1 -- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c072440 msgr2=0x7fb04c116eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.458+0000 7fb0327fc700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c072440 0x7fb04c116eb0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb040009fd0 tx=0x7fb04000eea0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.459+0000 7fb0327fc700 1 -- 192.168.123.103:0/878634784 shutdown_connections 2026-03-10T14:09:37.458 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.459+0000 7fb0327fc700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb034077a30 0x7fb034079ee0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.459+0000 7fb0327fc700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb04c071a60 0x7fb04c116970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.459+0000 7fb0327fc700 1 --2- 192.168.123.103:0/878634784 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb04c072440 0x7fb04c116eb0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.459+0000 7fb0327fc700 1 -- 192.168.123.103:0/878634784 >> 192.168.123.103:0/878634784 conn(0x7fb04c06d1a0 msgr2=0x7fb04c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:37.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.460+0000 7fb0327fc700 1 -- 192.168.123.103:0/878634784 shutdown_connections 2026-03-10T14:09:37.459 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.460+0000 7fb0327fc700 1 -- 192.168.123.103:0/878634784 wait complete. 2026-03-10T14:09:37.465 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:09:37.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.573+0000 7f15b342f700 1 -- 192.168.123.103:0/1398191374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f15ac071a90 msgr2=0x7f15ac071ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.573+0000 7f15b342f700 1 --2- 192.168.123.103:0/1398191374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f15ac071a90 0x7f15ac071ea0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f15a400b3a0 tx=0x7f15a400b6b0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.574+0000 7f15b342f700 1 -- 192.168.123.103:0/1398191374 shutdown_connections 2026-03-10T14:09:37.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.574+0000 7f15b342f700 1 --2- 192.168.123.103:0/1398191374 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15ac072470 0x7f15ac10beb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.574+0000 7f15b342f700 1 --2- 192.168.123.103:0/1398191374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f15ac071a90 0x7f15ac071ea0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.574+0000 7f15b342f700 1 -- 192.168.123.103:0/1398191374 >> 192.168.123.103:0/1398191374 conn(0x7f15ac06d1a0 msgr2=0x7f15ac06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:37.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.576+0000 7f15b342f700 1 -- 192.168.123.103:0/1398191374 shutdown_connections 2026-03-10T14:09:37.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.576+0000 7f15b342f700 1 -- 192.168.123.103:0/1398191374 wait complete. 2026-03-10T14:09:37.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b342f700 1 Processor -- start 2026-03-10T14:09:37.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b342f700 1 -- start start 2026-03-10T14:09:37.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b342f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15ac072470 0x7f15ac116b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:37.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b342f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f15ac117040 0x7f15ac1b2820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:37.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b342f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15ac117540 con 0x7f15ac117040 2026-03-10T14:09:37.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b342f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15ac1176b0 con 0x7f15ac072470 2026-03-10T14:09:37.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b11cb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15ac072470 0x7f15ac116b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:37.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b11cb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15ac072470 0x7f15ac116b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:40870/0 (socket says 192.168.123.103:40870) 2026-03-10T14:09:37.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b11cb700 1 -- 192.168.123.103:0/660364913 learned_addr learned my addr 192.168.123.103:0/660364913 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:37.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.577+0000 7f15b09ca700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f15ac117040 0x7f15ac1b2820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:37.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.578+0000 7f15b11cb700 1 -- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f15ac117040 msgr2=0x7f15ac1b2820 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.578+0000 7f15b11cb700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f15ac117040 0x7f15ac1b2820 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.578+0000 7f15b11cb700 1 -- 192.168.123.103:0/660364913 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15a400b050 con 0x7f15ac072470 2026-03-10T14:09:37.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.578+0000 7f15b11cb700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15ac072470 0x7f15ac116b00 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f15a400b370 tx=0x7f15a40096c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:37.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.580+0000 7f15a27fc700 1 -- 192.168.123.103:0/660364913 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15a400e040 con 0x7f15ac072470 2026-03-10T14:09:37.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.580+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15ac1b2d60 con 0x7f15ac072470 2026-03-10T14:09:37.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.581+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15ac1b31c0 con 0x7f15ac072470 2026-03-10T14:09:37.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.581+0000 7f15a27fc700 1 -- 192.168.123.103:0/660364913 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f15a4007950 con 0x7f15ac072470 2026-03-10T14:09:37.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.581+0000 7f15a27fc700 1 -- 192.168.123.103:0/660364913 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15a40129a0 con 0x7f15ac072470 2026-03-10T14:09:37.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.584+0000 7f15a27fc700 1 -- 192.168.123.103:0/660364913 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f15a4019070 con 0x7f15ac072470 2026-03-10T14:09:37.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.585+0000 7f15a27fc700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1598077b10 0x7f1598079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:37.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.585+0000 7f15a27fc700 1 -- 192.168.123.103:0/660364913 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f15a409ac90 con 0x7f15ac072470 2026-03-10T14:09:37.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.585+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1590005320 con 0x7f15ac072470 2026-03-10T14:09:37.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.586+0000 7f15b09ca700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1598077b10 0x7f1598079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:37.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.588+0000 7f15b09ca700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1598077b10 0x7f1598079fc0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f15a8005950 tx=0x7f15a80058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:37.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.588+0000 7f15a27fc700 1 -- 192.168.123.103:0/660364913 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f15a4063320 con 0x7f15ac072470 2026-03-10T14:09:37.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.823+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1590000bf0 con 0x7f1598077b10 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "6/23 daemons upgraded", 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:09:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.833+0000 7f15a27fc700 1 -- 192.168.123.103:0/660364913 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f1590000bf0 con 0x7f1598077b10 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1598077b10 msgr2=0x7f1598079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1598077b10 0x7f1598079fc0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f15a8005950 tx=0x7f15a80058e0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15ac072470 msgr2=0x7f15ac116b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15ac072470 0x7f15ac116b00 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f15a400b370 tx=0x7f15a40096c0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 shutdown_connections 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1598077b10 0x7f1598079fc0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f15ac072470 0x7f15ac116b00 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 --2- 192.168.123.103:0/660364913 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f15ac117040 0x7f15ac1b2820 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:37.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.839+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 >> 192.168.123.103:0/660364913 conn(0x7f15ac06d1a0 msgr2=0x7f15ac10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:37.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.842+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 shutdown_connections 2026-03-10T14:09:37.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:37.842+0000 7f15b342f700 1 -- 192.168.123.103:0/660364913 wait complete. 2026-03-10T14:09:38.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.029+0000 7f54e7508700 1 -- 192.168.123.103:0/1375313905 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0071b80 msgr2=0x7f54e0071f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:38.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.029+0000 7f54e7508700 1 --2- 192.168.123.103:0/1375313905 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0071b80 0x7f54e0071f90 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f54dc009b00 tx=0x7f54dc009e10 comp rx=0 tx=0).stop 2026-03-10T14:09:38.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.030+0000 7f54e7508700 1 -- 192.168.123.103:0/1375313905 shutdown_connections 2026-03-10T14:09:38.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.030+0000 7f54e7508700 1 --2- 192.168.123.103:0/1375313905 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54e0072560 0x7f54e0107db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:38.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.030+0000 7f54e7508700 1 --2- 192.168.123.103:0/1375313905 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0071b80 0x7f54e0071f90 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:38.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.030+0000 7f54e7508700 1 -- 192.168.123.103:0/1375313905 >> 192.168.123.103:0/1375313905 conn(0x7f54e006d310 msgr2=0x7f54e006f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:38.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.030+0000 7f54e7508700 1 -- 192.168.123.103:0/1375313905 shutdown_connections 2026-03-10T14:09:38.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.030+0000 7f54e7508700 1 -- 192.168.123.103:0/1375313905 wait complete. 2026-03-10T14:09:38.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.031+0000 7f54e7508700 1 Processor -- start 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.031+0000 7f54e7508700 1 -- start start 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.031+0000 7f54e7508700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54e0071b80 0x7f54e01a4a70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.032+0000 7f54e52a4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54e0071b80 0x7f54e01a4a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.032+0000 7f54e52a4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54e0071b80 0x7f54e01a4a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:40890/0 (socket says 192.168.123.103:40890) 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.032+0000 7f54e7508700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0072560 0x7f54e01a4fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.032+0000 7f54e7508700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f54e01a55d0 con 0x7f54e0072560 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.032+0000 7f54e7508700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f54e01a5710 con 0x7f54e0071b80 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.032+0000 7f54e52a4700 1 -- 192.168.123.103:0/3856072490 learned_addr learned my addr 192.168.123.103:0/3856072490 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:09:38.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.032+0000 7f54e4aa3700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0072560 0x7f54e01a4fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:38.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.033+0000 7f54e4aa3700 1 -- 192.168.123.103:0/3856072490 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54e0071b80 msgr2=0x7f54e01a4a70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:38.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.033+0000 7f54e4aa3700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54e0071b80 0x7f54e01a4a70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:38.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.033+0000 7f54e4aa3700 1 -- 192.168.123.103:0/3856072490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f54dc0097e0 con 0x7f54e0072560 2026-03-10T14:09:38.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.033+0000 7f54e4aa3700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0072560 0x7f54e01a4fb0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f54e0072950 tx=0x7f54d000dbb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:38.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.033+0000 7f54d67fc700 1 -- 192.168.123.103:0/3856072490 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54d000f840 con 0x7f54e0072560 2026-03-10T14:09:38.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.033+0000 7f54d67fc700 1 -- 192.168.123.103:0/3856072490 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f54d000fe80 con 0x7f54e0072560 2026-03-10T14:09:38.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.033+0000 7f54e7508700 1 -- 192.168.123.103:0/3856072490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f54e01aa170 con 0x7f54e0072560 2026-03-10T14:09:38.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.033+0000 7f54e7508700 1 -- 192.168.123.103:0/3856072490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f54e01aa6c0 con 0x7f54e0072560 2026-03-10T14:09:38.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.037+0000 7f54d67fc700 1 -- 192.168.123.103:0/3856072490 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54d000e5c0 con 0x7f54e0072560 2026-03-10T14:09:38.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.037+0000 7f54e7508700 1 -- 192.168.123.103:0/3856072490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f54e004ea50 con 0x7f54e0072560 2026-03-10T14:09:38.039 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.038+0000 7f54d67fc700 1 -- 192.168.123.103:0/3856072490 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f54d0010460 con 0x7f54e0072560 2026-03-10T14:09:38.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.038+0000 7f54d67fc700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54cc077a50 0x7f54cc079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:09:38.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.038+0000 7f54d67fc700 1 -- 192.168.123.103:0/3856072490 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f54d0099e60 con 0x7f54e0072560 2026-03-10T14:09:38.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.041+0000 7f54e52a4700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54cc077a50 0x7f54cc079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:09:38.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.041+0000 7f54d67fc700 1 -- 192.168.123.103:0/3856072490 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f54d0062470 con 0x7f54e0072560 2026-03-10T14:09:38.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.047+0000 7f54e52a4700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54cc077a50 0x7f54cc079f00 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f54dc000c00 tx=0x7f54dc0054a0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:09:38.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all crash 2026-03-10T14:09:38.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: from='client.34160 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:38.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T14:09:38.111 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: Upgrade: osd.0 is safe to restart 2026-03-10T14:09:38.112 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: from='client.44107 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:38.112 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: pgmap v21: 65 pgs: 65 active+clean; 305 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 906 KiB/s rd, 840 KiB/s wr, 220 op/s 2026-03-10T14:09:38.112 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:38.112 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/333656674' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:38.112 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T14:09:38.112 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:38.112 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:38 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/878634784' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all crash 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: from='client.34160 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: Upgrade: osd.0 is safe to restart 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: from='client.44107 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: pgmap v21: 65 pgs: 65 active+clean; 305 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 906 KiB/s rd, 840 KiB/s wr, 220 op/s 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/333656674' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:38 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/878634784' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:09:38.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.489+0000 7f54e7508700 1 -- 192.168.123.103:0/3856072490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f54e01aaa90 con 0x7f54e0072560 2026-03-10T14:09:38.493 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_OK 2026-03-10T14:09:38.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.492+0000 7f54d67fc700 1 -- 192.168.123.103:0/3856072490 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f54d0061bc0 con 0x7f54e0072560 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 -- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54cc077a50 msgr2=0x7f54cc079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54cc077a50 0x7f54cc079f00 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f54dc000c00 tx=0x7f54dc0054a0 comp rx=0 tx=0).stop 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 -- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0072560 msgr2=0x7f54e01a4fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0072560 0x7f54e01a4fb0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f54e0072950 tx=0x7f54d000dbb0 comp rx=0 tx=0).stop 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 -- 192.168.123.103:0/3856072490 shutdown_connections 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54cc077a50 0x7f54cc079f00 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f54e0071b80 0x7f54e01a4a70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 --2- 192.168.123.103:0/3856072490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f54e0072560 0x7f54e01a4fb0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.497+0000 7f54cbfff700 1 -- 192.168.123.103:0/3856072490 >> 192.168.123.103:0/3856072490 conn(0x7f54e006d310 msgr2=0x7f54e010af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.498+0000 7f54cbfff700 1 -- 192.168.123.103:0/3856072490 shutdown_connections 2026-03-10T14:09:38.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:09:38.498+0000 7f54cbfff700 1 -- 192.168.123.103:0/3856072490 wait complete. 2026-03-10T14:09:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:39 vm03.local ceph-mon[103098]: Upgrade: Updating osd.0 2026-03-10T14:09:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:39 vm03.local ceph-mon[103098]: Deploying daemon osd.0 on vm03 2026-03-10T14:09:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:39 vm03.local ceph-mon[103098]: from='client.44117 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:39 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3856072490' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:09:39.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:39 vm03.local ceph-mon[103098]: osd.0 marked itself down and dead 2026-03-10T14:09:39.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:38 vm03.local systemd[1]: Stopping Ceph osd.0 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:09:39.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[67663]: 2026-03-10T14:09:39.007+0000 7f299c8af700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:09:39.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[67663]: 2026-03-10T14:09:39.007+0000 7f299c8af700 -1 osd.0 43 *** Got signal Terminated *** 2026-03-10T14:09:39.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[67663]: 2026-03-10T14:09:39.007+0000 7f299c8af700 -1 osd.0 43 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:09:39.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:39 vm04.local ceph-mon[92084]: Upgrade: Updating osd.0 2026-03-10T14:09:39.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:39 vm04.local ceph-mon[92084]: Deploying daemon osd.0 on vm03 2026-03-10T14:09:39.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:39 vm04.local ceph-mon[92084]: from='client.44117 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:09:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:39 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3856072490' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:09:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:39 vm04.local ceph-mon[92084]: osd.0 marked itself down and dead 2026-03-10T14:09:39.413 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108855]: 2026-03-10 14:09:39.152345684 +0000 UTC m=+0.176978056 container died 5a222b855ee3617401ab2b98f104692bbe7d11b8783661bfcb72f72ca17210f9 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, ceph=True, org.label-schema.build-date=20231212, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-10T14:09:39.413 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108855]: 2026-03-10 14:09:39.172185423 +0000 UTC m=+0.196817786 container remove 5a222b855ee3617401ab2b98f104692bbe7d11b8783661bfcb72f72ca17210f9 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0, org.label-schema.build-date=20231212, GIT_BRANCH=HEAD, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True) 2026-03-10T14:09:39.413 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local bash[108855]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0 2026-03-10T14:09:39.413 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108925]: 2026-03-10 14:09:39.33370514 +0000 UTC m=+0.027582976 container create af839bbc4fdb07cc2bc5be4f6b56ba9b0908c01be37a69decd6f06f059b50b65 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T14:09:39.413 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108925]: 2026-03-10 14:09:39.394929452 +0000 UTC m=+0.088807298 container init af839bbc4fdb07cc2bc5be4f6b56ba9b0908c01be37a69decd6f06f059b50b65 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True) 2026-03-10T14:09:39.413 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108925]: 2026-03-10 14:09:39.398762969 +0000 UTC m=+0.092640805 container start af839bbc4fdb07cc2bc5be4f6b56ba9b0908c01be37a69decd6f06f059b50b65 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default) 2026-03-10T14:09:39.413 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108925]: 2026-03-10 14:09:39.400270541 +0000 UTC m=+0.094148377 container attach af839bbc4fdb07cc2bc5be4f6b56ba9b0908c01be37a69decd6f06f059b50b65 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.vendor=CentOS) 2026-03-10T14:09:39.413 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108925]: 2026-03-10 14:09:39.315312979 +0000 UTC m=+0.009190826 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:09:39.680 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108925]: 2026-03-10 14:09:39.547830044 +0000 UTC m=+0.241707880 container died af839bbc4fdb07cc2bc5be4f6b56ba9b0908c01be37a69decd6f06f059b50b65 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:09:39.680 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[108925]: 2026-03-10 14:09:39.563524704 +0000 UTC m=+0.257402541 container remove af839bbc4fdb07cc2bc5be4f6b56ba9b0908c01be37a69decd6f06f059b50b65 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3) 2026-03-10T14:09:39.680 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.0.service: Deactivated successfully. 2026-03-10T14:09:39.680 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.0.service: Unit process 108936 (conmon) remains running after unit stopped. 2026-03-10T14:09:39.680 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.0.service: Unit process 108945 (podman) remains running after unit stopped. 2026-03-10T14:09:39.680 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local systemd[1]: Stopped Ceph osd.0 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:09:39.680 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.0.service: Consumed 32.212s CPU time, 566.2M memory peak. 2026-03-10T14:09:40.097 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-mon[103098]: pgmap v22: 65 pgs: 65 active+clean; 302 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.3 MiB/s wr, 349 op/s 2026-03-10T14:09:40.097 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-mon[103098]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:09:40.097 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-mon[103098]: osdmap e44: 6 total, 5 up, 6 in 2026-03-10T14:09:40.097 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local systemd[1]: Starting Ceph osd.0 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[109027]: 2026-03-10 14:09:39.888758014 +0000 UTC m=+0.015066186 container create e07b2796dbf1cddfa0560623c8e394ede0e14d991288f33e79851a82e308cd7a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[109027]: 2026-03-10 14:09:39.926898112 +0000 UTC m=+0.053206284 container init e07b2796dbf1cddfa0560623c8e394ede0e14d991288f33e79851a82e308cd7a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[109027]: 2026-03-10 14:09:39.931097673 +0000 UTC m=+0.057405845 container start e07b2796dbf1cddfa0560623c8e394ede0e14d991288f33e79851a82e308cd7a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[109027]: 2026-03-10 14:09:39.932108145 +0000 UTC m=+0.058416317 container attach e07b2796dbf1cddfa0560623c8e394ede0e14d991288f33e79851a82e308cd7a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default) 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:39 vm03.local podman[109027]: 2026-03-10 14:09:39.882475644 +0000 UTC m=+0.008783816 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local bash[109027]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:09:40.098 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local bash[109027]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:09:40.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:40 vm04.local ceph-mon[92084]: pgmap v22: 65 pgs: 65 active+clean; 302 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 1.3 MiB/s rd, 1.3 MiB/s wr, 349 op/s 2026-03-10T14:09:40.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:40 vm04.local ceph-mon[92084]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:09:40.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:40 vm04.local ceph-mon[92084]: osdmap e44: 6 total, 5 up, 6 in 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local bash[109027]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local bash[109027]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local bash[109027]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local bash[109027]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-efd3c0eb-206e-4f36-955f-350c14ee0a77/osd-block-32845076-88bb-4ae3-87fc-7369eed22d26 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T14:09:41.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:40 vm03.local bash[109027]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-efd3c0eb-206e-4f36-955f-350c14ee0a77/osd-block-32845076-88bb-4ae3-87fc-7369eed22d26 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T14:09:41.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:41 vm04.local ceph-mon[92084]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T14:09:41.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:41 vm04.local ceph-mon[92084]: pgmap v25: 65 pgs: 9 stale+active+clean, 56 active+clean; 302 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 675 KiB/s rd, 789 KiB/s wr, 193 op/s 2026-03-10T14:09:41.424 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:41 vm03.local ceph-mon[103098]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T14:09:41.424 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:41 vm03.local ceph-mon[103098]: pgmap v25: 65 pgs: 9 stale+active+clean, 56 active+clean; 302 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 675 KiB/s rd, 789 KiB/s wr, 193 op/s 2026-03-10T14:09:41.424 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/ln -snf /dev/ceph-efd3c0eb-206e-4f36-955f-350c14ee0a77/osd-block-32845076-88bb-4ae3-87fc-7369eed22d26 /var/lib/ceph/osd/ceph-0/block 2026-03-10T14:09:41.424 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T14:09:41.424 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local bash[109027]: Running command: /usr/bin/ln -snf /dev/ceph-efd3c0eb-206e-4f36-955f-350c14ee0a77/osd-block-32845076-88bb-4ae3-87fc-7369eed22d26 /var/lib/ceph/osd/ceph-0/block 2026-03-10T14:09:41.424 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local bash[109027]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T14:09:41.424 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T14:09:41.424 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local bash[109027]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T14:09:41.424 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T14:09:41.425 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local bash[109027]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T14:09:41.425 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate[109038]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T14:09:41.425 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local bash[109027]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T14:09:41.425 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local conmon[109038]: conmon e07b2796dbf1cddfa056 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e07b2796dbf1cddfa0560623c8e394ede0e14d991288f33e79851a82e308cd7a.scope/container/memory.events 2026-03-10T14:09:41.425 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local podman[109027]: 2026-03-10 14:09:41.178860848 +0000 UTC m=+1.305169020 container died e07b2796dbf1cddfa0560623c8e394ede0e14d991288f33e79851a82e308cd7a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default) 2026-03-10T14:09:41.425 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local podman[109027]: 2026-03-10 14:09:41.228695368 +0000 UTC m=+1.355003540 container remove e07b2796dbf1cddfa0560623c8e394ede0e14d991288f33e79851a82e308cd7a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:09:41.677 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local podman[109282]: 2026-03-10 14:09:41.426656162 +0000 UTC m=+0.057667695 container create 6e24e5898f4d2cc880aa183a560b9a87a532849eec08cd87bbf36ff8eda787a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:09:41.677 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local podman[109282]: 2026-03-10 14:09:41.382773183 +0000 UTC m=+0.013784726 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:09:41.677 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local podman[109282]: 2026-03-10 14:09:41.561178229 +0000 UTC m=+0.192189772 container init 6e24e5898f4d2cc880aa183a560b9a87a532849eec08cd87bbf36ff8eda787a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS) 2026-03-10T14:09:41.677 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local podman[109282]: 2026-03-10 14:09:41.569035568 +0000 UTC m=+0.200047101 container start 6e24e5898f4d2cc880aa183a560b9a87a532849eec08cd87bbf36ff8eda787a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3) 2026-03-10T14:09:42.113 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local bash[109282]: 6e24e5898f4d2cc880aa183a560b9a87a532849eec08cd87bbf36ff8eda787a4 2026-03-10T14:09:42.113 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local systemd[1]: Started Ceph osd.0 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:09:42.113 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:41 vm03.local ceph-osd[109298]: -- 192.168.123.103:0/975503506 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 ==== 327+0+0 (secure 0 0 0) 0x562b55314680 con 0x562b55295c00 2026-03-10T14:09:42.594 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:42 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[109293]: 2026-03-10T14:09:42.290+0000 7fbac0642740 -1 Falling back to public interface 2026-03-10T14:09:42.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:42 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:42.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:42 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:42.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:42 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:43.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:42 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:43.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:42 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:43.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:42 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:43 vm04.local ceph-mon[92084]: pgmap v26: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 298 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 991 KiB/s rd, 1.6 MiB/s wr, 415 op/s; 6594/44970 objects degraded (14.663%) 2026-03-10T14:09:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:43 vm04.local ceph-mon[92084]: Health check failed: Degraded data redundancy: 6594/44970 objects degraded (14.663%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T14:09:44.095 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:43 vm03.local ceph-mon[103098]: pgmap v26: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 298 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 991 KiB/s rd, 1.6 MiB/s wr, 415 op/s; 6594/44970 objects degraded (14.663%) 2026-03-10T14:09:44.095 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:43 vm03.local ceph-mon[103098]: Health check failed: Degraded data redundancy: 6594/44970 objects degraded (14.663%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T14:09:46.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:45 vm03.local ceph-mon[103098]: pgmap v27: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 298 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 992 KiB/s rd, 1.6 MiB/s wr, 415 op/s; 6594/44970 objects degraded (14.663%) 2026-03-10T14:09:46.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:09:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:45 vm04.local ceph-mon[92084]: pgmap v27: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 298 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 992 KiB/s rd, 1.6 MiB/s wr, 415 op/s; 6594/44970 objects degraded (14.663%) 2026-03-10T14:09:46.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:46.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:09:47.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:47 vm04.local ceph-mon[92084]: pgmap v28: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 298 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 316 KiB/s rd, 800 KiB/s wr, 222 op/s; 6594/44970 objects degraded (14.663%) 2026-03-10T14:09:47.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:47 vm03.local ceph-mon[103098]: pgmap v28: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 298 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 316 KiB/s rd, 800 KiB/s wr, 222 op/s; 6594/44970 objects degraded (14.663%) 2026-03-10T14:09:48.358 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[109293]: 2026-03-10T14:09:48.099+0000 7fbac0642740 -1 osd.0 0 read_superblock omap replica is missing. 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:09:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:48 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:09:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:48 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T14:09:50.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:49 vm03.local ceph-mon[103098]: pgmap v29: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 892 KiB/s rd, 1.3 MiB/s wr, 403 op/s; 5801/39570 objects degraded (14.660%) 2026-03-10T14:09:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:49 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 5801/39570 objects degraded (14.660%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T14:09:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:49 vm03.local ceph-mon[103098]: from='osd.0 [v2:192.168.123.103:6802/2572490447,v1:192.168.123.103:6803/2572490447]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T14:09:50.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:49 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[109293]: 2026-03-10T14:09:49.839+0000 7fbac0642740 -1 osd.0 43 log_to_monitors true 2026-03-10T14:09:50.108 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:09:49 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[109293]: 2026-03-10T14:09:49.951+0000 7fbab83dc640 -1 osd.0 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:09:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:49 vm04.local ceph-mon[92084]: pgmap v29: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 892 KiB/s rd, 1.3 MiB/s wr, 403 op/s; 5801/39570 objects degraded (14.660%) 2026-03-10T14:09:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:49 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 5801/39570 objects degraded (14.660%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T14:09:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:49 vm04.local ceph-mon[92084]: from='osd.0 [v2:192.168.123.103:6802/2572490447,v1:192.168.123.103:6803/2572490447]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T14:09:51.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:51 vm04.local ceph-mon[92084]: from='osd.0 [v2:192.168.123.103:6802/2572490447,v1:192.168.123.103:6803/2572490447]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T14:09:51.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:51 vm04.local ceph-mon[92084]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T14:09:51.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:51 vm04.local ceph-mon[92084]: from='osd.0 [v2:192.168.123.103:6802/2572490447,v1:192.168.123.103:6803/2572490447]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:09:51.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:51 vm03.local ceph-mon[103098]: from='osd.0 [v2:192.168.123.103:6802/2572490447,v1:192.168.123.103:6803/2572490447]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T14:09:51.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:51 vm03.local ceph-mon[103098]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T14:09:51.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:51 vm03.local ceph-mon[103098]: from='osd.0 [v2:192.168.123.103:6802/2572490447,v1:192.168.123.103:6803/2572490447]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:09:52.316 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:52 vm04.local ceph-mon[92084]: pgmap v31: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 851 KiB/s rd, 1.2 MiB/s wr, 385 op/s; 5801/39570 objects degraded (14.660%) 2026-03-10T14:09:52.316 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:52 vm04.local ceph-mon[92084]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:09:52.316 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:52 vm04.local ceph-mon[92084]: osd.0 [v2:192.168.123.103:6802/2572490447,v1:192.168.123.103:6803/2572490447] boot 2026-03-10T14:09:52.316 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:52 vm04.local ceph-mon[92084]: osdmap e47: 6 total, 6 up, 6 in 2026-03-10T14:09:52.316 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:52 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:09:52.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:52 vm03.local ceph-mon[103098]: pgmap v31: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 297 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 851 KiB/s rd, 1.2 MiB/s wr, 385 op/s; 5801/39570 objects degraded (14.660%) 2026-03-10T14:09:52.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:52 vm03.local ceph-mon[103098]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:09:52.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:52 vm03.local ceph-mon[103098]: osd.0 [v2:192.168.123.103:6802/2572490447,v1:192.168.123.103:6803/2572490447] boot 2026-03-10T14:09:52.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:52 vm03.local ceph-mon[103098]: osdmap e47: 6 total, 6 up, 6 in 2026-03-10T14:09:52.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:52 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T14:09:53.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:53 vm03.local ceph-mon[103098]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T14:09:53.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:53 vm03.local ceph-mon[103098]: pgmap v34: 65 pgs: 1 unknown, 24 peering, 9 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 537 op/s; 2428/37119 objects degraded (6.541%) 2026-03-10T14:09:53.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:53 vm04.local ceph-mon[92084]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T14:09:53.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:53 vm04.local ceph-mon[92084]: pgmap v34: 65 pgs: 1 unknown, 24 peering, 9 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 537 op/s; 2428/37119 objects degraded (6.541%) 2026-03-10T14:09:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:54 vm04.local ceph-mon[92084]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T14:09:54.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:54 vm03.local ceph-mon[103098]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T14:09:55.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:55 vm04.local ceph-mon[92084]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T14:09:55.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:55 vm04.local ceph-mon[92084]: pgmap v37: 65 pgs: 1 unknown, 24 peering, 9 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.7 MiB/s wr, 288 op/s; 2428/37119 objects degraded (6.541%) 2026-03-10T14:09:55.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:55 vm03.local ceph-mon[103098]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T14:09:55.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:55 vm03.local ceph-mon[103098]: pgmap v37: 65 pgs: 1 unknown, 24 peering, 9 active+undersized+degraded, 31 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.7 MiB/s wr, 288 op/s; 2428/37119 objects degraded (6.541%) 2026-03-10T14:09:56.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:56 vm04.local ceph-mon[92084]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T14:09:56.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:56 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 2428/37119 objects degraded (6.541%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T14:09:56.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:56 vm03.local ceph-mon[103098]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T14:09:56.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:56 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 2428/37119 objects degraded (6.541%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T14:09:57.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:57 vm04.local ceph-mon[92084]: pgmap v39: 65 pgs: 1 active+recovery_wait+degraded, 1 unknown, 24 peering, 1 active+recovering, 4 active+undersized+degraded, 34 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 42 KiB/s wr, 35 op/s; 1698/36768 objects degraded (4.618%); 40 KiB/s, 5 objects/s recovering 2026-03-10T14:09:57.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:57 vm03.local ceph-mon[103098]: pgmap v39: 65 pgs: 1 active+recovery_wait+degraded, 1 unknown, 24 peering, 1 active+recovering, 4 active+undersized+degraded, 34 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 42 KiB/s wr, 35 op/s; 1698/36768 objects degraded (4.618%); 40 KiB/s, 5 objects/s recovering 2026-03-10T14:10:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:09:59 vm04.local ceph-mon[92084]: pgmap v40: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering, 51 active+clean; 302 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 792 KiB/s rd, 1.0 MiB/s wr, 350 op/s; 1362/31653 objects degraded (4.303%); 2.5 MiB/s, 8 objects/s recovering 2026-03-10T14:10:00.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:09:59 vm03.local ceph-mon[103098]: pgmap v40: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering, 51 active+clean; 302 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 792 KiB/s rd, 1.0 MiB/s wr, 350 op/s; 1362/31653 objects degraded (4.303%); 2.5 MiB/s, 8 objects/s recovering 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: Health detail: HEALTH_WARN Degraded data redundancy: 1362/31653 objects degraded (4.303%), 12 pgs degraded 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: [WRN] PG_DEGRADED: Degraded data redundancy: 1362/31653 objects degraded (4.303%), 12 pgs degraded 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.3 is active+recovery_wait+undersized+degraded+remapped, acting [4,3] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.12 is active+recovery_wait+degraded, acting [0,3,1] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-10T14:10:01.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:01 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1362/31653 objects degraded (4.303%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: Health detail: HEALTH_WARN Degraded data redundancy: 1362/31653 objects degraded (4.303%), 12 pgs degraded 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: [WRN] PG_DEGRADED: Degraded data redundancy: 1362/31653 objects degraded (4.303%), 12 pgs degraded 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.3 is active+recovery_wait+undersized+degraded+remapped, acting [4,3] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.12 is active+recovery_wait+degraded, acting [0,3,1] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-10T14:10:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:01 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1362/31653 objects degraded (4.303%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:02.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:02 vm04.local ceph-mon[92084]: pgmap v41: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 51 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 657 KiB/s rd, 849 KiB/s wr, 299 op/s; 1362/31365 objects degraded (4.342%); 2.0 MiB/s, 12 objects/s recovering 2026-03-10T14:10:02.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:10:02.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:10:02.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:02 vm03.local ceph-mon[103098]: pgmap v41: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 51 active+clean; 301 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 657 KiB/s rd, 849 KiB/s wr, 299 op/s; 1362/31365 objects degraded (4.342%); 2.0 MiB/s, 12 objects/s recovering 2026-03-10T14:10:02.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:10:02.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:10:03.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:04.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:04 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:04.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:04 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T14:10:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:04 vm03.local ceph-mon[103098]: pgmap v42: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+degraded, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 51 active+clean; 293 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.7 MiB/s wr, 502 op/s; 1362/26394 objects degraded (5.160%); 1.8 MiB/s, 15 objects/s recovering 2026-03-10T14:10:04.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:04 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:04.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:04 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T14:10:04.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:04 vm04.local ceph-mon[92084]: pgmap v42: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+degraded, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 51 active+clean; 293 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.7 MiB/s wr, 502 op/s; 1362/26394 objects degraded (5.160%); 1.8 MiB/s, 15 objects/s recovering 2026-03-10T14:10:05.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:05 vm04.local ceph-mon[92084]: pgmap v43: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+degraded, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 51 active+clean; 293 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.4 MiB/s wr, 422 op/s; 1362/26394 objects degraded (5.160%); 1.5 MiB/s, 12 objects/s recovering 2026-03-10T14:10:05.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:05 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1362/26394 objects degraded (5.160%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:05.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:05 vm03.local ceph-mon[103098]: pgmap v43: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+degraded, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 51 active+clean; 293 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.4 MiB/s wr, 422 op/s; 1362/26394 objects degraded (5.160%); 1.5 MiB/s, 12 objects/s recovering 2026-03-10T14:10:05.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:05 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1362/26394 objects degraded (5.160%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:08.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.625+0000 7f91ebfff700 1 -- 192.168.123.103:0/325817958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f91e4095550 msgr2=0x7f91e40959a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:08.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.625+0000 7f91ebfff700 1 --2- 192.168.123.103:0/325817958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f91e4095550 0x7f91e40959a0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f91ec0669f0 tx=0x7f91ec06a460 comp rx=0 tx=0).stop 2026-03-10T14:10:08.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.628+0000 7f91ebfff700 1 -- 192.168.123.103:0/325817958 shutdown_connections 2026-03-10T14:10:08.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.628+0000 7f91ebfff700 1 --2- 192.168.123.103:0/325817958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f91e4095550 0x7f91e40959a0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.628+0000 7f91ebfff700 1 --2- 192.168.123.103:0/325817958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f91e4094350 0x7f91e4094760 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.628+0000 7f91ebfff700 1 -- 192.168.123.103:0/325817958 >> 192.168.123.103:0/325817958 conn(0x7f91e408f8e0 msgr2=0x7f91e4091d30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.630+0000 7f91ebfff700 1 -- 192.168.123.103:0/325817958 shutdown_connections 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.630+0000 7f91ebfff700 1 -- 192.168.123.103:0/325817958 wait complete. 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.630+0000 7f91ebfff700 1 Processor -- start 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.630+0000 7f91ebfff700 1 -- start start 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.630+0000 7f91ebfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f91e4094350 0x7f91e40a3b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.630+0000 7f91ebfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f91e4095550 0x7f91e40a40b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.630+0000 7f91ebfff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91e40a46d0 con 0x7f91e4094350 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.630+0000 7f91ebfff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91e40a4810 con 0x7f91e4095550 2026-03-10T14:10:08.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.631+0000 7f91ea7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f91e4095550 0x7f91e40a40b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:08.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.631+0000 7f91ea7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f91e4095550 0x7f91e40a40b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:47676/0 (socket says 192.168.123.103:47676) 2026-03-10T14:10:08.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.631+0000 7f91ea7fc700 1 -- 192.168.123.103:0/1187651434 learned_addr learned my addr 192.168.123.103:0/1187651434 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:08.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.631+0000 7f91eaffd700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f91e4094350 0x7f91e40a3b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:08.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.631+0000 7f91ea7fc700 1 -- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f91e4094350 msgr2=0x7f91e40a3b70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:08.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.631+0000 7f91ea7fc700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f91e4094350 0x7f91e40a3b70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.631+0000 7f91ea7fc700 1 -- 192.168.123.103:0/1187651434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91ec067050 con 0x7f91e4095550 2026-03-10T14:10:08.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.631+0000 7f91ea7fc700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f91e4095550 0x7f91e40a40b0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f91ec066080 tx=0x7f91ec07ca90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:08.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.632+0000 7f91d3fff700 1 -- 192.168.123.103:0/1187651434 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91ec069da0 con 0x7f91e4095550 2026-03-10T14:10:08.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.632+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f91e40a9540 con 0x7f91e4095550 2026-03-10T14:10:08.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.632+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f91e40a9a00 con 0x7f91e4095550 2026-03-10T14:10:08.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.632+0000 7f91d3fff700 1 -- 192.168.123.103:0/1187651434 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f91ec07c380 con 0x7f91e4095550 2026-03-10T14:10:08.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.632+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f91e4005810 con 0x7f91e4095550 2026-03-10T14:10:08.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.632+0000 7f91d3fff700 1 -- 192.168.123.103:0/1187651434 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91ec088600 con 0x7f91e4095550 2026-03-10T14:10:08.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.634+0000 7f91d3fff700 1 -- 192.168.123.103:0/1187651434 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f91ec085070 con 0x7f91e4095550 2026-03-10T14:10:08.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.634+0000 7f91d3fff700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f91d4077aa0 0x7f91d4079f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:08.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.634+0000 7f91d3fff700 1 -- 192.168.123.103:0/1187651434 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6484+0+0 (secure 0 0 0) 0x7f91ec078050 con 0x7f91e4095550 2026-03-10T14:10:08.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.634+0000 7f91eaffd700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f91d4077aa0 0x7f91d4079f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:08.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.635+0000 7f91eaffd700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f91d4077aa0 0x7f91d4079f50 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f91e40953b0 tx=0x7f91dc009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:08.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.638+0000 7f91d3fff700 1 -- 192.168.123.103:0/1187651434 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f91ec0ce3b0 con 0x7f91e4095550 2026-03-10T14:10:08.788 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.789+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f91e4099ea0 con 0x7f91d4077aa0 2026-03-10T14:10:08.789 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.790+0000 7f91d3fff700 1 -- 192.168.123.103:0/1187651434 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f91e4099ea0 con 0x7f91d4077aa0 2026-03-10T14:10:08.791 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f91d4077aa0 msgr2=0x7f91d4079f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f91d4077aa0 0x7f91d4079f50 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f91e40953b0 tx=0x7f91dc009450 comp rx=0 tx=0).stop 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f91e4095550 msgr2=0x7f91e40a40b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f91e4095550 0x7f91e40a40b0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f91ec066080 tx=0x7f91ec07ca90 comp rx=0 tx=0).stop 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 shutdown_connections 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f91d4077aa0 0x7f91d4079f50 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f91e4094350 0x7f91e40a3b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 --2- 192.168.123.103:0/1187651434 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f91e4095550 0x7f91e40a40b0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 >> 192.168.123.103:0/1187651434 conn(0x7f91e408f8e0 msgr2=0x7f91e4098780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 shutdown_connections 2026-03-10T14:10:08.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.793+0000 7f91ebfff700 1 -- 192.168.123.103:0/1187651434 wait complete. 2026-03-10T14:10:08.801 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.865+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/2501648571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 msgr2=0x7ff6f8102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.865+0000 7ff6fdf5d700 1 --2- 192.168.123.103:0/2501648571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 0x7ff6f8102b70 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7ff6e8009b50 tx=0x7ff6e8009e60 comp rx=0 tx=0).stop 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.865+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/2501648571 shutdown_connections 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.865+0000 7ff6fdf5d700 1 --2- 192.168.123.103:0/2501648571 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8103960 0x7ff6f8103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.865+0000 7ff6fdf5d700 1 --2- 192.168.123.103:0/2501648571 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 0x7ff6f8102b70 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.865+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/2501648571 >> 192.168.123.103:0/2501648571 conn(0x7ff6f80fdcf0 msgr2=0x7ff6f8100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.865+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/2501648571 shutdown_connections 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.865+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/2501648571 wait complete. 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6fdf5d700 1 Processor -- start 2026-03-10T14:10:08.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6fdf5d700 1 -- start start 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6fdf5d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 0x7ff6f8198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6fdf5d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8103960 0x7ff6f8198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6fdf5d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6f8198b80 con 0x7ff6f8102760 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6fdf5d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6f8198cc0 con 0x7ff6f8103960 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6f6ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8103960 0x7ff6f8198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6f77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 0x7ff6f8198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6f77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 0x7ff6f8198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54934/0 (socket says 192.168.123.103:54934) 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.866+0000 7ff6f77fe700 1 -- 192.168.123.103:0/376066537 learned_addr learned my addr 192.168.123.103:0/376066537 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.867+0000 7ff6f6ffd700 1 -- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 msgr2=0x7ff6f8198020 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:08.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.867+0000 7ff6f6ffd700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 0x7ff6f8198020 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:08.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.867+0000 7ff6f6ffd700 1 -- 192.168.123.103:0/376066537 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6e80097e0 con 0x7ff6f8103960 2026-03-10T14:10:08.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.867+0000 7ff6f77fe700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 0x7ff6f8198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:10:08.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.867+0000 7ff6f6ffd700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8103960 0x7ff6f8198560 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7ff6ec00eab0 tx=0x7ff6ec00edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:08.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.868+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/376066537 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6ec00cb20 con 0x7ff6f8103960 2026-03-10T14:10:08.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.868+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6f819d770 con 0x7ff6f8103960 2026-03-10T14:10:08.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.868+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6f819dcc0 con 0x7ff6f8103960 2026-03-10T14:10:08.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.868+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/376066537 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff6ec00cc80 con 0x7ff6f8103960 2026-03-10T14:10:08.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.868+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/376066537 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6ec018860 con 0x7ff6f8103960 2026-03-10T14:10:08.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.869+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/376066537 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ff6ec0189c0 con 0x7ff6f8103960 2026-03-10T14:10:08.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.869+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff6e4005320 con 0x7ff6f8103960 2026-03-10T14:10:08.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.869+0000 7ff6f4ff9700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e0077a00 0x7ff6e0079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:08.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.870+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/376066537 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6484+0+0 (secure 0 0 0) 0x7ff6ec014070 con 0x7ff6f8103960 2026-03-10T14:10:08.869 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.870+0000 7ff6f77fe700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e0077a00 0x7ff6e0079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:08.869 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.870+0000 7ff6f77fe700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e0077a00 0x7ff6e0079eb0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7ff6e800b5c0 tx=0x7ff6e80058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:08.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:08.872+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/376066537 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff6ec062a50 con 0x7ff6f8103960 2026-03-10T14:10:09.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.028+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff6e4000bf0 con 0x7ff6e0077a00 2026-03-10T14:10:09.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.029+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/376066537 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ff6e4000bf0 con 0x7ff6e0077a00 2026-03-10T14:10:09.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e0077a00 msgr2=0x7ff6e0079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e0077a00 0x7ff6e0079eb0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7ff6e800b5c0 tx=0x7ff6e80058e0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8103960 msgr2=0x7ff6f8198560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8103960 0x7ff6f8198560 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7ff6ec00eab0 tx=0x7ff6ec00edc0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 shutdown_connections 2026-03-10T14:10:09.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e0077a00 0x7ff6e0079eb0 secure :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7ff6e800b5c0 tx=0x7ff6e80058e0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f8102760 0x7ff6f8198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 --2- 192.168.123.103:0/376066537 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8103960 0x7ff6f8198560 secure :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7ff6ec00eab0 tx=0x7ff6ec00edc0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 >> 192.168.123.103:0/376066537 conn(0x7ff6f80fdcf0 msgr2=0x7ff6f8106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:09.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 shutdown_connections 2026-03-10T14:10:09.031 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.032+0000 7ff6fdf5d700 1 -- 192.168.123.103:0/376066537 wait complete. 2026-03-10T14:10:09.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.116+0000 7fe2c26ca700 1 -- 192.168.123.103:0/1195243365 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc069180 msgr2=0x7fe2bc103140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.116+0000 7fe2c26ca700 1 --2- 192.168.123.103:0/1195243365 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc069180 0x7fe2bc103140 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fe2ac009b00 tx=0x7fe2ac009e10 comp rx=0 tx=0).stop 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.117+0000 7fe2c26ca700 1 -- 192.168.123.103:0/1195243365 shutdown_connections 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.117+0000 7fe2c26ca700 1 --2- 192.168.123.103:0/1195243365 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2bc103680 0x7fe2bc105ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.117+0000 7fe2c26ca700 1 --2- 192.168.123.103:0/1195243365 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc069180 0x7fe2bc103140 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.117+0000 7fe2c26ca700 1 -- 192.168.123.103:0/1195243365 >> 192.168.123.103:0/1195243365 conn(0x7fe2bc0faa70 msgr2=0x7fe2bc0fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.117+0000 7fe2c26ca700 1 -- 192.168.123.103:0/1195243365 shutdown_connections 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.117+0000 7fe2c26ca700 1 -- 192.168.123.103:0/1195243365 wait complete. 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.117+0000 7fe2c26ca700 1 Processor -- start 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2c26ca700 1 -- start start 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2c26ca700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2bc069180 0x7fe2bc198030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2c26ca700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc103680 0x7fe2bc198570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2c26ca700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2bc198b00 con 0x7fe2bc103680 2026-03-10T14:10:09.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2c26ca700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe2bc198c40 con 0x7fe2bc069180 2026-03-10T14:10:09.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2bb7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc103680 0x7fe2bc198570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:09.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2bb7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc103680 0x7fe2bc198570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54956/0 (socket says 192.168.123.103:54956) 2026-03-10T14:10:09.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2bb7fe700 1 -- 192.168.123.103:0/861334344 learned_addr learned my addr 192.168.123.103:0/861334344 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:09.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.118+0000 7fe2bbfff700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2bc069180 0x7fe2bc198030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:09.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.119+0000 7fe2bbfff700 1 -- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc103680 msgr2=0x7fe2bc198570 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.119+0000 7fe2bbfff700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc103680 0x7fe2bc198570 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.119+0000 7fe2bbfff700 1 -- 192.168.123.103:0/861334344 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2ac0097e0 con 0x7fe2bc069180 2026-03-10T14:10:09.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.119+0000 7fe2bb7fe700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc103680 0x7fe2bc198570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:10:09.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.119+0000 7fe2bbfff700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2bc069180 0x7fe2bc198030 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fe2ac009ad0 tx=0x7fe2ac004c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:09.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:08 vm03.local ceph-mon[103098]: pgmap v44: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 292 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.0 MiB/s rd, 1.2 MiB/s wr, 369 op/s; 1266/26028 objects degraded (4.864%); 1.3 MiB/s, 12 objects/s recovering 2026-03-10T14:10:09.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.346+0000 7fe2b97fa700 1 -- 192.168.123.103:0/861334344 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2ac01d070 con 0x7fe2bc069180 2026-03-10T14:10:09.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.347+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe2bc19d6a0 con 0x7fe2bc069180 2026-03-10T14:10:09.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.347+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe2bc19db60 con 0x7fe2bc069180 2026-03-10T14:10:09.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.347+0000 7fe2b97fa700 1 -- 192.168.123.103:0/861334344 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe2ac00bc50 con 0x7fe2bc069180 2026-03-10T14:10:09.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.347+0000 7fe2b97fa700 1 -- 192.168.123.103:0/861334344 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2ac00f670 con 0x7fe2bc069180 2026-03-10T14:10:09.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.347+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe2bc0fc670 con 0x7fe2bc069180 2026-03-10T14:10:09.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.351+0000 7fe2b97fa700 1 -- 192.168.123.103:0/861334344 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fe2ac022470 con 0x7fe2bc069180 2026-03-10T14:10:09.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.351+0000 7fe2b97fa700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2a80779f0 0x7fe2a8079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:09.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.351+0000 7fe2b97fa700 1 -- 192.168.123.103:0/861334344 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6484+0+0 (secure 0 0 0) 0x7fe2ac09bcc0 con 0x7fe2bc069180 2026-03-10T14:10:09.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.353+0000 7fe2bb7fe700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2a80779f0 0x7fe2a8079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:09.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.353+0000 7fe2b97fa700 1 -- 192.168.123.103:0/861334344 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe2ac0641f0 con 0x7fe2bc069180 2026-03-10T14:10:09.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.353+0000 7fe2bb7fe700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2a80779f0 0x7fe2a8079ea0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fe2a400f4d0 tx=0x7fe2a4005f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:09.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.536+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fe2bc061190 con 0x7fe2a80779f0 2026-03-10T14:10:09.549 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (6m) 25s ago 6m 22.0M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (4m) 25s ago 6m 8904k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (6m) 37s ago 6m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (42s) 25s ago 6m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (39s) 37s ago 6m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (6m) 25s ago 6m 91.2M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (4m) 25s ago 4m 158M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (4m) 25s ago 4m 93.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (4m) 37s ago 4m 91.2M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (4m) 37s ago 4m 166M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (109s) 25s ago 7m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (81s) 37s ago 6m 496M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (74s) 25s ago 7m 56.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (60s) 37s ago 6m 49.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (6m) 25s ago 6m 15.5M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (6m) 37s ago 6m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (28s) 25s ago 5m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (5m) 25s ago 5m 368M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (5m) 25s ago 5m 288M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (5m) 37s ago 5m 434M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (5m) 37s ago 5m 397M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (5m) 37s ago 5m 352M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (84s) 25s ago 6m 53.0M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:10:09.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.549+0000 7fe2b97fa700 1 -- 192.168.123.103:0/861334344 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fe2bc061190 con 0x7fe2a80779f0 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2a80779f0 msgr2=0x7fe2a8079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2a80779f0 0x7fe2a8079ea0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fe2a400f4d0 tx=0x7fe2a4005f90 comp rx=0 tx=0).stop 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2bc069180 msgr2=0x7fe2bc198030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2bc069180 0x7fe2bc198030 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fe2ac009ad0 tx=0x7fe2ac004c30 comp rx=0 tx=0).stop 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 shutdown_connections 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2a80779f0 0x7fe2a8079ea0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe2bc069180 0x7fe2bc198030 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 --2- 192.168.123.103:0/861334344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe2bc103680 0x7fe2bc198570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 >> 192.168.123.103:0/861334344 conn(0x7fe2bc0faa70 msgr2=0x7fe2bc104340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.552+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 shutdown_connections 2026-03-10T14:10:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.553+0000 7fe2c26ca700 1 -- 192.168.123.103:0/861334344 wait complete. 2026-03-10T14:10:09.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.642+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/2521626782 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24072360 msgr2=0x7fbc240770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.642+0000 7fbc2bb6b700 1 --2- 192.168.123.103:0/2521626782 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24072360 0x7fbc240770e0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fbc1c00d3e0 tx=0x7fbc1c00d6f0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.642+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/2521626782 shutdown_connections 2026-03-10T14:10:09.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.642+0000 7fbc2bb6b700 1 --2- 192.168.123.103:0/2521626782 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24072360 0x7fbc240770e0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.642+0000 7fbc2bb6b700 1 --2- 192.168.123.103:0/2521626782 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbc24071980 0x7fbc24071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.642+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/2521626782 >> 192.168.123.103:0/2521626782 conn(0x7fbc2406d1a0 msgr2=0x7fbc2406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:09.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.643+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/2521626782 shutdown_connections 2026-03-10T14:10:09.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.643+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/2521626782 wait complete. 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.643+0000 7fbc2bb6b700 1 Processor -- start 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.643+0000 7fbc2bb6b700 1 -- start start 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.643+0000 7fbc2bb6b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbc24071980 0x7fbc24082440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.643+0000 7fbc2bb6b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24082980 0x7fbc24082df0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.643+0000 7fbc2bb6b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc24083df0 con 0x7fbc24071980 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.643+0000 7fbc2bb6b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc2412dd80 con 0x7fbc24082980 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.645+0000 7fbc29106700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24082980 0x7fbc24082df0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.645+0000 7fbc29106700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24082980 0x7fbc24082df0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:47736/0 (socket says 192.168.123.103:47736) 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.645+0000 7fbc29106700 1 -- 192.168.123.103:0/1197274045 learned_addr learned my addr 192.168.123.103:0/1197274045 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:09.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.645+0000 7fbc29907700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbc24071980 0x7fbc24082440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:09.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.645+0000 7fbc29106700 1 -- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbc24071980 msgr2=0x7fbc24082440 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.645+0000 7fbc29106700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbc24071980 0x7fbc24082440 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.645+0000 7fbc29106700 1 -- 192.168.123.103:0/1197274045 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbc1c00d090 con 0x7fbc24082980 2026-03-10T14:10:09.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.645+0000 7fbc29106700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24082980 0x7fbc24082df0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fbc1c003fa0 tx=0x7fbc1c009b40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:09.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.647+0000 7fbc1affd700 1 -- 192.168.123.103:0/1197274045 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc1c010040 con 0x7fbc24082980 2026-03-10T14:10:09.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.647+0000 7fbc1affd700 1 -- 192.168.123.103:0/1197274045 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbc1c004250 con 0x7fbc24082980 2026-03-10T14:10:09.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.647+0000 7fbc1affd700 1 -- 192.168.123.103:0/1197274045 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc1c01d850 con 0x7fbc24082980 2026-03-10T14:10:09.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.647+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/1197274045 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbc2412dfd0 con 0x7fbc24082980 2026-03-10T14:10:09.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.647+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/1197274045 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbc2412e4c0 con 0x7fbc24082980 2026-03-10T14:10:09.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.648+0000 7fbc1affd700 1 -- 192.168.123.103:0/1197274045 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fbc1c0043c0 con 0x7fbc24082980 2026-03-10T14:10:09.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.648+0000 7fbc1affd700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbc10077a40 0x7fbc10079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:09.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.649+0000 7fbc1affd700 1 -- 192.168.123.103:0/1197274045 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6484+0+0 (secure 0 0 0) 0x7fbc1c09b3a0 con 0x7fbc24082980 2026-03-10T14:10:09.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.650+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/1197274045 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbc2404f2a0 con 0x7fbc24082980 2026-03-10T14:10:09.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.650+0000 7fbc29907700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbc10077a40 0x7fbc10079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:09.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.650+0000 7fbc29907700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbc10077a40 0x7fbc10079ef0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fbc2000aa40 tx=0x7fbc2000c040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:09.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.653+0000 7fbc1affd700 1 -- 192.168.123.103:0/1197274045 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbc1c0638d0 con 0x7fbc24082980 2026-03-10T14:10:09.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:09 vm04.local ceph-mon[92084]: pgmap v44: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 292 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.0 MiB/s rd, 1.2 MiB/s wr, 369 op/s; 1266/26028 objects degraded (4.864%); 1.3 MiB/s, 12 objects/s recovering 2026-03-10T14:10:09.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.884+0000 7fbc2bb6b700 1 -- 192.168.123.103:0/1197274045 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fbc2412e7a0 con 0x7fbc24082980 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.886+0000 7fbc1affd700 1 -- 192.168.123.103:0/1197274045 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fbc1c063020 con 0x7fbc24082980 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T14:10:09.884 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:10:09.885 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:10:09.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 -- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbc10077a40 msgr2=0x7fbc10079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbc10077a40 0x7fbc10079ef0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fbc2000aa40 tx=0x7fbc2000c040 comp rx=0 tx=0).stop 2026-03-10T14:10:09.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 -- 192.168.123.103:0/1197274045 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24082980 msgr2=0x7fbc24082df0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24082980 0x7fbc24082df0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fbc1c003fa0 tx=0x7fbc1c009b40 comp rx=0 tx=0).stop 2026-03-10T14:10:09.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 -- 192.168.123.103:0/1197274045 shutdown_connections 2026-03-10T14:10:09.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbc10077a40 0x7fbc10079ef0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbc24071980 0x7fbc24082440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 --2- 192.168.123.103:0/1197274045 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbc24082980 0x7fbc24082df0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:09.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.889+0000 7fbc18ff9700 1 -- 192.168.123.103:0/1197274045 >> 192.168.123.103:0/1197274045 conn(0x7fbc2406d1a0 msgr2=0x7fbc24076470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:09.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.891+0000 7fbc18ff9700 1 -- 192.168.123.103:0/1197274045 shutdown_connections 2026-03-10T14:10:09.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.891+0000 7fbc18ff9700 1 -- 192.168.123.103:0/1197274045 wait complete. 2026-03-10T14:10:09.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.996+0000 7fe09f23c700 1 -- 192.168.123.103:0/976808057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 msgr2=0x7fe098103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:09.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:09.996+0000 7fe09f23c700 1 --2- 192.168.123.103:0/976808057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 0x7fe098103db0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fe084009b00 tx=0x7fe084009e10 comp rx=0 tx=0).stop 2026-03-10T14:10:10.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.003+0000 7fe09f23c700 1 -- 192.168.123.103:0/976808057 shutdown_connections 2026-03-10T14:10:10.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.003+0000 7fe09f23c700 1 --2- 192.168.123.103:0/976808057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 0x7fe098103db0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.003+0000 7fe09f23c700 1 --2- 192.168.123.103:0/976808057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe098102760 0x7fe098102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.003+0000 7fe09f23c700 1 -- 192.168.123.103:0/976808057 >> 192.168.123.103:0/976808057 conn(0x7fe0980fdcf0 msgr2=0x7fe098100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:10.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.003+0000 7fe09f23c700 1 -- 192.168.123.103:0/976808057 shutdown_connections 2026-03-10T14:10:10.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.003+0000 7fe09f23c700 1 -- 192.168.123.103:0/976808057 wait complete. 2026-03-10T14:10:10.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.004+0000 7fe09f23c700 1 Processor -- start 2026-03-10T14:10:10.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.004+0000 7fe09f23c700 1 -- start start 2026-03-10T14:10:10.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe09f23c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe098102760 0x7fe098198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe09f23c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 0x7fe098198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe09cfd8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe098102760 0x7fe098198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:10.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe09cfd8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe098102760 0x7fe098198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:47758/0 (socket says 192.168.123.103:47758) 2026-03-10T14:10:10.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe097fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 0x7fe098198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:10.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe097fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 0x7fe098198560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54982/0 (socket says 192.168.123.103:54982) 2026-03-10T14:10:10.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe09cfd8700 1 -- 192.168.123.103:0/2359134163 learned_addr learned my addr 192.168.123.103:0/2359134163 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:10.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe09f23c700 1 -- 192.168.123.103:0/2359134163 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe098198b80 con 0x7fe098103960 2026-03-10T14:10:10.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.005+0000 7fe09f23c700 1 -- 192.168.123.103:0/2359134163 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe098198cc0 con 0x7fe098102760 2026-03-10T14:10:10.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.010+0000 7fe097fff700 1 -- 192.168.123.103:0/2359134163 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe098102760 msgr2=0x7fe098198020 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.010+0000 7fe097fff700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe098102760 0x7fe098198020 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.010+0000 7fe097fff700 1 -- 192.168.123.103:0/2359134163 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0840097e0 con 0x7fe098103960 2026-03-10T14:10:10.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.011+0000 7fe097fff700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 0x7fe098198560 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe084000c00 tx=0x7fe084004910 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:10.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.011+0000 7fe095ffb700 1 -- 192.168.123.103:0/2359134163 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe08401d070 con 0x7fe098103960 2026-03-10T14:10:10.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.011+0000 7fe09f23c700 1 -- 192.168.123.103:0/2359134163 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe09819d710 con 0x7fe098103960 2026-03-10T14:10:10.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.011+0000 7fe09f23c700 1 -- 192.168.123.103:0/2359134163 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe09819dc00 con 0x7fe098103960 2026-03-10T14:10:10.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.012+0000 7fe09f23c700 1 -- 192.168.123.103:0/2359134163 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe098066e40 con 0x7fe098103960 2026-03-10T14:10:10.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.012+0000 7fe095ffb700 1 -- 192.168.123.103:0/2359134163 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe08400f460 con 0x7fe098103960 2026-03-10T14:10:10.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.013+0000 7fe095ffb700 1 -- 192.168.123.103:0/2359134163 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe084017610 con 0x7fe098103960 2026-03-10T14:10:10.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.014+0000 7fe095ffb700 1 -- 192.168.123.103:0/2359134163 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fe08400fab0 con 0x7fe098103960 2026-03-10T14:10:10.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.015+0000 7fe095ffb700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe088077a00 0x7fe088079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.015+0000 7fe095ffb700 1 -- 192.168.123.103:0/2359134163 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6484+0+0 (secure 0 0 0) 0x7fe08409b590 con 0x7fe098103960 2026-03-10T14:10:10.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.018+0000 7fe095ffb700 1 -- 192.168.123.103:0/2359134163 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe084063bb0 con 0x7fe098103960 2026-03-10T14:10:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.020+0000 7fe09cfd8700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe088077a00 0x7fe088079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:10.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.021+0000 7fe09cfd8700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe088077a00 0x7fe088079eb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fe08c005950 tx=0x7fe08c009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:10.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.211+0000 7fe09f23c700 1 -- 192.168.123.103:0/2359134163 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fe09819dea0 con 0x7fe098103960 2026-03-10T14:10:10.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.213+0000 7fe095ffb700 1 -- 192.168.123.103:0/2359134163 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1973 (secure 0 0 0) 0x7fe084063300 con 0x7fe098103960 2026-03-10T14:10:10.211 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:10:10.211 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T14:10:10.211 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:10:10.211 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:10:10.211 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:10:10.211 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:10:10.211 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:10:10.212 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:10:10.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.217+0000 7fe0837fe700 1 -- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe088077a00 msgr2=0x7fe088079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.217+0000 7fe0837fe700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe088077a00 0x7fe088079eb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fe08c005950 tx=0x7fe08c009450 comp rx=0 tx=0).stop 2026-03-10T14:10:10.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.217+0000 7fe0837fe700 1 -- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 msgr2=0x7fe098198560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.217+0000 7fe0837fe700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 0x7fe098198560 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe084000c00 tx=0x7fe084004910 comp rx=0 tx=0).stop 2026-03-10T14:10:10.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.218+0000 7fe0837fe700 1 -- 192.168.123.103:0/2359134163 shutdown_connections 2026-03-10T14:10:10.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.218+0000 7fe0837fe700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe088077a00 0x7fe088079eb0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.218+0000 7fe0837fe700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe098102760 0x7fe098198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.218+0000 7fe0837fe700 1 --2- 192.168.123.103:0/2359134163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe098103960 0x7fe098198560 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.218+0000 7fe0837fe700 1 -- 192.168.123.103:0/2359134163 >> 192.168.123.103:0/2359134163 conn(0x7fe0980fdcf0 msgr2=0x7fe098106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:10.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.219+0000 7fe0837fe700 1 -- 192.168.123.103:0/2359134163 shutdown_connections 2026-03-10T14:10:10.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.219+0000 7fe0837fe700 1 -- 192.168.123.103:0/2359134163 wait complete. 2026-03-10T14:10:10.223 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:10:10.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:10 vm04.local ceph-mon[92084]: pgmap v45: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 539 op/s; 1266/20394 objects degraded (6.208%); 1.2 MiB/s, 15 objects/s recovering 2026-03-10T14:10:10.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:10 vm04.local ceph-mon[92084]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:10.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:10 vm04.local ceph-mon[92084]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:10.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:10 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1197274045' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:10.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.327+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/3442936683 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c072360 msgr2=0x7fb89c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.327+0000 7fb8a2f2c700 1 --2- 192.168.123.103:0/3442936683 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c072360 0x7fb89c0770e0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fb894008280 tx=0x7fb894008590 comp rx=0 tx=0).stop 2026-03-10T14:10:10.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.327+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/3442936683 shutdown_connections 2026-03-10T14:10:10.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.327+0000 7fb8a2f2c700 1 --2- 192.168.123.103:0/3442936683 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c072360 0x7fb89c0770e0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.327+0000 7fb8a2f2c700 1 --2- 192.168.123.103:0/3442936683 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb89c071980 0x7fb89c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.327+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/3442936683 >> 192.168.123.103:0/3442936683 conn(0x7fb89c06d1a0 msgr2=0x7fb89c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/3442936683 shutdown_connections 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/3442936683 wait complete. 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a2f2c700 1 Processor -- start 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a2f2c700 1 -- start start 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a2f2c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c071980 0x7fb89c082550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a2f2c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb89c082a90 0x7fb89c082f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a2f2c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb89c12dd80 con 0x7fb89c071980 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a2f2c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb89c12def0 con 0x7fb89c082a90 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a0cc8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c071980 0x7fb89c082550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a0cc8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c071980 0x7fb89c082550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55004/0 (socket says 192.168.123.103:55004) 2026-03-10T14:10:10.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.328+0000 7fb8a0cc8700 1 -- 192.168.123.103:0/2933627433 learned_addr learned my addr 192.168.123.103:0/2933627433 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:10.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:10 vm03.local ceph-mon[103098]: pgmap v45: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 539 op/s; 1266/20394 objects degraded (6.208%); 1.2 MiB/s, 15 objects/s recovering 2026-03-10T14:10:10.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:10 vm03.local ceph-mon[103098]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:10.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:10 vm03.local ceph-mon[103098]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:10.327 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:10 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1197274045' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:10.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.329+0000 7fb8a0cc8700 1 -- 192.168.123.103:0/2933627433 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb89c082a90 msgr2=0x7fb89c082f00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.329+0000 7fb8a0cc8700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb89c082a90 0x7fb89c082f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.329+0000 7fb8a0cc8700 1 -- 192.168.123.103:0/2933627433 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb894007ed0 con 0x7fb89c071980 2026-03-10T14:10:10.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.329+0000 7fb8a0cc8700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c071980 0x7fb89c082550 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fb88c00d8d0 tx=0x7fb88c00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:10.332 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.334+0000 7fb899ffb700 1 -- 192.168.123.103:0/2933627433 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb88c009940 con 0x7fb89c071980 2026-03-10T14:10:10.333 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.334+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/2933627433 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb89c12e170 con 0x7fb89c071980 2026-03-10T14:10:10.333 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.334+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/2933627433 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb89c12e6c0 con 0x7fb89c071980 2026-03-10T14:10:10.333 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.335+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/2933627433 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb89c04ea50 con 0x7fb89c071980 2026-03-10T14:10:10.333 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.335+0000 7fb899ffb700 1 -- 192.168.123.103:0/2933627433 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb88c025460 con 0x7fb89c071980 2026-03-10T14:10:10.334 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.335+0000 7fb899ffb700 1 -- 192.168.123.103:0/2933627433 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb88c009c50 con 0x7fb89c071980 2026-03-10T14:10:10.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.337+0000 7fb899ffb700 1 -- 192.168.123.103:0/2933627433 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb88c0255d0 con 0x7fb89c071980 2026-03-10T14:10:10.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.337+0000 7fb899ffb700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb884077a40 0x7fb884079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.337+0000 7fb899ffb700 1 -- 192.168.123.103:0/2933627433 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(51..51 src has 1..51) v4 ==== 6484+0+0 (secure 0 0 0) 0x7fb88c09a230 con 0x7fb89c071980 2026-03-10T14:10:10.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.339+0000 7fb89bfff700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb884077a40 0x7fb884079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:10.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.339+0000 7fb89bfff700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb884077a40 0x7fb884079ef0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fb894008990 tx=0x7fb894006040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:10.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.341+0000 7fb899ffb700 1 -- 192.168.123.103:0/2933627433 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb88c0626e0 con 0x7fb89c071980 2026-03-10T14:10:10.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.516+0000 7fb8a2f2c700 1 -- 192.168.123.103:0/2933627433 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb89c07c760 con 0x7fb884077a40 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.517+0000 7fb899ffb700 1 -- 192.168.123.103:0/2933627433 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fb89c07c760 con 0x7fb884077a40 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:10:10.516 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:10:10.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 -- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb884077a40 msgr2=0x7fb884079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb884077a40 0x7fb884079ef0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fb894008990 tx=0x7fb894006040 comp rx=0 tx=0).stop 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 -- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c071980 msgr2=0x7fb89c082550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c071980 0x7fb89c082550 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fb88c00d8d0 tx=0x7fb88c00dc90 comp rx=0 tx=0).stop 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 -- 192.168.123.103:0/2933627433 shutdown_connections 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb884077a40 0x7fb884079ef0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb89c071980 0x7fb89c082550 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 --2- 192.168.123.103:0/2933627433 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb89c082a90 0x7fb89c082f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.522+0000 7fb8837fe700 1 -- 192.168.123.103:0/2933627433 >> 192.168.123.103:0/2933627433 conn(0x7fb89c06d1a0 msgr2=0x7fb89c076520 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.523+0000 7fb8837fe700 1 -- 192.168.123.103:0/2933627433 shutdown_connections 2026-03-10T14:10:10.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.523+0000 7fb8837fe700 1 -- 192.168.123.103:0/2933627433 wait complete. 2026-03-10T14:10:10.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.602+0000 7f9856705700 1 -- 192.168.123.103:0/4245422674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850072360 msgr2=0x7f98500770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.602+0000 7f9856705700 1 --2- 192.168.123.103:0/4245422674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850072360 0x7f98500770e0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f984800d3f0 tx=0x7f984800d700 comp rx=0 tx=0).stop 2026-03-10T14:10:10.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 -- 192.168.123.103:0/4245422674 shutdown_connections 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 --2- 192.168.123.103:0/4245422674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850072360 0x7f98500770e0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 --2- 192.168.123.103:0/4245422674 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9850071980 0x7f9850071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 -- 192.168.123.103:0/4245422674 >> 192.168.123.103:0/4245422674 conn(0x7f985006d1a0 msgr2=0x7f985006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 -- 192.168.123.103:0/4245422674 shutdown_connections 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 -- 192.168.123.103:0/4245422674 wait complete. 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 Processor -- start 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 -- start start 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9850071980 0x7f98501313c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850131900 0x7f985007f590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9850131e00 con 0x7f9850131900 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f9856705700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9850131f70 con 0x7f9850071980 2026-03-10T14:10:10.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.605+0000 7f984f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850131900 0x7f985007f590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:10.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.606+0000 7f984f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850131900 0x7f985007f590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55020/0 (socket says 192.168.123.103:55020) 2026-03-10T14:10:10.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.606+0000 7f984f7fe700 1 -- 192.168.123.103:0/3613046177 learned_addr learned my addr 192.168.123.103:0/3613046177 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:10.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.606+0000 7f984ffff700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9850071980 0x7f98501313c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:10.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.606+0000 7f984f7fe700 1 -- 192.168.123.103:0/3613046177 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9850071980 msgr2=0x7f98501313c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.606+0000 7f984f7fe700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9850071980 0x7f98501313c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.606+0000 7f984f7fe700 1 -- 192.168.123.103:0/3613046177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9848007ed0 con 0x7f9850131900 2026-03-10T14:10:10.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.606+0000 7f984f7fe700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850131900 0x7f985007f590 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9848003c30 tx=0x7f9848004b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:10.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.607+0000 7f984d7fa700 1 -- 192.168.123.103:0/3613046177 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f984801c070 con 0x7f9850131900 2026-03-10T14:10:10.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.607+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f985007fad0 con 0x7f9850131900 2026-03-10T14:10:10.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.607+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f985007ff90 con 0x7f9850131900 2026-03-10T14:10:10.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.608+0000 7f984d7fa700 1 -- 192.168.123.103:0/3613046177 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f984800deb0 con 0x7f9850131900 2026-03-10T14:10:10.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.608+0000 7f984d7fa700 1 -- 192.168.123.103:0/3613046177 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9848017bc0 con 0x7f9850131900 2026-03-10T14:10:10.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.608+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f983c005320 con 0x7f9850131900 2026-03-10T14:10:10.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.611+0000 7f984d7fa700 1 -- 192.168.123.103:0/3613046177 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f9848017420 con 0x7f9850131900 2026-03-10T14:10:10.610 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.611+0000 7f984d7fa700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9838077b00 0x7f9838079fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:10.610 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.611+0000 7f984d7fa700 1 -- 192.168.123.103:0/3613046177 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(52..52 src has 1..52) v4 ==== 6455+0+0 (secure 0 0 0) 0x7f9848013070 con 0x7f9850131900 2026-03-10T14:10:10.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.613+0000 7f984ffff700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9838077b00 0x7f9838079fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:10.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.614+0000 7f984ffff700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9838077b00 0x7f9838079fb0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9840005950 tx=0x7f98400058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:10.614 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.615+0000 7f984d7fa700 1 -- 192.168.123.103:0/3613046177 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f98480642b0 con 0x7f9850131900 2026-03-10T14:10:10.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.838+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f983c005190 con 0x7f9850131900 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.838+0000 7f984d7fa700 1 -- 192.168.123.103:0/3613046177 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1054 (secure 0 0 0) 0x7f9848025430 con 0x7f9850131900 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 1266/20394 objects degraded (6.208%), 11 pgs degraded 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1266/20394 objects degraded (6.208%), 11 pgs degraded 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.3 is active+recovery_wait+undersized+degraded+remapped, acting [4,3] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-10T14:10:10.837 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-10T14:10:10.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9838077b00 msgr2=0x7f9838079fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9838077b00 0x7f9838079fb0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9840005950 tx=0x7f98400058e0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850131900 msgr2=0x7f985007f590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850131900 0x7f985007f590 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9848003c30 tx=0x7f9848004b40 comp rx=0 tx=0).stop 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 shutdown_connections 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9838077b00 0x7f9838079fb0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9850071980 0x7f98501313c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 --2- 192.168.123.103:0/3613046177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9850131900 0x7f985007f590 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 >> 192.168.123.103:0/3613046177 conn(0x7f985006d1a0 msgr2=0x7f98500764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 shutdown_connections 2026-03-10T14:10:10.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:10.841+0000 7f9856705700 1 -- 192.168.123.103:0/3613046177 wait complete. 2026-03-10T14:10:11.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:11 vm04.local ceph-mon[92084]: from='client.44135 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:11.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:11 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2359134163' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:10:11.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:11 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1266/20394 objects degraded (6.208%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:11.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:11 vm04.local ceph-mon[92084]: osdmap e52: 6 total, 6 up, 6 in 2026-03-10T14:10:11.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:11 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3613046177' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:10:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:11 vm03.local ceph-mon[103098]: from='client.44135 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:11 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2359134163' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:10:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:11 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1266/20394 objects degraded (6.208%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:11 vm03.local ceph-mon[103098]: osdmap e52: 6 total, 6 up, 6 in 2026-03-10T14:10:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:11 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3613046177' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:10:12.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:12 vm04.local ceph-mon[92084]: from='client.34208 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:12.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:12 vm04.local ceph-mon[92084]: pgmap v47: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.6 MiB/s wr, 452 op/s; 1266/20076 objects degraded (6.306%); 0 B/s, 15 objects/s recovering 2026-03-10T14:10:12.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:12 vm04.local ceph-mon[92084]: osdmap e53: 6 total, 6 up, 6 in 2026-03-10T14:10:12.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:12 vm03.local ceph-mon[103098]: from='client.34208 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:12.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:12 vm03.local ceph-mon[103098]: pgmap v47: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.6 MiB/s wr, 452 op/s; 1266/20076 objects degraded (6.306%); 0 B/s, 15 objects/s recovering 2026-03-10T14:10:12.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:12 vm03.local ceph-mon[103098]: osdmap e53: 6 total, 6 up, 6 in 2026-03-10T14:10:13.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:13 vm03.local ceph-mon[103098]: pgmap v49: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.6 MiB/s wr, 500 op/s; 1266/16881 objects degraded (7.500%); 0 B/s, 14 objects/s recovering 2026-03-10T14:10:13.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:13 vm04.local ceph-mon[92084]: pgmap v49: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.6 MiB/s wr, 500 op/s; 1266/16881 objects degraded (7.500%); 0 B/s, 14 objects/s recovering 2026-03-10T14:10:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:15 vm04.local ceph-mon[92084]: pgmap v50: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.5 MiB/s wr, 483 op/s; 1266/16881 objects degraded (7.500%); 0 B/s, 9 objects/s recovering 2026-03-10T14:10:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:15 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1266/16881 objects degraded (7.500%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:10:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:15 vm03.local ceph-mon[103098]: pgmap v50: 65 pgs: 1 active+recovering+undersized+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 52 active+clean; 288 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 MiB/s rd, 1.5 MiB/s wr, 483 op/s; 1266/16881 objects degraded (7.500%); 0 B/s, 9 objects/s recovering 2026-03-10T14:10:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:15 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1266/16881 objects degraded (7.500%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:10:18.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:17 vm04.local ceph-mon[92084]: pgmap v51: 65 pgs: 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 53 active+clean; 287 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 461 KiB/s rd, 611 KiB/s wr, 216 op/s; 1266/16560 objects degraded (7.645%); 0 B/s, 12 objects/s recovering 2026-03-10T14:10:18.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:17 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:18.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:17 vm03.local ceph-mon[103098]: pgmap v51: 65 pgs: 11 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 53 active+clean; 287 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 461 KiB/s rd, 611 KiB/s wr, 216 op/s; 1266/16560 objects degraded (7.645%); 0 B/s, 12 objects/s recovering 2026-03-10T14:10:18.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:17 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:18.438 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-10T14:10:18.438 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-10T14:10:18.838 DEBUG:teuthology.parallel:result is None 2026-03-10T14:10:19.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:19 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:19 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T14:10:19.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:19 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:19.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:19 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T14:10:20.698 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:20 vm03.local ceph-mon[103098]: pgmap v52: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 53 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.5 MiB/s wr, 426 op/s; 1139/12747 objects degraded (8.935%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:20.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:20 vm04.local ceph-mon[92084]: pgmap v52: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 53 active+clean; 290 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.4 MiB/s rd, 1.5 MiB/s wr, 426 op/s; 1139/12747 objects degraded (8.935%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:21.715 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:21 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1139/12747 objects degraded (8.935%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:21.715 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:21 vm04.local ceph-mon[92084]: pgmap v53: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 289 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.3 MiB/s wr, 369 op/s; 1139/12309 objects degraded (9.253%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:21.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:21 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1139/12747 objects degraded (8.935%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:21.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:21 vm03.local ceph-mon[103098]: pgmap v53: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 289 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.3 MiB/s wr, 369 op/s; 1139/12309 objects degraded (9.253%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:24.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:23 vm04.local ceph-mon[92084]: pgmap v54: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.6 MiB/s wr, 489 op/s; 1139/7965 objects degraded (14.300%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:24.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:23 vm03.local ceph-mon[103098]: pgmap v54: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.6 MiB/s wr, 489 op/s; 1139/7965 objects degraded (14.300%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:26.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:25 vm04.local ceph-mon[92084]: pgmap v55: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.1 MiB/s wr, 334 op/s; 1139/7965 objects degraded (14.300%); 0 B/s, 10 objects/s recovering 2026-03-10T14:10:26.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:25 vm03.local ceph-mon[103098]: pgmap v55: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.1 MiB/s wr, 334 op/s; 1139/7965 objects degraded (14.300%); 0 B/s, 10 objects/s recovering 2026-03-10T14:10:27.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:26 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1139/7965 objects degraded (14.300%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:27.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:26 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1139/7965 objects degraded (14.300%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:28.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:27 vm04.local ceph-mon[92084]: pgmap v56: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.1 MiB/s wr, 345 op/s; 1139/7617 objects degraded (14.953%); 0 B/s, 10 objects/s recovering 2026-03-10T14:10:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:27 vm03.local ceph-mon[103098]: pgmap v56: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.1 MiB/s rd, 1.1 MiB/s wr, 345 op/s; 1139/7617 objects degraded (14.953%); 0 B/s, 10 objects/s recovering 2026-03-10T14:10:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:29 vm04.local ceph-mon[92084]: pgmap v57: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.5 MiB/s wr, 442 op/s; 1139/4938 objects degraded (23.066%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:29 vm03.local ceph-mon[103098]: pgmap v57: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.5 MiB/s rd, 1.5 MiB/s wr, 442 op/s; 1139/4938 objects degraded (23.066%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:31.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:30 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1139/4938 objects degraded (23.066%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:31.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:10:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:30 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1139/4938 objects degraded (23.066%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:10:31.869 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-10T14:10:31.869 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-10T14:10:32.105 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:32 vm03.local ceph-mon[103098]: pgmap v58: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 833 KiB/s rd, 875 KiB/s wr, 295 op/s; 1139/4572 objects degraded (24.913%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:32.105 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:32 vm03.local ceph-mon[103098]: osdmap e54: 6 total, 6 up, 6 in 2026-03-10T14:10:32.366 DEBUG:teuthology.parallel:result is None 2026-03-10T14:10:32.366 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T14:10:32.411 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T14:10:32.411 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T14:10:32.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:32 vm04.local ceph-mon[92084]: pgmap v58: 65 pgs: 1 active+recovering+undersized+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 833 KiB/s rd, 875 KiB/s wr, 295 op/s; 1139/4572 objects degraded (24.913%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:32.433 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:32 vm04.local ceph-mon[92084]: osdmap e54: 6 total, 6 up, 6 in 2026-03-10T14:10:32.443 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T14:10:32.443 DEBUG:teuthology.parallel:result is None 2026-03-10T14:10:33.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:33 vm03.local ceph-mon[103098]: osdmap e55: 6 total, 6 up, 6 in 2026-03-10T14:10:33.502 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:33.502 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:33 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:33.502 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:33 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T14:10:33.502 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:33 vm03.local ceph-mon[103098]: pgmap v61: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 348 op/s; 1027/2436 objects degraded (42.159%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:33 vm04.local ceph-mon[92084]: osdmap e55: 6 total, 6 up, 6 in 2026-03-10T14:10:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:33 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:33 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T14:10:33.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:33 vm04.local ceph-mon[92084]: pgmap v61: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 348 op/s; 1027/2436 objects degraded (42.159%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:34 vm04.local ceph-mon[92084]: mgrmap e38: vm03.rwbbep(active, since 92s), standbys: vm04.ywwcto 2026-03-10T14:10:34.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:34 vm03.local ceph-mon[103098]: mgrmap e38: vm03.rwbbep(active, since 92s), standbys: vm04.ywwcto 2026-03-10T14:10:35.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:35 vm04.local ceph-mon[92084]: pgmap v62: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 331 op/s; 1027/2436 objects degraded (42.159%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:35.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:35 vm03.local ceph-mon[103098]: pgmap v62: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 331 op/s; 1027/2436 objects degraded (42.159%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:36.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:36 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1027/2436 objects degraded (42.159%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:36.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:36 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1027/2436 objects degraded (42.159%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:37.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:37 vm04.local ceph-mon[92084]: pgmap v63: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 677 KiB/s rd, 696 KiB/s wr, 181 op/s; 1027/2217 objects degraded (46.324%); 0 B/s, 5 objects/s recovering 2026-03-10T14:10:37.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:37 vm03.local ceph-mon[103098]: pgmap v63: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 677 KiB/s rd, 696 KiB/s wr, 181 op/s; 1027/2217 objects degraded (46.324%); 0 B/s, 5 objects/s recovering 2026-03-10T14:10:40.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:39 vm04.local ceph-mon[92084]: pgmap v64: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 794 KiB/s rd, 782 KiB/s wr, 250 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:39 vm03.local ceph-mon[103098]: pgmap v64: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 794 KiB/s rd, 782 KiB/s wr, 250 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.153+0000 7f3bec196700 1 -- 192.168.123.103:0/3694152235 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4103680 msgr2=0x7f3be4105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.153+0000 7f3bec196700 1 --2- 192.168.123.103:0/3694152235 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4103680 0x7f3be4105ac0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f3be0009b00 tx=0x7f3be0009e10 comp rx=0 tx=0).stop 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.154+0000 7f3bec196700 1 -- 192.168.123.103:0/3694152235 shutdown_connections 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.154+0000 7f3bec196700 1 --2- 192.168.123.103:0/3694152235 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4103680 0x7f3be4105ac0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.154+0000 7f3bec196700 1 --2- 192.168.123.103:0/3694152235 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3be4069180 0x7f3be4103140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.154+0000 7f3bec196700 1 -- 192.168.123.103:0/3694152235 >> 192.168.123.103:0/3694152235 conn(0x7f3be40faa70 msgr2=0x7f3be40fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.154+0000 7f3bec196700 1 -- 192.168.123.103:0/3694152235 shutdown_connections 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.155+0000 7f3bec196700 1 -- 192.168.123.103:0/3694152235 wait complete. 2026-03-10T14:10:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.155+0000 7f3bec196700 1 Processor -- start 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.155+0000 7f3bec196700 1 -- start start 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.155+0000 7f3bec196700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4069180 0x7f3be4193c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.155+0000 7f3bec196700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3be4103680 0x7f3be4194160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.155+0000 7f3bec196700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3be41946f0 con 0x7f3be4069180 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.155+0000 7f3bec196700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3be4194860 con 0x7f3be4103680 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9731700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3be4103680 0x7f3be4194160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9731700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3be4103680 0x7f3be4194160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38116/0 (socket says 192.168.123.103:38116) 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9731700 1 -- 192.168.123.103:0/399867034 learned_addr learned my addr 192.168.123.103:0/399867034 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:41.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9f32700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4069180 0x7f3be4193c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9731700 1 -- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4069180 msgr2=0x7f3be4193c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9731700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4069180 0x7f3be4193c20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9731700 1 -- 192.168.123.103:0/399867034 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3be00097e0 con 0x7f3be4103680 2026-03-10T14:10:41.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9f32700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4069180 0x7f3be4193c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:10:41.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.156+0000 7f3be9731700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3be4103680 0x7f3be4194160 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f3be0005230 tx=0x7f3be0004c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:41.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.301+0000 7f3bdaffd700 1 -- 192.168.123.103:0/399867034 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3be001d070 con 0x7f3be4103680 2026-03-10T14:10:41.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.301+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3be4199250 con 0x7f3be4103680 2026-03-10T14:10:41.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.301+0000 7f3bdaffd700 1 -- 192.168.123.103:0/399867034 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3be000bc50 con 0x7f3be4103680 2026-03-10T14:10:41.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.302+0000 7f3bdaffd700 1 -- 192.168.123.103:0/399867034 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3be000f710 con 0x7f3be4103680 2026-03-10T14:10:41.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.302+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3be41996e0 con 0x7f3be4103680 2026-03-10T14:10:41.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.303+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3be418ddc0 con 0x7f3be4103680 2026-03-10T14:10:41.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.329+0000 7f3bdaffd700 1 -- 192.168.123.103:0/399867034 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3be0022470 con 0x7f3be4103680 2026-03-10T14:10:41.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.330+0000 7f3bdaffd700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3bd007bb80 0x7f3bd007e030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.330+0000 7f3bdaffd700 1 -- 192.168.123.103:0/399867034 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f3be009bcd0 con 0x7f3be4103680 2026-03-10T14:10:41.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.330+0000 7f3be9f32700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3bd007bb80 0x7f3bd007e030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.331+0000 7f3be9f32700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3bd007bb80 0x7f3bd007e030 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f3be4195100 tx=0x7f3bd400a3b0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:41.332 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.333+0000 7f3bdaffd700 1 -- 192.168.123.103:0/399867034 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3be0064460 con 0x7f3be4103680 2026-03-10T14:10:41.461 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:41 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1027/336 objects degraded (305.655%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:41.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.461+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3be4061190 con 0x7f3bd007bb80 2026-03-10T14:10:41.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.463+0000 7f3bdaffd700 1 -- 192.168.123.103:0/399867034 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f3be4061190 con 0x7f3bd007bb80 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.465+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3bd007bb80 msgr2=0x7f3bd007e030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.465+0000 7f3bec196700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3bd007bb80 0x7f3bd007e030 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f3be4195100 tx=0x7f3bd400a3b0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.465+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3be4103680 msgr2=0x7f3be4194160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.465+0000 7f3bec196700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3be4103680 0x7f3be4194160 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f3be0005230 tx=0x7f3be0004c30 comp rx=0 tx=0).stop 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.466+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 shutdown_connections 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.466+0000 7f3bec196700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3bd007bb80 0x7f3bd007e030 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.466+0000 7f3bec196700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3be4069180 0x7f3be4193c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.466+0000 7f3bec196700 1 --2- 192.168.123.103:0/399867034 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3be4103680 0x7f3be4194160 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.466+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 >> 192.168.123.103:0/399867034 conn(0x7f3be40faa70 msgr2=0x7f3be40fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:41.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.466+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 shutdown_connections 2026-03-10T14:10:41.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.466+0000 7f3bec196700 1 -- 192.168.123.103:0/399867034 wait complete. 2026-03-10T14:10:41.474 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:10:41.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.538+0000 7f7f65c78700 1 -- 192.168.123.103:0/353939148 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 msgr2=0x7f7f60105520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.538+0000 7f7f65c78700 1 --2- 192.168.123.103:0/353939148 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 0x7f7f60105520 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7f48009b50 tx=0x7f7f48009e60 comp rx=0 tx=0).stop 2026-03-10T14:10:41.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.538+0000 7f7f65c78700 1 -- 192.168.123.103:0/353939148 shutdown_connections 2026-03-10T14:10:41.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.538+0000 7f7f65c78700 1 --2- 192.168.123.103:0/353939148 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f60105a60 0x7f7f60107e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.538+0000 7f7f65c78700 1 --2- 192.168.123.103:0/353939148 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 0x7f7f60105520 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.538+0000 7f7f65c78700 1 -- 192.168.123.103:0/353939148 >> 192.168.123.103:0/353939148 conn(0x7f7f600faa70 msgr2=0x7f7f600fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:41.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.539+0000 7f7f65c78700 1 -- 192.168.123.103:0/353939148 shutdown_connections 2026-03-10T14:10:41.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.539+0000 7f7f65c78700 1 -- 192.168.123.103:0/353939148 wait complete. 2026-03-10T14:10:41.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.539+0000 7f7f65c78700 1 Processor -- start 2026-03-10T14:10:41.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.539+0000 7f7f65c78700 1 -- start start 2026-03-10T14:10:41.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.539+0000 7f7f65c78700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 0x7f7f60198080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.540+0000 7f7f65c78700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f60105a60 0x7f7f601985c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.540+0000 7f7f65c78700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f60198be0 con 0x7f7f600691a0 2026-03-10T14:10:41.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.540+0000 7f7f65c78700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f60198d20 con 0x7f7f60105a60 2026-03-10T14:10:41.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.540+0000 7f7f5effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f60105a60 0x7f7f601985c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.540+0000 7f7f5effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f60105a60 0x7f7f601985c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38124/0 (socket says 192.168.123.103:38124) 2026-03-10T14:10:41.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.540+0000 7f7f5effd700 1 -- 192.168.123.103:0/2590660343 learned_addr learned my addr 192.168.123.103:0/2590660343 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:41.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.540+0000 7f7f5f7fe700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 0x7f7f60198080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.540+0000 7f7f5effd700 1 -- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 msgr2=0x7f7f60198080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.541+0000 7f7f5effd700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 0x7f7f60198080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.541+0000 7f7f5effd700 1 -- 192.168.123.103:0/2590660343 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f50009710 con 0x7f7f60105a60 2026-03-10T14:10:41.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.541+0000 7f7f5f7fe700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 0x7f7f60198080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:10:41.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.541+0000 7f7f5effd700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f60105a60 0x7f7f601985c0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f7f5000ec80 tx=0x7f7f5000ef90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:41.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.541+0000 7f7f5cff9700 1 -- 192.168.123.103:0/2590660343 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f5000ccd0 con 0x7f7f60105a60 2026-03-10T14:10:41.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.541+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f480097e0 con 0x7f7f60105a60 2026-03-10T14:10:41.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.541+0000 7f7f5cff9700 1 -- 192.168.123.103:0/2590660343 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7f50004500 con 0x7f7f60105a60 2026-03-10T14:10:41.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.542+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f6019db90 con 0x7f7f60105a60 2026-03-10T14:10:41.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.542+0000 7f7f5cff9700 1 -- 192.168.123.103:0/2590660343 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f50005350 con 0x7f7f60105a60 2026-03-10T14:10:41.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.543+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f601921a0 con 0x7f7f60105a60 2026-03-10T14:10:41.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.543+0000 7f7f5cff9700 1 -- 192.168.123.103:0/2590660343 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7f50004ac0 con 0x7f7f60105a60 2026-03-10T14:10:41.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.543+0000 7f7f5cff9700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7f4c077870 0x7f7f4c079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.543+0000 7f7f5cff9700 1 -- 192.168.123.103:0/2590660343 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f7f50014070 con 0x7f7f60105a60 2026-03-10T14:10:41.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.544+0000 7f7f5f7fe700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7f4c077870 0x7f7f4c079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.544+0000 7f7f5f7fe700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7f4c077870 0x7f7f4c079d20 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f7f4800b5c0 tx=0x7f7f48005fd0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:41.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.547+0000 7f7f5cff9700 1 -- 192.168.123.103:0/2590660343 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7f500627c0 con 0x7f7f60105a60 2026-03-10T14:10:41.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.682+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7f60061190 con 0x7f7f4c077870 2026-03-10T14:10:41.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.684+0000 7f7f5cff9700 1 -- 192.168.123.103:0/2590660343 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f7f60061190 con 0x7f7f4c077870 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7f4c077870 msgr2=0x7f7f4c079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7f4c077870 0x7f7f4c079d20 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f7f4800b5c0 tx=0x7f7f48005fd0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f60105a60 msgr2=0x7f7f601985c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f60105a60 0x7f7f601985c0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f7f5000ec80 tx=0x7f7f5000ef90 comp rx=0 tx=0).stop 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 shutdown_connections 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7f4c077870 0x7f7f4c079d20 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f600691a0 0x7f7f60198080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 --2- 192.168.123.103:0/2590660343 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7f60105a60 0x7f7f601985c0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 >> 192.168.123.103:0/2590660343 conn(0x7f7f600faa70 msgr2=0x7f7f600fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.686+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 shutdown_connections 2026-03-10T14:10:41.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.687+0000 7f7f65c78700 1 -- 192.168.123.103:0/2590660343 wait complete. 2026-03-10T14:10:41.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.754+0000 7f630a13a700 1 -- 192.168.123.103:0/897803658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 msgr2=0x7f6304102b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.754+0000 7f630a13a700 1 --2- 192.168.123.103:0/897803658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 0x7f6304102b90 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f62f4009b50 tx=0x7f62f4009e60 comp rx=0 tx=0).stop 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.754+0000 7f630a13a700 1 -- 192.168.123.103:0/897803658 shutdown_connections 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.754+0000 7f630a13a700 1 --2- 192.168.123.103:0/897803658 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304103980 0x7f6304103dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.754+0000 7f630a13a700 1 --2- 192.168.123.103:0/897803658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 0x7f6304102b90 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.754+0000 7f630a13a700 1 -- 192.168.123.103:0/897803658 >> 192.168.123.103:0/897803658 conn(0x7f63040fdd50 msgr2=0x7f6304100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.754+0000 7f630a13a700 1 -- 192.168.123.103:0/897803658 shutdown_connections 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.754+0000 7f630a13a700 1 -- 192.168.123.103:0/897803658 wait complete. 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f630a13a700 1 Processor -- start 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f630a13a700 1 -- start start 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f630a13a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 0x7f6304071d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f630a13a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304103980 0x7f6304072270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f630a13a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6304072890 con 0x7f6304102780 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f630a13a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63040729d0 con 0x7f6304103980 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f63037fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 0x7f6304071d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f63037fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 0x7f6304071d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33026/0 (socket says 192.168.123.103:33026) 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f63037fe700 1 -- 192.168.123.103:0/3730903855 learned_addr learned my addr 192.168.123.103:0/3730903855 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f63037fe700 1 -- 192.168.123.103:0/3730903855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304103980 msgr2=0x7f6304072270 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f63037fe700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304103980 0x7f6304072270 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f63037fe700 1 -- 192.168.123.103:0/3730903855 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f62f40097e0 con 0x7f6304102780 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f63037fe700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 0x7f6304071d30 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f62f4006010 tx=0x7f62f400b920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f63017fa700 1 -- 192.168.123.103:0/3730903855 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f62f401d070 con 0x7f6304102780 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6304194ff0 con 0x7f6304102780 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.755+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6304195480 con 0x7f6304102780 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.757+0000 7f63017fa700 1 -- 192.168.123.103:0/3730903855 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f62f4004500 con 0x7f6304102780 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.757+0000 7f63017fa700 1 -- 192.168.123.103:0/3730903855 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f62f400f670 con 0x7f6304102780 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.757+0000 7f63017fa700 1 -- 192.168.123.103:0/3730903855 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f62f400f850 con 0x7f6304102780 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.757+0000 7f63017fa700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f62f0077a60 0x7f62f0079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.758+0000 7f62fbfff700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f62f0077a60 0x7f62f0079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.757 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.758+0000 7f63017fa700 1 -- 192.168.123.103:0/3730903855 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f62f409c570 con 0x7f6304102780 2026-03-10T14:10:41.757 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.758+0000 7f62fbfff700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f62f0077a60 0x7f62f0079f10 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f62ec009710 tx=0x7f62ec006c60 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:41.758 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.759+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f62e4005320 con 0x7f6304102780 2026-03-10T14:10:41.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.762+0000 7f63017fa700 1 -- 192.168.123.103:0/3730903855 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f62f4064d00 con 0x7f6304102780 2026-03-10T14:10:41.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:41 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1027/336 objects degraded (305.655%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:41.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.885+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f62e4000bf0 con 0x7f62f0077a60 2026-03-10T14:10:41.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.891+0000 7f63017fa700 1 -- 192.168.123.103:0/3730903855 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f62e4000bf0 con 0x7f62f0077a60 2026-03-10T14:10:41.890 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:10:41.890 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (6m) 57s ago 7m 22.0M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:10:41.890 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (5m) 57s ago 7m 8904k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:10:41.890 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (6m) 69s ago 6m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:10:41.890 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (74s) 57s ago 7m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:10:41.890 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (71s) 69s ago 6m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (6m) 57s ago 7m 91.2M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (5m) 57s ago 5m 158M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (5m) 57s ago 5m 93.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (5m) 69s ago 5m 91.2M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (5m) 69s ago 5m 166M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (2m) 57s ago 8m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (113s) 69s ago 6m 496M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (106s) 57s ago 8m 56.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (93s) 69s ago 6m 49.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (7m) 57s ago 7m 15.5M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (6m) 69s ago 6m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (60s) 57s ago 6m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (6m) 57s ago 6m 368M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (6m) 57s ago 6m 288M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (5m) 69s ago 5m 434M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (5m) 69s ago 5m 397M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (5m) 69s ago 5m 352M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:10:41.891 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (116s) 57s ago 7m 53.0M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:10:41.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f62f0077a60 msgr2=0x7f62f0079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f62f0077a60 0x7f62f0079f10 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f62ec009710 tx=0x7f62ec006c60 comp rx=0 tx=0).stop 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 msgr2=0x7f6304071d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 0x7f6304071d30 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f62f4006010 tx=0x7f62f400b920 comp rx=0 tx=0).stop 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 shutdown_connections 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f62f0077a60 0x7f62f0079f10 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6304102780 0x7f6304071d30 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 --2- 192.168.123.103:0/3730903855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6304103980 0x7f6304072270 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 >> 192.168.123.103:0/3730903855 conn(0x7f63040fdd50 msgr2=0x7f6304106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 shutdown_connections 2026-03-10T14:10:41.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.895+0000 7f630a13a700 1 -- 192.168.123.103:0/3730903855 wait complete. 2026-03-10T14:10:41.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.965+0000 7fe700739700 1 -- 192.168.123.103:0/1733321400 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f8100620 msgr2=0x7fe6f8100a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:41.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.965+0000 7fe700739700 1 --2- 192.168.123.103:0/1733321400 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f8100620 0x7fe6f8100a70 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fe6f4009b50 tx=0x7fe6f4009e60 comp rx=0 tx=0).stop 2026-03-10T14:10:41.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.965+0000 7fe700739700 1 -- 192.168.123.103:0/1733321400 shutdown_connections 2026-03-10T14:10:41.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.965+0000 7fe700739700 1 --2- 192.168.123.103:0/1733321400 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f8100620 0x7fe6f8100a70 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.965+0000 7fe700739700 1 --2- 192.168.123.103:0/1733321400 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f80ff420 0x7fe6f80ff830 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.965+0000 7fe700739700 1 -- 192.168.123.103:0/1733321400 >> 192.168.123.103:0/1733321400 conn(0x7fe6f80fa9b0 msgr2=0x7fe6f80fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:41.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.966+0000 7fe700739700 1 -- 192.168.123.103:0/1733321400 shutdown_connections 2026-03-10T14:10:41.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.966+0000 7fe700739700 1 -- 192.168.123.103:0/1733321400 wait complete. 2026-03-10T14:10:41.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.966+0000 7fe700739700 1 Processor -- start 2026-03-10T14:10:41.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.966+0000 7fe700739700 1 -- start start 2026-03-10T14:10:41.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.966+0000 7fe700739700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f80ff420 0x7fe6f8198220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.966+0000 7fe700739700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f8198760 0x7fe6f819d7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.966+0000 7fe700739700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe6f8198c60 con 0x7fe6f80ff420 2026-03-10T14:10:41.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.966+0000 7fe700739700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe6f8198dd0 con 0x7fe6f8198760 2026-03-10T14:10:41.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.967+0000 7fe6fe4d5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f80ff420 0x7fe6f8198220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.967+0000 7fe6fe4d5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f80ff420 0x7fe6f8198220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33054/0 (socket says 192.168.123.103:33054) 2026-03-10T14:10:41.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.967+0000 7fe6fe4d5700 1 -- 192.168.123.103:0/1350316476 learned_addr learned my addr 192.168.123.103:0/1350316476 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:41.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.967+0000 7fe6fe4d5700 1 -- 192.168.123.103:0/1350316476 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f8198760 msgr2=0x7fe6f819d7d0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:10:41.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.967+0000 7fe6fdcd4700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f8198760 0x7fe6f819d7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.967+0000 7fe6fe4d5700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f8198760 0x7fe6f819d7d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:41.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.967+0000 7fe6fe4d5700 1 -- 192.168.123.103:0/1350316476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe6e8009710 con 0x7fe6f80ff420 2026-03-10T14:10:41.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.967+0000 7fe6fe4d5700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f80ff420 0x7fe6f8198220 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fe6e800ec50 tx=0x7fe6e800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:41.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.968+0000 7fe6ef7fe700 1 -- 192.168.123.103:0/1350316476 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe6e800cdd0 con 0x7fe6f80ff420 2026-03-10T14:10:41.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.968+0000 7fe6ef7fe700 1 -- 192.168.123.103:0/1350316476 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe6e8004d10 con 0x7fe6f80ff420 2026-03-10T14:10:41.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.968+0000 7fe6ef7fe700 1 -- 192.168.123.103:0/1350316476 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe6e8010760 con 0x7fe6f80ff420 2026-03-10T14:10:41.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.968+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe6f40097e0 con 0x7fe6f80ff420 2026-03-10T14:10:41.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.968+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe6f818ea20 con 0x7fe6f80ff420 2026-03-10T14:10:41.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.969+0000 7fe6ef7fe700 1 -- 192.168.123.103:0/1350316476 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe6e8010910 con 0x7fe6f80ff420 2026-03-10T14:10:41.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.970+0000 7fe6ef7fe700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6e40778c0 0x7fe6e4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:41.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.970+0000 7fe6fdcd4700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6e40778c0 0x7fe6e4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:41.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.970+0000 7fe6ef7fe700 1 -- 192.168.123.103:0/1350316476 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6426+0+0 (secure 0 0 0) 0x7fe6e8014070 con 0x7fe6f80ff420 2026-03-10T14:10:41.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.970+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe6f8106710 con 0x7fe6f80ff420 2026-03-10T14:10:41.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.972+0000 7fe6fdcd4700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6e40778c0 0x7fe6e4079d70 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fe6f4009b20 tx=0x7fe6f40058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:41.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:41.973+0000 7fe6ef7fe700 1 -- 192.168.123.103:0/1350316476 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe6e8063980 con 0x7fe6f80ff420 2026-03-10T14:10:42.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.137+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe6f8066e40 con 0x7fe6f80ff420 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.138+0000 7fe6ef7fe700 1 -- 192.168.123.103:0/1350316476 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fe6e80630d0 con 0x7fe6f80ff420 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:10:42.137 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:10:42.139 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6e40778c0 msgr2=0x7fe6e4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6e40778c0 0x7fe6e4079d70 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fe6f4009b20 tx=0x7fe6f40058e0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f80ff420 msgr2=0x7fe6f8198220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f80ff420 0x7fe6f8198220 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fe6e800ec50 tx=0x7fe6e800c5b0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 shutdown_connections 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6e40778c0 0x7fe6e4079d70 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f80ff420 0x7fe6f8198220 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 --2- 192.168.123.103:0/1350316476 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f8198760 0x7fe6f819d7d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 >> 192.168.123.103:0/1350316476 conn(0x7fe6f80fa9b0 msgr2=0x7fe6f8103850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 shutdown_connections 2026-03-10T14:10:42.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.141+0000 7fe700739700 1 -- 192.168.123.103:0/1350316476 wait complete. 2026-03-10T14:10:42.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.212+0000 7f56a4ddf700 1 -- 192.168.123.103:0/1806471774 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 msgr2=0x7f56a0105d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.212+0000 7f56a4ddf700 1 --2- 192.168.123.103:0/1806471774 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 0x7f56a0105d70 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f5690009b00 tx=0x7f5690009e10 comp rx=0 tx=0).stop 2026-03-10T14:10:42.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.213+0000 7f56a4ddf700 1 -- 192.168.123.103:0/1806471774 shutdown_connections 2026-03-10T14:10:42.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.213+0000 7f56a4ddf700 1 --2- 192.168.123.103:0/1806471774 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 0x7f56a0105d70 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.213+0000 7f56a4ddf700 1 --2- 192.168.123.103:0/1806471774 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f56a0101070 0x7f56a0103450 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.213+0000 7f56a4ddf700 1 -- 192.168.123.103:0/1806471774 >> 192.168.123.103:0/1806471774 conn(0x7f56a00fa9b0 msgr2=0x7f56a00fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.213+0000 7f56a4ddf700 1 -- 192.168.123.103:0/1806471774 shutdown_connections 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.213+0000 7f56a4ddf700 1 -- 192.168.123.103:0/1806471774 wait complete. 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.213+0000 7f56a4ddf700 1 Processor -- start 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.213+0000 7f56a4ddf700 1 -- start start 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f56a4ddf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f56a0101070 0x7f56a0195ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f56a4ddf700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 0x7f56a01963e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f56a4ddf700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56a0196a00 con 0x7f56a0103990 2026-03-10T14:10:42.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f56a4ddf700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56a0196b40 con 0x7f56a0101070 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f569dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 0x7f56a01963e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f569dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 0x7f56a01963e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33072/0 (socket says 192.168.123.103:33072) 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f569dd9b700 1 -- 192.168.123.103:0/3211397340 learned_addr learned my addr 192.168.123.103:0/3211397340 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f569e59c700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f56a0101070 0x7f56a0195ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f569dd9b700 1 -- 192.168.123.103:0/3211397340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f56a0101070 msgr2=0x7f56a0195ea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f569dd9b700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f56a0101070 0x7f56a0195ea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f569dd9b700 1 -- 192.168.123.103:0/3211397340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56900097e0 con 0x7f56a0103990 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.214+0000 7f569dd9b700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 0x7f56a01963e0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f5690009fd0 tx=0x7f5690004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.215+0000 7f56977fe700 1 -- 192.168.123.103:0/3211397340 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f569001d070 con 0x7f56a0103990 2026-03-10T14:10:42.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.215+0000 7f56977fe700 1 -- 192.168.123.103:0/3211397340 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f569000bd10 con 0x7f56a0103990 2026-03-10T14:10:42.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.215+0000 7f56977fe700 1 -- 192.168.123.103:0/3211397340 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f569000f890 con 0x7f56a0103990 2026-03-10T14:10:42.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.215+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f56a019b590 con 0x7f56a0103990 2026-03-10T14:10:42.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.215+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f56a019bb00 con 0x7f56a0103990 2026-03-10T14:10:42.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.216+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f56a0190060 con 0x7f56a0103990 2026-03-10T14:10:42.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.219+0000 7f56977fe700 1 -- 192.168.123.103:0/3211397340 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5690022b70 con 0x7f56a0103990 2026-03-10T14:10:42.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.219+0000 7f56977fe700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f568c0778c0 0x7f568c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.219+0000 7f56977fe700 1 -- 192.168.123.103:0/3211397340 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f569009bcf0 con 0x7f56a0103990 2026-03-10T14:10:42.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.220+0000 7f569e59c700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f568c0778c0 0x7f568c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.220+0000 7f569e59c700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f568c0778c0 0x7f568c079d70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f5688006fd0 tx=0x7f5688008040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:42.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.221+0000 7f56977fe700 1 -- 192.168.123.103:0/3211397340 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5690064480 con 0x7f56a0103990 2026-03-10T14:10:42.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.366+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f56a0066e40 con 0x7f56a0103990 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.367+0000 7f56977fe700 1 -- 192.168.123.103:0/3211397340 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1973 (secure 0 0 0) 0x7f5690063bd0 con 0x7f56a0103990 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:10:42.366 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:10:42.367 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-10T14:10:42.367 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:10:42.367 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:10:42.367 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:10:42.367 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:10:42.367 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:10:42.367 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.370+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f568c0778c0 msgr2=0x7f568c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.370+0000 7f56a4ddf700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f568c0778c0 0x7f568c079d70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f5688006fd0 tx=0x7f5688008040 comp rx=0 tx=0).stop 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.370+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 msgr2=0x7f56a01963e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.370+0000 7f56a4ddf700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 0x7f56a01963e0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f5690009fd0 tx=0x7f5690004ab0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.371+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 shutdown_connections 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.371+0000 7f56a4ddf700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f568c0778c0 0x7f568c079d70 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.371+0000 7f56a4ddf700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f56a0101070 0x7f56a0195ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.371+0000 7f56a4ddf700 1 --2- 192.168.123.103:0/3211397340 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56a0103990 0x7f56a01963e0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.371+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 >> 192.168.123.103:0/3211397340 conn(0x7f56a00fa9b0 msgr2=0x7f56a00fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:42.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.371+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 shutdown_connections 2026-03-10T14:10:42.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.371+0000 7f56a4ddf700 1 -- 192.168.123.103:0/3211397340 wait complete. 2026-03-10T14:10:42.370 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:10:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.444+0000 7fac402bd700 1 -- 192.168.123.103:0/340797796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 msgr2=0x7fac380ff870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.444+0000 7fac402bd700 1 --2- 192.168.123.103:0/340797796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 0x7fac380ff870 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fac28009ab0 tx=0x7fac28009dc0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:42 vm03.local ceph-mon[103098]: pgmap v65: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 659 KiB/s rd, 649 KiB/s wr, 208 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 9 objects/s recovering 2026-03-10T14:10:42.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:42 vm03.local ceph-mon[103098]: from='client.44151 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:42.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:42 vm03.local ceph-mon[103098]: from='client.44155 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:42.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:42 vm03.local ceph-mon[103098]: from='client.34224 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:42.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:42 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1350316476' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.445+0000 7fac402bd700 1 -- 192.168.123.103:0/340797796 shutdown_connections 2026-03-10T14:10:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.445+0000 7fac402bd700 1 --2- 192.168.123.103:0/340797796 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac38100700 0x7fac38100b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.445+0000 7fac402bd700 1 --2- 192.168.123.103:0/340797796 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 0x7fac380ff870 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.445+0000 7fac402bd700 1 -- 192.168.123.103:0/340797796 >> 192.168.123.103:0/340797796 conn(0x7fac380faa70 msgr2=0x7fac380fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.446+0000 7fac402bd700 1 -- 192.168.123.103:0/340797796 shutdown_connections 2026-03-10T14:10:42.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.446+0000 7fac402bd700 1 -- 192.168.123.103:0/340797796 wait complete. 2026-03-10T14:10:42.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.446+0000 7fac402bd700 1 Processor -- start 2026-03-10T14:10:42.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.447+0000 7fac402bd700 1 -- start start 2026-03-10T14:10:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.447+0000 7fac402bd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 0x7fac38193c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.447+0000 7fac402bd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac38100700 0x7fac38194140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.447+0000 7fac402bd700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac38194760 con 0x7fac380ff460 2026-03-10T14:10:42.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.447+0000 7fac402bd700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac381aa190 con 0x7fac38100700 2026-03-10T14:10:42.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.448+0000 7fac3d858700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac38100700 0x7fac38194140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.448+0000 7fac3d858700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac38100700 0x7fac38194140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38168/0 (socket says 192.168.123.103:38168) 2026-03-10T14:10:42.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.448+0000 7fac3d858700 1 -- 192.168.123.103:0/1787425139 learned_addr learned my addr 192.168.123.103:0/1787425139 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:42.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.448+0000 7fac3e059700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 0x7fac38193c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.449+0000 7fac3d858700 1 -- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 msgr2=0x7fac38193c00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.449+0000 7fac3d858700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 0x7fac38193c00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.449+0000 7fac3d858700 1 -- 192.168.123.103:0/1787425139 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac28009710 con 0x7fac38100700 2026-03-10T14:10:42.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.449+0000 7fac3e059700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 0x7fac38193c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:10:42.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.449+0000 7fac3d858700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac38100700 0x7fac38194140 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fac3400ea30 tx=0x7fac3400edf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:42.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.450+0000 7fac2f7fe700 1 -- 192.168.123.103:0/1787425139 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac3400cc40 con 0x7fac38100700 2026-03-10T14:10:42.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.450+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fac3806a830 con 0x7fac38100700 2026-03-10T14:10:42.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.450+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fac3806ad30 con 0x7fac38100700 2026-03-10T14:10:42.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.451+0000 7fac2f7fe700 1 -- 192.168.123.103:0/1787425139 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fac3400cda0 con 0x7fac38100700 2026-03-10T14:10:42.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.451+0000 7fac2f7fe700 1 -- 192.168.123.103:0/1787425139 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac34010430 con 0x7fac38100700 2026-03-10T14:10:42.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.452+0000 7fac2f7fe700 1 -- 192.168.123.103:0/1787425139 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fac34010670 con 0x7fac38100700 2026-03-10T14:10:42.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.452+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fac38066e40 con 0x7fac38100700 2026-03-10T14:10:42.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.452+0000 7fac2f7fe700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fac24077990 0x7fac24079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.453+0000 7fac3e059700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fac24077990 0x7fac24079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.454+0000 7fac3e059700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fac24077990 0x7fac24079e40 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fac28005e90 tx=0x7fac28005d80 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:42.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.454+0000 7fac2f7fe700 1 -- 192.168.123.103:0/1787425139 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6426+0+0 (secure 0 0 0) 0x7fac34014070 con 0x7fac38100700 2026-03-10T14:10:42.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.458+0000 7fac2f7fe700 1 -- 192.168.123.103:0/1787425139 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fac3409e030 con 0x7fac38100700 2026-03-10T14:10:42.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:42 vm04.local ceph-mon[92084]: pgmap v65: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 659 KiB/s rd, 649 KiB/s wr, 208 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 9 objects/s recovering 2026-03-10T14:10:42.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:42 vm04.local ceph-mon[92084]: from='client.44151 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:42.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:42 vm04.local ceph-mon[92084]: from='client.44155 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:42.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:42 vm04.local ceph-mon[92084]: from='client.34224 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:42.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:42 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1350316476' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:42.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.582+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fac38105050 con 0x7fac24077990 2026-03-10T14:10:42.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.583+0000 7fac2f7fe700 1 -- 192.168.123.103:0/1787425139 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fac38105050 con 0x7fac24077990 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:10:42.585 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:10:42.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.588+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fac24077990 msgr2=0x7fac24079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.588+0000 7fac402bd700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fac24077990 0x7fac24079e40 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fac28005e90 tx=0x7fac28005d80 comp rx=0 tx=0).stop 2026-03-10T14:10:42.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.589+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac38100700 msgr2=0x7fac38194140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.589+0000 7fac402bd700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac38100700 0x7fac38194140 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fac3400ea30 tx=0x7fac3400edf0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.589+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 shutdown_connections 2026-03-10T14:10:42.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.589+0000 7fac402bd700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fac24077990 0x7fac24079e40 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.589+0000 7fac402bd700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fac380ff460 0x7fac38193c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.589+0000 7fac402bd700 1 --2- 192.168.123.103:0/1787425139 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fac38100700 0x7fac38194140 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.589+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 >> 192.168.123.103:0/1787425139 conn(0x7fac380faa70 msgr2=0x7fac38103930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:42.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.590+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 shutdown_connections 2026-03-10T14:10:42.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.590+0000 7fac402bd700 1 -- 192.168.123.103:0/1787425139 wait complete. 2026-03-10T14:10:42.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.663+0000 7f2d470b3700 1 -- 192.168.123.103:0/520648485 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 msgr2=0x7f2d40103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.663+0000 7f2d470b3700 1 --2- 192.168.123.103:0/520648485 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 0x7f2d40103db0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f2d34009b00 tx=0x7f2d34009e10 comp rx=0 tx=0).stop 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.664+0000 7f2d470b3700 1 -- 192.168.123.103:0/520648485 shutdown_connections 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.664+0000 7f2d470b3700 1 --2- 192.168.123.103:0/520648485 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 0x7f2d40103db0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.664+0000 7f2d470b3700 1 --2- 192.168.123.103:0/520648485 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d40102760 0x7f2d40102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.664+0000 7f2d470b3700 1 -- 192.168.123.103:0/520648485 >> 192.168.123.103:0/520648485 conn(0x7f2d400fdcf0 msgr2=0x7f2d40100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.664+0000 7f2d470b3700 1 -- 192.168.123.103:0/520648485 shutdown_connections 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.664+0000 7f2d470b3700 1 -- 192.168.123.103:0/520648485 wait complete. 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d470b3700 1 Processor -- start 2026-03-10T14:10:42.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d470b3700 1 -- start start 2026-03-10T14:10:42.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d470b3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d40102760 0x7f2d40198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d470b3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 0x7f2d40198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d470b3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d40198b80 con 0x7f2d40103960 2026-03-10T14:10:42.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d470b3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d40198cc0 con 0x7f2d40102760 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d3ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 0x7f2d40198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d3ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 0x7f2d40198560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:33102/0 (socket says 192.168.123.103:33102) 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.665+0000 7f2d3ffff700 1 -- 192.168.123.103:0/1032412036 learned_addr learned my addr 192.168.123.103:0/1032412036 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.666+0000 7f2d44e4f700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d40102760 0x7f2d40198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.666+0000 7f2d3ffff700 1 -- 192.168.123.103:0/1032412036 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d40102760 msgr2=0x7f2d40198020 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.666+0000 7f2d3ffff700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d40102760 0x7f2d40198020 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.666+0000 7f2d3ffff700 1 -- 192.168.123.103:0/1032412036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2d340097e0 con 0x7f2d40103960 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.666+0000 7f2d3ffff700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 0x7f2d40198560 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f2d34005850 tx=0x7f2d34004a60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:42.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.666+0000 7f2d3dffb700 1 -- 192.168.123.103:0/1032412036 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d3401d070 con 0x7f2d40103960 2026-03-10T14:10:42.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.666+0000 7f2d3dffb700 1 -- 192.168.123.103:0/1032412036 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2d3400bc50 con 0x7f2d40103960 2026-03-10T14:10:42.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.667+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2d4019d710 con 0x7f2d40103960 2026-03-10T14:10:42.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.667+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2d4019dc00 con 0x7f2d40103960 2026-03-10T14:10:42.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.668+0000 7f2d3dffb700 1 -- 192.168.123.103:0/1032412036 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d34022620 con 0x7f2d40103960 2026-03-10T14:10:42.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.668+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2d40066e40 con 0x7f2d40103960 2026-03-10T14:10:42.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.669+0000 7f2d3dffb700 1 -- 192.168.123.103:0/1032412036 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2d3400f7a0 con 0x7f2d40103960 2026-03-10T14:10:42.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.669+0000 7f2d3dffb700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d280778c0 0x7f2d28079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:10:42.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.669+0000 7f2d3dffb700 1 -- 192.168.123.103:0/1032412036 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f2d3409b5a0 con 0x7f2d40103960 2026-03-10T14:10:42.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.670+0000 7f2d44e4f700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d280778c0 0x7f2d28079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:10:42.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.670+0000 7f2d44e4f700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d280778c0 0x7f2d28079d70 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f2d30006fd0 tx=0x7f2d30008040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:10:42.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.672+0000 7f2d3dffb700 1 -- 192.168.123.103:0/1032412036 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2d34063c30 con 0x7f2d40103960 2026-03-10T14:10:42.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.849+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f2d4019dee0 con 0x7f2d40103960 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.853+0000 7f2d3dffb700 1 -- 192.168.123.103:0/1032412036 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+896 (secure 0 0 0) 0x7f2d34005c00 con 0x7f2d40103960 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 1027/336 objects degraded (305.655%), 9 pgs degraded 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1027/336 objects degraded (305.655%), 9 pgs degraded 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-10T14:10:42.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-10T14:10:42.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.856+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d280778c0 msgr2=0x7f2d28079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.856+0000 7f2d470b3700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d280778c0 0x7f2d28079d70 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f2d30006fd0 tx=0x7f2d30008040 comp rx=0 tx=0).stop 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.856+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 msgr2=0x7f2d40198560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.856+0000 7f2d470b3700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 0x7f2d40198560 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f2d34005850 tx=0x7f2d34004a60 comp rx=0 tx=0).stop 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.856+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 shutdown_connections 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.856+0000 7f2d470b3700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d280778c0 0x7f2d28079d70 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.856+0000 7f2d470b3700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d40102760 0x7f2d40198020 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.856+0000 7f2d470b3700 1 --2- 192.168.123.103:0/1032412036 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d40103960 0x7f2d40198560 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.857+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 >> 192.168.123.103:0/1032412036 conn(0x7f2d400fdcf0 msgr2=0x7f2d40106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.857+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 shutdown_connections 2026-03-10T14:10:42.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:10:42.857+0000 7f2d470b3700 1 -- 192.168.123.103:0/1032412036 wait complete. 2026-03-10T14:10:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:43 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3211397340' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:10:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:43 vm03.local ceph-mon[103098]: from='client.44163 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:43 vm03.local ceph-mon[103098]: pgmap v66: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 114 KiB/s rd, 89 KiB/s wr, 80 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:43 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1032412036' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:10:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:43 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3211397340' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:10:44.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:43 vm04.local ceph-mon[92084]: from='client.44163 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:10:44.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:43 vm04.local ceph-mon[92084]: pgmap v66: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 114 KiB/s rd, 89 KiB/s wr, 80 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:44.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:43 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1032412036' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:10:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:45 vm03.local ceph-mon[103098]: osdmap e56: 6 total, 6 up, 6 in 2026-03-10T14:10:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:45 vm03.local ceph-mon[103098]: pgmap v68: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 120 KiB/s rd, 93 KiB/s wr, 84 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:46.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:45 vm04.local ceph-mon[92084]: osdmap e56: 6 total, 6 up, 6 in 2026-03-10T14:10:46.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:45 vm04.local ceph-mon[92084]: pgmap v68: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 120 KiB/s rd, 93 KiB/s wr, 84 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:47.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:46 vm04.local ceph-mon[92084]: osdmap e57: 6 total, 6 up, 6 in 2026-03-10T14:10:47.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:10:47.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:46 vm03.local ceph-mon[103098]: osdmap e57: 6 total, 6 up, 6 in 2026-03-10T14:10:47.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:10:48.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:47 vm04.local ceph-mon[92084]: pgmap v70: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 4 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 5 objects/s recovering 2026-03-10T14:10:48.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:10:48.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:47 vm03.local ceph-mon[103098]: pgmap v70: 65 pgs: 1 active+recovering+undersized+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 767 B/s wr, 4 op/s; 1027/336 objects degraded (305.655%); 0 B/s, 5 objects/s recovering 2026-03-10T14:10:48.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:48 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:10:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:48 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T14:10:50.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:49 vm04.local ceph-mon[92084]: pgmap v71: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 2.9 KiB/s rd, 767 B/s wr, 6 op/s; 898/336 objects degraded (267.262%); 0 B/s, 10 objects/s recovering 2026-03-10T14:10:50.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:49 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 898/336 objects degraded (267.262%), 8 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:49 vm03.local ceph-mon[103098]: pgmap v71: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 2.9 KiB/s rd, 767 B/s wr, 6 op/s; 898/336 objects degraded (267.262%); 0 B/s, 10 objects/s recovering 2026-03-10T14:10:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:49 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 898/336 objects degraded (267.262%), 8 pgs degraded (PG_DEGRADED) 2026-03-10T14:10:51.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:51 vm04.local ceph-mon[92084]: pgmap v72: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.1 KiB/s rd, 2 op/s; 898/336 objects degraded (267.262%); 0 B/s, 4 objects/s recovering 2026-03-10T14:10:52.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:51 vm03.local ceph-mon[103098]: pgmap v72: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.1 KiB/s rd, 2 op/s; 898/336 objects degraded (267.262%); 0 B/s, 4 objects/s recovering 2026-03-10T14:10:53.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:53 vm03.local ceph-mon[103098]: pgmap v73: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 2.4 KiB/s rd, 4 op/s; 898/336 objects degraded (267.262%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:54.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:53 vm04.local ceph-mon[92084]: pgmap v73: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 2.4 KiB/s rd, 4 op/s; 898/336 objects degraded (267.262%); 0 B/s, 11 objects/s recovering 2026-03-10T14:10:55.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:54 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 898/336 objects degraded (267.262%), 8 pgs degraded, 9 pgs undersized (PG_DEGRADED) 2026-03-10T14:10:55.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:54 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 898/336 objects degraded (267.262%), 8 pgs degraded, 9 pgs undersized (PG_DEGRADED) 2026-03-10T14:10:56.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:55 vm03.local ceph-mon[103098]: pgmap v74: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 898/336 objects degraded (267.262%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:55 vm04.local ceph-mon[92084]: pgmap v74: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 898/336 objects degraded (267.262%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:58.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:57 vm03.local ceph-mon[103098]: pgmap v75: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 898/336 objects degraded (267.262%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:57 vm04.local ceph-mon[92084]: pgmap v75: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 898/336 objects degraded (267.262%); 0 B/s, 8 objects/s recovering 2026-03-10T14:10:59.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:58 vm03.local ceph-mon[103098]: osdmap e58: 6 total, 6 up, 6 in 2026-03-10T14:10:59.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:58 vm04.local ceph-mon[92084]: osdmap e58: 6 total, 6 up, 6 in 2026-03-10T14:11:00.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:59 vm03.local ceph-mon[103098]: pgmap v77: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 898/336 objects degraded (267.262%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:00.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:10:59 vm03.local ceph-mon[103098]: osdmap e59: 6 total, 6 up, 6 in 2026-03-10T14:11:00.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:59 vm04.local ceph-mon[92084]: pgmap v77: 65 pgs: 1 active+recovering+undersized+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 275 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 898/336 objects degraded (267.262%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:00.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:10:59 vm04.local ceph-mon[92084]: osdmap e59: 6 total, 6 up, 6 in 2026-03-10T14:11:01.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:11:01.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:11:02.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:01 vm04.local ceph-mon[92084]: pgmap v79: 65 pgs: 8 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 898/336 objects degraded (267.262%); 0 B/s, 7 objects/s recovering 2026-03-10T14:11:02.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:01 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 898/336 objects degraded (267.262%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:02.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:01 vm03.local ceph-mon[103098]: pgmap v79: 65 pgs: 8 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 898/336 objects degraded (267.262%); 0 B/s, 7 objects/s recovering 2026-03-10T14:11:02.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:01 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 898/336 objects degraded (267.262%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:04.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:03 vm03.local ceph-mon[103098]: pgmap v80: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 790/336 objects degraded (235.119%); 0 B/s, 11 objects/s recovering 2026-03-10T14:11:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:03 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:03 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T14:11:04.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:03 vm04.local ceph-mon[92084]: pgmap v80: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 790/336 objects degraded (235.119%); 0 B/s, 11 objects/s recovering 2026-03-10T14:11:04.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:04.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:03 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:04.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:03 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-10T14:11:06.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:05 vm04.local ceph-mon[92084]: pgmap v81: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 790/336 objects degraded (235.119%); 0 B/s, 11 objects/s recovering 2026-03-10T14:11:06.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:05 vm03.local ceph-mon[103098]: pgmap v81: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 790/336 objects degraded (235.119%); 0 B/s, 11 objects/s recovering 2026-03-10T14:11:08.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:07 vm04.local ceph-mon[92084]: pgmap v82: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.1 KiB/s rd, 2 op/s; 790/336 objects degraded (235.119%); 0 B/s, 5 objects/s recovering 2026-03-10T14:11:08.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:07 vm03.local ceph-mon[103098]: pgmap v82: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.1 KiB/s rd, 2 op/s; 790/336 objects degraded (235.119%); 0 B/s, 5 objects/s recovering 2026-03-10T14:11:10.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:09 vm04.local ceph-mon[92084]: pgmap v83: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 790/336 objects degraded (235.119%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:10.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:09 vm03.local ceph-mon[103098]: pgmap v83: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 790/336 objects degraded (235.119%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:11.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:10 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 790/336 objects degraded (235.119%), 7 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:11.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:10 vm04.local ceph-mon[92084]: osdmap e60: 6 total, 6 up, 6 in 2026-03-10T14:11:11.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:10 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 790/336 objects degraded (235.119%), 7 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:11.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:10 vm03.local ceph-mon[103098]: osdmap e60: 6 total, 6 up, 6 in 2026-03-10T14:11:12.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:12 vm04.local ceph-mon[92084]: pgmap v85: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 790/336 objects degraded (235.119%); 0 B/s, 8 objects/s recovering 2026-03-10T14:11:12.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:12 vm03.local ceph-mon[103098]: pgmap v85: 65 pgs: 1 active+recovering+undersized+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 790/336 objects degraded (235.119%); 0 B/s, 8 objects/s recovering 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.929+0000 7fd48ac6d700 1 -- 192.168.123.103:0/1647349972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484103970 msgr2=0x7fd484105d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.929+0000 7fd48ac6d700 1 --2- 192.168.123.103:0/1647349972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484103970 0x7fd484105d50 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fd474009b00 tx=0x7fd474009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.930+0000 7fd48ac6d700 1 -- 192.168.123.103:0/1647349972 shutdown_connections 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.930+0000 7fd48ac6d700 1 --2- 192.168.123.103:0/1647349972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484103970 0x7fd484105d50 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.930+0000 7fd48ac6d700 1 --2- 192.168.123.103:0/1647349972 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd484101050 0x7fd484103430 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.930+0000 7fd48ac6d700 1 -- 192.168.123.103:0/1647349972 >> 192.168.123.103:0/1647349972 conn(0x7fd4840fa9b0 msgr2=0x7fd4840fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.930+0000 7fd48ac6d700 1 -- 192.168.123.103:0/1647349972 shutdown_connections 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.930+0000 7fd48ac6d700 1 -- 192.168.123.103:0/1647349972 wait complete. 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.931+0000 7fd48ac6d700 1 Processor -- start 2026-03-10T14:11:12.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.931+0000 7fd48ac6d700 1 -- start start 2026-03-10T14:11:12.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.931+0000 7fd48ac6d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484101050 0x7fd4841009b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:12.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.931+0000 7fd48ac6d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd484103970 0x7fd4840ff000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:12.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.931+0000 7fd483fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd484103970 0x7fd4840ff000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:12.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.932+0000 7fd488a09700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484101050 0x7fd4841009b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:12.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.932+0000 7fd488a09700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484101050 0x7fd4841009b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45292/0 (socket says 192.168.123.103:45292) 2026-03-10T14:11:12.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.932+0000 7fd488a09700 1 -- 192.168.123.103:0/154193008 learned_addr learned my addr 192.168.123.103:0/154193008 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:12.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.931+0000 7fd48ac6d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd4840ff540 con 0x7fd484101050 2026-03-10T14:11:12.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.932+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd4840ff680 con 0x7fd484103970 2026-03-10T14:11:12.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.932+0000 7fd488a09700 1 -- 192.168.123.103:0/154193008 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd484103970 msgr2=0x7fd4840ff000 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:12.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.932+0000 7fd488a09700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd484103970 0x7fd4840ff000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:12.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.932+0000 7fd488a09700 1 -- 192.168.123.103:0/154193008 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd4740097e0 con 0x7fd484101050 2026-03-10T14:11:12.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.933+0000 7fd488a09700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484101050 0x7fd4841009b0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fd47800cc60 tx=0x7fd47800cf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:12.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.933+0000 7fd481ffb700 1 -- 192.168.123.103:0/154193008 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd478007960 con 0x7fd484101050 2026-03-10T14:11:12.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.933+0000 7fd481ffb700 1 -- 192.168.123.103:0/154193008 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd47800f450 con 0x7fd484101050 2026-03-10T14:11:12.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.933+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd4840ff960 con 0x7fd484101050 2026-03-10T14:11:12.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.933+0000 7fd481ffb700 1 -- 192.168.123.103:0/154193008 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd4780186a0 con 0x7fd484101050 2026-03-10T14:11:12.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.934+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd4840ffe80 con 0x7fd484101050 2026-03-10T14:11:12.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.935+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd48418fff0 con 0x7fd484101050 2026-03-10T14:11:12.936 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.935+0000 7fd481ffb700 1 -- 192.168.123.103:0/154193008 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd478007ac0 con 0x7fd484101050 2026-03-10T14:11:12.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.938+0000 7fd481ffb700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd46c077660 0x7fd46c079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:12.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.938+0000 7fd481ffb700 1 -- 192.168.123.103:0/154193008 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6339+0+0 (secure 0 0 0) 0x7fd47801d020 con 0x7fd484101050 2026-03-10T14:11:12.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.938+0000 7fd483fff700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd46c077660 0x7fd46c079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:12.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.939+0000 7fd483fff700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd46c077660 0x7fd46c079b10 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fd474005340 tx=0x7fd474005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:12.938 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:12.939+0000 7fd481ffb700 1 -- 192.168.123.103:0/154193008 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd478062010 con 0x7fd484101050 2026-03-10T14:11:13.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.068+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd48402d070 con 0x7fd46c077660 2026-03-10T14:11:13.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.069+0000 7fd481ffb700 1 -- 192.168.123.103:0/154193008 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fd48402d070 con 0x7fd46c077660 2026-03-10T14:11:13.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd46c077660 msgr2=0x7fd46c079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd46c077660 0x7fd46c079b10 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fd474005340 tx=0x7fd474005fb0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484101050 msgr2=0x7fd4841009b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484101050 0x7fd4841009b0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fd47800cc60 tx=0x7fd47800cf70 comp rx=0 tx=0).stop 2026-03-10T14:11:13.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 shutdown_connections 2026-03-10T14:11:13.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd46c077660 0x7fd46c079b10 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd484101050 0x7fd4841009b0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 --2- 192.168.123.103:0/154193008 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd484103970 0x7fd4840ff000 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.072+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 >> 192.168.123.103:0/154193008 conn(0x7fd4840fa9b0 msgr2=0x7fd4840fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:13.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.073+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 shutdown_connections 2026-03-10T14:11:13.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.073+0000 7fd48ac6d700 1 -- 192.168.123.103:0/154193008 wait complete. 2026-03-10T14:11:13.083 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:11:13.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.149+0000 7fb89cc5e700 1 -- 192.168.123.103:0/326980174 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 msgr2=0x7fb8980ff220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.149+0000 7fb89cc5e700 1 --2- 192.168.123.103:0/326980174 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 0x7fb8980ff220 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fb888009a60 tx=0x7fb888009d70 comp rx=0 tx=0).stop 2026-03-10T14:11:13.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.150+0000 7fb89cc5e700 1 -- 192.168.123.103:0/326980174 shutdown_connections 2026-03-10T14:11:13.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.150+0000 7fb89cc5e700 1 --2- 192.168.123.103:0/326980174 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb8980ff760 0x7fb8980ffbd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.150+0000 7fb89cc5e700 1 --2- 192.168.123.103:0/326980174 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 0x7fb8980ff220 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.149 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.150+0000 7fb89cc5e700 1 -- 192.168.123.103:0/326980174 >> 192.168.123.103:0/326980174 conn(0x7fb8980fa980 msgr2=0x7fb8980fcdf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:13.149 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.150+0000 7fb89cc5e700 1 -- 192.168.123.103:0/326980174 shutdown_connections 2026-03-10T14:11:13.149 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.150+0000 7fb89cc5e700 1 -- 192.168.123.103:0/326980174 wait complete. 2026-03-10T14:11:13.149 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.151+0000 7fb89cc5e700 1 Processor -- start 2026-03-10T14:11:13.149 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.151+0000 7fb89cc5e700 1 -- start start 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.151+0000 7fb89cc5e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 0x7fb898197f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.151+0000 7fb89cc5e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb8980ff760 0x7fb898198480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.151+0000 7fb89cc5e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb898198aa0 con 0x7fb8980ff760 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.151+0000 7fb89cc5e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb898198be0 con 0x7fb8980fee10 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.151+0000 7fb89659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 0x7fb898197f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.151+0000 7fb89659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 0x7fb898197f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:54040/0 (socket says 192.168.123.103:54040) 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.152+0000 7fb89659c700 1 -- 192.168.123.103:0/3350832163 learned_addr learned my addr 192.168.123.103:0/3350832163 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.152+0000 7fb895d9b700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb8980ff760 0x7fb898198480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.150 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.152+0000 7fb89659c700 1 -- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb8980ff760 msgr2=0x7fb898198480 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.152+0000 7fb89659c700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb8980ff760 0x7fb898198480 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.152+0000 7fb89659c700 1 -- 192.168.123.103:0/3350832163 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb88c0097e0 con 0x7fb8980fee10 2026-03-10T14:11:13.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.152+0000 7fb895d9b700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb8980ff760 0x7fb898198480 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:11:13.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.152+0000 7fb89659c700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 0x7fb898197f40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb888009420 tx=0x7fb88800f7b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:13.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.153+0000 7fb8877fe700 1 -- 192.168.123.103:0/3350832163 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb88801d070 con 0x7fb8980fee10 2026-03-10T14:11:13.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.153+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb888009710 con 0x7fb8980fee10 2026-03-10T14:11:13.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.153+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb89819d990 con 0x7fb8980fee10 2026-03-10T14:11:13.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.153+0000 7fb8877fe700 1 -- 192.168.123.103:0/3350832163 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb88800fd40 con 0x7fb8980fee10 2026-03-10T14:11:13.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.153+0000 7fb8877fe700 1 -- 192.168.123.103:0/3350832163 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb888017800 con 0x7fb8980fee10 2026-03-10T14:11:13.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.154+0000 7fb8877fe700 1 -- 192.168.123.103:0/3350832163 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb888017960 con 0x7fb8980fee10 2026-03-10T14:11:13.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.154+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb878005320 con 0x7fb8980fee10 2026-03-10T14:11:13.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.155+0000 7fb8877fe700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb880077870 0x7fb880079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.155+0000 7fb895d9b700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb880077870 0x7fb880079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.155+0000 7fb8877fe700 1 -- 192.168.123.103:0/3350832163 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6339+0+0 (secure 0 0 0) 0x7fb88809ace0 con 0x7fb8980fee10 2026-03-10T14:11:13.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.156+0000 7fb895d9b700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb880077870 0x7fb880079d20 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb88c005fd0 tx=0x7fb88c009500 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:13.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.158+0000 7fb8877fe700 1 -- 192.168.123.103:0/3350832163 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb8880646a0 con 0x7fb8980fee10 2026-03-10T14:11:13.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.295+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb878000bf0 con 0x7fb880077870 2026-03-10T14:11:13.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.296+0000 7fb8877fe700 1 -- 192.168.123.103:0/3350832163 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fb878000bf0 con 0x7fb880077870 2026-03-10T14:11:13.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.298+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb880077870 msgr2=0x7fb880079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.298+0000 7fb89cc5e700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb880077870 0x7fb880079d20 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb88c005fd0 tx=0x7fb88c009500 comp rx=0 tx=0).stop 2026-03-10T14:11:13.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 msgr2=0x7fb898197f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 0x7fb898197f40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb888009420 tx=0x7fb88800f7b0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 shutdown_connections 2026-03-10T14:11:13.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb880077870 0x7fb880079d20 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb8980fee10 0x7fb898197f40 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 --2- 192.168.123.103:0/3350832163 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb8980ff760 0x7fb898198480 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 >> 192.168.123.103:0/3350832163 conn(0x7fb8980fa980 msgr2=0x7fb898107440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:13.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 shutdown_connections 2026-03-10T14:11:13.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.299+0000 7fb89cc5e700 1 -- 192.168.123.103:0/3350832163 wait complete. 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.377+0000 7f8603609700 1 -- 192.168.123.103:0/2616470599 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc0737b0 msgr2=0x7f85fc073c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.377+0000 7f8603609700 1 --2- 192.168.123.103:0/2616470599 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc0737b0 0x7f85fc073c20 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f85f8009b50 tx=0x7f85f8009e60 comp rx=0 tx=0).stop 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.378+0000 7f8603609700 1 -- 192.168.123.103:0/2616470599 shutdown_connections 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.378+0000 7f8603609700 1 --2- 192.168.123.103:0/2616470599 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc0737b0 0x7f85fc073c20 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.378+0000 7f8603609700 1 --2- 192.168.123.103:0/2616470599 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85fc074d80 0x7f85fc0731e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.378+0000 7f8603609700 1 -- 192.168.123.103:0/2616470599 >> 192.168.123.103:0/2616470599 conn(0x7f85fc0fbaa0 msgr2=0x7f85fc0fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.378+0000 7f8603609700 1 -- 192.168.123.103:0/2616470599 shutdown_connections 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.378+0000 7f8603609700 1 -- 192.168.123.103:0/2616470599 wait complete. 2026-03-10T14:11:13.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.379+0000 7f8603609700 1 Processor -- start 2026-03-10T14:11:13.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.379+0000 7f8603609700 1 -- start start 2026-03-10T14:11:13.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.379+0000 7f8603609700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85fc0737b0 0x7f85fc19c410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.379+0000 7f8603609700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc074d80 0x7f85fc19c950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.379+0000 7f8603609700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85fc19cf70 con 0x7f85fc074d80 2026-03-10T14:11:13.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.379+0000 7f8603609700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85fc19d0b0 con 0x7f85fc0737b0 2026-03-10T14:11:13.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.380+0000 7f86013a5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85fc0737b0 0x7f85fc19c410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.380+0000 7f86013a5700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85fc0737b0 0x7f85fc19c410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:54058/0 (socket says 192.168.123.103:54058) 2026-03-10T14:11:13.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.380+0000 7f86013a5700 1 -- 192.168.123.103:0/1368262229 learned_addr learned my addr 192.168.123.103:0/1368262229 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:13.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.380+0000 7f8600ba4700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc074d80 0x7f85fc19c950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.380+0000 7f86013a5700 1 -- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc074d80 msgr2=0x7f85fc19c950 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.380+0000 7f86013a5700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc074d80 0x7f85fc19c950 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.380+0000 7f86013a5700 1 -- 192.168.123.103:0/1368262229 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85f80097e0 con 0x7f85fc0737b0 2026-03-10T14:11:13.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.380+0000 7f8600ba4700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc074d80 0x7f85fc19c950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:11:13.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.381+0000 7f86013a5700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85fc0737b0 0x7f85fc19c410 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f85f000ebf0 tx=0x7f85f000c2d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:13.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.381+0000 7f85ee7fc700 1 -- 192.168.123.103:0/1368262229 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85f000cd00 con 0x7f85fc0737b0 2026-03-10T14:11:13.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.381+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85fc1a1b60 con 0x7f85fc0737b0 2026-03-10T14:11:13.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.381+0000 7f85ee7fc700 1 -- 192.168.123.103:0/1368262229 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f85f000ce60 con 0x7f85fc0737b0 2026-03-10T14:11:13.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.381+0000 7f85ee7fc700 1 -- 192.168.123.103:0/1368262229 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85f0010640 con 0x7f85fc0737b0 2026-03-10T14:11:13.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.381+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85fc1a20b0 con 0x7f85fc0737b0 2026-03-10T14:11:13.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.383+0000 7f85ee7fc700 1 -- 192.168.123.103:0/1368262229 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f85f0004750 con 0x7f85fc0737b0 2026-03-10T14:11:13.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.383+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f85fc066e40 con 0x7f85fc0737b0 2026-03-10T14:11:13.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.383+0000 7f85ee7fc700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f85e8077870 0x7f85e8079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.384+0000 7f85ee7fc700 1 -- 192.168.123.103:0/1368262229 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6339+0+0 (secure 0 0 0) 0x7f85f0014070 con 0x7f85fc0737b0 2026-03-10T14:11:13.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.384+0000 7f8600ba4700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f85e8077870 0x7f85e8079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.385+0000 7f8600ba4700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f85e8077870 0x7f85e8079d20 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f85f800b5c0 tx=0x7f85f80058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:13.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.386+0000 7f85ee7fc700 1 -- 192.168.123.103:0/1368262229 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f85f00626b0 con 0x7f85fc0737b0 2026-03-10T14:11:13.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.507+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f85fc1a2360 con 0x7f85e8077870 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.512+0000 7f85ee7fc700 1 -- 192.168.123.103:0/1368262229 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f85fc1a2360 con 0x7f85e8077870 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (7m) 88s ago 7m 22.0M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (5m) 88s ago 7m 8904k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (7m) 101s ago 7m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (106s) 88s ago 7m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (103s) 101s ago 7m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (7m) 88s ago 7m 91.2M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (5m) 88s ago 5m 158M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (5m) 88s ago 5m 93.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:11:13.511 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (5m) 101s ago 5m 91.2M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (5m) 101s ago 5m 166M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (2m) 88s ago 8m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (2m) 101s ago 7m 496M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 88s ago 8m 56.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (2m) 101s ago 7m 49.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (7m) 88s ago 7m 15.5M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (7m) 101s ago 7m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (92s) 88s ago 7m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (6m) 88s ago 6m 368M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (6m) 88s ago 6m 288M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (6m) 101s ago 6m 434M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (6m) 101s ago 6m 397M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (6m) 101s ago 6m 352M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:11:13.512 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 88s ago 7m 53.0M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f85e8077870 msgr2=0x7f85e8079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f85e8077870 0x7f85e8079d20 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f85f800b5c0 tx=0x7f85f80058e0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85fc0737b0 msgr2=0x7f85fc19c410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85fc0737b0 0x7f85fc19c410 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f85f000ebf0 tx=0x7f85f000c2d0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 shutdown_connections 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f85e8077870 0x7f85e8079d20 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85fc0737b0 0x7f85fc19c410 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 --2- 192.168.123.103:0/1368262229 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85fc074d80 0x7f85fc19c950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.515+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 >> 192.168.123.103:0/1368262229 conn(0x7f85fc0fbaa0 msgr2=0x7f85fc101ec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.516+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 shutdown_connections 2026-03-10T14:11:13.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.516+0000 7f8603609700 1 -- 192.168.123.103:0/1368262229 wait complete. 2026-03-10T14:11:13.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:13 vm04.local ceph-mon[92084]: osdmap e61: 6 total, 6 up, 6 in 2026-03-10T14:11:13.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:13 vm04.local ceph-mon[92084]: pgmap v87: 65 pgs: 2 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 679/336 objects degraded (202.083%); 0 B/s, 7 objects/s recovering 2026-03-10T14:11:13.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:13 vm04.local ceph-mon[92084]: from='client.34242 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:13.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.588+0000 7f6c37fff700 1 -- 192.168.123.103:0/2376169812 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c380feeb0 msgr2=0x7f6c380ff370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.588+0000 7f6c37fff700 1 --2- 192.168.123.103:0/2376169812 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c380feeb0 0x7f6c380ff370 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f6c20009b00 tx=0x7f6c20009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:13.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.589+0000 7f6c37fff700 1 -- 192.168.123.103:0/2376169812 shutdown_connections 2026-03-10T14:11:13.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.589+0000 7f6c37fff700 1 --2- 192.168.123.103:0/2376169812 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c380feeb0 0x7f6c380ff370 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.589+0000 7f6c37fff700 1 --2- 192.168.123.103:0/2376169812 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c380fe560 0x7f6c380fe970 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.589+0000 7f6c37fff700 1 -- 192.168.123.103:0/2376169812 >> 192.168.123.103:0/2376169812 conn(0x7f6c380f9f90 msgr2=0x7f6c380fc3a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:13.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:13 vm03.local ceph-mon[103098]: osdmap e61: 6 total, 6 up, 6 in 2026-03-10T14:11:13.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:13 vm03.local ceph-mon[103098]: pgmap v87: 65 pgs: 2 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 679/336 objects degraded (202.083%); 0 B/s, 7 objects/s recovering 2026-03-10T14:11:13.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:13 vm03.local ceph-mon[103098]: from='client.34242 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:13.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.591+0000 7f6c37fff700 1 -- 192.168.123.103:0/2376169812 shutdown_connections 2026-03-10T14:11:13.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.591+0000 7f6c37fff700 1 -- 192.168.123.103:0/2376169812 wait complete. 2026-03-10T14:11:13.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.591+0000 7f6c37fff700 1 Processor -- start 2026-03-10T14:11:13.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.591+0000 7f6c37fff700 1 -- start start 2026-03-10T14:11:13.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.592+0000 7f6c37fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c380fe560 0x7f6c38071de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.592+0000 7f6c37fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c380feeb0 0x7f6c38072320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.592+0000 7f6c37fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c380728b0 con 0x7f6c380feeb0 2026-03-10T14:11:13.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.592+0000 7f6c37fff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c380729f0 con 0x7f6c380fe560 2026-03-10T14:11:13.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.592+0000 7f6c36ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c380fe560 0x7f6c38071de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.592+0000 7f6c36ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c380fe560 0x7f6c38071de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:54084/0 (socket says 192.168.123.103:54084) 2026-03-10T14:11:13.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.592+0000 7f6c36ffd700 1 -- 192.168.123.103:0/2501453556 learned_addr learned my addr 192.168.123.103:0/2501453556 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:13.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.593+0000 7f6c36ffd700 1 -- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c380feeb0 msgr2=0x7f6c38072320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.593+0000 7f6c36ffd700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c380feeb0 0x7f6c38072320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.593+0000 7f6c36ffd700 1 -- 192.168.123.103:0/2501453556 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6c200097e0 con 0x7f6c380fe560 2026-03-10T14:11:13.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.593+0000 7f6c36ffd700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c380fe560 0x7f6c38071de0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f6c2800c930 tx=0x7f6c2800ccf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:13.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.595+0000 7f6c3c968700 1 -- 192.168.123.103:0/2501453556 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c28007ab0 con 0x7f6c380fe560 2026-03-10T14:11:13.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.595+0000 7f6c3c968700 1 -- 192.168.123.103:0/2501453556 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6c28007c10 con 0x7f6c380fe560 2026-03-10T14:11:13.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.595+0000 7f6c3c968700 1 -- 192.168.123.103:0/2501453556 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c28018730 con 0x7f6c380fe560 2026-03-10T14:11:13.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.595+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6c381a1870 con 0x7f6c380fe560 2026-03-10T14:11:13.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.595+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6c381a1d90 con 0x7f6c380fe560 2026-03-10T14:11:13.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.596+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6c3804ea50 con 0x7f6c380fe560 2026-03-10T14:11:13.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.597+0000 7f6c3c968700 1 -- 192.168.123.103:0/2501453556 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6c2801f030 con 0x7f6c380fe560 2026-03-10T14:11:13.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.598+0000 7f6c3c968700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6c240779e0 0x7f6c24079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.598+0000 7f6c3c968700 1 -- 192.168.123.103:0/2501453556 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6339+0+0 (secure 0 0 0) 0x7f6c28099e90 con 0x7f6c380fe560 2026-03-10T14:11:13.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.598+0000 7f6c367fc700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6c240779e0 0x7f6c24079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.599+0000 7f6c367fc700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6c240779e0 0x7f6c24079e90 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6c20009ad0 tx=0x7f6c20009f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:13.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.600+0000 7f6c3c968700 1 -- 192.168.123.103:0/2501453556 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6c28062670 con 0x7f6c380fe560 2026-03-10T14:11:13.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.769+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f6c381a2320 con 0x7f6c380fe560 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.770+0000 7f6c3c968700 1 -- 192.168.123.103:0/2501453556 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f6c28061dc0 con 0x7f6c380fe560 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T14:11:13.769 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:11:13.770 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:11:13.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6c240779e0 msgr2=0x7f6c24079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6c240779e0 0x7f6c24079e90 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6c20009ad0 tx=0x7f6c20009f90 comp rx=0 tx=0).stop 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c380fe560 msgr2=0x7f6c38071de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c380fe560 0x7f6c38071de0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f6c2800c930 tx=0x7f6c2800ccf0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 shutdown_connections 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6c240779e0 0x7f6c24079e90 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6c380fe560 0x7f6c38071de0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 --2- 192.168.123.103:0/2501453556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c380feeb0 0x7f6c38072320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.773+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 >> 192.168.123.103:0/2501453556 conn(0x7f6c380f9f90 msgr2=0x7f6c38107a00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.774+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 shutdown_connections 2026-03-10T14:11:13.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.774+0000 7f6c37fff700 1 -- 192.168.123.103:0/2501453556 wait complete. 2026-03-10T14:11:13.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.846+0000 7fce011d1700 1 -- 192.168.123.103:0/2049546769 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcdfc103940 msgr2=0x7fcdfc103d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:13.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.846+0000 7fce011d1700 1 --2- 192.168.123.103:0/2049546769 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcdfc103940 0x7fcdfc103d90 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fcdec009b00 tx=0x7fcdec009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:13.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.848+0000 7fce011d1700 1 -- 192.168.123.103:0/2049546769 shutdown_connections 2026-03-10T14:11:13.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.848+0000 7fce011d1700 1 --2- 192.168.123.103:0/2049546769 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcdfc103940 0x7fcdfc103d90 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.848+0000 7fce011d1700 1 --2- 192.168.123.103:0/2049546769 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc102740 0x7fcdfc102b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.847 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.848+0000 7fce011d1700 1 -- 192.168.123.103:0/2049546769 >> 192.168.123.103:0/2049546769 conn(0x7fcdfc0fdcf0 msgr2=0x7fcdfc100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:13.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.849+0000 7fce011d1700 1 -- 192.168.123.103:0/2049546769 shutdown_connections 2026-03-10T14:11:13.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.849+0000 7fce011d1700 1 -- 192.168.123.103:0/2049546769 wait complete. 2026-03-10T14:11:13.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.849+0000 7fce011d1700 1 Processor -- start 2026-03-10T14:11:13.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.850+0000 7fce011d1700 1 -- start start 2026-03-10T14:11:13.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.850+0000 7fce011d1700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcdfc102740 0x7fcdfc197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.850+0000 7fce011d1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103940 0x7fcdfc198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.850+0000 7fcdfa59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103940 0x7fcdfc198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.850+0000 7fcdfa59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103940 0x7fcdfc198530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45346/0 (socket says 192.168.123.103:45346) 2026-03-10T14:11:13.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.850+0000 7fcdfa59c700 1 -- 192.168.123.103:0/3154364938 learned_addr learned my addr 192.168.123.103:0/3154364938 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:13.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.850+0000 7fce011d1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcdfc198b50 con 0x7fcdfc103940 2026-03-10T14:11:13.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.850+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcdfc198c90 con 0x7fcdfc102740 2026-03-10T14:11:13.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.851+0000 7fcdfa59c700 1 -- 192.168.123.103:0/3154364938 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcdfc102740 msgr2=0x7fcdfc197ff0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:11:13.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.851+0000 7fcdfa59c700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcdfc102740 0x7fcdfc197ff0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:13.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.851+0000 7fcdfa59c700 1 -- 192.168.123.103:0/3154364938 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcde4009710 con 0x7fcdfc103940 2026-03-10T14:11:13.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.851+0000 7fcdfa59c700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103940 0x7fcdfc198530 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fcdec004940 tx=0x7fcdec004a20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:13.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.852+0000 7fcdf3fff700 1 -- 192.168.123.103:0/3154364938 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdec01d070 con 0x7fcdfc103940 2026-03-10T14:11:13.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.852+0000 7fcdf3fff700 1 -- 192.168.123.103:0/3154364938 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcdec004e20 con 0x7fcdfc103940 2026-03-10T14:11:13.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.852+0000 7fcdf3fff700 1 -- 192.168.123.103:0/3154364938 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdec00f700 con 0x7fcdfc103940 2026-03-10T14:11:13.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.852+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcdec0097e0 con 0x7fcdfc103940 2026-03-10T14:11:13.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.852+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcdfc19da40 con 0x7fcdfc103940 2026-03-10T14:11:13.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.854+0000 7fcdf3fff700 1 -- 192.168.123.103:0/3154364938 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcdec00f860 con 0x7fcdfc103940 2026-03-10T14:11:13.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.854+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcdfc066e40 con 0x7fcdfc103940 2026-03-10T14:11:13.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.857+0000 7fcdf3fff700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcde8077870 0x7fcde8079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:13.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.857+0000 7fcdfad9d700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcde8077870 0x7fcde8079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:13.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.858+0000 7fcdfad9d700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcde8077870 0x7fcde8079d20 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fcdfc1037a0 tx=0x7fcde4009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:13.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.857+0000 7fcdf3fff700 1 -- 192.168.123.103:0/3154364938 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6339+0+0 (secure 0 0 0) 0x7fcdec09b1b0 con 0x7fcdfc103940 2026-03-10T14:11:13.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:13.858+0000 7fcdf3fff700 1 -- 192.168.123.103:0/3154364938 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcdec064a20 con 0x7fcdfc103940 2026-03-10T14:11:14.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.009+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fcdfc19dd20 con 0x7fcdfc103940 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.010+0000 7fcdf3fff700 1 -- 192.168.123.103:0/3154364938 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1973 (secure 0 0 0) 0x7fcdec0277d0 con 0x7fcdfc103940 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:11:14.009 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:11:14.010 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:11:14.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.013+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcde8077870 msgr2=0x7fcde8079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.013+0000 7fce011d1700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcde8077870 0x7fcde8079d20 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fcdfc1037a0 tx=0x7fcde4009450 comp rx=0 tx=0).stop 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.013+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103940 msgr2=0x7fcdfc198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.013+0000 7fce011d1700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103940 0x7fcdfc198530 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fcdec004940 tx=0x7fcdec004a20 comp rx=0 tx=0).stop 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.013+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 shutdown_connections 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.013+0000 7fce011d1700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcde8077870 0x7fcde8079d20 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.014+0000 7fce011d1700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcdfc102740 0x7fcdfc197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.014+0000 7fce011d1700 1 --2- 192.168.123.103:0/3154364938 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103940 0x7fcdfc198530 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.014+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 >> 192.168.123.103:0/3154364938 conn(0x7fcdfc0fdcf0 msgr2=0x7fcdfc106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:14.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.014+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 shutdown_connections 2026-03-10T14:11:14.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.014+0000 7fce011d1700 1 -- 192.168.123.103:0/3154364938 wait complete. 2026-03-10T14:11:14.013 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:11:14.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.089+0000 7fdb9648a700 1 -- 192.168.123.103:0/829062130 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90103680 msgr2=0x7fdb90105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.089+0000 7fdb9648a700 1 --2- 192.168.123.103:0/829062130 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90103680 0x7fdb90105ac0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fdb80009b00 tx=0x7fdb80009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:14.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.089+0000 7fdb9648a700 1 -- 192.168.123.103:0/829062130 shutdown_connections 2026-03-10T14:11:14.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.089+0000 7fdb9648a700 1 --2- 192.168.123.103:0/829062130 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90103680 0x7fdb90105ac0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.089+0000 7fdb9648a700 1 --2- 192.168.123.103:0/829062130 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb90069180 0x7fdb90103140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.089+0000 7fdb9648a700 1 -- 192.168.123.103:0/829062130 >> 192.168.123.103:0/829062130 conn(0x7fdb900faa70 msgr2=0x7fdb900fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:14.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.089+0000 7fdb9648a700 1 -- 192.168.123.103:0/829062130 shutdown_connections 2026-03-10T14:11:14.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.089+0000 7fdb9648a700 1 -- 192.168.123.103:0/829062130 wait complete. 2026-03-10T14:11:14.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.090+0000 7fdb9648a700 1 Processor -- start 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.090+0000 7fdb9648a700 1 -- start start 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.090+0000 7fdb9648a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90069180 0x7fdb90100a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.090+0000 7fdb9648a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb90103680 0x7fdb900ff0e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.091+0000 7fdb8f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb90103680 0x7fdb900ff0e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.091+0000 7fdb8f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb90103680 0x7fdb900ff0e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:57150/0 (socket says 192.168.123.103:57150) 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.091+0000 7fdb8f7fe700 1 -- 192.168.123.103:0/2844811523 learned_addr learned my addr 192.168.123.103:0/2844811523 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.091+0000 7fdb8ffff700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90069180 0x7fdb90100a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.090+0000 7fdb9648a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb900ff620 con 0x7fdb90069180 2026-03-10T14:11:14.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.091+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb900ff760 con 0x7fdb90103680 2026-03-10T14:11:14.090 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.091+0000 7fdb8ffff700 1 -- 192.168.123.103:0/2844811523 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb90103680 msgr2=0x7fdb900ff0e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.090 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.091+0000 7fdb8ffff700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb90103680 0x7fdb900ff0e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.090 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.091+0000 7fdb8ffff700 1 -- 192.168.123.103:0/2844811523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb800097e0 con 0x7fdb90069180 2026-03-10T14:11:14.090 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.092+0000 7fdb8ffff700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90069180 0x7fdb90100a90 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fdb90069570 tx=0x7fdb7800cd00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:14.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.092+0000 7fdb8d7fa700 1 -- 192.168.123.103:0/2844811523 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb780041d0 con 0x7fdb90069180 2026-03-10T14:11:14.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.092+0000 7fdb8d7fa700 1 -- 192.168.123.103:0/2844811523 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdb78004d10 con 0x7fdb90069180 2026-03-10T14:11:14.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.092+0000 7fdb8d7fa700 1 -- 192.168.123.103:0/2844811523 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb780056b0 con 0x7fdb90069180 2026-03-10T14:11:14.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.092+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdb900ffa40 con 0x7fdb90069180 2026-03-10T14:11:14.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.092+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdb9019dea0 con 0x7fdb90069180 2026-03-10T14:11:14.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.094+0000 7fdb8d7fa700 1 -- 192.168.123.103:0/2844811523 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdb780076f0 con 0x7fdb90069180 2026-03-10T14:11:14.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.094+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdb9018ffe0 con 0x7fdb90069180 2026-03-10T14:11:14.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.096+0000 7fdb8d7fa700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdb7c077870 0x7fdb7c079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:14.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.096+0000 7fdb8d7fa700 1 -- 192.168.123.103:0/2844811523 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6339+0+0 (secure 0 0 0) 0x7fdb780994e0 con 0x7fdb90069180 2026-03-10T14:11:14.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.098+0000 7fdb8f7fe700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdb7c077870 0x7fdb7c079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:14.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.098+0000 7fdb8f7fe700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdb7c077870 0x7fdb7c079d20 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fdb80006010 tx=0x7fdb8000b540 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:14.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.103+0000 7fdb8d7fa700 1 -- 192.168.123.103:0/2844811523 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdb78062d40 con 0x7fdb90069180 2026-03-10T14:11:14.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.241+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdb900ffbd0 con 0x7fdb7c077870 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.242+0000 7fdb8d7fa700 1 -- 192.168.123.103:0/2844811523 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fdb900ffbd0 con 0x7fdb7c077870 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:11:14.241 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:11:14.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdb7c077870 msgr2=0x7fdb7c079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdb7c077870 0x7fdb7c079d20 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fdb80006010 tx=0x7fdb8000b540 comp rx=0 tx=0).stop 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90069180 msgr2=0x7fdb90100a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90069180 0x7fdb90100a90 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fdb90069570 tx=0x7fdb7800cd00 comp rx=0 tx=0).stop 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 shutdown_connections 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdb7c077870 0x7fdb7c079d20 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb90069180 0x7fdb90100a90 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 --2- 192.168.123.103:0/2844811523 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb90103680 0x7fdb900ff0e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 >> 192.168.123.103:0/2844811523 conn(0x7fdb900faa70 msgr2=0x7fdb900fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 shutdown_connections 2026-03-10T14:11:14.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.245+0000 7fdb9648a700 1 -- 192.168.123.103:0/2844811523 wait complete. 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.323+0000 7fe7d9d6c700 1 -- 192.168.123.103:0/1182825666 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 msgr2=0x7fe7d4105d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.323+0000 7fe7d9d6c700 1 --2- 192.168.123.103:0/1182825666 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 0x7fe7d4105d50 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fe7c4009b00 tx=0x7fe7c4009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.323+0000 7fe7d9d6c700 1 -- 192.168.123.103:0/1182825666 shutdown_connections 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.323+0000 7fe7d9d6c700 1 --2- 192.168.123.103:0/1182825666 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 0x7fe7d4105d50 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.323+0000 7fe7d9d6c700 1 --2- 192.168.123.103:0/1182825666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7d4101050 0x7fe7d4103430 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.323+0000 7fe7d9d6c700 1 -- 192.168.123.103:0/1182825666 >> 192.168.123.103:0/1182825666 conn(0x7fe7d40fa9b0 msgr2=0x7fe7d40fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.323+0000 7fe7d9d6c700 1 -- 192.168.123.103:0/1182825666 shutdown_connections 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d9d6c700 1 -- 192.168.123.103:0/1182825666 wait complete. 2026-03-10T14:11:14.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d9d6c700 1 Processor -- start 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d9d6c700 1 -- start start 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d9d6c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7d4101050 0x7fe7d4197fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d9d6c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 0x7fe7d4198520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d2ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 0x7fe7d4198520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d2ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 0x7fe7d4198520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:57172/0 (socket says 192.168.123.103:57172) 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d2ffd700 1 -- 192.168.123.103:0/2898017811 learned_addr learned my addr 192.168.123.103:0/2898017811 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.325+0000 7fe7d37fe700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7d4101050 0x7fe7d4197fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.324+0000 7fe7d9d6c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7d4198b40 con 0x7fe7d4101050 2026-03-10T14:11:14.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.325+0000 7fe7d9d6c700 1 -- 192.168.123.103:0/2898017811 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe7d4198c80 con 0x7fe7d4103970 2026-03-10T14:11:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.325+0000 7fe7d2ffd700 1 -- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7d4101050 msgr2=0x7fe7d4197fe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.325+0000 7fe7d2ffd700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7d4101050 0x7fe7d4197fe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.325+0000 7fe7d2ffd700 1 -- 192.168.123.103:0/2898017811 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe7bc009710 con 0x7fe7d4103970 2026-03-10T14:11:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.325+0000 7fe7d37fe700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7d4101050 0x7fe7d4197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:11:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.326+0000 7fe7d2ffd700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 0x7fe7d4198520 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fe7c400bb70 tx=0x7fe7c400bba0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.326+0000 7fe7d0ff9700 1 -- 192.168.123.103:0/2898017811 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe7c401d070 con 0x7fe7d4103970 2026-03-10T14:11:14.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.326+0000 7fe7d9d6c700 1 -- 192.168.123.103:0/2898017811 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe7c40097e0 con 0x7fe7d4103970 2026-03-10T14:11:14.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.326+0000 7fe7d9d6c700 1 -- 192.168.123.103:0/2898017811 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe7d419da30 con 0x7fe7d4103970 2026-03-10T14:11:14.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.326+0000 7fe7d0ff9700 1 -- 192.168.123.103:0/2898017811 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe7c4022470 con 0x7fe7d4103970 2026-03-10T14:11:14.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.326+0000 7fe7d0ff9700 1 -- 192.168.123.103:0/2898017811 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe7c400f650 con 0x7fe7d4103970 2026-03-10T14:11:14.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.328+0000 7fe7ca7fc700 1 -- 192.168.123.103:0/2898017811 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe7d4066e40 con 0x7fe7d4103970 2026-03-10T14:11:14.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.328+0000 7fe7d0ff9700 1 -- 192.168.123.103:0/2898017811 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe7c400f7b0 con 0x7fe7d4103970 2026-03-10T14:11:14.327 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.329+0000 7fe7d0ff9700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe7c0077870 0x7fe7c0079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:14.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.329+0000 7fe7d37fe700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe7c0077870 0x7fe7c0079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:14.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.330+0000 7fe7d37fe700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe7c0077870 0x7fe7c0079d20 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fe7bc00f790 tx=0x7fe7bc009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:14.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.330+0000 7fe7d0ff9700 1 -- 192.168.123.103:0/2898017811 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(61..61 src has 1..61) v4 ==== 6339+0+0 (secure 0 0 0) 0x7fe7c409b4d0 con 0x7fe7d4103970 2026-03-10T14:11:14.330 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.332+0000 7fe7d0ff9700 1 -- 192.168.123.103:0/2898017811 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe7c4063c30 con 0x7fe7d4103970 2026-03-10T14:11:14.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:14 vm03.local ceph-mon[103098]: from='client.44171 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:14.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:14 vm03.local ceph-mon[103098]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:14.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:14 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2501453556' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:14.359 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:14 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3154364938' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:11:14.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.509+0000 7fe7ca7fc700 1 -- 192.168.123.103:0/2898017811 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fe7d4061b80 con 0x7fe7d4103970 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.510+0000 7fe7d0ff9700 1 -- 192.168.123.103:0/2898017811 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1187 (secure 0 0 0) 0x7fe7c4063380 con 0x7fe7d4103970 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 679/336 objects degraded (202.083%), 6 pgs degraded, 8 pgs undersized 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 679/336 objects degraded (202.083%), 6 pgs degraded, 8 pgs undersized 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is stuck undersized for 79s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.6 is stuck undersized for 79s, current state active+recovering+undersized+remapped, last acting [1,4] 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.b is stuck undersized for 79s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.f is stuck undersized for 79s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,3] 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.10 is stuck undersized for 79s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is stuck undersized for 79s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.15 is stuck undersized for 79s, current state active+recovering+undersized+remapped, last acting [3,4] 2026-03-10T14:11:14.509 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.18 is stuck undersized for 79s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 -- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe7c0077870 msgr2=0x7fe7c0079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe7c0077870 0x7fe7c0079d20 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fe7bc00f790 tx=0x7fe7bc009450 comp rx=0 tx=0).stop 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 -- 192.168.123.103:0/2898017811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 msgr2=0x7fe7d4198520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 0x7fe7d4198520 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fe7c400bb70 tx=0x7fe7c400bba0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 -- 192.168.123.103:0/2898017811 shutdown_connections 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe7c0077870 0x7fe7c0079d20 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7d4101050 0x7fe7d4197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 --2- 192.168.123.103:0/2898017811 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe7d4103970 0x7fe7d4198520 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.513+0000 7fe7ca7fc700 1 -- 192.168.123.103:0/2898017811 >> 192.168.123.103:0/2898017811 conn(0x7fe7d40fa9b0 msgr2=0x7fe7d40ff6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.514+0000 7fe7ca7fc700 1 -- 192.168.123.103:0/2898017811 shutdown_connections 2026-03-10T14:11:14.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:14.514+0000 7fe7ca7fc700 1 -- 192.168.123.103:0/2898017811 wait complete. 2026-03-10T14:11:14.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:14 vm04.local ceph-mon[92084]: from='client.44171 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:14.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:14 vm04.local ceph-mon[92084]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:14.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:14 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2501453556' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:14.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:14 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3154364938' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:11:15.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:15 vm04.local ceph-mon[92084]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:15.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:15 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2898017811' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:11:15.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:15 vm04.local ceph-mon[92084]: pgmap v88: 65 pgs: 2 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 679/336 objects degraded (202.083%); 0 B/s, 7 objects/s recovering 2026-03-10T14:11:15.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:15 vm03.local ceph-mon[103098]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:15.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:15 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2898017811' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:11:15.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:15 vm03.local ceph-mon[103098]: pgmap v88: 65 pgs: 2 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 679/336 objects degraded (202.083%); 0 B/s, 7 objects/s recovering 2026-03-10T14:11:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:16 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 679/336 objects degraded (202.083%), 6 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:16 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:11:16.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:16 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 679/336 objects degraded (202.083%), 6 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:16.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:16 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:11:17.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:17 vm03.local ceph-mon[103098]: pgmap v89: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 679/336 objects degraded (202.083%); 0 B/s, 4 objects/s recovering 2026-03-10T14:11:17.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:17 vm04.local ceph-mon[92084]: pgmap v89: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 679/336 objects degraded (202.083%); 0 B/s, 4 objects/s recovering 2026-03-10T14:11:19.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:19.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:18 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:19.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:18 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T14:11:19.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:19.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:18 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:19.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:18 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T14:11:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:19 vm04.local ceph-mon[92084]: pgmap v90: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.4 KiB/s rd, 4 op/s; 679/336 objects degraded (202.083%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:20.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:19 vm03.local ceph-mon[103098]: pgmap v90: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.4 KiB/s rd, 4 op/s; 679/336 objects degraded (202.083%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:21.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:21 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 679/336 objects degraded (202.083%), 6 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:21.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:21 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 679/336 objects degraded (202.083%), 6 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:22.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:22 vm04.local ceph-mon[92084]: pgmap v91: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 679/336 objects degraded (202.083%); 0 B/s, 7 objects/s recovering 2026-03-10T14:11:22.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:22 vm03.local ceph-mon[103098]: pgmap v91: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 679/336 objects degraded (202.083%); 0 B/s, 7 objects/s recovering 2026-03-10T14:11:23.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:23 vm03.local ceph-mon[103098]: pgmap v92: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 679/336 objects degraded (202.083%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:23.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:23 vm04.local ceph-mon[92084]: pgmap v92: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 679/336 objects degraded (202.083%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:25.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:25 vm04.local ceph-mon[92084]: osdmap e62: 6 total, 6 up, 6 in 2026-03-10T14:11:25.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:25 vm04.local ceph-mon[92084]: pgmap v94: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 679/336 objects degraded (202.083%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:25.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:25 vm03.local ceph-mon[103098]: osdmap e62: 6 total, 6 up, 6 in 2026-03-10T14:11:25.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:25 vm03.local ceph-mon[103098]: pgmap v94: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 679/336 objects degraded (202.083%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:26.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:26 vm04.local ceph-mon[92084]: osdmap e63: 6 total, 6 up, 6 in 2026-03-10T14:11:26.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:26 vm03.local ceph-mon[103098]: osdmap e63: 6 total, 6 up, 6 in 2026-03-10T14:11:27.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:27 vm03.local ceph-mon[103098]: pgmap v96: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 679/336 objects degraded (202.083%); 0 B/s, 4 objects/s recovering 2026-03-10T14:11:27.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:27 vm04.local ceph-mon[92084]: pgmap v96: 65 pgs: 1 active+recovering+undersized+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 679/336 objects degraded (202.083%); 0 B/s, 4 objects/s recovering 2026-03-10T14:11:29.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:29 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 561/336 objects degraded (166.964%), 5 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:29.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:29 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 561/336 objects degraded (166.964%), 5 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:30.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:30 vm03.local ceph-mon[103098]: pgmap v97: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 561/336 objects degraded (166.964%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:30.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:30 vm04.local ceph-mon[92084]: pgmap v97: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 561/336 objects degraded (166.964%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:11:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:11:32.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:32 vm04.local ceph-mon[92084]: pgmap v98: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 561/336 objects degraded (166.964%); 0 B/s, 5 objects/s recovering 2026-03-10T14:11:32.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:32 vm03.local ceph-mon[103098]: pgmap v98: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 561/336 objects degraded (166.964%); 0 B/s, 5 objects/s recovering 2026-03-10T14:11:33.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:33 vm03.local ceph-mon[103098]: pgmap v99: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.4 KiB/s rd, 4 op/s; 561/336 objects degraded (166.964%); 0 B/s, 10 objects/s recovering 2026-03-10T14:11:33.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:33.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:33 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:33.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:33 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T14:11:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:33 vm04.local ceph-mon[92084]: pgmap v99: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.4 KiB/s rd, 4 op/s; 561/336 objects degraded (166.964%); 0 B/s, 10 objects/s recovering 2026-03-10T14:11:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:33 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:33 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T14:11:36.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:35 vm04.local ceph-mon[92084]: pgmap v100: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 561/336 objects degraded (166.964%); 0 B/s, 8 objects/s recovering 2026-03-10T14:11:36.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:35 vm03.local ceph-mon[103098]: pgmap v100: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 561/336 objects degraded (166.964%); 0 B/s, 8 objects/s recovering 2026-03-10T14:11:38.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:37 vm04.local ceph-mon[92084]: pgmap v101: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.2 KiB/s rd, 3 op/s; 561/336 objects degraded (166.964%); 0 B/s, 11 objects/s recovering 2026-03-10T14:11:38.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:37 vm03.local ceph-mon[103098]: pgmap v101: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.2 KiB/s rd, 3 op/s; 561/336 objects degraded (166.964%); 0 B/s, 11 objects/s recovering 2026-03-10T14:11:39.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:38 vm04.local ceph-mon[92084]: osdmap e64: 6 total, 6 up, 6 in 2026-03-10T14:11:39.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:38 vm03.local ceph-mon[103098]: osdmap e64: 6 total, 6 up, 6 in 2026-03-10T14:11:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:39 vm04.local ceph-mon[92084]: pgmap v103: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 561/336 objects degraded (166.964%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:39 vm04.local ceph-mon[92084]: osdmap e65: 6 total, 6 up, 6 in 2026-03-10T14:11:40.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:39 vm03.local ceph-mon[103098]: pgmap v103: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 561/336 objects degraded (166.964%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:40.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:39 vm03.local ceph-mon[103098]: osdmap e65: 6 total, 6 up, 6 in 2026-03-10T14:11:42.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:41 vm04.local ceph-mon[92084]: pgmap v105: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 561/336 objects degraded (166.964%); 0 B/s, 5 objects/s recovering 2026-03-10T14:11:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:41 vm03.local ceph-mon[103098]: pgmap v105: 65 pgs: 1 active+recovering+undersized+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 561/336 objects degraded (166.964%); 0 B/s, 5 objects/s recovering 2026-03-10T14:11:43.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:42 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 447/336 objects degraded (133.036%), 4 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:43.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:42 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 447/336 objects degraded (133.036%), 4 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:44.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:43 vm04.local ceph-mon[92084]: pgmap v106: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 447/336 objects degraded (133.036%); 0 B/s, 10 objects/s recovering 2026-03-10T14:11:44.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:43 vm03.local ceph-mon[103098]: pgmap v106: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 447/336 objects degraded (133.036%); 0 B/s, 10 objects/s recovering 2026-03-10T14:11:44.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.590+0000 7fcc43c97700 1 -- 192.168.123.103:0/3948739013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 msgr2=0x7fcc3c103b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.590+0000 7fcc43c97700 1 --2- 192.168.123.103:0/3948739013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 0x7fcc3c103b80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fcc38009b00 tx=0x7fcc38009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.591+0000 7fcc43c97700 1 -- 192.168.123.103:0/3948739013 shutdown_connections 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.591+0000 7fcc43c97700 1 --2- 192.168.123.103:0/3948739013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 0x7fcc3c103b80 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.591+0000 7fcc43c97700 1 --2- 192.168.123.103:0/3948739013 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc3c0feda0 0x7fcc3c1011c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.591+0000 7fcc43c97700 1 -- 192.168.123.103:0/3948739013 >> 192.168.123.103:0/3948739013 conn(0x7fcc3c0fa9b0 msgr2=0x7fcc3c0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.591+0000 7fcc43c97700 1 -- 192.168.123.103:0/3948739013 shutdown_connections 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.591+0000 7fcc43c97700 1 -- 192.168.123.103:0/3948739013 wait complete. 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc43c97700 1 Processor -- start 2026-03-10T14:11:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc43c97700 1 -- start start 2026-03-10T14:11:44.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc43c97700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc3c0feda0 0x7fcc3c19c3d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:44.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc43c97700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 0x7fcc3c19c910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:44.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc43c97700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc3c19cea0 con 0x7fcc3c101700 2026-03-10T14:11:44.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc43c97700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc3c19cfe0 con 0x7fcc3c0feda0 2026-03-10T14:11:44.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc41232700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 0x7fcc3c19c910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:44.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc41232700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 0x7fcc3c19c910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47806/0 (socket says 192.168.123.103:47806) 2026-03-10T14:11:44.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.592+0000 7fcc41232700 1 -- 192.168.123.103:0/1965262216 learned_addr learned my addr 192.168.123.103:0/1965262216 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:44.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.593+0000 7fcc41232700 1 -- 192.168.123.103:0/1965262216 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc3c0feda0 msgr2=0x7fcc3c19c3d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:44.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.593+0000 7fcc41a33700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc3c0feda0 0x7fcc3c19c3d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:44.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.593+0000 7fcc41232700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc3c0feda0 0x7fcc3c19c3d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.593+0000 7fcc41232700 1 -- 192.168.123.103:0/1965262216 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcc380097e0 con 0x7fcc3c101700 2026-03-10T14:11:44.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.593+0000 7fcc41a33700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc3c0feda0 0x7fcc3c19c3d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:11:44.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.593+0000 7fcc41232700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 0x7fcc3c19c910 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fcc38005230 tx=0x7fcc380056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:44.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.594+0000 7fcc2effd700 1 -- 192.168.123.103:0/1965262216 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc3801d070 con 0x7fcc3c101700 2026-03-10T14:11:44.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.594+0000 7fcc2effd700 1 -- 192.168.123.103:0/1965262216 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcc3800bc50 con 0x7fcc3c101700 2026-03-10T14:11:44.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.594+0000 7fcc2effd700 1 -- 192.168.123.103:0/1965262216 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc3800f850 con 0x7fcc3c101700 2026-03-10T14:11:44.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.594+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcc3c1a1a40 con 0x7fcc3c101700 2026-03-10T14:11:44.593 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.594+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcc3c1a1f00 con 0x7fcc3c101700 2026-03-10T14:11:44.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.595+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcc3c1965e0 con 0x7fcc3c101700 2026-03-10T14:11:44.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.596+0000 7fcc2effd700 1 -- 192.168.123.103:0/1965262216 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcc38022b70 con 0x7fcc3c101700 2026-03-10T14:11:44.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.597+0000 7fcc2effd700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcc280779e0 0x7fcc28079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:44.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.597+0000 7fcc2effd700 1 -- 192.168.123.103:0/1965262216 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6281+0+0 (secure 0 0 0) 0x7fcc3809bbc0 con 0x7fcc3c101700 2026-03-10T14:11:44.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.597+0000 7fcc41a33700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcc280779e0 0x7fcc28079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:44.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.599+0000 7fcc41a33700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcc280779e0 0x7fcc28079e90 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fcc30005fd0 tx=0x7fcc30005e20 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:44.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.599+0000 7fcc2effd700 1 -- 192.168.123.103:0/1965262216 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcc380643e0 con 0x7fcc3c101700 2026-03-10T14:11:44.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.741+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fcc3c061190 con 0x7fcc280779e0 2026-03-10T14:11:44.742 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.743+0000 7fcc2effd700 1 -- 192.168.123.103:0/1965262216 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fcc3c061190 con 0x7fcc280779e0 2026-03-10T14:11:44.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.745+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcc280779e0 msgr2=0x7fcc28079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:44.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.745+0000 7fcc43c97700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcc280779e0 0x7fcc28079e90 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fcc30005fd0 tx=0x7fcc30005e20 comp rx=0 tx=0).stop 2026-03-10T14:11:44.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.745+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 msgr2=0x7fcc3c19c910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:44.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.745+0000 7fcc43c97700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 0x7fcc3c19c910 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fcc38005230 tx=0x7fcc380056c0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.746+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 shutdown_connections 2026-03-10T14:11:44.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.746+0000 7fcc43c97700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fcc280779e0 0x7fcc28079e90 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.746+0000 7fcc43c97700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fcc3c0feda0 0x7fcc3c19c3d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.746+0000 7fcc43c97700 1 --2- 192.168.123.103:0/1965262216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc3c101700 0x7fcc3c19c910 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.746+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 >> 192.168.123.103:0/1965262216 conn(0x7fcc3c0fa9b0 msgr2=0x7fcc3c0fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:44.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.746+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 shutdown_connections 2026-03-10T14:11:44.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.746+0000 7fcc43c97700 1 -- 192.168.123.103:0/1965262216 wait complete. 2026-03-10T14:11:44.754 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:11:44.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.818+0000 7f2f47114700 1 -- 192.168.123.103:0/2656859829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 msgr2=0x7f2f40100b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:44.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.818+0000 7f2f47114700 1 --2- 192.168.123.103:0/2656859829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 0x7f2f40100b70 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f2f34009b00 tx=0x7f2f34009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:44.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.818+0000 7f2f47114700 1 -- 192.168.123.103:0/2656859829 shutdown_connections 2026-03-10T14:11:44.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.818+0000 7f2f47114700 1 --2- 192.168.123.103:0/2656859829 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 0x7f2f40100b70 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.818+0000 7f2f47114700 1 --2- 192.168.123.103:0/2656859829 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f400ff460 0x7f2f400ff870 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.818+0000 7f2f47114700 1 -- 192.168.123.103:0/2656859829 >> 192.168.123.103:0/2656859829 conn(0x7f2f400faa70 msgr2=0x7f2f400fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:44.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.819+0000 7f2f47114700 1 -- 192.168.123.103:0/2656859829 shutdown_connections 2026-03-10T14:11:44.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.819+0000 7f2f47114700 1 -- 192.168.123.103:0/2656859829 wait complete. 2026-03-10T14:11:44.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.819+0000 7f2f47114700 1 Processor -- start 2026-03-10T14:11:44.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.819+0000 7f2f47114700 1 -- start start 2026-03-10T14:11:44.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.820+0000 7f2f47114700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f400ff460 0x7f2f401074b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.820+0000 7f2f47114700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 0x7f2f40105b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.820+0000 7f2f47114700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f40106040 con 0x7f2f40100700 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.820+0000 7f2f47114700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f40106180 con 0x7f2f400ff460 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.820+0000 7f2f3ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 0x7f2f40105b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.820+0000 7f2f3ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 0x7f2f40105b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47822/0 (socket says 192.168.123.103:47822) 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.820+0000 7f2f3ffff700 1 -- 192.168.123.103:0/3307364001 learned_addr learned my addr 192.168.123.103:0/3307364001 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.820+0000 7f2f3ffff700 1 -- 192.168.123.103:0/3307364001 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f400ff460 msgr2=0x7f2f401074b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.821+0000 7f2f3ffff700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f400ff460 0x7f2f401074b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.821+0000 7f2f3ffff700 1 -- 192.168.123.103:0/3307364001 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2f340097e0 con 0x7f2f40100700 2026-03-10T14:11:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.821+0000 7f2f3ffff700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 0x7f2f40105b00 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f2f34005230 tx=0x7f2f340056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:44.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.821+0000 7f2f3dffb700 1 -- 192.168.123.103:0/3307364001 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f3401d070 con 0x7f2f40100700 2026-03-10T14:11:44.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.821+0000 7f2f3dffb700 1 -- 192.168.123.103:0/3307364001 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2f3400bc50 con 0x7f2f40100700 2026-03-10T14:11:44.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.821+0000 7f2f3dffb700 1 -- 192.168.123.103:0/3307364001 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f3400f800 con 0x7f2f40100700 2026-03-10T14:11:44.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.821+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2f40106400 con 0x7f2f40100700 2026-03-10T14:11:44.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.821+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2f40106810 con 0x7f2f40100700 2026-03-10T14:11:44.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.822+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2f40066e40 con 0x7f2f40100700 2026-03-10T14:11:44.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.824+0000 7f2f3dffb700 1 -- 192.168.123.103:0/3307364001 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2f34022ae0 con 0x7f2f40100700 2026-03-10T14:11:44.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.824+0000 7f2f3dffb700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2f300779e0 0x7f2f30079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:44.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.824+0000 7f2f3dffb700 1 -- 192.168.123.103:0/3307364001 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6281+0+0 (secure 0 0 0) 0x7f2f3409bc70 con 0x7f2f40100700 2026-03-10T14:11:44.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.824+0000 7f2f44eb0700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2f300779e0 0x7f2f30079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:44.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.825+0000 7f2f44eb0700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2f300779e0 0x7f2f30079e90 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f2f2c00a9b0 tx=0x7f2f2c005c90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:44.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.826+0000 7f2f3dffb700 1 -- 192.168.123.103:0/3307364001 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2f34064490 con 0x7f2f40100700 2026-03-10T14:11:44.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.958+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2f40105050 con 0x7f2f300779e0 2026-03-10T14:11:44.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.959+0000 7f2f3dffb700 1 -- 192.168.123.103:0/3307364001 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f2f40105050 con 0x7f2f300779e0 2026-03-10T14:11:44.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2f300779e0 msgr2=0x7f2f30079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:44.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2f300779e0 0x7f2f30079e90 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f2f2c00a9b0 tx=0x7f2f2c005c90 comp rx=0 tx=0).stop 2026-03-10T14:11:44.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 msgr2=0x7f2f40105b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:44.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 0x7f2f40105b00 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f2f34005230 tx=0x7f2f340056c0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 shutdown_connections 2026-03-10T14:11:44.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2f300779e0 0x7f2f30079e90 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2f400ff460 0x7f2f401074b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 --2- 192.168.123.103:0/3307364001 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f40100700 0x7f2f40105b00 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:44.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.962+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 >> 192.168.123.103:0/3307364001 conn(0x7f2f400faa70 msgr2=0x7f2f40103930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:44.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.963+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 shutdown_connections 2026-03-10T14:11:44.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:44.963+0000 7f2f47114700 1 -- 192.168.123.103:0/3307364001 wait complete. 2026-03-10T14:11:45.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.035+0000 7feebdd2f700 1 -- 192.168.123.103:0/2573600746 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 msgr2=0x7feeb8102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.035+0000 7feebdd2f700 1 --2- 192.168.123.103:0/2573600746 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 0x7feeb8102b70 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7feea0009b50 tx=0x7feea0009e60 comp rx=0 tx=0).stop 2026-03-10T14:11:45.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.036+0000 7feebdd2f700 1 -- 192.168.123.103:0/2573600746 shutdown_connections 2026-03-10T14:11:45.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.036+0000 7feebdd2f700 1 --2- 192.168.123.103:0/2573600746 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feeb8103960 0x7feeb8103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.034 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.036+0000 7feebdd2f700 1 --2- 192.168.123.103:0/2573600746 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 0x7feeb8102b70 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.036+0000 7feebdd2f700 1 -- 192.168.123.103:0/2573600746 >> 192.168.123.103:0/2573600746 conn(0x7feeb80fdd10 msgr2=0x7feeb8100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.036+0000 7feebdd2f700 1 -- 192.168.123.103:0/2573600746 shutdown_connections 2026-03-10T14:11:45.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.036+0000 7feebdd2f700 1 -- 192.168.123.103:0/2573600746 wait complete. 2026-03-10T14:11:45.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.036+0000 7feebdd2f700 1 Processor -- start 2026-03-10T14:11:45.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.037+0000 7feebdd2f700 1 -- start start 2026-03-10T14:11:45.035 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.037+0000 7feebdd2f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 0x7feeb8198090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.037+0000 7feeb77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 0x7feeb8198090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.037+0000 7feeb77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 0x7feeb8198090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47832/0 (socket says 192.168.123.103:47832) 2026-03-10T14:11:45.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.037+0000 7feebdd2f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feeb8103960 0x7feeb81985d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.037+0000 7feebdd2f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feeb8198bf0 con 0x7feeb8102760 2026-03-10T14:11:45.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.037+0000 7feebdd2f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feeb8198d30 con 0x7feeb8103960 2026-03-10T14:11:45.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.037+0000 7feeb77fe700 1 -- 192.168.123.103:0/1691955573 learned_addr learned my addr 192.168.123.103:0/1691955573 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:45.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.038+0000 7feeb6ffd700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feeb8103960 0x7feeb81985d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.036 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.038+0000 7feeb6ffd700 1 -- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 msgr2=0x7feeb8198090 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.038+0000 7feeb6ffd700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 0x7feeb8198090 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.038+0000 7feeb6ffd700 1 -- 192.168.123.103:0/1691955573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feea00097e0 con 0x7feeb8103960 2026-03-10T14:11:45.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.038+0000 7feeb77fe700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 0x7feeb8198090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:11:45.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.038+0000 7feeb6ffd700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feeb8103960 0x7feeb81985d0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7feea800eab0 tx=0x7feea800edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.039+0000 7feeb4ff9700 1 -- 192.168.123.103:0/1691955573 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feea800cb20 con 0x7feeb8103960 2026-03-10T14:11:45.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.039+0000 7feeb4ff9700 1 -- 192.168.123.103:0/1691955573 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feea800cc80 con 0x7feeb8103960 2026-03-10T14:11:45.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.039+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feeb819d7e0 con 0x7feeb8103960 2026-03-10T14:11:45.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.039+0000 7feeb4ff9700 1 -- 192.168.123.103:0/1691955573 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feea8018930 con 0x7feeb8103960 2026-03-10T14:11:45.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.039+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feeb819dd30 con 0x7feeb8103960 2026-03-10T14:11:45.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.040+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feeb8066e40 con 0x7feeb8103960 2026-03-10T14:11:45.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.041+0000 7feeb4ff9700 1 -- 192.168.123.103:0/1691955573 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7feea8018a90 con 0x7feeb8103960 2026-03-10T14:11:45.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.042+0000 7feeb4ff9700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feea40778c0 0x7feea4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.042+0000 7feeb77fe700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feea40778c0 0x7feea4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.042+0000 7feeb4ff9700 1 -- 192.168.123.103:0/1691955573 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6281+0+0 (secure 0 0 0) 0x7feea8026080 con 0x7feeb8103960 2026-03-10T14:11:45.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.042+0000 7feeb77fe700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feea40778c0 0x7feea4079d70 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7feea0006010 tx=0x7feea000b540 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.045+0000 7feeb4ff9700 1 -- 192.168.123.103:0/1691955573 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feea8062930 con 0x7feeb8103960 2026-03-10T14:11:45.166 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.167+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7feeb819e010 con 0x7feea40778c0 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.172+0000 7feeb4ff9700 1 -- 192.168.123.103:0/1691955573 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3528 (secure 0 0 0) 0x7feeb819e010 con 0x7feea40778c0 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (7m) 2m ago 8m 22.0M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (6m) 2m ago 8m 8904k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (7m) 2m ago 7m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 2m ago 8m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (2m) 2m ago 7m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (7m) 2m ago 8m 91.2M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (6m) 2m ago 6m 158M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (6m) 2m ago 6m 93.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (6m) 2m ago 6m 91.2M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (6m) 2m ago 6m 166M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (3m) 2m ago 9m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (2m) 2m ago 7m 496M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 2m ago 9m 56.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (2m) 2m ago 7m 49.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (8m) 2m ago 8m 15.5M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (7m) 2m ago 7m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 2m ago 7m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (7m) 2m ago 7m 368M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (7m) 2m ago 7m 288M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (7m) 2m ago 7m 434M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (6m) 2m ago 6m 397M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (6m) 2m ago 6m 352M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:11:45.171 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 2m ago 8m 53.0M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:11:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.174+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feea40778c0 msgr2=0x7feea4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.174+0000 7feebdd2f700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feea40778c0 0x7feea4079d70 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7feea0006010 tx=0x7feea000b540 comp rx=0 tx=0).stop 2026-03-10T14:11:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feeb8103960 msgr2=0x7feeb81985d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feeb8103960 0x7feeb81985d0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7feea800eab0 tx=0x7feea800edc0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 shutdown_connections 2026-03-10T14:11:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feea40778c0 0x7feea4079d70 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7feeb8102760 0x7feeb8198090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 --2- 192.168.123.103:0/1691955573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7feeb8103960 0x7feeb81985d0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 >> 192.168.123.103:0/1691955573 conn(0x7feeb80fdd10 msgr2=0x7feeb8106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 shutdown_connections 2026-03-10T14:11:45.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.175+0000 7feebdd2f700 1 -- 192.168.123.103:0/1691955573 wait complete. 2026-03-10T14:11:45.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.241+0000 7f338667b700 1 -- 192.168.123.103:0/1593458003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3380100700 msgr2=0x7f3380100b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.241+0000 7f338667b700 1 --2- 192.168.123.103:0/1593458003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3380100700 0x7f3380100b70 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f3370009b00 tx=0x7f3370009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:45.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.243+0000 7f338667b700 1 -- 192.168.123.103:0/1593458003 shutdown_connections 2026-03-10T14:11:45.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.243+0000 7f338667b700 1 --2- 192.168.123.103:0/1593458003 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3380100700 0x7f3380100b70 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.243+0000 7f338667b700 1 --2- 192.168.123.103:0/1593458003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33800ff460 0x7f33800ff870 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.243+0000 7f338667b700 1 -- 192.168.123.103:0/1593458003 >> 192.168.123.103:0/1593458003 conn(0x7f33800faa70 msgr2=0x7f33800fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.245+0000 7f338667b700 1 -- 192.168.123.103:0/1593458003 shutdown_connections 2026-03-10T14:11:45.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.245+0000 7f338667b700 1 -- 192.168.123.103:0/1593458003 wait complete. 2026-03-10T14:11:45.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.245+0000 7f338667b700 1 Processor -- start 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.245+0000 7f338667b700 1 -- start start 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.245+0000 7f338667b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f33800ff460 0x7f3380071d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.245+0000 7f338667b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3380100700 0x7f3380072250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.245+0000 7f338667b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3380072870 con 0x7f3380100700 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.245+0000 7f338667b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33801a3830 con 0x7f33800ff460 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.246+0000 7f337f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3380100700 0x7f3380072250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.246+0000 7f337f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3380100700 0x7f3380072250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47854/0 (socket says 192.168.123.103:47854) 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.246+0000 7f337f7fe700 1 -- 192.168.123.103:0/2684050860 learned_addr learned my addr 192.168.123.103:0/2684050860 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.246+0000 7f337f7fe700 1 -- 192.168.123.103:0/2684050860 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f33800ff460 msgr2=0x7f3380071d10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.246+0000 7f337f7fe700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f33800ff460 0x7f3380071d10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.246+0000 7f337f7fe700 1 -- 192.168.123.103:0/2684050860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3374009710 con 0x7f3380100700 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.246+0000 7f337f7fe700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3380100700 0x7f3380072250 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f337000bb70 tx=0x7f3370004690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.247+0000 7f337d7fa700 1 -- 192.168.123.103:0/2684050860 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f337001d070 con 0x7f3380100700 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.247+0000 7f337d7fa700 1 -- 192.168.123.103:0/2684050860 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3370022470 con 0x7f3380100700 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.247+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f33700097e0 con 0x7f3380100700 2026-03-10T14:11:45.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.247+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33801a3db0 con 0x7f3380100700 2026-03-10T14:11:45.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.247+0000 7f337d7fa700 1 -- 192.168.123.103:0/2684050860 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f337000f740 con 0x7f3380100700 2026-03-10T14:11:45.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.248+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3380066e40 con 0x7f3380100700 2026-03-10T14:11:45.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.249+0000 7f337d7fa700 1 -- 192.168.123.103:0/2684050860 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3370022ac0 con 0x7f3380100700 2026-03-10T14:11:45.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.249+0000 7f337d7fa700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f33680778c0 0x7f3368079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.249+0000 7f337d7fa700 1 -- 192.168.123.103:0/2684050860 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6281+0+0 (secure 0 0 0) 0x7f337009b040 con 0x7f3380100700 2026-03-10T14:11:45.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.249+0000 7f337ffff700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f33680778c0 0x7f3368079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.250+0000 7f337ffff700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f33680778c0 0x7f3368079d70 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f3380105c50 tx=0x7f3374009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.252+0000 7f337d7fa700 1 -- 192.168.123.103:0/2684050860 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3370063910 con 0x7f3380100700 2026-03-10T14:11:45.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.419+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f33801a39f0 con 0x7f3380100700 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.420+0000 7f337d7fa700 1 -- 192.168.123.103:0/2684050860 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f33801a39f0 con 0x7f3380100700 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:11:45.419 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:11:45.420 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T14:11:45.420 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T14:11:45.420 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:11:45.420 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f33680778c0 msgr2=0x7f3368079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f33680778c0 0x7f3368079d70 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f3380105c50 tx=0x7f3374009450 comp rx=0 tx=0).stop 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3380100700 msgr2=0x7f3380072250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3380100700 0x7f3380072250 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f337000bb70 tx=0x7f3370004690 comp rx=0 tx=0).stop 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 shutdown_connections 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f33680778c0 0x7f3368079d70 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f33800ff460 0x7f3380071d10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 --2- 192.168.123.103:0/2684050860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3380100700 0x7f3380072250 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.423+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 >> 192.168.123.103:0/2684050860 conn(0x7f33800faa70 msgr2=0x7f3380103930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.424+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 shutdown_connections 2026-03-10T14:11:45.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.424+0000 7f338667b700 1 -- 192.168.123.103:0/2684050860 wait complete. 2026-03-10T14:11:45.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.499+0000 7fde5b92b700 1 -- 192.168.123.103:0/2389061951 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54074d00 msgr2=0x7fde54073160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.499+0000 7fde5b92b700 1 --2- 192.168.123.103:0/2389061951 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54074d00 0x7fde54073160 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fde48009b50 tx=0x7fde48009e60 comp rx=0 tx=0).stop 2026-03-10T14:11:45.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.503+0000 7fde5b92b700 1 -- 192.168.123.103:0/2389061951 shutdown_connections 2026-03-10T14:11:45.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.503+0000 7fde5b92b700 1 --2- 192.168.123.103:0/2389061951 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fde54073730 0x7fde54073ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.503+0000 7fde5b92b700 1 --2- 192.168.123.103:0/2389061951 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54074d00 0x7fde54073160 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.503+0000 7fde5b92b700 1 -- 192.168.123.103:0/2389061951 >> 192.168.123.103:0/2389061951 conn(0x7fde540fbaa0 msgr2=0x7fde540fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.503+0000 7fde5b92b700 1 -- 192.168.123.103:0/2389061951 shutdown_connections 2026-03-10T14:11:45.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.503+0000 7fde5b92b700 1 -- 192.168.123.103:0/2389061951 wait complete. 2026-03-10T14:11:45.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.504+0000 7fde5b92b700 1 Processor -- start 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.504+0000 7fde5b92b700 1 -- start start 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.504+0000 7fde5b92b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54073730 0x7fde54193c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.504+0000 7fde5b92b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fde54074d00 0x7fde54194160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.504+0000 7fde5b92b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde54194730 con 0x7fde54073730 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.504+0000 7fde5b92b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde54194870 con 0x7fde54074d00 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde58ec6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fde54074d00 0x7fde54194160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde58ec6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fde54074d00 0x7fde54194160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38634/0 (socket says 192.168.123.103:38634) 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde58ec6700 1 -- 192.168.123.103:0/4104757222 learned_addr learned my addr 192.168.123.103:0/4104757222 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:45.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde596c7700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54073730 0x7fde54193c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde58ec6700 1 -- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54073730 msgr2=0x7fde54193c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde58ec6700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54073730 0x7fde54193c20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde58ec6700 1 -- 192.168.123.103:0/4104757222 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fde480097e0 con 0x7fde54074d00 2026-03-10T14:11:45.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde596c7700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54073730 0x7fde54193c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:11:45.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.505+0000 7fde58ec6700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fde54074d00 0x7fde54194160 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fde5000eb10 tx=0x7fde5000ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.506+0000 7fde467fc700 1 -- 192.168.123.103:0/4104757222 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fde5000cc40 con 0x7fde54074d00 2026-03-10T14:11:45.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.506+0000 7fde467fc700 1 -- 192.168.123.103:0/4104757222 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fde5000cda0 con 0x7fde54074d00 2026-03-10T14:11:45.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.506+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fde541aa380 con 0x7fde54074d00 2026-03-10T14:11:45.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.506+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fde541aa8d0 con 0x7fde54074d00 2026-03-10T14:11:45.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.507+0000 7fde467fc700 1 -- 192.168.123.103:0/4104757222 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fde50018810 con 0x7fde54074d00 2026-03-10T14:11:45.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.507+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fde38005320 con 0x7fde54074d00 2026-03-10T14:11:45.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.507+0000 7fde467fc700 1 -- 192.168.123.103:0/4104757222 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fde50018aa0 con 0x7fde54074d00 2026-03-10T14:11:45.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.508+0000 7fde467fc700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fde400778c0 0x7fde40079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.508+0000 7fde467fc700 1 -- 192.168.123.103:0/4104757222 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6281+0+0 (secure 0 0 0) 0x7fde50014070 con 0x7fde54074d00 2026-03-10T14:11:45.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.508+0000 7fde596c7700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fde400778c0 0x7fde40079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.509+0000 7fde596c7700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fde400778c0 0x7fde40079d70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fde48006010 tx=0x7fde48005a90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.510+0000 7fde467fc700 1 -- 192.168.123.103:0/4104757222 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fde50063ca0 con 0x7fde54074d00 2026-03-10T14:11:45.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.655+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fde38006200 con 0x7fde54074d00 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.656+0000 7fde467fc700 1 -- 192.168.123.103:0/4104757222 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1973 (secure 0 0 0) 0x7fde500633f0 con 0x7fde54074d00 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:11:45.655 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:11:45.656 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:11:45.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.659+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fde400778c0 msgr2=0x7fde40079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.659+0000 7fde5b92b700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fde400778c0 0x7fde40079d70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fde48006010 tx=0x7fde48005a90 comp rx=0 tx=0).stop 2026-03-10T14:11:45.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.659+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fde54074d00 msgr2=0x7fde54194160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.660+0000 7fde5b92b700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fde54074d00 0x7fde54194160 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fde5000eb10 tx=0x7fde5000ee20 comp rx=0 tx=0).stop 2026-03-10T14:11:45.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.660+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 shutdown_connections 2026-03-10T14:11:45.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.660+0000 7fde5b92b700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fde400778c0 0x7fde40079d70 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.660+0000 7fde5b92b700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde54073730 0x7fde54193c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.660+0000 7fde5b92b700 1 --2- 192.168.123.103:0/4104757222 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fde54074d00 0x7fde54194160 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.660+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 >> 192.168.123.103:0/4104757222 conn(0x7fde540fbaa0 msgr2=0x7fde54101f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.660+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 shutdown_connections 2026-03-10T14:11:45.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.660+0000 7fde5b92b700 1 -- 192.168.123.103:0/4104757222 wait complete. 2026-03-10T14:11:45.660 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:11:45.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.734+0000 7f9c6c4de700 1 -- 192.168.123.103:0/3302079110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64102740 msgr2=0x7f9c64102b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.734+0000 7f9c6c4de700 1 --2- 192.168.123.103:0/3302079110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64102740 0x7f9c64102b50 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f9c58009b00 tx=0x7f9c58009e10 comp rx=0 tx=0).stop 2026-03-10T14:11:45.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.735+0000 7f9c6c4de700 1 -- 192.168.123.103:0/3302079110 shutdown_connections 2026-03-10T14:11:45.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.735+0000 7f9c6c4de700 1 --2- 192.168.123.103:0/3302079110 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c64103940 0x7f9c64103d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.735+0000 7f9c6c4de700 1 --2- 192.168.123.103:0/3302079110 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64102740 0x7f9c64102b50 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.735+0000 7f9c6c4de700 1 -- 192.168.123.103:0/3302079110 >> 192.168.123.103:0/3302079110 conn(0x7f9c640fdcf0 msgr2=0x7f9c64100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.735+0000 7f9c6c4de700 1 -- 192.168.123.103:0/3302079110 shutdown_connections 2026-03-10T14:11:45.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.735+0000 7f9c6c4de700 1 -- 192.168.123.103:0/3302079110 wait complete. 2026-03-10T14:11:45.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6c4de700 1 Processor -- start 2026-03-10T14:11:45.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6c4de700 1 -- start start 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6c4de700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c64102740 0x7f9c64198000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6c4de700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64103940 0x7f9c64198540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6c4de700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c64198b60 con 0x7f9c64103940 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6c4de700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c64198ca0 con 0x7f9c64102740 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6a27a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c64102740 0x7f9c64198000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6a27a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c64102740 0x7f9c64198000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38652/0 (socket says 192.168.123.103:38652) 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.736+0000 7f9c6a27a700 1 -- 192.168.123.103:0/1054325859 learned_addr learned my addr 192.168.123.103:0/1054325859 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c69a79700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64103940 0x7f9c64198540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c6a27a700 1 -- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64103940 msgr2=0x7f9c64198540 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c6a27a700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64103940 0x7f9c64198540 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c6a27a700 1 -- 192.168.123.103:0/1054325859 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9c580097e0 con 0x7f9c64102740 2026-03-10T14:11:45.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c69a79700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64103940 0x7f9c64198540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:11:45.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c6a27a700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c64102740 0x7f9c64198000 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f9c58000c00 tx=0x7f9c580056c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c577fe700 1 -- 192.168.123.103:0/1054325859 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c5801d070 con 0x7f9c64102740 2026-03-10T14:11:45.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9c6419d6f0 con 0x7f9c64102740 2026-03-10T14:11:45.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.737+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9c6419dbe0 con 0x7f9c64102740 2026-03-10T14:11:45.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.738+0000 7f9c577fe700 1 -- 192.168.123.103:0/1054325859 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9c5800bc50 con 0x7f9c64102740 2026-03-10T14:11:45.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.738+0000 7f9c577fe700 1 -- 192.168.123.103:0/1054325859 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9c5800f780 con 0x7f9c64102740 2026-03-10T14:11:45.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.739+0000 7f9c577fe700 1 -- 192.168.123.103:0/1054325859 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9c5800fa00 con 0x7f9c64102740 2026-03-10T14:11:45.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.739+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9c48005320 con 0x7f9c64102740 2026-03-10T14:11:45.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.739+0000 7f9c577fe700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9c500778c0 0x7f9c50079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.740+0000 7f9c577fe700 1 -- 192.168.123.103:0/1054325859 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6281+0+0 (secure 0 0 0) 0x7f9c5809b2f0 con 0x7f9c64102740 2026-03-10T14:11:45.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.740+0000 7f9c69a79700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9c500778c0 0x7f9c50079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.740+0000 7f9c69a79700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9c500778c0 0x7f9c50079d70 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f9c60005fd0 tx=0x7f9c60005dc0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.743+0000 7f9c577fe700 1 -- 192.168.123.103:0/1054325859 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9c58063b40 con 0x7f9c64102740 2026-03-10T14:11:45.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.876+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9c48000bf0 con 0x7f9c500778c0 2026-03-10T14:11:45.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.877+0000 7f9c577fe700 1 -- 192.168.123.103:0/1054325859 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9c48000bf0 con 0x7f9c500778c0 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:11:45.876 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:11:45.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.879+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9c500778c0 msgr2=0x7f9c50079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.879+0000 7f9c6c4de700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9c500778c0 0x7f9c50079d70 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f9c60005fd0 tx=0x7f9c60005dc0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.879+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c64102740 msgr2=0x7f9c64198000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.880+0000 7f9c6c4de700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c64102740 0x7f9c64198000 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f9c58000c00 tx=0x7f9c580056c0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.880+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 shutdown_connections 2026-03-10T14:11:45.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.880+0000 7f9c6c4de700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9c500778c0 0x7f9c50079d70 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.880+0000 7f9c6c4de700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9c64102740 0x7f9c64198000 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.880+0000 7f9c6c4de700 1 --2- 192.168.123.103:0/1054325859 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c64103940 0x7f9c64198540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.880+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 >> 192.168.123.103:0/1054325859 conn(0x7f9c640fdcf0 msgr2=0x7f9c64106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.880+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 shutdown_connections 2026-03-10T14:11:45.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.880+0000 7f9c6c4de700 1 -- 192.168.123.103:0/1054325859 wait complete. 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.950+0000 7f735ed4f700 1 -- 192.168.123.103:0/4158980042 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 msgr2=0x7f7358105800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.950+0000 7f735ed4f700 1 --2- 192.168.123.103:0/4158980042 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 0x7f7358105800 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f7354009a60 tx=0x7f7354009d70 comp rx=0 tx=0).stop 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.950+0000 7f735ed4f700 1 -- 192.168.123.103:0/4158980042 shutdown_connections 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.950+0000 7f735ed4f700 1 --2- 192.168.123.103:0/4158980042 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 0x7f7358105800 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.950+0000 7f735ed4f700 1 --2- 192.168.123.103:0/4158980042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7358069180 0x7f7358102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.950+0000 7f735ed4f700 1 -- 192.168.123.103:0/4158980042 >> 192.168.123.103:0/4158980042 conn(0x7f73580fa7b0 msgr2=0x7f73580fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.950+0000 7f735ed4f700 1 -- 192.168.123.103:0/4158980042 shutdown_connections 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.950+0000 7f735ed4f700 1 -- 192.168.123.103:0/4158980042 wait complete. 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.951+0000 7f735ed4f700 1 Processor -- start 2026-03-10T14:11:45.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.951+0000 7f735ed4f700 1 -- start start 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.951+0000 7f735ed4f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7358069180 0x7f7358197db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.951+0000 7f735dd4d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7358069180 0x7f7358197db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.951+0000 7f735dd4d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7358069180 0x7f7358197db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47918/0 (socket says 192.168.123.103:47918) 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.951+0000 7f735ed4f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 0x7f73581982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.951+0000 7f735ed4f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7358198910 con 0x7f7358069180 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.952+0000 7f735ed4f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f735819d320 con 0x7f73581033c0 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.952+0000 7f735dd4d700 1 -- 192.168.123.103:0/2896709705 learned_addr learned my addr 192.168.123.103:0/2896709705 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.952+0000 7f735d54c700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 0x7f73581982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.952+0000 7f735dd4d700 1 -- 192.168.123.103:0/2896709705 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 msgr2=0x7f73581982f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:45.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.952+0000 7f735dd4d700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 0x7f73581982f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:45.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.952+0000 7f735dd4d700 1 -- 192.168.123.103:0/2896709705 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73480097e0 con 0x7f7358069180 2026-03-10T14:11:45.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.952+0000 7f735d54c700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 0x7f73581982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:11:45.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.952+0000 7f735dd4d700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7358069180 0x7f7358197db0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f7348009f90 tx=0x7f734800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.953+0000 7f734effd700 1 -- 192.168.123.103:0/2896709705 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f734800ce90 con 0x7f7358069180 2026-03-10T14:11:45.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.953+0000 7f734effd700 1 -- 192.168.123.103:0/2896709705 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7348010910 con 0x7f7358069180 2026-03-10T14:11:45.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.953+0000 7f734effd700 1 -- 192.168.123.103:0/2896709705 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7348018b40 con 0x7f7358069180 2026-03-10T14:11:45.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.953+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7354009710 con 0x7f7358069180 2026-03-10T14:11:45.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.953+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f735819d8b0 con 0x7f7358069180 2026-03-10T14:11:45.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.954+0000 7f734effd700 1 -- 192.168.123.103:0/2896709705 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7348018d00 con 0x7f7358069180 2026-03-10T14:11:45.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.954+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f73580fc3b0 con 0x7f7358069180 2026-03-10T14:11:45.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.955+0000 7f734effd700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7344077870 0x7f7344079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:11:45.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.955+0000 7f734effd700 1 -- 192.168.123.103:0/2896709705 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6281+0+0 (secure 0 0 0) 0x7f7348014070 con 0x7f7358069180 2026-03-10T14:11:45.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.955+0000 7f735d54c700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7344077870 0x7f7344079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:11:45.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.957+0000 7f735d54c700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7344077870 0x7f7344079d20 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f73540038c0 tx=0x7f735400b540 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:11:45.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:45.958+0000 7f734effd700 1 -- 192.168.123.103:0/2896709705 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f73480629b0 con 0x7f7358069180 2026-03-10T14:11:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.133+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f735819d460 con 0x7f7358069180 2026-03-10T14:11:46.132 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:45 vm03.local ceph-mon[103098]: pgmap v107: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 447/336 objects degraded (133.036%); 0 B/s, 5 objects/s recovering 2026-03-10T14:11:46.132 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:45 vm03.local ceph-mon[103098]: from='client.34264 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:46.132 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:45 vm03.local ceph-mon[103098]: from='client.34268 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:46.132 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:45 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2684050860' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:46.132 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:11:46.132 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:45 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/4104757222' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:11:46.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.137+0000 7f734effd700 1 -- 192.168.123.103:0/2896709705 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+834 (secure 0 0 0) 0x7f7348062100 con 0x7f7358069180 2026-03-10T14:11:46.136 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 447/336 objects degraded (133.036%), 4 pgs degraded, 5 pgs undersized 2026-03-10T14:11:46.136 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 447/336 objects degraded (133.036%), 4 pgs degraded, 5 pgs undersized 2026-03-10T14:11:46.136 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is stuck undersized for 111s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T14:11:46.136 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.b is stuck undersized for 111s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-10T14:11:46.136 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.10 is stuck undersized for 111s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-10T14:11:46.136 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is stuck undersized for 111s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-10T14:11:46.136 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.18 is stuck undersized for 111s, current state active+recovering+undersized+remapped, last acting [2,1] 2026-03-10T14:11:46.139 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.140+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7344077870 msgr2=0x7f7344079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.140+0000 7f735ed4f700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7344077870 0x7f7344079d20 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f73540038c0 tx=0x7f735400b540 comp rx=0 tx=0).stop 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.140+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7358069180 msgr2=0x7f7358197db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.140+0000 7f735ed4f700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7358069180 0x7f7358197db0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f7348009f90 tx=0x7f734800c5b0 comp rx=0 tx=0).stop 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.141+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 shutdown_connections 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.141+0000 7f735ed4f700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7344077870 0x7f7344079d20 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.141+0000 7f735ed4f700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7358069180 0x7f7358197db0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.141+0000 7f735ed4f700 1 --2- 192.168.123.103:0/2896709705 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f73581033c0 0x7f73581982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.141+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 >> 192.168.123.103:0/2896709705 conn(0x7f73580fa7b0 msgr2=0x7f7358100710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.141+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 shutdown_connections 2026-03-10T14:11:46.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:11:46.141+0000 7f735ed4f700 1 -- 192.168.123.103:0/2896709705 wait complete. 2026-03-10T14:11:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:45 vm04.local ceph-mon[92084]: pgmap v107: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 447/336 objects degraded (133.036%); 0 B/s, 5 objects/s recovering 2026-03-10T14:11:46.367 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:45 vm04.local ceph-mon[92084]: from='client.34264 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:46.367 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:45 vm04.local ceph-mon[92084]: from='client.34268 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:46.367 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:45 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2684050860' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:46.367 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:11:46.367 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:45 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/4104757222' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:11:47.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:46 vm04.local ceph-mon[92084]: from='client.44189 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:47.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:46 vm04.local ceph-mon[92084]: from='client.44197 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:47.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:46 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2896709705' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:11:47.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:46 vm03.local ceph-mon[103098]: from='client.44189 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:47.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:46 vm03.local ceph-mon[103098]: from='client.44197 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:11:47.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:46 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2896709705' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:11:48.304 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:47 vm03.local ceph-mon[103098]: pgmap v108: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 2 op/s; 447/336 objects degraded (133.036%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:47 vm04.local ceph-mon[92084]: pgmap v108: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 2 op/s; 447/336 objects degraded (133.036%); 0 B/s, 9 objects/s recovering 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:11:49.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:49 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:49 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T14:11:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:49 vm04.local ceph-mon[92084]: pgmap v109: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 447/336 objects degraded (133.036%); 0 B/s, 8 objects/s recovering 2026-03-10T14:11:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:49 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:11:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:49 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T14:11:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:49 vm03.local ceph-mon[103098]: pgmap v109: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 447/336 objects degraded (133.036%); 0 B/s, 8 objects/s recovering 2026-03-10T14:11:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:51 vm04.local ceph-mon[92084]: osdmap e66: 6 total, 6 up, 6 in 2026-03-10T14:11:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:51 vm04.local ceph-mon[92084]: pgmap v111: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 447/336 objects degraded (133.036%); 0 B/s, 8 objects/s recovering 2026-03-10T14:11:51.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:51 vm03.local ceph-mon[103098]: osdmap e66: 6 total, 6 up, 6 in 2026-03-10T14:11:51.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:51 vm03.local ceph-mon[103098]: pgmap v111: 65 pgs: 1 active+recovering+undersized+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 447/336 objects degraded (133.036%); 0 B/s, 8 objects/s recovering 2026-03-10T14:11:52.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:52 vm04.local ceph-mon[92084]: osdmap e67: 6 total, 6 up, 6 in 2026-03-10T14:11:52.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:52 vm03.local ceph-mon[103098]: osdmap e67: 6 total, 6 up, 6 in 2026-03-10T14:11:53.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:53 vm04.local ceph-mon[92084]: pgmap v113: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 336/336 objects degraded (100.000%); 0 B/s, 11 objects/s recovering 2026-03-10T14:11:53.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:53 vm03.local ceph-mon[103098]: pgmap v113: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 336/336 objects degraded (100.000%); 0 B/s, 11 objects/s recovering 2026-03-10T14:11:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:54 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 336/336 objects degraded (100.000%), 3 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:54.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:54 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 336/336 objects degraded (100.000%), 3 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-10T14:11:55.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:55 vm04.local ceph-mon[92084]: pgmap v114: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 336/336 objects degraded (100.000%); 0 B/s, 6 objects/s recovering 2026-03-10T14:11:55.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:55 vm03.local ceph-mon[103098]: pgmap v114: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 336/336 objects degraded (100.000%); 0 B/s, 6 objects/s recovering 2026-03-10T14:11:58.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:57 vm04.local ceph-mon[92084]: pgmap v115: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 336/336 objects degraded (100.000%); 0 B/s, 6 objects/s recovering 2026-03-10T14:11:58.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:57 vm03.local ceph-mon[103098]: pgmap v115: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 336/336 objects degraded (100.000%); 0 B/s, 6 objects/s recovering 2026-03-10T14:12:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:11:59 vm04.local ceph-mon[92084]: pgmap v116: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.4 KiB/s rd, 4 op/s; 336/336 objects degraded (100.000%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:00.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:11:59 vm03.local ceph-mon[103098]: pgmap v116: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.4 KiB/s rd, 4 op/s; 336/336 objects degraded (100.000%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:01.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:12:01.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:12:02.047 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:01 vm04.local ceph-mon[92084]: pgmap v117: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 336/336 objects degraded (100.000%); 0 B/s, 9 objects/s recovering 2026-03-10T14:12:02.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:01 vm03.local ceph-mon[103098]: pgmap v117: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 336/336 objects degraded (100.000%); 0 B/s, 9 objects/s recovering 2026-03-10T14:12:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:02 vm04.local ceph-mon[92084]: osdmap e68: 6 total, 6 up, 6 in 2026-03-10T14:12:03.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:02 vm03.local ceph-mon[103098]: osdmap e68: 6 total, 6 up, 6 in 2026-03-10T14:12:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:03 vm04.local ceph-mon[92084]: pgmap v119: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 232/336 objects degraded (69.048%); 0 B/s, 9 objects/s recovering 2026-03-10T14:12:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:03 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 232/336 objects degraded (69.048%), 3 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-10T14:12:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:03 vm04.local ceph-mon[92084]: osdmap e69: 6 total, 6 up, 6 in 2026-03-10T14:12:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:04.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:03 vm03.local ceph-mon[103098]: pgmap v119: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 232/336 objects degraded (69.048%); 0 B/s, 9 objects/s recovering 2026-03-10T14:12:04.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:03 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 232/336 objects degraded (69.048%), 3 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-10T14:12:04.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:03 vm03.local ceph-mon[103098]: osdmap e69: 6 total, 6 up, 6 in 2026-03-10T14:12:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:05.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:04 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:05.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:04 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-10T14:12:05.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:04 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:05.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:04 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-10T14:12:06.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:05 vm04.local ceph-mon[92084]: pgmap v121: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 232/336 objects degraded (69.048%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:06.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:05 vm03.local ceph-mon[103098]: pgmap v121: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 232/336 objects degraded (69.048%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:08.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:07 vm04.local ceph-mon[92084]: pgmap v122: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 232/336 objects degraded (69.048%); 0 B/s, 5 objects/s recovering 2026-03-10T14:12:08.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:07 vm03.local ceph-mon[103098]: pgmap v122: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 232/336 objects degraded (69.048%); 0 B/s, 5 objects/s recovering 2026-03-10T14:12:09.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:08 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 232/336 objects degraded (69.048%), 2 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-10T14:12:09.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:08 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 232/336 objects degraded (69.048%), 2 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-10T14:12:10.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:09 vm04.local ceph-mon[92084]: pgmap v123: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 232/336 objects degraded (69.048%); 0 B/s, 12 objects/s recovering 2026-03-10T14:12:10.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:09 vm03.local ceph-mon[103098]: pgmap v123: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 232/336 objects degraded (69.048%); 0 B/s, 12 objects/s recovering 2026-03-10T14:12:12.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:11 vm04.local ceph-mon[92084]: pgmap v124: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 232/336 objects degraded (69.048%); 0 B/s, 5 objects/s recovering 2026-03-10T14:12:12.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:11 vm03.local ceph-mon[103098]: pgmap v124: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 232/336 objects degraded (69.048%); 0 B/s, 5 objects/s recovering 2026-03-10T14:12:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:13 vm04.local ceph-mon[92084]: pgmap v125: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 232/336 objects degraded (69.048%); 0 B/s, 9 objects/s recovering 2026-03-10T14:12:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:13 vm04.local ceph-mon[92084]: osdmap e70: 6 total, 6 up, 6 in 2026-03-10T14:12:14.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:13 vm03.local ceph-mon[103098]: pgmap v125: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 232/336 objects degraded (69.048%); 0 B/s, 9 objects/s recovering 2026-03-10T14:12:14.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:13 vm03.local ceph-mon[103098]: osdmap e70: 6 total, 6 up, 6 in 2026-03-10T14:12:15.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:14 vm03.local ceph-mon[103098]: osdmap e71: 6 total, 6 up, 6 in 2026-03-10T14:12:15.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:14 vm04.local ceph-mon[92084]: osdmap e71: 6 total, 6 up, 6 in 2026-03-10T14:12:16.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.218+0000 7f4493b09700 1 -- 192.168.123.103:0/3087978586 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 msgr2=0x7f448c101b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.218+0000 7f4493b09700 1 --2- 192.168.123.103:0/3087978586 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 0x7f448c101b80 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f4488009b00 tx=0x7f4488009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:16.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.219+0000 7f4493b09700 1 -- 192.168.123.103:0/3087978586 shutdown_connections 2026-03-10T14:12:16.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.219+0000 7f4493b09700 1 --2- 192.168.123.103:0/3087978586 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 0x7f448c101b80 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.219+0000 7f4493b09700 1 --2- 192.168.123.103:0/3087978586 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f448c100530 0x7f448c100940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.219+0000 7f4493b09700 1 -- 192.168.123.103:0/3087978586 >> 192.168.123.103:0/3087978586 conn(0x7f448c0fbaa0 msgr2=0x7f448c0fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:16.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.219+0000 7f4493b09700 1 -- 192.168.123.103:0/3087978586 shutdown_connections 2026-03-10T14:12:16.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.219+0000 7f4493b09700 1 -- 192.168.123.103:0/3087978586 wait complete. 2026-03-10T14:12:16.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.219+0000 7f4493b09700 1 Processor -- start 2026-03-10T14:12:16.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f4493b09700 1 -- start start 2026-03-10T14:12:16.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f4493b09700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f448c100530 0x7f448c074af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f4493b09700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 0x7f448c073140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f4493b09700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f448c073680 con 0x7f448c101730 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f44910a4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 0x7f448c073140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f44910a4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 0x7f448c073140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40352/0 (socket says 192.168.123.103:40352) 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f44910a4700 1 -- 192.168.123.103:0/2826535087 learned_addr learned my addr 192.168.123.103:0/2826535087 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f448c0737c0 con 0x7f448c100530 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f44910a4700 1 -- 192.168.123.103:0/2826535087 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f448c100530 msgr2=0x7f448c074af0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f44910a4700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f448c100530 0x7f448c074af0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.220+0000 7f44910a4700 1 -- 192.168.123.103:0/2826535087 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f44880097e0 con 0x7f448c101730 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.221+0000 7f44910a4700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 0x7f448c073140 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f4488005230 tx=0x7f4488005790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:16.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.221+0000 7f4482ffd700 1 -- 192.168.123.103:0/2826535087 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f448801d070 con 0x7f448c101730 2026-03-10T14:12:16.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.221+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f448c073a40 con 0x7f448c101730 2026-03-10T14:12:16.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.221+0000 7f4482ffd700 1 -- 192.168.123.103:0/2826535087 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f448800be00 con 0x7f448c101730 2026-03-10T14:12:16.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.221+0000 7f4482ffd700 1 -- 192.168.123.103:0/2826535087 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f448800f460 con 0x7f448c101730 2026-03-10T14:12:16.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.221+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f448c073ed0 con 0x7f448c101730 2026-03-10T14:12:16.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.223+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f448c066e40 con 0x7f448c101730 2026-03-10T14:12:16.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.224+0000 7f4482ffd700 1 -- 192.168.123.103:0/2826535087 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4488003780 con 0x7f448c101730 2026-03-10T14:12:16.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.224+0000 7f4482ffd700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f44780778c0 0x7f4478079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.226+0000 7f44918a5700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f44780778c0 0x7f4478079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.227+0000 7f4482ffd700 1 -- 192.168.123.103:0/2826535087 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6194+0+0 (secure 0 0 0) 0x7f4488067930 con 0x7f448c101730 2026-03-10T14:12:16.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.227+0000 7f44918a5700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f44780778c0 0x7f4478079d70 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f448c101590 tx=0x7f447c006c60 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:16.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.227+0000 7f4482ffd700 1 -- 192.168.123.103:0/2826535087 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4488063350 con 0x7f448c101730 2026-03-10T14:12:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:15 vm04.local ceph-mon[92084]: pgmap v128: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 232/336 objects degraded (69.048%); 0 B/s, 12 objects/s recovering 2026-03-10T14:12:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:12:16.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.357+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f448c106080 con 0x7f44780778c0 2026-03-10T14:12:16.356 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:15 vm03.local ceph-mon[103098]: pgmap v128: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 232/336 objects degraded (69.048%); 0 B/s, 12 objects/s recovering 2026-03-10T14:12:16.356 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:12:16.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.360+0000 7f4482ffd700 1 -- 192.168.123.103:0/2826535087 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f448c106080 con 0x7f44780778c0 2026-03-10T14:12:16.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.362+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f44780778c0 msgr2=0x7f4478079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.362+0000 7f4493b09700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f44780778c0 0x7f4478079d70 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f448c101590 tx=0x7f447c006c60 comp rx=0 tx=0).stop 2026-03-10T14:12:16.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.362+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 msgr2=0x7f448c073140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.362+0000 7f4493b09700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 0x7f448c073140 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f4488005230 tx=0x7f4488005790 comp rx=0 tx=0).stop 2026-03-10T14:12:16.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.363+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 shutdown_connections 2026-03-10T14:12:16.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.363+0000 7f4493b09700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f44780778c0 0x7f4478079d70 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.363+0000 7f4493b09700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f448c100530 0x7f448c074af0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.363+0000 7f4493b09700 1 --2- 192.168.123.103:0/2826535087 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f448c101730 0x7f448c073140 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.363+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 >> 192.168.123.103:0/2826535087 conn(0x7f448c0fbaa0 msgr2=0x7f448c104960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:16.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.363+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 shutdown_connections 2026-03-10T14:12:16.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.364+0000 7f4493b09700 1 -- 192.168.123.103:0/2826535087 wait complete. 2026-03-10T14:12:16.373 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:12:16.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.432+0000 7f9d58d82700 1 -- 192.168.123.103:0/2548930785 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54102780 msgr2=0x7f9d54102b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.432+0000 7f9d58d82700 1 --2- 192.168.123.103:0/2548930785 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54102780 0x7f9d54102b90 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f9d3c009b00 tx=0x7f9d3c009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:16.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.432+0000 7f9d58d82700 1 -- 192.168.123.103:0/2548930785 shutdown_connections 2026-03-10T14:12:16.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.432+0000 7f9d58d82700 1 --2- 192.168.123.103:0/2548930785 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d54103980 0x7f9d54103dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.432+0000 7f9d58d82700 1 --2- 192.168.123.103:0/2548930785 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54102780 0x7f9d54102b90 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.432+0000 7f9d58d82700 1 -- 192.168.123.103:0/2548930785 >> 192.168.123.103:0/2548930785 conn(0x7f9d540fdd50 msgr2=0x7f9d54100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:16.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.433+0000 7f9d58d82700 1 -- 192.168.123.103:0/2548930785 shutdown_connections 2026-03-10T14:12:16.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.433+0000 7f9d58d82700 1 -- 192.168.123.103:0/2548930785 wait complete. 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.433+0000 7f9d58d82700 1 Processor -- start 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.433+0000 7f9d58d82700 1 -- start start 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.433+0000 7f9d58d82700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d54102780 0x7f9d54193ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.433+0000 7f9d58d82700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54103980 0x7f9d54194220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.433+0000 7f9d58d82700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d541947f0 con 0x7f9d54103980 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.433+0000 7f9d58d82700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d54194930 con 0x7f9d54102780 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d5259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d54102780 0x7f9d54193ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d5259c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d54102780 0x7f9d54193ce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34260/0 (socket says 192.168.123.103:34260) 2026-03-10T14:12:16.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d5259c700 1 -- 192.168.123.103:0/3214161852 learned_addr learned my addr 192.168.123.103:0/3214161852 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:16.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d5259c700 1 -- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54103980 msgr2=0x7f9d54194220 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d51d9b700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54103980 0x7f9d54194220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d5259c700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54103980 0x7f9d54194220 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d5259c700 1 -- 192.168.123.103:0/3214161852 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d3c0097e0 con 0x7f9d54102780 2026-03-10T14:12:16.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d51d9b700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54103980 0x7f9d54194220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:12:16.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.434+0000 7f9d5259c700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d54102780 0x7f9d54193ce0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f9d3c00b5c0 tx=0x7f9d3c004a80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:16.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.435+0000 7f9d4b7fe700 1 -- 192.168.123.103:0/3214161852 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d3c01d070 con 0x7f9d54102780 2026-03-10T14:12:16.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.435+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d541aa430 con 0x7f9d54102780 2026-03-10T14:12:16.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.435+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d541aa870 con 0x7f9d54102780 2026-03-10T14:12:16.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.435+0000 7f9d4b7fe700 1 -- 192.168.123.103:0/3214161852 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d3c004500 con 0x7f9d54102780 2026-03-10T14:12:16.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.435+0000 7f9d4b7fe700 1 -- 192.168.123.103:0/3214161852 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d3c00f460 con 0x7f9d54102780 2026-03-10T14:12:16.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.436+0000 7f9d4b7fe700 1 -- 192.168.123.103:0/3214161852 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9d3c00bc50 con 0x7f9d54102780 2026-03-10T14:12:16.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.437+0000 7f9d4b7fe700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9d400776b0 0x7f9d40079b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.437+0000 7f9d4b7fe700 1 -- 192.168.123.103:0/3214161852 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6194+0+0 (secure 0 0 0) 0x7f9d3c09b070 con 0x7f9d54102780 2026-03-10T14:12:16.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.437+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d34005320 con 0x7f9d54102780 2026-03-10T14:12:16.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.437+0000 7f9d51d9b700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9d400776b0 0x7f9d40079b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.437+0000 7f9d51d9b700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9d400776b0 0x7f9d40079b60 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f9d44006fd0 tx=0x7f9d44008040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:16.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.440+0000 7f9d4b7fe700 1 -- 192.168.123.103:0/3214161852 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9d3c063860 con 0x7f9d54102780 2026-03-10T14:12:16.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.569+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9d34000bf0 con 0x7f9d400776b0 2026-03-10T14:12:16.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.570+0000 7f9d4b7fe700 1 -- 192.168.123.103:0/3214161852 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9d34000bf0 con 0x7f9d400776b0 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9d400776b0 msgr2=0x7f9d40079b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9d400776b0 0x7f9d40079b60 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f9d44006fd0 tx=0x7f9d44008040 comp rx=0 tx=0).stop 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d54102780 msgr2=0x7f9d54193ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d54102780 0x7f9d54193ce0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f9d3c00b5c0 tx=0x7f9d3c004a80 comp rx=0 tx=0).stop 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 shutdown_connections 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9d400776b0 0x7f9d40079b60 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9d54102780 0x7f9d54193ce0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 --2- 192.168.123.103:0/3214161852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d54103980 0x7f9d54194220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.573+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 >> 192.168.123.103:0/3214161852 conn(0x7f9d540fdd50 msgr2=0x7f9d54106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.574+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 shutdown_connections 2026-03-10T14:12:16.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.574+0000 7f9d58d82700 1 -- 192.168.123.103:0/3214161852 wait complete. 2026-03-10T14:12:16.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.655+0000 7f459e160700 1 -- 192.168.123.103:0/4257180209 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 msgr2=0x7f4598103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.655+0000 7f459e160700 1 --2- 192.168.123.103:0/4257180209 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 0x7f4598103e70 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f4588009b00 tx=0x7f4588009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:16.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.656+0000 7f459e160700 1 -- 192.168.123.103:0/4257180209 shutdown_connections 2026-03-10T14:12:16.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.656+0000 7f459e160700 1 --2- 192.168.123.103:0/4257180209 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 0x7f4598103e70 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.656+0000 7f459e160700 1 --2- 192.168.123.103:0/4257180209 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4598102760 0x7f4598102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.656+0000 7f459e160700 1 -- 192.168.123.103:0/4257180209 >> 192.168.123.103:0/4257180209 conn(0x7f45980fddb0 msgr2=0x7f45981001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.656+0000 7f459e160700 1 -- 192.168.123.103:0/4257180209 shutdown_connections 2026-03-10T14:12:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.656+0000 7f459e160700 1 -- 192.168.123.103:0/4257180209 wait complete. 2026-03-10T14:12:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f459e160700 1 Processor -- start 2026-03-10T14:12:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f459e160700 1 -- start start 2026-03-10T14:12:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f459e160700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4598102760 0x7f4598197fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f459e160700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 0x7f4598198500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f459e160700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4598198b20 con 0x7f4598102760 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f459e160700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4598198c60 con 0x7f4598103a00 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f4596ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 0x7f4598198500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f4596ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 0x7f4598198500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34270/0 (socket says 192.168.123.103:34270) 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f4596ffd700 1 -- 192.168.123.103:0/2998960972 learned_addr learned my addr 192.168.123.103:0/2998960972 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.657+0000 7f45977fe700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4598102760 0x7f4598197fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f4596ffd700 1 -- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4598102760 msgr2=0x7f4598197fc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f4596ffd700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4598102760 0x7f4598197fc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f4596ffd700 1 -- 192.168.123.103:0/2998960972 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4580009710 con 0x7f4598103a00 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f45977fe700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4598102760 0x7f4598197fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:12:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f4596ffd700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 0x7f4598198500 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f458800bb70 tx=0x7f4588004690 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:16.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f4594ff9700 1 -- 192.168.123.103:0/2998960972 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f458801d070 con 0x7f4598103a00 2026-03-10T14:12:16.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f45880097e0 con 0x7f4598103a00 2026-03-10T14:12:16.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f4594ff9700 1 -- 192.168.123.103:0/2998960972 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4588004d60 con 0x7f4598103a00 2026-03-10T14:12:16.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f4594ff9700 1 -- 192.168.123.103:0/2998960972 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f458800f630 con 0x7f4598103a00 2026-03-10T14:12:16.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.658+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f459819da10 con 0x7f4598103a00 2026-03-10T14:12:16.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.659+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4598066e40 con 0x7f4598103a00 2026-03-10T14:12:16.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.661+0000 7f4594ff9700 1 -- 192.168.123.103:0/2998960972 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4588004ed0 con 0x7f4598103a00 2026-03-10T14:12:16.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.661+0000 7f4594ff9700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f45840778c0 0x7f4584079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.661+0000 7f4594ff9700 1 -- 192.168.123.103:0/2998960972 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6194+0+0 (secure 0 0 0) 0x7f458809bce0 con 0x7f4598103a00 2026-03-10T14:12:16.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.661+0000 7f45977fe700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f45840778c0 0x7f4584079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.661+0000 7f45977fe700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f45840778c0 0x7f4584079d70 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f4580009f60 tx=0x7f4580009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:16.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.663+0000 7f4594ff9700 1 -- 192.168.123.103:0/2998960972 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4588064550 con 0x7f4598103a00 2026-03-10T14:12:16.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.787+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f459819d5c0 con 0x7f45840778c0 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.792+0000 7f4594ff9700 1 -- 192.168.123.103:0/2998960972 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3528 (secure 0 0 0) 0x7f459819d5c0 con 0x7f45840778c0 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (8m) 2m ago 8m 22.0M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (6m) 2m ago 9m 8904k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (8m) 2m ago 8m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 2m ago 9m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (2m) 2m ago 8m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (8m) 2m ago 8m 91.2M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (6m) 2m ago 6m 158M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (6m) 2m ago 6m 93.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (6m) 2m ago 6m 91.2M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (6m) 2m ago 6m 166M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (3m) 2m ago 9m 614M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (3m) 2m ago 8m 496M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 2m ago 9m 56.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (3m) 2m ago 8m 49.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (8m) 2m ago 8m 15.5M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (8m) 2m ago 8m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 2m ago 8m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (7m) 2m ago 7m 368M 4096M 18.2.0 dc2bc1663786 ba323e54dbc0 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (7m) 2m ago 7m 288M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (7m) 2m ago 7m 434M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (7m) 2m ago 7m 397M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (7m) 2m ago 7m 352M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:12:16.791 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 2m ago 8m 53.0M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:12:16.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f45840778c0 msgr2=0x7f4584079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f45840778c0 0x7f4584079d70 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f4580009f60 tx=0x7f4580009450 comp rx=0 tx=0).stop 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 msgr2=0x7f4598198500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 0x7f4598198500 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f458800bb70 tx=0x7f4588004690 comp rx=0 tx=0).stop 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 shutdown_connections 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f45840778c0 0x7f4584079d70 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4598102760 0x7f4598197fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 --2- 192.168.123.103:0/2998960972 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4598103a00 0x7f4598198500 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 >> 192.168.123.103:0/2998960972 conn(0x7f45980fddb0 msgr2=0x7f4598106c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 shutdown_connections 2026-03-10T14:12:16.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.795+0000 7f459e160700 1 -- 192.168.123.103:0/2998960972 wait complete. 2026-03-10T14:12:16.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.865+0000 7fbfd7d87700 1 -- 192.168.123.103:0/1423704921 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 msgr2=0x7fbfd0105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.865+0000 7fbfd7d87700 1 --2- 192.168.123.103:0/1423704921 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 0x7fbfd0105ac0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fbfcc009b00 tx=0x7fbfcc009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.866+0000 7fbfd7d87700 1 -- 192.168.123.103:0/1423704921 shutdown_connections 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.866+0000 7fbfd7d87700 1 --2- 192.168.123.103:0/1423704921 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 0x7fbfd0105ac0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.866+0000 7fbfd7d87700 1 --2- 192.168.123.103:0/1423704921 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbfd0069160 0x7fbfd0103160 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.866+0000 7fbfd7d87700 1 -- 192.168.123.103:0/1423704921 >> 192.168.123.103:0/1423704921 conn(0x7fbfd00faa70 msgr2=0x7fbfd00fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.866+0000 7fbfd7d87700 1 -- 192.168.123.103:0/1423704921 shutdown_connections 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.866+0000 7fbfd7d87700 1 -- 192.168.123.103:0/1423704921 wait complete. 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.866+0000 7fbfd7d87700 1 Processor -- start 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.866+0000 7fbfd7d87700 1 -- start start 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd7d87700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbfd0069160 0x7fbfd0192170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd7d87700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 0x7fbfd01926b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd7d87700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbfd018ec60 con 0x7fbfd01036a0 2026-03-10T14:12:16.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd7d87700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbfd018edd0 con 0x7fbfd0069160 2026-03-10T14:12:16.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd5b23700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbfd0069160 0x7fbfd0192170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd5b23700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbfd0069160 0x7fbfd0192170 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34280/0 (socket says 192.168.123.103:34280) 2026-03-10T14:12:16.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd5322700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 0x7fbfd01926b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd5b23700 1 -- 192.168.123.103:0/592798613 learned_addr learned my addr 192.168.123.103:0/592798613 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:16.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd5322700 1 -- 192.168.123.103:0/592798613 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbfd0069160 msgr2=0x7fbfd0192170 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:16.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd5322700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbfd0069160 0x7fbfd0192170 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:16.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.867+0000 7fbfd5322700 1 -- 192.168.123.103:0/592798613 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbfcc0097e0 con 0x7fbfd01036a0 2026-03-10T14:12:16.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.868+0000 7fbfd5322700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 0x7fbfd01926b0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fbfcc009ad0 tx=0x7fbfcc004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:16.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.868+0000 7fbfc6ffd700 1 -- 192.168.123.103:0/592798613 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbfcc01d070 con 0x7fbfd01036a0 2026-03-10T14:12:16.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.868+0000 7fbfc6ffd700 1 -- 192.168.123.103:0/592798613 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbfcc00bc50 con 0x7fbfd01036a0 2026-03-10T14:12:16.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.868+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbfd018f050 con 0x7fbfd01036a0 2026-03-10T14:12:16.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.868+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbfd018f540 con 0x7fbfd01036a0 2026-03-10T14:12:16.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.869+0000 7fbfc6ffd700 1 -- 192.168.123.103:0/592798613 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbfcc00f7f0 con 0x7fbfd01036a0 2026-03-10T14:12:16.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.869+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbfd00fc670 con 0x7fbfd01036a0 2026-03-10T14:12:16.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.873+0000 7fbfc6ffd700 1 -- 192.168.123.103:0/592798613 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbfcc00f950 con 0x7fbfd01036a0 2026-03-10T14:12:16.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.874+0000 7fbfc6ffd700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbfbc077720 0x7fbfbc079bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:16.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.874+0000 7fbfc6ffd700 1 -- 192.168.123.103:0/592798613 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6194+0+0 (secure 0 0 0) 0x7fbfcc09ba70 con 0x7fbfd01036a0 2026-03-10T14:12:16.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.874+0000 7fbfd5b23700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbfbc077720 0x7fbfbc079bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:16.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.874+0000 7fbfd5b23700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbfbc077720 0x7fbfbc079bd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fbfc000abe0 tx=0x7fbfc000a5c0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:16.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:16.874+0000 7fbfc6ffd700 1 -- 192.168.123.103:0/592798613 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbfcc0cba90 con 0x7fbfd01036a0 2026-03-10T14:12:17.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.043+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fbfd0066e40 con 0x7fbfd01036a0 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.046+0000 7fbfc6ffd700 1 -- 192.168.123.103:0/592798613 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fbfcc0642e0 con 0x7fbfd01036a0 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:12:17.045 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:12:17.046 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:12:17.046 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:12:17.046 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T14:12:17.046 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T14:12:17.046 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:12:17.046 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.049+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbfbc077720 msgr2=0x7fbfbc079bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.049+0000 7fbfd7d87700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbfbc077720 0x7fbfbc079bd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fbfc000abe0 tx=0x7fbfc000a5c0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.049+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 msgr2=0x7fbfd01926b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.049+0000 7fbfd7d87700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 0x7fbfd01926b0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fbfcc009ad0 tx=0x7fbfcc004ab0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.050+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 shutdown_connections 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.050+0000 7fbfd7d87700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fbfbc077720 0x7fbfbc079bd0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.050+0000 7fbfd7d87700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fbfd0069160 0x7fbfd0192170 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.050+0000 7fbfd7d87700 1 --2- 192.168.123.103:0/592798613 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfd01036a0 0x7fbfd01926b0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.050+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 >> 192.168.123.103:0/592798613 conn(0x7fbfd00faa70 msgr2=0x7fbfd00ff7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:17.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.050+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 shutdown_connections 2026-03-10T14:12:17.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.050+0000 7fbfd7d87700 1 -- 192.168.123.103:0/592798613 wait complete. 2026-03-10T14:12:17.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.122+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2016383197 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68069180 msgr2=0x7fca68103140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.122+0000 7fca6ff5c700 1 --2- 192.168.123.103:0/2016383197 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68069180 0x7fca68103140 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fca5c009b50 tx=0x7fca5c009e60 comp rx=0 tx=0).stop 2026-03-10T14:12:17.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.122+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2016383197 shutdown_connections 2026-03-10T14:12:17.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.122+0000 7fca6ff5c700 1 --2- 192.168.123.103:0/2016383197 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca68103680 0x7fca68105ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.122+0000 7fca6ff5c700 1 --2- 192.168.123.103:0/2016383197 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68069180 0x7fca68103140 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.122+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2016383197 >> 192.168.123.103:0/2016383197 conn(0x7fca680faa70 msgr2=0x7fca680fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:17.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.122+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2016383197 shutdown_connections 2026-03-10T14:12:17.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.122+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2016383197 wait complete. 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.123+0000 7fca6ff5c700 1 Processor -- start 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.123+0000 7fca6ff5c700 1 -- start start 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.123+0000 7fca6ff5c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca68069180 0x7fca68193c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.123+0000 7fca6ff5c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68103680 0x7fca681941c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.123+0000 7fca6ff5c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca681947e0 con 0x7fca68103680 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.123+0000 7fca6ff5c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca68194920 con 0x7fca68069180 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.124+0000 7fca6d4f7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68103680 0x7fca681941c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.124+0000 7fca6d4f7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68103680 0x7fca681941c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40420/0 (socket says 192.168.123.103:40420) 2026-03-10T14:12:17.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.124+0000 7fca6d4f7700 1 -- 192.168.123.103:0/2085249984 learned_addr learned my addr 192.168.123.103:0/2085249984 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:17.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.124+0000 7fca6d4f7700 1 -- 192.168.123.103:0/2085249984 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca68069180 msgr2=0x7fca68193c80 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:12:17.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.124+0000 7fca6d4f7700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca68069180 0x7fca68193c80 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.124+0000 7fca6d4f7700 1 -- 192.168.123.103:0/2085249984 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca5c0097e0 con 0x7fca68103680 2026-03-10T14:12:17.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.125+0000 7fca6d4f7700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68103680 0x7fca681941c0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fca6400eb10 tx=0x7fca6400eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:17.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.125+0000 7fca5affd700 1 -- 192.168.123.103:0/2085249984 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca6400cca0 con 0x7fca68103680 2026-03-10T14:12:17.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.125+0000 7fca5affd700 1 -- 192.168.123.103:0/2085249984 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fca6400ce00 con 0x7fca68103680 2026-03-10T14:12:17.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.125+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca681993d0 con 0x7fca68103680 2026-03-10T14:12:17.124 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.125+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca68199920 con 0x7fca68103680 2026-03-10T14:12:17.125 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.126+0000 7fca5affd700 1 -- 192.168.123.103:0/2085249984 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca640105e0 con 0x7fca68103680 2026-03-10T14:12:17.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.126+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca68066e40 con 0x7fca68103680 2026-03-10T14:12:17.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.131+0000 7fca5affd700 1 -- 192.168.123.103:0/2085249984 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fca64018700 con 0x7fca68103680 2026-03-10T14:12:17.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.132+0000 7fca5affd700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca54077870 0x7fca54079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.132+0000 7fca5affd700 1 -- 192.168.123.103:0/2085249984 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6194+0+0 (secure 0 0 0) 0x7fca64014070 con 0x7fca68103680 2026-03-10T14:12:17.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.132+0000 7fca5affd700 1 -- 192.168.123.103:0/2085249984 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fca6409a8a0 con 0x7fca68103680 2026-03-10T14:12:17.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.133+0000 7fca6dcf8700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca54077870 0x7fca54079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:17.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.133+0000 7fca6dcf8700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca54077870 0x7fca54079d20 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fca5c009b20 tx=0x7fca5c005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:17.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.293+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fca6802cf30 con 0x7fca68103680 2026-03-10T14:12:17.292 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:17 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 117/336 objects degraded (34.821%), 1 pg degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-10T14:12:17.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.297+0000 7fca5affd700 1 -- 192.168.123.103:0/2085249984 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1973 (secure 0 0 0) 0x7fca64062dc0 con 0x7fca68103680 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:e14 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:epoch 14 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:07:48.854532+0000 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:12:17.296 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:12:17.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.300+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca54077870 msgr2=0x7fca54079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.300+0000 7fca6ff5c700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca54077870 0x7fca54079d20 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fca5c009b20 tx=0x7fca5c005fb0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.300+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68103680 msgr2=0x7fca681941c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.300+0000 7fca6ff5c700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68103680 0x7fca681941c0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7fca6400eb10 tx=0x7fca6400eed0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.301+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 shutdown_connections 2026-03-10T14:12:17.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.301+0000 7fca6ff5c700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca54077870 0x7fca54079d20 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.301+0000 7fca6ff5c700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca68069180 0x7fca68193c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.301+0000 7fca6ff5c700 1 --2- 192.168.123.103:0/2085249984 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca68103680 0x7fca681941c0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.301+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 >> 192.168.123.103:0/2085249984 conn(0x7fca680faa70 msgr2=0x7fca681009a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:17.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.301+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 shutdown_connections 2026-03-10T14:12:17.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.301+0000 7fca6ff5c700 1 -- 192.168.123.103:0/2085249984 wait complete. 2026-03-10T14:12:17.301 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:12:17.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:17 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 117/336 objects degraded (34.821%), 1 pg degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-10T14:12:17.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.385+0000 7fab760c8700 1 -- 192.168.123.103:0/1332802132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab700737b0 msgr2=0x7fab70073c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.385+0000 7fab760c8700 1 --2- 192.168.123.103:0/1332802132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab700737b0 0x7fab70073c20 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fab58009b50 tx=0x7fab58009e60 comp rx=0 tx=0).stop 2026-03-10T14:12:17.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.385+0000 7fab760c8700 1 -- 192.168.123.103:0/1332802132 shutdown_connections 2026-03-10T14:12:17.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.385+0000 7fab760c8700 1 --2- 192.168.123.103:0/1332802132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab700737b0 0x7fab70073c20 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.385+0000 7fab760c8700 1 --2- 192.168.123.103:0/1332802132 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab70074d80 0x7fab700731e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.385+0000 7fab760c8700 1 -- 192.168.123.103:0/1332802132 >> 192.168.123.103:0/1332802132 conn(0x7fab700fbaa0 msgr2=0x7fab700fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:17.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.385+0000 7fab760c8700 1 -- 192.168.123.103:0/1332802132 shutdown_connections 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.386+0000 7fab760c8700 1 -- 192.168.123.103:0/1332802132 wait complete. 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.386+0000 7fab760c8700 1 Processor -- start 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.386+0000 7fab760c8700 1 -- start start 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.386+0000 7fab760c8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab700737b0 0x7fab70071d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.386+0000 7fab760c8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab70074d80 0x7fab70072260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.386+0000 7fab760c8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab70072880 con 0x7fab70074d80 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.386+0000 7fab760c8700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab700729c0 con 0x7fab700737b0 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab6f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab700737b0 0x7fab70071d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab6f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab700737b0 0x7fab70071d20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34306/0 (socket says 192.168.123.103:34306) 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab6f7fe700 1 -- 192.168.123.103:0/1848718338 learned_addr learned my addr 192.168.123.103:0/1848718338 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:17.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab66dff700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab70074d80 0x7fab70072260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:17.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab6f7fe700 1 -- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab70074d80 msgr2=0x7fab70072260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab6f7fe700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab70074d80 0x7fab70072260 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab6f7fe700 1 -- 192.168.123.103:0/1848718338 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab580097e0 con 0x7fab700737b0 2026-03-10T14:12:17.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab66dff700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab70074d80 0x7fab70072260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:12:17.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.387+0000 7fab6f7fe700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab700737b0 0x7fab70071d20 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fab6000ebf0 tx=0x7fab6000ef00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:17.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.388+0000 7fab6d7fa700 1 -- 192.168.123.103:0/1848718338 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab6000cc30 con 0x7fab700737b0 2026-03-10T14:12:17.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.388+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fab701087f0 con 0x7fab700737b0 2026-03-10T14:12:17.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.388+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fab70108d40 con 0x7fab700737b0 2026-03-10T14:12:17.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.388+0000 7fab6d7fa700 1 -- 192.168.123.103:0/1848718338 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fab6000cd90 con 0x7fab700737b0 2026-03-10T14:12:17.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.388+0000 7fab6d7fa700 1 -- 192.168.123.103:0/1848718338 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab60010640 con 0x7fab700737b0 2026-03-10T14:12:17.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.389+0000 7fab6d7fa700 1 -- 192.168.123.103:0/1848718338 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fab60004830 con 0x7fab700737b0 2026-03-10T14:12:17.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.390+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fab50005320 con 0x7fab700737b0 2026-03-10T14:12:17.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.390+0000 7fab6d7fa700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fab5c0778c0 0x7fab5c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.390+0000 7fab66dff700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fab5c0778c0 0x7fab5c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:17.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.391+0000 7fab6d7fa700 1 -- 192.168.123.103:0/1848718338 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6194+0+0 (secure 0 0 0) 0x7fab60014070 con 0x7fab700737b0 2026-03-10T14:12:17.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.391+0000 7fab66dff700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fab5c0778c0 0x7fab5c079d70 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fab58005b40 tx=0x7fab58005a90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:17.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.393+0000 7fab6d7fa700 1 -- 192.168.123.103:0/1848718338 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fab60062620 con 0x7fab700737b0 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.534+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fab50000bf0 con 0x7fab5c0778c0 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.535+0000 7fab6d7fa700 1 -- 192.168.123.103:0/1848718338 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fab50000bf0 con 0x7fab5c0778c0 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:12:17.534 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:12:17.535 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T14:12:17.535 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:12:17.535 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:12:17.535 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.538+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fab5c0778c0 msgr2=0x7fab5c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.538+0000 7fab760c8700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fab5c0778c0 0x7fab5c079d70 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fab58005b40 tx=0x7fab58005a90 comp rx=0 tx=0).stop 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.538+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab700737b0 msgr2=0x7fab70071d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.538+0000 7fab760c8700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab700737b0 0x7fab70071d20 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fab6000ebf0 tx=0x7fab6000ef00 comp rx=0 tx=0).stop 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.539+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 shutdown_connections 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.539+0000 7fab760c8700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fab5c0778c0 0x7fab5c079d70 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.539+0000 7fab760c8700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fab700737b0 0x7fab70071d20 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.539+0000 7fab760c8700 1 --2- 192.168.123.103:0/1848718338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fab70074d80 0x7fab70072260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.539+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 >> 192.168.123.103:0/1848718338 conn(0x7fab700fbaa0 msgr2=0x7fab70101ec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:17.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.539+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 shutdown_connections 2026-03-10T14:12:17.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.539+0000 7fab760c8700 1 -- 192.168.123.103:0/1848718338 wait complete. 2026-03-10T14:12:17.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.616+0000 7f718136f700 1 -- 192.168.123.103:0/3028501778 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c101540 msgr2=0x7f717c1039c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.616+0000 7f718136f700 1 --2- 192.168.123.103:0/3028501778 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c101540 0x7f717c1039c0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f716c009b00 tx=0x7f716c009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:17.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.616+0000 7f718136f700 1 -- 192.168.123.103:0/3028501778 shutdown_connections 2026-03-10T14:12:17.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.616+0000 7f718136f700 1 --2- 192.168.123.103:0/3028501778 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c101540 0x7f717c1039c0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.616+0000 7f718136f700 1 --2- 192.168.123.103:0/3028501778 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f717c0febe0 0x7f717c101000 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.616+0000 7f718136f700 1 -- 192.168.123.103:0/3028501778 >> 192.168.123.103:0/3028501778 conn(0x7f717c0fa7f0 msgr2=0x7f717c0fcc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:17.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.617+0000 7f718136f700 1 -- 192.168.123.103:0/3028501778 shutdown_connections 2026-03-10T14:12:17.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.617+0000 7f718136f700 1 -- 192.168.123.103:0/3028501778 wait complete. 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.617+0000 7f718136f700 1 Processor -- start 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.617+0000 7f718136f700 1 -- start start 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.617+0000 7f718136f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c0febe0 0x7f717c19c170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f718136f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f717c101540 0x7f717c19c6b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f718136f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f717c19ccd0 con 0x7f717c0febe0 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f718136f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f717c19ce10 con 0x7f717c101540 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f717a7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f717c101540 0x7f717c19c6b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f717a7fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f717c101540 0x7f717c19c6b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34318/0 (socket says 192.168.123.103:34318) 2026-03-10T14:12:17.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f717a7fc700 1 -- 192.168.123.103:0/858652358 learned_addr learned my addr 192.168.123.103:0/858652358 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:17.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f717affd700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c0febe0 0x7f717c19c170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:17.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f717a7fc700 1 -- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c0febe0 msgr2=0x7f717c19c170 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f717a7fc700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c0febe0 0x7f717c19c170 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f717a7fc700 1 -- 192.168.123.103:0/858652358 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f716c0097e0 con 0x7f717c101540 2026-03-10T14:12:17.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.618+0000 7f717affd700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c0febe0 0x7f717c19c170 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:12:17.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.619+0000 7f717a7fc700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f717c101540 0x7f717c19c6b0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f716c00b5c0 tx=0x7f716c004a40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:17.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.619+0000 7f7173fff700 1 -- 192.168.123.103:0/858652358 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f716c01d070 con 0x7f717c101540 2026-03-10T14:12:17.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.619+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f717c1a1860 con 0x7f717c101540 2026-03-10T14:12:17.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.619+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f717c1a1d50 con 0x7f717c101540 2026-03-10T14:12:17.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.619+0000 7f7173fff700 1 -- 192.168.123.103:0/858652358 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f716c00bc50 con 0x7f717c101540 2026-03-10T14:12:17.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.619+0000 7f7173fff700 1 -- 192.168.123.103:0/858652358 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f716c00f740 con 0x7f717c101540 2026-03-10T14:12:17.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.620+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f717c1962f0 con 0x7f717c101540 2026-03-10T14:12:17.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.621+0000 7f7173fff700 1 -- 192.168.123.103:0/858652358 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f716c022a50 con 0x7f717c101540 2026-03-10T14:12:17.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.621+0000 7f7173fff700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7168077910 0x7f7168079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:17.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.621+0000 7f7173fff700 1 -- 192.168.123.103:0/858652358 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(71..71 src has 1..71) v4 ==== 6194+0+0 (secure 0 0 0) 0x7f716c09b2c0 con 0x7f717c101540 2026-03-10T14:12:17.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.621+0000 7f717affd700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7168077910 0x7f7168079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:17.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.622+0000 7f717affd700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7168077910 0x7f7168079dc0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f7164005fd0 tx=0x7f7164005f00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:17.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.623+0000 7f7173fff700 1 -- 192.168.123.103:0/858652358 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f716c063be0 con 0x7f717c101540 2026-03-10T14:12:17.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.799+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f717c02d0b0 con 0x7f717c101540 2026-03-10T14:12:17.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.800+0000 7f7173fff700 1 -- 192.168.123.103:0/858652358 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+563 (secure 0 0 0) 0x7f717c02d0b0 con 0x7f717c101540 2026-03-10T14:12:17.799 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 117/336 objects degraded (34.821%), 1 pg degraded, 3 pgs undersized 2026-03-10T14:12:17.799 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 117/336 objects degraded (34.821%), 1 pg degraded, 3 pgs undersized 2026-03-10T14:12:17.799 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-10T14:12:17.799 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.10 is stuck undersized for 2m, current state active+recovering+undersized+remapped, last acting [5,1] 2026-03-10T14:12:17.799 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is stuck undersized for 2m, current state active+recovering+undersized+remapped, last acting [3,4] 2026-03-10T14:12:17.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.803+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7168077910 msgr2=0x7f7168079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.803+0000 7f718136f700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7168077910 0x7f7168079dc0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f7164005fd0 tx=0x7f7164005f00 comp rx=0 tx=0).stop 2026-03-10T14:12:17.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.803+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f717c101540 msgr2=0x7f717c19c6b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:17.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.803+0000 7f718136f700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f717c101540 0x7f717c19c6b0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f716c00b5c0 tx=0x7f716c004a40 comp rx=0 tx=0).stop 2026-03-10T14:12:17.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.804+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 shutdown_connections 2026-03-10T14:12:17.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.804+0000 7f718136f700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7168077910 0x7f7168079dc0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.804+0000 7f718136f700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f717c0febe0 0x7f717c19c170 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.804+0000 7f718136f700 1 --2- 192.168.123.103:0/858652358 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f717c101540 0x7f717c19c6b0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:17.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.804+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 >> 192.168.123.103:0/858652358 conn(0x7f717c0fa7f0 msgr2=0x7f717c0fcc40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:17.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.804+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 shutdown_connections 2026-03-10T14:12:17.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:17.804+0000 7f718136f700 1 -- 192.168.123.103:0/858652358 wait complete. 2026-03-10T14:12:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:18 vm04.local ceph-mon[92084]: from='client.34290 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:18 vm04.local ceph-mon[92084]: from='client.44203 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:18 vm04.local ceph-mon[92084]: pgmap v129: 65 pgs: 2 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 117/336 objects degraded (34.821%); 0 B/s, 10 objects/s recovering 2026-03-10T14:12:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:18 vm04.local ceph-mon[92084]: from='client.44207 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:18 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/592798613' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:18 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2085249984' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:12:18.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:18 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/858652358' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:12:18.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:18 vm03.local ceph-mon[103098]: from='client.34290 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:18.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:18 vm03.local ceph-mon[103098]: from='client.44203 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:18.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:18 vm03.local ceph-mon[103098]: pgmap v129: 65 pgs: 2 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 117/336 objects degraded (34.821%); 0 B/s, 10 objects/s recovering 2026-03-10T14:12:18.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:18 vm03.local ceph-mon[103098]: from='client.44207 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:18.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:18 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/592798613' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:18.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:18 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2085249984' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:12:18.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:18 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/858652358' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:12:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:19 vm04.local ceph-mon[92084]: from='client.44215 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:19.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:19.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:19 vm03.local ceph-mon[103098]: from='client.44215 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:19.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:20.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:20 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:20.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:20 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-10T14:12:20.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:20 vm03.local ceph-mon[103098]: pgmap v130: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 117/336 objects degraded (34.821%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:20 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:20 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-10T14:12:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:20 vm04.local ceph-mon[92084]: pgmap v130: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 117/336 objects degraded (34.821%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:22.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:22 vm04.local ceph-mon[92084]: pgmap v131: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 117/336 objects degraded (34.821%); 0 B/s, 5 objects/s recovering 2026-03-10T14:12:22.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:22 vm03.local ceph-mon[103098]: pgmap v131: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 117/336 objects degraded (34.821%); 0 B/s, 5 objects/s recovering 2026-03-10T14:12:24.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:24 vm03.local ceph-mon[103098]: pgmap v132: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s; 117/336 objects degraded (34.821%); 0 B/s, 8 objects/s recovering 2026-03-10T14:12:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:24 vm04.local ceph-mon[92084]: pgmap v132: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s; 117/336 objects degraded (34.821%); 0 B/s, 8 objects/s recovering 2026-03-10T14:12:26.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:26 vm03.local ceph-mon[103098]: pgmap v133: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 117/336 objects degraded (34.821%); 0 B/s, 8 objects/s recovering 2026-03-10T14:12:26.365 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:26 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 117/336 objects degraded (34.821%), 1 pg degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-10T14:12:26.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:26 vm04.local ceph-mon[92084]: pgmap v133: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 117/336 objects degraded (34.821%); 0 B/s, 8 objects/s recovering 2026-03-10T14:12:26.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:26 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 117/336 objects degraded (34.821%), 1 pg degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-10T14:12:27.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:27 vm03.local ceph-mon[103098]: osdmap e72: 6 total, 6 up, 6 in 2026-03-10T14:12:27.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:27 vm04.local ceph-mon[92084]: osdmap e72: 6 total, 6 up, 6 in 2026-03-10T14:12:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:28 vm03.local ceph-mon[103098]: pgmap v135: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 0 B/s, 9 objects/s recovering 2026-03-10T14:12:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:28 vm03.local ceph-mon[103098]: osdmap e73: 6 total, 6 up, 6 in 2026-03-10T14:12:28.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:28 vm04.local ceph-mon[92084]: pgmap v135: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 0 B/s, 9 objects/s recovering 2026-03-10T14:12:28.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:28 vm04.local ceph-mon[92084]: osdmap e73: 6 total, 6 up, 6 in 2026-03-10T14:12:30.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:30 vm03.local ceph-mon[103098]: pgmap v137: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 0 B/s, 10 objects/s recovering 2026-03-10T14:12:30.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:30 vm04.local ceph-mon[92084]: pgmap v137: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 0 B/s, 10 objects/s recovering 2026-03-10T14:12:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:31 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1 pg undersized (PG_DEGRADED) 2026-03-10T14:12:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:12:31.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:31 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1 pg undersized (PG_DEGRADED) 2026-03-10T14:12:31.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:12:32.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:32 vm04.local ceph-mon[92084]: pgmap v138: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:12:32.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:32 vm03.local ceph-mon[103098]: pgmap v138: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.2 KiB/s rd, 2 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:12:33.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:33 vm03.local ceph-mon[103098]: pgmap v139: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:12:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:33 vm04.local ceph-mon[92084]: pgmap v139: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:12:34.403 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:34.403 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:34.404 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-mon[103098]: Upgrade: osd.1 is safe to restart 2026-03-10T14:12:34.404 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-mon[103098]: Upgrade: Updating osd.1 2026-03-10T14:12:34.404 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:34.404 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T14:12:34.404 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:12:34.404 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-mon[103098]: Deploying daemon osd.1 on vm03 2026-03-10T14:12:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:34 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:34 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T14:12:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:34 vm04.local ceph-mon[92084]: Upgrade: osd.1 is safe to restart 2026-03-10T14:12:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:34 vm04.local ceph-mon[92084]: Upgrade: Updating osd.1 2026-03-10T14:12:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:34 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:34 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T14:12:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:34 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:12:34.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:34 vm04.local ceph-mon[92084]: Deploying daemon osd.1 on vm03 2026-03-10T14:12:34.858 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:34 vm03.local systemd[1]: Stopping Ceph osd.1 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:12:34.858 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[73415]: 2026-03-10T14:12:34.517+0000 7ff010e3b700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:12:34.858 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[73415]: 2026-03-10T14:12:34.517+0000 7ff010e3b700 -1 osd.1 73 *** Got signal Terminated *** 2026-03-10T14:12:34.858 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:34 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[73415]: 2026-03-10T14:12:34.517+0000 7ff010e3b700 -1 osd.1 73 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:12:35.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:35 vm04.local ceph-mon[92084]: osd.1 marked itself down and dead 2026-03-10T14:12:35.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:35 vm04.local ceph-mon[92084]: pgmap v140: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.3 KiB/s rd, 4 op/s; 0 B/s, 10 objects/s recovering 2026-03-10T14:12:35.564 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:35 vm03.local ceph-mon[103098]: osd.1 marked itself down and dead 2026-03-10T14:12:35.564 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:35 vm03.local ceph-mon[103098]: pgmap v140: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 2.3 KiB/s rd, 4 op/s; 0 B/s, 10 objects/s recovering 2026-03-10T14:12:35.564 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[113979]: 2026-03-10 14:12:35.354496713 +0000 UTC m=+0.853244775 container died ba323e54dbc0c55deabce00d21564a461df64efae84fb22637f114f0b38e6cd5 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1, GIT_BRANCH=HEAD, io.buildah.version=1.29.1, CEPH_POINT_RELEASE=-18.2.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, ceph=True, org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, GIT_CLEAN=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, maintainer=Guillaume Abrioux ) 2026-03-10T14:12:35.564 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[113979]: 2026-03-10 14:12:35.382920652 +0000 UTC m=+0.881668705 container remove ba323e54dbc0c55deabce00d21564a461df64efae84fb22637f114f0b38e6cd5 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, org.label-schema.license=GPLv2, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1) 2026-03-10T14:12:35.564 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local bash[113979]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[114048]: 2026-03-10 14:12:35.566112921 +0000 UTC m=+0.022358687 container create 5d2f8363718eb6aafeae9281dcf85c5303ecff3e0f84a656ece01332f230053e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[114048]: 2026-03-10 14:12:35.62445672 +0000 UTC m=+0.080702476 container init 5d2f8363718eb6aafeae9281dcf85c5303ecff3e0f84a656ece01332f230053e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[114048]: 2026-03-10 14:12:35.627937317 +0000 UTC m=+0.084183083 container start 5d2f8363718eb6aafeae9281dcf85c5303ecff3e0f84a656ece01332f230053e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[114048]: 2026-03-10 14:12:35.628888858 +0000 UTC m=+0.085134624 container attach 5d2f8363718eb6aafeae9281dcf85c5303ecff3e0f84a656ece01332f230053e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[114048]: 2026-03-10 14:12:35.557344135 +0000 UTC m=+0.013589901 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[114048]: 2026-03-10 14:12:35.765958814 +0000 UTC m=+0.222204580 container died 5d2f8363718eb6aafeae9281dcf85c5303ecff3e0f84a656ece01332f230053e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local podman[114048]: 2026-03-10 14:12:35.797084531 +0000 UTC m=+0.253330287 container remove 5d2f8363718eb6aafeae9281dcf85c5303ecff3e0f84a656ece01332f230053e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.1.service: Deactivated successfully. 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local systemd[1]: Stopped Ceph osd.1 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:12:35.824 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:35 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.1.service: Consumed 49.773s CPU time. 2026-03-10T14:12:36.108 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local systemd[1]: Starting Ceph osd.1 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:12:36.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:36 vm04.local ceph-mon[92084]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:12:36.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:36 vm04.local ceph-mon[92084]: osdmap e74: 6 total, 5 up, 6 in 2026-03-10T14:12:36.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:36 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:12:36.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-mon[103098]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:12:36.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-mon[103098]: osdmap e74: 6 total, 5 up, 6 in 2026-03-10T14:12:36.609 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local podman[114164]: 2026-03-10 14:12:36.14100781 +0000 UTC m=+0.024572261 container create 5a4e84e192448be6c0917b493127f97c907737e0bcf938808b39a3a2d9e231f9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local podman[114164]: 2026-03-10 14:12:36.193780389 +0000 UTC m=+0.077344851 container init 5a4e84e192448be6c0917b493127f97c907737e0bcf938808b39a3a2d9e231f9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local podman[114164]: 2026-03-10 14:12:36.197472161 +0000 UTC m=+0.081036612 container start 5a4e84e192448be6c0917b493127f97c907737e0bcf938808b39a3a2d9e231f9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local podman[114164]: 2026-03-10 14:12:36.20012636 +0000 UTC m=+0.083690811 container attach 5a4e84e192448be6c0917b493127f97c907737e0bcf938808b39a3a2d9e231f9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate, ceph=True, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local podman[114164]: 2026-03-10 14:12:36.127254152 +0000 UTC m=+0.010818603 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local bash[114164]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:36.609 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local bash[114164]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local bash[114164]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local bash[114164]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local bash[114164]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local bash[114164]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-a04afd93-3bbe-40d3-af89-92ea0436d7b3/osd-block-0af1d772-f786-4e93-8006-866ee43f51d1 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:36 vm03.local bash[114164]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-a04afd93-3bbe-40d3-af89-92ea0436d7b3/osd-block-0af1d772-f786-4e93-8006-866ee43f51d1 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T14:12:37.062 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/ln -snf /dev/ceph-a04afd93-3bbe-40d3-af89-92ea0436d7b3/osd-block-0af1d772-f786-4e93-8006-866ee43f51d1 /var/lib/ceph/osd/ceph-1/block 2026-03-10T14:12:37.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:37 vm03.local ceph-mon[103098]: osdmap e75: 6 total, 5 up, 6 in 2026-03-10T14:12:37.314 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:37 vm03.local ceph-mon[103098]: pgmap v143: 65 pgs: 12 active+undersized, 1 active+recovering+undersized+remapped, 9 stale+active+clean, 5 active+undersized+degraded, 38 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 2 op/s; 18/336 objects degraded (5.357%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:37.314 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local bash[114164]: Running command: /usr/bin/ln -snf /dev/ceph-a04afd93-3bbe-40d3-af89-92ea0436d7b3/osd-block-0af1d772-f786-4e93-8006-866ee43f51d1 /var/lib/ceph/osd/ceph-1/block 2026-03-10T14:12:37.314 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T14:12:37.314 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local bash[114164]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T14:12:37.314 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T14:12:37.314 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local bash[114164]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T14:12:37.315 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T14:12:37.315 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local bash[114164]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T14:12:37.315 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate[114175]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T14:12:37.315 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local bash[114164]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T14:12:37.315 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local podman[114164]: 2026-03-10 14:12:37.090683682 +0000 UTC m=+0.974248133 container died 5a4e84e192448be6c0917b493127f97c907737e0bcf938808b39a3a2d9e231f9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:12:37.315 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local podman[114164]: 2026-03-10 14:12:37.198923639 +0000 UTC m=+1.082488090 container remove 5a4e84e192448be6c0917b493127f97c907737e0bcf938808b39a3a2d9e231f9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-activate, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T14:12:37.315 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local podman[114435]: 2026-03-10 14:12:37.290759316 +0000 UTC m=+0.018348630 container create bf01c6df212002144d9e3f7fb6a019e7c690e0d146a614e0aa020a3dee497632 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T14:12:37.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:37 vm04.local ceph-mon[92084]: osdmap e75: 6 total, 5 up, 6 in 2026-03-10T14:12:37.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:37 vm04.local ceph-mon[92084]: pgmap v143: 65 pgs: 12 active+undersized, 1 active+recovering+undersized+remapped, 9 stale+active+clean, 5 active+undersized+degraded, 38 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 2 op/s; 18/336 objects degraded (5.357%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:37.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local podman[114435]: 2026-03-10 14:12:37.332469447 +0000 UTC m=+0.060058761 container init bf01c6df212002144d9e3f7fb6a019e7c690e0d146a614e0aa020a3dee497632 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default) 2026-03-10T14:12:37.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local podman[114435]: 2026-03-10 14:12:37.336896245 +0000 UTC m=+0.064485559 container start bf01c6df212002144d9e3f7fb6a019e7c690e0d146a614e0aa020a3dee497632 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223) 2026-03-10T14:12:37.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local bash[114435]: bf01c6df212002144d9e3f7fb6a019e7c690e0d146a614e0aa020a3dee497632 2026-03-10T14:12:37.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local podman[114435]: 2026-03-10 14:12:37.283792635 +0000 UTC m=+0.011381958 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:12:37.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local systemd[1]: Started Ceph osd.1 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:12:37.973 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:37 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[114446]: 2026-03-10T14:12:37.682+0000 7f92f0d84740 -1 Falling back to public interface 2026-03-10T14:12:38.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:38 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 18/336 objects degraded (5.357%), 5 pgs degraded, 1 pg undersized (PG_DEGRADED) 2026-03-10T14:12:38.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:38.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:38.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:12:38.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:38 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 18/336 objects degraded (5.357%), 5 pgs degraded, 1 pg undersized (PG_DEGRADED) 2026-03-10T14:12:38.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:38.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:38.580 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:12:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:40 vm03.local ceph-mon[103098]: pgmap v144: 65 pgs: 15 active+undersized, 1 active+recovering+undersized+remapped, 3 stale+active+clean, 14 active+undersized+degraded, 32 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 3 op/s; 47/336 objects degraded (13.988%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:40 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:40 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:40 vm03.local ceph-mon[103098]: osdmap e76: 6 total, 5 up, 6 in 2026-03-10T14:12:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:40 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:40 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:40.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:40 vm04.local ceph-mon[92084]: pgmap v144: 65 pgs: 15 active+undersized, 1 active+recovering+undersized+remapped, 3 stale+active+clean, 14 active+undersized+degraded, 32 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 3 op/s; 47/336 objects degraded (13.988%); 0 B/s, 11 objects/s recovering 2026-03-10T14:12:40.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:40 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:40.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:40 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:40.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:40 vm04.local ceph-mon[92084]: osdmap e76: 6 total, 5 up, 6 in 2026-03-10T14:12:40.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:40 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:40.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:40 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: osdmap e77: 6 total, 5 up, 6 in 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: pgmap v147: 65 pgs: 1 peering, 18 active+undersized, 16 active+undersized+degraded, 30 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 53/336 objects degraded (15.774%) 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T14:12:41.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (13 PGs are or would become offline) 2026-03-10T14:12:41.741 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:41 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[114446]: 2026-03-10T14:12:41.741+0000 7f92f0d84740 -1 osd.1 0 read_superblock omap replica is missing. 2026-03-10T14:12:41.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: osdmap e77: 6 total, 5 up, 6 in 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: pgmap v147: 65 pgs: 1 peering, 18 active+undersized, 16 active+undersized+degraded, 30 active+clean; 275 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 53/336 objects degraded (15.774%) 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T14:12:41.817 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:41 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (13 PGs are or would become offline) 2026-03-10T14:12:42.358 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:42 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[114446]: 2026-03-10T14:12:42.059+0000 7f92f0d84740 -1 osd.1 73 log_to_monitors true 2026-03-10T14:12:42.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:42 vm04.local ceph-mon[92084]: from='osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T14:12:42.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:42 vm03.local ceph-mon[103098]: from='osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T14:12:43.793 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:43 vm03.local ceph-mon[103098]: from='osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T14:12:43.793 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:43 vm03.local ceph-mon[103098]: osdmap e78: 6 total, 5 up, 6 in 2026-03-10T14:12:43.793 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:43 vm03.local ceph-mon[103098]: from='osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:12:43.793 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:43 vm03.local ceph-mon[103098]: pgmap v149: 65 pgs: 1 peering, 18 active+undersized, 16 active+undersized+degraded, 30 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 2 op/s; 53/336 objects degraded (15.774%) 2026-03-10T14:12:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:43 vm04.local ceph-mon[92084]: from='osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T14:12:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:43 vm04.local ceph-mon[92084]: osdmap e78: 6 total, 5 up, 6 in 2026-03-10T14:12:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:43 vm04.local ceph-mon[92084]: from='osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:12:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:43 vm04.local ceph-mon[92084]: pgmap v149: 65 pgs: 1 peering, 18 active+undersized, 16 active+undersized+degraded, 30 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 2 op/s; 53/336 objects degraded (15.774%) 2026-03-10T14:12:44.107 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:12:43 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[114446]: 2026-03-10T14:12:43.793+0000 7f92e831d640 -1 osd.1 73 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:12:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:44 vm03.local ceph-mon[103098]: from='osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421]' entity='osd.1' 2026-03-10T14:12:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:44 vm03.local ceph-mon[103098]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:12:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:44 vm03.local ceph-mon[103098]: osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421] boot 2026-03-10T14:12:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:44 vm03.local ceph-mon[103098]: osdmap e79: 6 total, 6 up, 6 in 2026-03-10T14:12:45.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:44 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:12:45.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:44 vm04.local ceph-mon[92084]: from='osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421]' entity='osd.1' 2026-03-10T14:12:45.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:44 vm04.local ceph-mon[92084]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:12:45.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:44 vm04.local ceph-mon[92084]: osd.1 [v2:192.168.123.103:6810/2060038421,v1:192.168.123.103:6811/2060038421] boot 2026-03-10T14:12:45.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:44 vm04.local ceph-mon[92084]: osdmap e79: 6 total, 6 up, 6 in 2026-03-10T14:12:45.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:44 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T14:12:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:45 vm03.local ceph-mon[103098]: pgmap v150: 65 pgs: 1 peering, 18 active+undersized, 16 active+undersized+degraded, 30 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 1 op/s; 53/336 objects degraded (15.774%) 2026-03-10T14:12:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:45 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 53/336 objects degraded (15.774%), 16 pgs degraded (PG_DEGRADED) 2026-03-10T14:12:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:12:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:45 vm03.local ceph-mon[103098]: osdmap e80: 6 total, 6 up, 6 in 2026-03-10T14:12:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:45 vm04.local ceph-mon[92084]: pgmap v150: 65 pgs: 1 peering, 18 active+undersized, 16 active+undersized+degraded, 30 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 0 B/s rd, 1 op/s; 53/336 objects degraded (15.774%) 2026-03-10T14:12:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:45 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 53/336 objects degraded (15.774%), 16 pgs degraded (PG_DEGRADED) 2026-03-10T14:12:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:12:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:45 vm04.local ceph-mon[92084]: osdmap e80: 6 total, 6 up, 6 in 2026-03-10T14:12:47.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.882+0000 7f035dfa7700 1 -- 192.168.123.103:0/4081122372 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f03581036d0 msgr2=0x7f0358103b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:47.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.882+0000 7f035dfa7700 1 --2- 192.168.123.103:0/4081122372 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f03581036d0 0x7f0358103b20 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f034c009b00 tx=0x7f034c009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:47.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.883+0000 7f035dfa7700 1 -- 192.168.123.103:0/4081122372 shutdown_connections 2026-03-10T14:12:47.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.883+0000 7f035dfa7700 1 --2- 192.168.123.103:0/4081122372 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f03581036d0 0x7f0358103b20 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:47.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.883+0000 7f035dfa7700 1 --2- 192.168.123.103:0/4081122372 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f03581024d0 0x7f03581028e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:47.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.883+0000 7f035dfa7700 1 -- 192.168.123.103:0/4081122372 >> 192.168.123.103:0/4081122372 conn(0x7f03580fda80 msgr2=0x7f03580ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:47.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.884+0000 7f035dfa7700 1 -- 192.168.123.103:0/4081122372 shutdown_connections 2026-03-10T14:12:47.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.884+0000 7f035dfa7700 1 -- 192.168.123.103:0/4081122372 wait complete. 2026-03-10T14:12:47.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.884+0000 7f035dfa7700 1 Processor -- start 2026-03-10T14:12:47.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.884+0000 7f035dfa7700 1 -- start start 2026-03-10T14:12:47.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.885+0000 7f035dfa7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f03581024d0 0x7f0358078b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:47.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.885+0000 7f035dfa7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f03581036d0 0x7f0358079040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:47.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.885+0000 7f035dfa7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f03580755f0 con 0x7f03581024d0 2026-03-10T14:12:47.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.885+0000 7f035dfa7700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0358075760 con 0x7f03581036d0 2026-03-10T14:12:47.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.885+0000 7f0357fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f03581036d0 0x7f0358079040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:47.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.885+0000 7f0357fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f03581036d0 0x7f0358079040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:41254/0 (socket says 192.168.123.103:41254) 2026-03-10T14:12:47.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.885+0000 7f0357fff700 1 -- 192.168.123.103:0/1075373288 learned_addr learned my addr 192.168.123.103:0/1075373288 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:47.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.886+0000 7f0357fff700 1 -- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f03581024d0 msgr2=0x7f0358078b00 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:12:47.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.886+0000 7f035cfa5700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f03581024d0 0x7f0358078b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:47.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.886+0000 7f0357fff700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f03581024d0 0x7f0358078b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:47.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.886+0000 7f0357fff700 1 -- 192.168.123.103:0/1075373288 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0348009710 con 0x7f03581036d0 2026-03-10T14:12:47.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.886+0000 7f0357fff700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f03581036d0 0x7f0358079040 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f034c009fd0 tx=0x7f034c004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:47.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.887+0000 7f0355ffb700 1 -- 192.168.123.103:0/1075373288 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f034c01d070 con 0x7f03581036d0 2026-03-10T14:12:47.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.887+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f034c0097e0 con 0x7f03581036d0 2026-03-10T14:12:47.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.887+0000 7f0355ffb700 1 -- 192.168.123.103:0/1075373288 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f034c00bb40 con 0x7f03581036d0 2026-03-10T14:12:47.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.887+0000 7f0355ffb700 1 -- 192.168.123.103:0/1075373288 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f034c00f670 con 0x7f03581036d0 2026-03-10T14:12:47.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.887+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0358075d40 con 0x7f03581036d0 2026-03-10T14:12:47.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.889+0000 7f0355ffb700 1 -- 192.168.123.103:0/1075373288 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f034c00f8e0 con 0x7f03581036d0 2026-03-10T14:12:47.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.889+0000 7f0355ffb700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f03400778e0 0x7f0340079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:47.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.889+0000 7f0355ffb700 1 -- 192.168.123.103:0/1075373288 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f034c09b180 con 0x7f03581036d0 2026-03-10T14:12:47.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.889+0000 7f035cfa5700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f03400778e0 0x7f0340079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:47.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.889+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f035804ea50 con 0x7f03581036d0 2026-03-10T14:12:47.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.890+0000 7f035cfa5700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f03400778e0 0x7f0340079d90 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f0358103530 tx=0x7f0348009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:47.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:47.893+0000 7f0355ffb700 1 -- 192.168.123.103:0/1075373288 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f034c027080 con 0x7f03581036d0 2026-03-10T14:12:48.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.026+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0358108020 con 0x7f03400778e0 2026-03-10T14:12:48.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.027+0000 7f0355ffb700 1 -- 192.168.123.103:0/1075373288 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f0358108020 con 0x7f03400778e0 2026-03-10T14:12:48.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f03400778e0 msgr2=0x7f0340079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f03400778e0 0x7f0340079d90 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f0358103530 tx=0x7f0348009450 comp rx=0 tx=0).stop 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f03581036d0 msgr2=0x7f0358079040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f03581036d0 0x7f0358079040 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f034c009fd0 tx=0x7f034c004970 comp rx=0 tx=0).stop 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 shutdown_connections 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f03400778e0 0x7f0340079d90 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f03581024d0 0x7f0358078b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 --2- 192.168.123.103:0/1075373288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f03581036d0 0x7f0358079040 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 >> 192.168.123.103:0/1075373288 conn(0x7f03580fda80 msgr2=0x7f0358106900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 shutdown_connections 2026-03-10T14:12:48.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.030+0000 7f035dfa7700 1 -- 192.168.123.103:0/1075373288 wait complete. 2026-03-10T14:12:48.038 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:12:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:47 vm03.local ceph-mon[103098]: pgmap v153: 65 pgs: 2 peering, 7 active+undersized, 11 active+undersized+degraded, 45 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.2 KiB/s rd, 3 op/s; 35/336 objects degraded (10.417%) 2026-03-10T14:12:48.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.114+0000 7f5efe7eb700 1 -- 192.168.123.103:0/3278450057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef80fee00 msgr2=0x7f5ef8101220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.114+0000 7f5efe7eb700 1 --2- 192.168.123.103:0/3278450057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef80fee00 0x7f5ef8101220 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f5eec009b00 tx=0x7f5eec009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:48.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.114+0000 7f5efe7eb700 1 -- 192.168.123.103:0/3278450057 shutdown_connections 2026-03-10T14:12:48.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.114+0000 7f5efe7eb700 1 --2- 192.168.123.103:0/3278450057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef8101760 0x7f5ef8103be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.114+0000 7f5efe7eb700 1 --2- 192.168.123.103:0/3278450057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef80fee00 0x7f5ef8101220 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.114+0000 7f5efe7eb700 1 -- 192.168.123.103:0/3278450057 >> 192.168.123.103:0/3278450057 conn(0x7f5ef80faa10 msgr2=0x7f5ef80fce60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:48.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.114+0000 7f5efe7eb700 1 -- 192.168.123.103:0/3278450057 shutdown_connections 2026-03-10T14:12:48.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.114+0000 7f5efe7eb700 1 -- 192.168.123.103:0/3278450057 wait complete. 2026-03-10T14:12:48.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.115+0000 7f5efe7eb700 1 Processor -- start 2026-03-10T14:12:48.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.115+0000 7f5efe7eb700 1 -- start start 2026-03-10T14:12:48.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.115+0000 7f5efe7eb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef80fee00 0x7f5ef819c3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5efe7eb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef8101760 0x7f5ef819c930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5efe7eb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ef819cf50 con 0x7f5ef8101760 2026-03-10T14:12:48.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5efe7eb700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ef819d090 con 0x7f5ef80fee00 2026-03-10T14:12:48.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5ef7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef80fee00 0x7f5ef819c3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5ef7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef80fee00 0x7f5ef819c3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:41274/0 (socket says 192.168.123.103:41274) 2026-03-10T14:12:48.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5ef7fff700 1 -- 192.168.123.103:0/88767401 learned_addr learned my addr 192.168.123.103:0/88767401 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:48.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5ef77fe700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef8101760 0x7f5ef819c930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5ef77fe700 1 -- 192.168.123.103:0/88767401 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef80fee00 msgr2=0x7f5ef819c3f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.116+0000 7f5ef77fe700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef80fee00 0x7f5ef819c3f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.117+0000 7f5ef77fe700 1 -- 192.168.123.103:0/88767401 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5eec0097e0 con 0x7f5ef8101760 2026-03-10T14:12:48.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.117+0000 7f5ef77fe700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef8101760 0x7f5ef819c930 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f5ee800d940 tx=0x7f5ee800dc50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:48.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.117+0000 7f5ef57fa700 1 -- 192.168.123.103:0/88767401 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ee80098e0 con 0x7f5ef8101760 2026-03-10T14:12:48.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.117+0000 7f5ef57fa700 1 -- 192.168.123.103:0/88767401 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5ee800de90 con 0x7f5ef8101760 2026-03-10T14:12:48.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.117+0000 7f5ef57fa700 1 -- 192.168.123.103:0/88767401 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ee800f5d0 con 0x7f5ef8101760 2026-03-10T14:12:48.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.117+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ef81a1b40 con 0x7f5ef8101760 2026-03-10T14:12:48.116 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.117+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ef81a2090 con 0x7f5ef8101760 2026-03-10T14:12:48.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.119+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5ef8196640 con 0x7f5ef8101760 2026-03-10T14:12:48.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.120+0000 7f5ef57fa700 1 -- 192.168.123.103:0/88767401 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5ee8010460 con 0x7f5ef8101760 2026-03-10T14:12:48.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.121+0000 7f5ef57fa700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5ee00778c0 0x7f5ee0079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.121+0000 7f5ef57fa700 1 -- 192.168.123.103:0/88767401 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f5ee8099170 con 0x7f5ef8101760 2026-03-10T14:12:48.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.123+0000 7f5ef7fff700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5ee00778c0 0x7f5ee0079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.123+0000 7f5ef57fa700 1 -- 192.168.123.103:0/88767401 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5ee8061ae0 con 0x7f5ef8101760 2026-03-10T14:12:48.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.123+0000 7f5ef7fff700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5ee00778c0 0x7f5ee0079d70 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f5eec0052d0 tx=0x7f5eec005fd0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:48.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.266+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5ef81a1cd0 con 0x7f5ee00778c0 2026-03-10T14:12:48.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.268+0000 7f5ef57fa700 1 -- 192.168.123.103:0/88767401 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f5ef81a1cd0 con 0x7f5ee00778c0 2026-03-10T14:12:48.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.271+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5ee00778c0 msgr2=0x7f5ee0079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.271+0000 7f5efe7eb700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5ee00778c0 0x7f5ee0079d70 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f5eec0052d0 tx=0x7f5eec005fd0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.271+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef8101760 msgr2=0x7f5ef819c930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.271+0000 7f5efe7eb700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef8101760 0x7f5ef819c930 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f5ee800d940 tx=0x7f5ee800dc50 comp rx=0 tx=0).stop 2026-03-10T14:12:48.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.272+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 shutdown_connections 2026-03-10T14:12:48.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.272+0000 7f5efe7eb700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5ee00778c0 0x7f5ee0079d70 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.272+0000 7f5efe7eb700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5ef80fee00 0x7f5ef819c3f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.272+0000 7f5efe7eb700 1 --2- 192.168.123.103:0/88767401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ef8101760 0x7f5ef819c930 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.272+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 >> 192.168.123.103:0/88767401 conn(0x7f5ef80faa10 msgr2=0x7f5ef80fce60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:48.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.272+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 shutdown_connections 2026-03-10T14:12:48.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.272+0000 7f5efe7eb700 1 -- 192.168.123.103:0/88767401 wait complete. 2026-03-10T14:12:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:47 vm04.local ceph-mon[92084]: pgmap v153: 65 pgs: 2 peering, 7 active+undersized, 11 active+undersized+degraded, 45 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.2 KiB/s rd, 3 op/s; 35/336 objects degraded (10.417%) 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.343+0000 7f60ff0f9700 1 -- 192.168.123.103:0/1906648295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 msgr2=0x7f60f8102e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.343+0000 7f60ff0f9700 1 --2- 192.168.123.103:0/1906648295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 0x7f60f8102e80 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f60e8009b00 tx=0x7f60e8009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.343+0000 7f60ff0f9700 1 -- 192.168.123.103:0/1906648295 shutdown_connections 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.343+0000 7f60ff0f9700 1 --2- 192.168.123.103:0/1906648295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60f81033c0 0x7f60f8105800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.343+0000 7f60ff0f9700 1 --2- 192.168.123.103:0/1906648295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 0x7f60f8102e80 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.343+0000 7f60ff0f9700 1 -- 192.168.123.103:0/1906648295 >> 192.168.123.103:0/1906648295 conn(0x7f60f80fa7b0 msgr2=0x7f60f80fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.343+0000 7f60ff0f9700 1 -- 192.168.123.103:0/1906648295 shutdown_connections 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.343+0000 7f60ff0f9700 1 -- 192.168.123.103:0/1906648295 wait complete. 2026-03-10T14:12:48.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.344+0000 7f60ff0f9700 1 Processor -- start 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.344+0000 7f60ff0f9700 1 -- start start 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.344+0000 7f60ff0f9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 0x7f60f8193960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.344+0000 7f60ff0f9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60f81033c0 0x7f60f8193ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.344+0000 7f60ff0f9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60f81944c0 con 0x7f60f8069180 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.344+0000 7f60ff0f9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60f8194600 con 0x7f60f81033c0 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.344+0000 7f60fd8f6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60f81033c0 0x7f60f8193ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.345+0000 7f60fd8f6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60f81033c0 0x7f60f8193ea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:41296/0 (socket says 192.168.123.103:41296) 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.345+0000 7f60fd8f6700 1 -- 192.168.123.103:0/3537268993 learned_addr learned my addr 192.168.123.103:0/3537268993 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:48.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.345+0000 7f60fe0f7700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 0x7f60f8193960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.345+0000 7f60fe0f7700 1 -- 192.168.123.103:0/3537268993 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60f81033c0 msgr2=0x7f60f8193ea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.345+0000 7f60fe0f7700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60f81033c0 0x7f60f8193ea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.345+0000 7f60fe0f7700 1 -- 192.168.123.103:0/3537268993 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f60e80097e0 con 0x7f60f8069180 2026-03-10T14:12:48.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.345+0000 7f60fe0f7700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 0x7f60f8193960 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f60e8009fd0 tx=0x7f60e8004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:48.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.346+0000 7f60ef7fe700 1 -- 192.168.123.103:0/3537268993 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f60e801d070 con 0x7f60f8069180 2026-03-10T14:12:48.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.346+0000 7f60ef7fe700 1 -- 192.168.123.103:0/3537268993 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f60e800bc50 con 0x7f60f8069180 2026-03-10T14:12:48.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.346+0000 7f60ef7fe700 1 -- 192.168.123.103:0/3537268993 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f60e800f650 con 0x7f60f8069180 2026-03-10T14:12:48.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.346+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f60f8199050 con 0x7f60f8069180 2026-03-10T14:12:48.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.346+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f60f8199510 con 0x7f60f8069180 2026-03-10T14:12:48.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.347+0000 7f60ef7fe700 1 -- 192.168.123.103:0/3537268993 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f60e800f870 con 0x7f60f8069180 2026-03-10T14:12:48.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.347+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f60f818dad0 con 0x7f60f8069180 2026-03-10T14:12:48.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.348+0000 7f60ef7fe700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f60e407bc80 0x7f60e407e130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.348+0000 7f60ef7fe700 1 -- 192.168.123.103:0/3537268993 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f60e809b210 con 0x7f60f8069180 2026-03-10T14:12:48.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.348+0000 7f60fd8f6700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f60e407bc80 0x7f60e407e130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.349+0000 7f60fd8f6700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f60e407bc80 0x7f60e407e130 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f60f4005950 tx=0x7f60f400b500 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:48.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.352+0000 7f60ef7fe700 1 -- 192.168.123.103:0/3537268993 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f60e8063af0 con 0x7f60f8069180 2026-03-10T14:12:48.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.480+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f60f8061190 con 0x7f60e407bc80 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.489+0000 7f60ef7fe700 1 -- 192.168.123.103:0/3537268993 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f60f8061190 con 0x7f60e407bc80 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (8m) 9s ago 9m 22.0M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (7m) 9s ago 9m 9365k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (9m) 3m ago 9m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 9s ago 9m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (3m) 3m ago 8m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (8m) 9s ago 9m 91.5M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (7m) 9s ago 7m 136M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (7m) 9s ago 7m 72.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (7m) 3m ago 7m 91.2M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (7m) 3m ago 7m 166M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (4m) 9s ago 10m 619M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (4m) 3m ago 8m 496M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 9s ago 10m 60.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (3m) 3m ago 8m 49.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (9m) 9s ago 9m 15.3M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (8m) 3m ago 8m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 9s ago 8m 204M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (11s) 9s ago 8m 15.5M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (8m) 9s ago 8m 348M 4096M 18.2.0 dc2bc1663786 7c08a01b8fe1 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (8m) 3m ago 8m 434M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (7m) 3m ago 7m 397M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (7m) 3m ago 7m 352M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:12:48.488 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (4m) 9s ago 9m 68.9M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:12:48.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.492+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f60e407bc80 msgr2=0x7f60e407e130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.492+0000 7f60ff0f9700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f60e407bc80 0x7f60e407e130 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f60f4005950 tx=0x7f60f400b500 comp rx=0 tx=0).stop 2026-03-10T14:12:48.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.492+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 msgr2=0x7f60f8193960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.492+0000 7f60ff0f9700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 0x7f60f8193960 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f60e8009fd0 tx=0x7f60e8004930 comp rx=0 tx=0).stop 2026-03-10T14:12:48.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.493+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 shutdown_connections 2026-03-10T14:12:48.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.493+0000 7f60ff0f9700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f60e407bc80 0x7f60e407e130 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.493+0000 7f60ff0f9700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f60f8069180 0x7f60f8193960 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.493+0000 7f60ff0f9700 1 --2- 192.168.123.103:0/3537268993 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f60f81033c0 0x7f60f8193ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.493+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 >> 192.168.123.103:0/3537268993 conn(0x7f60f80fa7b0 msgr2=0x7f60f80fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:48.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.493+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 shutdown_connections 2026-03-10T14:12:48.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.493+0000 7f60ff0f9700 1 -- 192.168.123.103:0/3537268993 wait complete. 2026-03-10T14:12:48.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.570+0000 7fc262b13700 1 -- 192.168.123.103:0/1338287261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 msgr2=0x7fc25c073c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.570+0000 7fc262b13700 1 --2- 192.168.123.103:0/1338287261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 0x7fc25c073c20 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fc250009b30 tx=0x7fc250009e40 comp rx=0 tx=0).stop 2026-03-10T14:12:48.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.570+0000 7fc262b13700 1 -- 192.168.123.103:0/1338287261 shutdown_connections 2026-03-10T14:12:48.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.570+0000 7fc262b13700 1 --2- 192.168.123.103:0/1338287261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 0x7fc25c073c20 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.570+0000 7fc262b13700 1 --2- 192.168.123.103:0/1338287261 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc25c074d80 0x7fc25c0731e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.570+0000 7fc262b13700 1 -- 192.168.123.103:0/1338287261 >> 192.168.123.103:0/1338287261 conn(0x7fc25c0fbaa0 msgr2=0x7fc25c0fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:48.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.571+0000 7fc262b13700 1 -- 192.168.123.103:0/1338287261 shutdown_connections 2026-03-10T14:12:48.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.571+0000 7fc262b13700 1 -- 192.168.123.103:0/1338287261 wait complete. 2026-03-10T14:12:48.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.571+0000 7fc262b13700 1 Processor -- start 2026-03-10T14:12:48.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.571+0000 7fc262b13700 1 -- start start 2026-03-10T14:12:48.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc262b13700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 0x7fc25c103860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc262b13700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc25c074d80 0x7fc25c103da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc262b13700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc25c100350 con 0x7fc25c0737b0 2026-03-10T14:12:48.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc262b13700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc25c1004c0 con 0x7fc25c074d80 2026-03-10T14:12:48.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc2608af700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 0x7fc25c103860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc2608af700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 0x7fc25c103860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52268/0 (socket says 192.168.123.103:52268) 2026-03-10T14:12:48.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc2608af700 1 -- 192.168.123.103:0/3277495743 learned_addr learned my addr 192.168.123.103:0/3277495743 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:48.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc25bfff700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc25c074d80 0x7fc25c103da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc2608af700 1 -- 192.168.123.103:0/3277495743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc25c074d80 msgr2=0x7fc25c103da0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc2608af700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc25c074d80 0x7fc25c103da0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.572+0000 7fc2608af700 1 -- 192.168.123.103:0/3277495743 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc2500097e0 con 0x7fc25c0737b0 2026-03-10T14:12:48.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.573+0000 7fc2608af700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 0x7fc25c103860 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc24800d8d0 tx=0x7fc24800dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:48.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.573+0000 7fc259ffb700 1 -- 192.168.123.103:0/3277495743 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc248009880 con 0x7fc25c0737b0 2026-03-10T14:12:48.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.573+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc25c1007a0 con 0x7fc25c0737b0 2026-03-10T14:12:48.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.573+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc25c100cf0 con 0x7fc25c0737b0 2026-03-10T14:12:48.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.574+0000 7fc259ffb700 1 -- 192.168.123.103:0/3277495743 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc248010460 con 0x7fc25c0737b0 2026-03-10T14:12:48.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.574+0000 7fc259ffb700 1 -- 192.168.123.103:0/3277495743 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc24800f5d0 con 0x7fc25c0737b0 2026-03-10T14:12:48.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.575+0000 7fc259ffb700 1 -- 192.168.123.103:0/3277495743 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc24800f7b0 con 0x7fc25c0737b0 2026-03-10T14:12:48.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.576+0000 7fc259ffb700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc24c077990 0x7fc24c079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.576+0000 7fc25bfff700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc24c077990 0x7fc24c079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.577+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc240005320 con 0x7fc25c0737b0 2026-03-10T14:12:48.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.577+0000 7fc259ffb700 1 -- 192.168.123.103:0/3277495743 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc24809a830 con 0x7fc25c0737b0 2026-03-10T14:12:48.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.577+0000 7fc25bfff700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc24c077990 0x7fc24c079e40 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fc250006010 tx=0x7fc250005ab0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:48.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.580+0000 7fc259ffb700 1 -- 192.168.123.103:0/3277495743 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc2480630e0 con 0x7fc25c0737b0 2026-03-10T14:12:48.752 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.753+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc240006200 con 0x7fc25c0737b0 2026-03-10T14:12:48.756 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.758+0000 7fc259ffb700 1 -- 192.168.123.103:0/3277495743 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fc248020330 con 0x7fc25c0737b0 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4, 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 8, 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:12:48.757 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:12:48.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.760+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc24c077990 msgr2=0x7fc24c079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.760+0000 7fc262b13700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc24c077990 0x7fc24c079e40 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fc250006010 tx=0x7fc250005ab0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 msgr2=0x7fc25c103860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 0x7fc25c103860 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc24800d8d0 tx=0x7fc24800dbe0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 shutdown_connections 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc24c077990 0x7fc24c079e40 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc25c0737b0 0x7fc25c103860 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 --2- 192.168.123.103:0/3277495743 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc25c074d80 0x7fc25c103da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 >> 192.168.123.103:0/3277495743 conn(0x7fc25c0fbaa0 msgr2=0x7fc25c106310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 shutdown_connections 2026-03-10T14:12:48.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.761+0000 7fc262b13700 1 -- 192.168.123.103:0/3277495743 wait complete. 2026-03-10T14:12:48.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.840+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/3567692822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4103940 msgr2=0x7fc5d4103d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.840+0000 7fc5d8fa7700 1 --2- 192.168.123.103:0/3567692822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4103940 0x7fc5d4103d90 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fc5c4009b00 tx=0x7fc5c4009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:48.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.840+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/3567692822 shutdown_connections 2026-03-10T14:12:48.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.840+0000 7fc5d8fa7700 1 --2- 192.168.123.103:0/3567692822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4103940 0x7fc5d4103d90 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.840+0000 7fc5d8fa7700 1 --2- 192.168.123.103:0/3567692822 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc5d4102740 0x7fc5d4102b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.840+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/3567692822 >> 192.168.123.103:0/3567692822 conn(0x7fc5d40fdcf0 msgr2=0x7fc5d4100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:48.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.840+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/3567692822 shutdown_connections 2026-03-10T14:12:48.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.840+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/3567692822 wait complete. 2026-03-10T14:12:48.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.841+0000 7fc5d8fa7700 1 Processor -- start 2026-03-10T14:12:48.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.841+0000 7fc5d8fa7700 1 -- start start 2026-03-10T14:12:48.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d8fa7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4102740 0x7fc5d4197f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d8fa7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc5d4103940 0x7fc5d4198480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d8fa7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5d4198ac0 con 0x7fc5d4102740 2026-03-10T14:12:48.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d8fa7700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5d4198c30 con 0x7fc5d4103940 2026-03-10T14:12:48.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4102740 0x7fc5d4197f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4102740 0x7fc5d4197f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52288/0 (socket says 192.168.123.103:52288) 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d259c700 1 -- 192.168.123.103:0/366215059 learned_addr learned my addr 192.168.123.103:0/366215059 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d1d9b700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc5d4103940 0x7fc5d4198480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d259c700 1 -- 192.168.123.103:0/366215059 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc5d4103940 msgr2=0x7fc5d4198480 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d259c700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc5d4103940 0x7fc5d4198480 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d259c700 1 -- 192.168.123.103:0/366215059 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc5c40097e0 con 0x7fc5d4102740 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.842+0000 7fc5d259c700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4102740 0x7fc5d4197f40 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fc5c800d900 tx=0x7fc5c800dc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.843+0000 7fc5c37fe700 1 -- 192.168.123.103:0/366215059 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc5c80041d0 con 0x7fc5d4102740 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.843+0000 7fc5c37fe700 1 -- 192.168.123.103:0/366215059 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc5c8004330 con 0x7fc5d4102740 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.843+0000 7fc5c37fe700 1 -- 192.168.123.103:0/366215059 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc5c8003d70 con 0x7fc5d4102740 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.843+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5d4079350 con 0x7fc5d4102740 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.843+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5d4199820 con 0x7fc5d4102740 2026-03-10T14:12:48.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.844+0000 7fc5c37fe700 1 -- 192.168.123.103:0/366215059 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc5c8010460 con 0x7fc5d4102740 2026-03-10T14:12:48.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.845+0000 7fc5c37fe700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc5bc0888d0 0x7fc5bc08ad80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:48.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.845+0000 7fc5d1d9b700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc5bc0888d0 0x7fc5bc08ad80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:48.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.846+0000 7fc5c37fe700 1 -- 192.168.123.103:0/366215059 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc5c8021030 con 0x7fc5d4102740 2026-03-10T14:12:48.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.846+0000 7fc5d1d9b700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc5bc0888d0 0x7fc5bc08ad80 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fc5d4199070 tx=0x7fc5c400b540 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:48.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.846+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc5d4066e40 con 0x7fc5d4102740 2026-03-10T14:12:48.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.849+0000 7fc5c37fe700 1 -- 192.168.123.103:0/366215059 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc5c80630c0 con 0x7fc5d4102740 2026-03-10T14:12:48.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.998+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fc5d4199bf0 con 0x7fc5d4102740 2026-03-10T14:12:48.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:48.998+0000 7fc5c37fe700 1 -- 192.168.123.103:0/366215059 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 16 v16) v1 ==== 76+0+1954 (secure 0 0 0) 0x7fc5c8062810 con 0x7fc5d4102740 2026-03-10T14:12:48.998 INFO:teuthology.orchestra.run.vm03.stdout:e16 2026-03-10T14:12:48.998 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-10T14:12:48:855184+0000 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:epoch 16 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:12:48.656761+0000 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 24263 members: 24263,14470 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:12:48.999 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:12:49.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.002+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc5bc0888d0 msgr2=0x7fc5bc08ad80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.002+0000 7fc5d8fa7700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc5bc0888d0 0x7fc5bc08ad80 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fc5d4199070 tx=0x7fc5c400b540 comp rx=0 tx=0).stop 2026-03-10T14:12:49.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4102740 msgr2=0x7fc5d4197f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4102740 0x7fc5d4197f40 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fc5c800d900 tx=0x7fc5c800dc10 comp rx=0 tx=0).stop 2026-03-10T14:12:49.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 shutdown_connections 2026-03-10T14:12:49.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc5bc0888d0 0x7fc5bc08ad80 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc5d4102740 0x7fc5d4197f40 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 --2- 192.168.123.103:0/366215059 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc5d4103940 0x7fc5d4198480 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 >> 192.168.123.103:0/366215059 conn(0x7fc5d40fdcf0 msgr2=0x7fc5d4106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:49.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 shutdown_connections 2026-03-10T14:12:49.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.003+0000 7fc5d8fa7700 1 -- 192.168.123.103:0/366215059 wait complete. 2026-03-10T14:12:49.002 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 16 2026-03-10T14:12:49.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:48 vm03.local ceph-mon[103098]: from='client.44227 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:49.003 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:48 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3277495743' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:49.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.075+0000 7f7d807eb700 1 -- 192.168.123.103:0/4133463898 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d780ff710 msgr2=0x7f7d780ffb20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.075+0000 7f7d807eb700 1 --2- 192.168.123.103:0/4133463898 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d780ff710 0x7f7d780ffb20 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f7d74009b00 tx=0x7f7d74009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:49.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.076+0000 7f7d807eb700 1 -- 192.168.123.103:0/4133463898 shutdown_connections 2026-03-10T14:12:49.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.076+0000 7f7d807eb700 1 --2- 192.168.123.103:0/4133463898 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7d781000f0 0x7f7d780fe290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.076+0000 7f7d807eb700 1 --2- 192.168.123.103:0/4133463898 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d780ff710 0x7f7d780ffb20 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.076+0000 7f7d807eb700 1 -- 192.168.123.103:0/4133463898 >> 192.168.123.103:0/4133463898 conn(0x7f7d780f9ce0 msgr2=0x7f7d780fc130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:49.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.076+0000 7f7d807eb700 1 -- 192.168.123.103:0/4133463898 shutdown_connections 2026-03-10T14:12:49.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.076+0000 7f7d807eb700 1 -- 192.168.123.103:0/4133463898 wait complete. 2026-03-10T14:12:49.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d807eb700 1 Processor -- start 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d807eb700 1 -- start start 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d807eb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7d780ff710 0x7f7d78105630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d807eb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d781000f0 0x7f7d78105b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d807eb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d78102120 con 0x7f7d780ff710 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d807eb700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d78102290 con 0x7f7d781000f0 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d7dd86700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d781000f0 0x7f7d78105b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d7e587700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7d780ff710 0x7f7d78105630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.078+0000 7f7d7dd86700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d781000f0 0x7f7d78105b70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:41350/0 (socket says 192.168.123.103:41350) 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.078+0000 7f7d7dd86700 1 -- 192.168.123.103:0/2449399899 learned_addr learned my addr 192.168.123.103:0/2449399899 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.077+0000 7f7d7e587700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7d780ff710 0x7f7d78105630 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52308/0 (socket says 192.168.123.103:52308) 2026-03-10T14:12:49.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.078+0000 7f7d7dd86700 1 -- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7d780ff710 msgr2=0x7f7d78105630 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.078+0000 7f7d7dd86700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7d780ff710 0x7f7d78105630 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.078+0000 7f7d7dd86700 1 -- 192.168.123.103:0/2449399899 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7d68009710 con 0x7f7d781000f0 2026-03-10T14:12:49.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.078+0000 7f7d7e587700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7d780ff710 0x7f7d78105630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:12:49.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.078+0000 7f7d7dd86700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d781000f0 0x7f7d78105b70 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f7d6800ec80 tx=0x7f7d6800ef90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:49.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.078+0000 7f7d6f7fe700 1 -- 192.168.123.103:0/2449399899 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d6800ccd0 con 0x7f7d781000f0 2026-03-10T14:12:49.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.079+0000 7f7d6f7fe700 1 -- 192.168.123.103:0/2449399899 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7d68004500 con 0x7f7d781000f0 2026-03-10T14:12:49.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.079+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7d740097e0 con 0x7f7d781000f0 2026-03-10T14:12:49.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.079+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7d78102930 con 0x7f7d781000f0 2026-03-10T14:12:49.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.079+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7d78102c10 con 0x7f7d781000f0 2026-03-10T14:12:49.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.079+0000 7f7d6f7fe700 1 -- 192.168.123.103:0/2449399899 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d68005350 con 0x7f7d781000f0 2026-03-10T14:12:49.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.080+0000 7f7d6f7fe700 1 -- 192.168.123.103:0/2449399899 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7d68005580 con 0x7f7d781000f0 2026-03-10T14:12:49.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.081+0000 7f7d6f7fe700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7d640778c0 0x7f7d64079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:49.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.081+0000 7f7d7e587700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7d640778c0 0x7f7d64079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:49.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.082+0000 7f7d6f7fe700 1 -- 192.168.123.103:0/2449399899 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f7d68014070 con 0x7f7d781000f0 2026-03-10T14:12:49.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.082+0000 7f7d7e587700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7d640778c0 0x7f7d64079d70 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f7d74005190 tx=0x7f7d7401a040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:49.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.083+0000 7f7d6f7fe700 1 -- 192.168.123.103:0/2449399899 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7d68062920 con 0x7f7d781000f0 2026-03-10T14:12:49.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.211+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7d781a2420 con 0x7f7d640778c0 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.212+0000 7f7d6f7fe700 1 -- 192.168.123.103:0/2449399899 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f7d781a2420 con 0x7f7d640778c0 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "8/23 daemons upgraded", 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:12:49.211 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7d640778c0 msgr2=0x7f7d64079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7d640778c0 0x7f7d64079d70 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f7d74005190 tx=0x7f7d7401a040 comp rx=0 tx=0).stop 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d781000f0 msgr2=0x7f7d78105b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d781000f0 0x7f7d78105b70 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f7d6800ec80 tx=0x7f7d6800ef90 comp rx=0 tx=0).stop 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 shutdown_connections 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7d640778c0 0x7f7d64079d70 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7d780ff710 0x7f7d78105630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 --2- 192.168.123.103:0/2449399899 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7d781000f0 0x7f7d78105b70 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 >> 192.168.123.103:0/2449399899 conn(0x7f7d780f9ce0 msgr2=0x7f7d781080e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 shutdown_connections 2026-03-10T14:12:49.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.215+0000 7f7d807eb700 1 -- 192.168.123.103:0/2449399899 wait complete. 2026-03-10T14:12:49.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.286+0000 7fd6048fa700 1 -- 192.168.123.103:0/2107712777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc103a00 msgr2=0x7fd5fc103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.286+0000 7fd6048fa700 1 --2- 192.168.123.103:0/2107712777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc103a00 0x7fd5fc103e70 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7fd5f0009b00 tx=0x7fd5f0009e10 comp rx=0 tx=0).stop 2026-03-10T14:12:49.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.289+0000 7fd6048fa700 1 -- 192.168.123.103:0/2107712777 shutdown_connections 2026-03-10T14:12:49.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.289+0000 7fd6048fa700 1 --2- 192.168.123.103:0/2107712777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc103a00 0x7fd5fc103e70 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.289+0000 7fd6048fa700 1 --2- 192.168.123.103:0/2107712777 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd5fc102760 0x7fd5fc102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.289+0000 7fd6048fa700 1 -- 192.168.123.103:0/2107712777 >> 192.168.123.103:0/2107712777 conn(0x7fd5fc0fddb0 msgr2=0x7fd5fc1001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:49.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.289+0000 7fd6048fa700 1 -- 192.168.123.103:0/2107712777 shutdown_connections 2026-03-10T14:12:49.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.289+0000 7fd6048fa700 1 -- 192.168.123.103:0/2107712777 wait complete. 2026-03-10T14:12:49.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd6048fa700 1 Processor -- start 2026-03-10T14:12:49.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd6048fa700 1 -- start start 2026-03-10T14:12:49.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd6048fa700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc102760 0x7fd5fc197fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:49.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd6048fa700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd5fc103a00 0x7fd5fc198500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:49.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd6048fa700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5fc198a90 con 0x7fd5fc102760 2026-03-10T14:12:49.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd6048fa700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5fc198bd0 con 0x7fd5fc103a00 2026-03-10T14:12:49.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd602696700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc102760 0x7fd5fc197fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:49.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd602696700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc102760 0x7fd5fc197fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52318/0 (socket says 192.168.123.103:52318) 2026-03-10T14:12:49.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd602696700 1 -- 192.168.123.103:0/866767345 learned_addr learned my addr 192.168.123.103:0/866767345 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:12:49.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd602696700 1 -- 192.168.123.103:0/866767345 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd5fc103a00 msgr2=0x7fd5fc198500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.290+0000 7fd601e95700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd5fc103a00 0x7fd5fc198500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:49.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd602696700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd5fc103a00 0x7fd5fc198500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd602696700 1 -- 192.168.123.103:0/866767345 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5f00097e0 con 0x7fd5fc102760 2026-03-10T14:12:49.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd602696700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc102760 0x7fd5fc197fc0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fd5e800c930 tx=0x7fd5e800ccf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:49.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd5f77fe700 1 -- 192.168.123.103:0/866767345 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5e8007ab0 con 0x7fd5fc102760 2026-03-10T14:12:49.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd5f77fe700 1 -- 192.168.123.103:0/866767345 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd5e8007c10 con 0x7fd5fc102760 2026-03-10T14:12:49.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd601e95700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd5fc103a00 0x7fd5fc198500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:12:49.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd5fc19d690 con 0x7fd5fc102760 2026-03-10T14:12:49.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd5f77fe700 1 -- 192.168.123.103:0/866767345 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd5e80186e0 con 0x7fd5fc102760 2026-03-10T14:12:49.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.291+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd5fc19dbb0 con 0x7fd5fc102760 2026-03-10T14:12:49.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.293+0000 7fd5f77fe700 1 -- 192.168.123.103:0/866767345 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd5e8018860 con 0x7fd5fc102760 2026-03-10T14:12:49.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.293+0000 7fd5f77fe700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd5ec0779e0 0x7fd5ec079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:12:49.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.293+0000 7fd5f77fe700 1 -- 192.168.123.103:0/866767345 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd5e8099da0 con 0x7fd5fc102760 2026-03-10T14:12:49.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.294+0000 7fd601e95700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd5ec0779e0 0x7fd5ec079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:12:49.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.295+0000 7fd601e95700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd5ec0779e0 0x7fd5ec079e90 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fd5f0000c00 tx=0x7fd5f0005c00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:12:49.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.295+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd5fc04ea50 con 0x7fd5fc102760 2026-03-10T14:12:49.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.298+0000 7fd5f77fe700 1 -- 192.168.123.103:0/866767345 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd5e8062650 con 0x7fd5fc102760 2026-03-10T14:12:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:48 vm04.local ceph-mon[92084]: from='client.44227 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:48 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3277495743' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:12:49.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.468+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fd5fc19de90 con 0x7fd5fc102760 2026-03-10T14:12:49.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.469+0000 7fd5f77fe700 1 -- 192.168.123.103:0/866767345 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fd5e8061da0 con 0x7fd5fc102760 2026-03-10T14:12:49.467 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_OK 2026-03-10T14:12:49.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.471+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd5ec0779e0 msgr2=0x7fd5ec079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.471+0000 7fd6048fa700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd5ec0779e0 0x7fd5ec079e90 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fd5f0000c00 tx=0x7fd5f0005c00 comp rx=0 tx=0).stop 2026-03-10T14:12:49.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.471+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc102760 msgr2=0x7fd5fc197fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:12:49.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.471+0000 7fd6048fa700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc102760 0x7fd5fc197fc0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fd5e800c930 tx=0x7fd5e800ccf0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.472+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 shutdown_connections 2026-03-10T14:12:49.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.472+0000 7fd6048fa700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd5ec0779e0 0x7fd5ec079e90 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.472+0000 7fd6048fa700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5fc102760 0x7fd5fc197fc0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.472+0000 7fd6048fa700 1 --2- 192.168.123.103:0/866767345 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd5fc103a00 0x7fd5fc198500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:12:49.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.472+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 >> 192.168.123.103:0/866767345 conn(0x7fd5fc0fddb0 msgr2=0x7fd5fc100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:12:49.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.472+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 shutdown_connections 2026-03-10T14:12:49.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:12:49.472+0000 7fd6048fa700 1 -- 192.168.123.103:0/866767345 wait complete. 2026-03-10T14:12:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:49 vm04.local ceph-mon[92084]: from='client.34318 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:49 vm04.local ceph-mon[92084]: from='client.34322 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:49 vm04.local ceph-mon[92084]: pgmap v154: 65 pgs: 2 peering, 63 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.1 KiB/s rd, 2 op/s 2026-03-10T14:12:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:49 vm04.local ceph-mon[92084]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 35/336 objects degraded (10.417%), 11 pgs degraded) 2026-03-10T14:12:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:49 vm04.local ceph-mon[92084]: Cluster is now healthy 2026-03-10T14:12:50.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:49 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:12:50.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:49 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/366215059' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:12:50.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:49 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/866767345' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:12:50.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:49 vm03.local ceph-mon[103098]: from='client.34318 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:49 vm03.local ceph-mon[103098]: from='client.34322 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:49 vm03.local ceph-mon[103098]: pgmap v154: 65 pgs: 2 peering, 63 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.1 KiB/s rd, 2 op/s 2026-03-10T14:12:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:49 vm03.local ceph-mon[103098]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 35/336 objects degraded (10.417%), 11 pgs degraded) 2026-03-10T14:12:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:49 vm03.local ceph-mon[103098]: Cluster is now healthy 2026-03-10T14:12:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:49 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby-replay 2026-03-10T14:12:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:49 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/366215059' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:12:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:49 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/866767345' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:12:51.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:50 vm03.local ceph-mon[103098]: from='client.44247 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:51.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:50 vm04.local ceph-mon[92084]: from='client.44247 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:12:52.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:51 vm03.local ceph-mon[103098]: pgmap v155: 65 pgs: 65 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 895 B/s rd, 1 op/s 2026-03-10T14:12:52.215 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:51 vm04.local ceph-mon[92084]: pgmap v155: 65 pgs: 65 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 895 B/s rd, 1 op/s 2026-03-10T14:12:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:53 vm04.local ceph-mon[92084]: pgmap v156: 65 pgs: 65 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:12:54.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:53 vm03.local ceph-mon[103098]: pgmap v156: 65 pgs: 65 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:12:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:55 vm04.local ceph-mon[92084]: pgmap v157: 65 pgs: 65 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:12:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:55 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:56.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:55 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T14:12:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:55 vm03.local ceph-mon[103098]: pgmap v157: 65 pgs: 65 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:12:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:55 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:55 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T14:12:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:56 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T14:12:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:56 vm04.local ceph-mon[92084]: Upgrade: osd.2 is safe to restart 2026-03-10T14:12:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:56 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:57.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:56 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T14:12:57.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:56 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:12:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:56 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T14:12:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:56 vm03.local ceph-mon[103098]: Upgrade: osd.2 is safe to restart 2026-03-10T14:12:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:56 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:56 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T14:12:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:56 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:12:57.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:57 vm03.local systemd[1]: Stopping Ceph osd.2 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:12:57.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:57 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[79637]: 2026-03-10T14:12:57.116+0000 7fcd89d7c700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:12:57.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:57 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[79637]: 2026-03-10T14:12:57.116+0000 7fcd89d7c700 -1 osd.2 80 *** Got signal Terminated *** 2026-03-10T14:12:57.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:57 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[79637]: 2026-03-10T14:12:57.116+0000 7fcd89d7c700 -1 osd.2 80 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:12:58.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:57 vm03.local ceph-mon[103098]: Upgrade: Updating osd.2 2026-03-10T14:12:58.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:57 vm03.local ceph-mon[103098]: Deploying daemon osd.2 on vm03 2026-03-10T14:12:58.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:57 vm03.local ceph-mon[103098]: pgmap v158: 65 pgs: 65 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:12:58.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:57 vm03.local ceph-mon[103098]: osd.2 marked itself down and dead 2026-03-10T14:12:58.203 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118075]: 2026-03-10 14:12:58.013478628 +0000 UTC m=+0.912095426 container died 7c08a01b8fe138c9c14fe2956a3ec7a281eb018b6947bc1342867d8e68629e86 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, GIT_BRANCH=HEAD) 2026-03-10T14:12:58.203 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118075]: 2026-03-10 14:12:58.032033733 +0000 UTC m=+0.930650532 container remove 7c08a01b8fe138c9c14fe2956a3ec7a281eb018b6947bc1342867d8e68629e86 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2, org.label-schema.build-date=20231212, GIT_CLEAN=True, RELEASE=HEAD, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-10T14:12:58.203 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local bash[118075]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2 2026-03-10T14:12:58.203 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118141]: 2026-03-10 14:12:58.163941225 +0000 UTC m=+0.018006408 container create eb1230d2f354510a6f43eafd2c1e499cc14c990983556ea6e840d752219332c8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.41.3) 2026-03-10T14:12:58.203 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118141]: 2026-03-10 14:12:58.20379326 +0000 UTC m=+0.057858452 container init eb1230d2f354510a6f43eafd2c1e499cc14c990983556ea6e840d752219332c8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True) 2026-03-10T14:12:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:57 vm04.local ceph-mon[92084]: Upgrade: Updating osd.2 2026-03-10T14:12:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:57 vm04.local ceph-mon[92084]: Deploying daemon osd.2 on vm03 2026-03-10T14:12:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:57 vm04.local ceph-mon[92084]: pgmap v158: 65 pgs: 65 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:12:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:57 vm04.local ceph-mon[92084]: osd.2 marked itself down and dead 2026-03-10T14:12:58.492 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118141]: 2026-03-10 14:12:58.20941839 +0000 UTC m=+0.063483573 container start eb1230d2f354510a6f43eafd2c1e499cc14c990983556ea6e840d752219332c8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid) 2026-03-10T14:12:58.493 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118141]: 2026-03-10 14:12:58.210458447 +0000 UTC m=+0.064523630 container attach eb1230d2f354510a6f43eafd2c1e499cc14c990983556ea6e840d752219332c8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T14:12:58.493 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118141]: 2026-03-10 14:12:58.15587214 +0000 UTC m=+0.009937334 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:12:58.493 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118160]: 2026-03-10 14:12:58.374044031 +0000 UTC m=+0.011925396 container died eb1230d2f354510a6f43eafd2c1e499cc14c990983556ea6e840d752219332c8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True) 2026-03-10T14:12:58.493 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118160]: 2026-03-10 14:12:58.390577382 +0000 UTC m=+0.028458747 container remove eb1230d2f354510a6f43eafd2c1e499cc14c990983556ea6e840d752219332c8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-10T14:12:58.493 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.2.service: Deactivated successfully. 2026-03-10T14:12:58.493 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local systemd[1]: Stopped Ceph osd.2 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:12:58.493 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.2.service: Consumed 40.729s CPU time. 2026-03-10T14:12:58.858 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local systemd[1]: Starting Ceph osd.2 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:12:58.858 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118244]: 2026-03-10 14:12:58.772515975 +0000 UTC m=+0.082973187 container create a192d6af02993016584007ded7a26db10f5c12fdd2eee97affed6f42cb5a5352 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T14:12:58.858 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118244]: 2026-03-10 14:12:58.698854209 +0000 UTC m=+0.009311432 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:12:58.858 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118244]: 2026-03-10 14:12:58.822851594 +0000 UTC m=+0.133308806 container init a192d6af02993016584007ded7a26db10f5c12fdd2eee97affed6f42cb5a5352 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223) 2026-03-10T14:12:58.858 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118244]: 2026-03-10 14:12:58.826435643 +0000 UTC m=+0.136892856 container start a192d6af02993016584007ded7a26db10f5c12fdd2eee97affed6f42cb5a5352 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:12:58.858 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local podman[118244]: 2026-03-10 14:12:58.832713716 +0000 UTC m=+0.143170928 container attach a192d6af02993016584007ded7a26db10f5c12fdd2eee97affed6f42cb5a5352 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0) 2026-03-10T14:12:59.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:58 vm04.local ceph-mon[92084]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:12:59.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:58 vm04.local ceph-mon[92084]: osdmap e81: 6 total, 5 up, 6 in 2026-03-10T14:12:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:58 vm03.local ceph-mon[103098]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:12:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:58 vm03.local ceph-mon[103098]: osdmap e81: 6 total, 5 up, 6 in 2026-03-10T14:12:59.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:59.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local bash[118244]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:59.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:59.358 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:58 vm03.local bash[118244]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-0a45a47d-7589-41b2-be15-c1e61c959311/osd-block-03fb18a9-019c-440f-9b09-702297165b29 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-0a45a47d-7589-41b2-be15-c1e61c959311/osd-block-03fb18a9-019c-440f-9b09-702297165b29 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T14:12:59.712 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/ln -snf /dev/ceph-0a45a47d-7589-41b2-be15-c1e61c959311/osd-block-03fb18a9-019c-440f-9b09-702297165b29 /var/lib/ceph/osd/ceph-2/block 2026-03-10T14:12:59.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-mon[103098]: pgmap v160: 65 pgs: 9 stale+active+clean, 56 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:12:59.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-mon[103098]: osdmap e82: 6 total, 5 up, 6 in 2026-03-10T14:12:59.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:59.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:12:59.966 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: Running command: /usr/bin/ln -snf /dev/ceph-0a45a47d-7589-41b2-be15-c1e61c959311/osd-block-03fb18a9-019c-440f-9b09-702297165b29 /var/lib/ceph/osd/ceph-2/block 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate[118256]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118244]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local conmon[118256]: conmon a192d6af029930165840 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a192d6af02993016584007ded7a26db10f5c12fdd2eee97affed6f42cb5a5352.scope/container/memory.events 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local podman[118244]: 2026-03-10 14:12:59.745413282 +0000 UTC m=+1.055870504 container died a192d6af02993016584007ded7a26db10f5c12fdd2eee97affed6f42cb5a5352 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T14:12:59.966 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local podman[118244]: 2026-03-10 14:12:59.76240444 +0000 UTC m=+1.072861652 container remove a192d6af02993016584007ded7a26db10f5c12fdd2eee97affed6f42cb5a5352 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T14:12:59.967 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local podman[118510]: 2026-03-10 14:12:59.85254876 +0000 UTC m=+0.015995084 container create e0d768b7046892884878c67e1381d02cdf55a824347f3dedd3b317c45ba4ad05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True) 2026-03-10T14:12:59.967 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local podman[118510]: 2026-03-10 14:12:59.893115973 +0000 UTC m=+0.056562297 container init e0d768b7046892884878c67e1381d02cdf55a824347f3dedd3b317c45ba4ad05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T14:12:59.967 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local podman[118510]: 2026-03-10 14:12:59.897116331 +0000 UTC m=+0.060562646 container start e0d768b7046892884878c67e1381d02cdf55a824347f3dedd3b317c45ba4ad05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_REF=squid) 2026-03-10T14:12:59.967 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local bash[118510]: e0d768b7046892884878c67e1381d02cdf55a824347f3dedd3b317c45ba4ad05 2026-03-10T14:12:59.967 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local podman[118510]: 2026-03-10 14:12:59.846524463 +0000 UTC m=+0.009970797 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:12:59.967 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:12:59 vm03.local systemd[1]: Started Ceph osd.2 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:13:00.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:59 vm04.local ceph-mon[92084]: pgmap v160: 65 pgs: 9 stale+active+clean, 56 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:13:00.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:59 vm04.local ceph-mon[92084]: osdmap e82: 6 total, 5 up, 6 in 2026-03-10T14:13:00.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:00.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:00.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:12:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:00.973 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:13:00 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[118520]: 2026-03-10T14:13:00.973+0000 7f2ac2700740 -1 Falling back to public interface 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.4", "id": [0, 2]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.c", "id": [3, 5]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.16", "id": [3, 1]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.1d", "id": [3, 2]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.1", "id": [4, 5]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.7", "id": [1, 0]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.a", "id": [3, 2, 1, 0]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.11", "id": [3, 2]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.19", "id": [4, 5]}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:13:01.237 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:01 vm03.local ceph-mon[103098]: pgmap v162: 65 pgs: 4 active+undersized, 6 stale+active+clean, 4 active+undersized+degraded, 51 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 13/336 objects degraded (3.869%) 2026-03-10T14:13:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.4", "id": [0, 2]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.c", "id": [3, 5]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.16", "id": [3, 1]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.1d", "id": [3, 2]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.1", "id": [4, 5]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.7", "id": [1, 0]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.a", "id": [3, 2, 1, 0]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.11", "id": [3, 2]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.19", "id": [4, 5]}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:13:01.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:01 vm04.local ceph-mon[92084]: pgmap v162: 65 pgs: 4 active+undersized, 6 stale+active+clean, 4 active+undersized+degraded, 51 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 13/336 objects degraded (3.869%) 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.4", "id": [0, 2]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.c", "id": [3, 5]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.16", "id": [3, 1]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.1d", "id": [3, 2]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.1", "id": [4, 5]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.7", "id": [1, 0]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.a", "id": [3, 2, 1, 0]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.11", "id": [3, 2]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.19", "id": [4, 5]}]': finished 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: osdmap e83: 6 total, 5 up, 6 in 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: Health check failed: Degraded data redundancy: 13/336 objects degraded (3.869%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:02.469 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.4", "id": [0, 2]}]': finished 2026-03-10T14:13:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.c", "id": [3, 5]}]': finished 2026-03-10T14:13:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.16", "id": [3, 1]}]': finished 2026-03-10T14:13:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "2.1d", "id": [3, 2]}]': finished 2026-03-10T14:13:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.1", "id": [4, 5]}]': finished 2026-03-10T14:13:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.7", "id": [1, 0]}]': finished 2026-03-10T14:13:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.a", "id": [3, 2, 1, 0]}]': finished 2026-03-10T14:13:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.11", "id": [3, 2]}]': finished 2026-03-10T14:13:02.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd pg-upmap-items", "format": "json", "pgid": "3.19", "id": [4, 5]}]': finished 2026-03-10T14:13:02.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: osdmap e83: 6 total, 5 up, 6 in 2026-03-10T14:13:02.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: Health check failed: Degraded data redundancy: 13/336 objects degraded (3.869%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:02.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:02.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:03.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:03 vm03.local ceph-mon[103098]: osdmap e84: 6 total, 5 up, 6 in 2026-03-10T14:13:03.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:03.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:03.501 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:03 vm03.local ceph-mon[103098]: pgmap v165: 65 pgs: 1 peering, 3 remapped+peering, 4 unknown, 13 active+undersized, 11 active+undersized+degraded, 33 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 37/336 objects degraded (11.012%) 2026-03-10T14:13:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:03 vm04.local ceph-mon[92084]: osdmap e84: 6 total, 5 up, 6 in 2026-03-10T14:13:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:03 vm04.local ceph-mon[92084]: pgmap v165: 65 pgs: 1 peering, 3 remapped+peering, 4 unknown, 13 active+undersized, 11 active+undersized+degraded, 33 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 37/336 objects degraded (11.012%) 2026-03-10T14:13:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: osdmap e85: 6 total, 5 up, 6 in 2026-03-10T14:13:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:13:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T14:13:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T14:13:04.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:04 vm04.local ceph-mon[92084]: Upgrade: 4 pgs have unknown state; cannot draw any conclusions 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: osdmap e85: 6 total, 5 up, 6 in 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T14:13:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-mon[103098]: Upgrade: 4 pgs have unknown state; cannot draw any conclusions 2026-03-10T14:13:04.963 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[118520]: 2026-03-10T14:13:04.691+0000 7f2ac2700740 -1 osd.2 0 read_superblock omap replica is missing. 2026-03-10T14:13:05.228 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:13:04 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[118520]: 2026-03-10T14:13:04.964+0000 7f2ac2700740 -1 osd.2 80 log_to_monitors true 2026-03-10T14:13:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:05 vm04.local ceph-mon[92084]: osdmap e86: 6 total, 5 up, 6 in 2026-03-10T14:13:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:05 vm04.local ceph-mon[92084]: pgmap v168: 65 pgs: 1 peering, 3 remapped+peering, 4 unknown, 13 active+undersized, 11 active+undersized+degraded, 33 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 37/336 objects degraded (11.012%) 2026-03-10T14:13:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:05 vm04.local ceph-mon[92084]: from='osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T14:13:05.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:05 vm03.local ceph-mon[103098]: osdmap e86: 6 total, 5 up, 6 in 2026-03-10T14:13:05.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:05 vm03.local ceph-mon[103098]: pgmap v168: 65 pgs: 1 peering, 3 remapped+peering, 4 unknown, 13 active+undersized, 11 active+undersized+degraded, 33 active+clean; 275 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 37/336 objects degraded (11.012%) 2026-03-10T14:13:05.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:05 vm03.local ceph-mon[103098]: from='osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T14:13:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:06 vm04.local ceph-mon[92084]: from='osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T14:13:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:06 vm04.local ceph-mon[92084]: osdmap e87: 6 total, 5 up, 6 in 2026-03-10T14:13:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:06 vm04.local ceph-mon[92084]: from='osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:13:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:06 vm04.local ceph-mon[92084]: from='osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777]' entity='osd.2' 2026-03-10T14:13:06.607 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:13:06 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[118520]: 2026-03-10T14:13:06.176+0000 7f2ab9c99640 -1 osd.2 80 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:13:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:06 vm03.local ceph-mon[103098]: from='osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T14:13:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:06 vm03.local ceph-mon[103098]: osdmap e87: 6 total, 5 up, 6 in 2026-03-10T14:13:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:06 vm03.local ceph-mon[103098]: from='osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-10T14:13:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:06 vm03.local ceph-mon[103098]: from='osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777]' entity='osd.2' 2026-03-10T14:13:07.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:07 vm04.local ceph-mon[92084]: pgmap v170: 65 pgs: 3 active+clean+remapped, 1 active+recovering+undersized+remapped, 3 remapped+peering, 2 unknown, 12 active+undersized, 11 active+undersized+degraded, 33 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 37/336 objects degraded (11.012%); 9/336 objects misplaced (2.679%); 2.7 MiB/s, 1 objects/s recovering 2026-03-10T14:13:07.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:07 vm04.local ceph-mon[92084]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:13:07.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:07 vm04.local ceph-mon[92084]: osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777] boot 2026-03-10T14:13:07.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:07 vm04.local ceph-mon[92084]: osdmap e88: 6 total, 6 up, 6 in 2026-03-10T14:13:07.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:13:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:07 vm03.local ceph-mon[103098]: pgmap v170: 65 pgs: 3 active+clean+remapped, 1 active+recovering+undersized+remapped, 3 remapped+peering, 2 unknown, 12 active+undersized, 11 active+undersized+degraded, 33 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 37/336 objects degraded (11.012%); 9/336 objects misplaced (2.679%); 2.7 MiB/s, 1 objects/s recovering 2026-03-10T14:13:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:07 vm03.local ceph-mon[103098]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:13:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:07 vm03.local ceph-mon[103098]: osd.2 [v2:192.168.123.103:6818/4167735777,v1:192.168.123.103:6819/4167735777] boot 2026-03-10T14:13:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:07 vm03.local ceph-mon[103098]: osdmap e88: 6 total, 6 up, 6 in 2026-03-10T14:13:07.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T14:13:09.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:09 vm04.local ceph-mon[92084]: osdmap e89: 6 total, 6 up, 6 in 2026-03-10T14:13:09.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:09 vm04.local ceph-mon[92084]: pgmap v173: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+undersized+remapped, 5 active+clean+remapped, 2 active+recovering+undersized+remapped, 12 active+undersized, 11 active+undersized+degraded, 33 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2.3 KiB/s rd, 4 op/s; 774/336 objects degraded (230.357%); 728/336 objects misplaced (216.667%); 2.7 MiB/s, 6 objects/s recovering 2026-03-10T14:13:09.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:09 vm03.local ceph-mon[103098]: osdmap e89: 6 total, 6 up, 6 in 2026-03-10T14:13:09.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:09 vm03.local ceph-mon[103098]: pgmap v173: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+undersized+remapped, 5 active+clean+remapped, 2 active+recovering+undersized+remapped, 12 active+undersized, 11 active+undersized+degraded, 33 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2.3 KiB/s rd, 4 op/s; 774/336 objects degraded (230.357%); 728/336 objects misplaced (216.667%); 2.7 MiB/s, 6 objects/s recovering 2026-03-10T14:13:10.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:10 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 774/336 objects degraded (230.357%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:10.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:10 vm04.local ceph-mon[92084]: osdmap e90: 6 total, 6 up, 6 in 2026-03-10T14:13:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:10 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 774/336 objects degraded (230.357%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:10.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:10 vm03.local ceph-mon[103098]: osdmap e90: 6 total, 6 up, 6 in 2026-03-10T14:13:11.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:11 vm04.local ceph-mon[92084]: osdmap e91: 6 total, 6 up, 6 in 2026-03-10T14:13:11.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:11 vm04.local ceph-mon[92084]: pgmap v176: 65 pgs: 3 active+recovery_wait+undersized+degraded+remapped, 4 active+clean+remapped, 1 active+recovering+undersized+remapped, 7 active+undersized, 5 active+undersized+degraded, 45 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 3 op/s; 2171/336 objects degraded (646.131%); 11/336 objects misplaced (3.274%); 0 B/s, 12 objects/s recovering 2026-03-10T14:13:11.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:11 vm03.local ceph-mon[103098]: osdmap e91: 6 total, 6 up, 6 in 2026-03-10T14:13:11.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:11 vm03.local ceph-mon[103098]: pgmap v176: 65 pgs: 3 active+recovery_wait+undersized+degraded+remapped, 4 active+clean+remapped, 1 active+recovering+undersized+remapped, 7 active+undersized, 5 active+undersized+degraded, 45 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 3 op/s; 2171/336 objects degraded (646.131%); 11/336 objects misplaced (3.274%); 0 B/s, 12 objects/s recovering 2026-03-10T14:13:12.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:12 vm04.local ceph-mon[92084]: osdmap e92: 6 total, 6 up, 6 in 2026-03-10T14:13:12.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:12 vm03.local ceph-mon[103098]: osdmap e92: 6 total, 6 up, 6 in 2026-03-10T14:13:13.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:13 vm04.local ceph-mon[92084]: pgmap v178: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 1 active+clean+remapped, 60 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2.0 KiB/s rd, 4 op/s; 3622/336 objects degraded (1077.976%); 5/336 objects misplaced (1.488%); 1.8 MiB/s, 175 objects/s recovering 2026-03-10T14:13:13.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:13 vm03.local ceph-mon[103098]: pgmap v178: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 1 active+clean+remapped, 60 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2.0 KiB/s rd, 4 op/s; 3622/336 objects degraded (1077.976%); 5/336 objects misplaced (1.488%); 1.8 MiB/s, 175 objects/s recovering 2026-03-10T14:13:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:15 vm04.local ceph-mon[92084]: pgmap v179: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 1 active+clean+remapped, 60 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.5 KiB/s rd, 3 op/s; 3622/336 objects degraded (1077.976%); 5/336 objects misplaced (1.488%); 1.3 MiB/s, 130 objects/s recovering 2026-03-10T14:13:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:15 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 3622/336 objects degraded (1077.976%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:13:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:15 vm03.local ceph-mon[103098]: pgmap v179: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 1 active+clean+remapped, 60 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 1.5 KiB/s rd, 3 op/s; 3622/336 objects degraded (1077.976%); 5/336 objects misplaced (1.488%); 1.3 MiB/s, 130 objects/s recovering 2026-03-10T14:13:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:15 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 3622/336 objects degraded (1077.976%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:13:17.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:17 vm04.local ceph-mon[92084]: pgmap v180: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 84 KiB/s rd, 3 op/s; 2908/336 objects degraded (865.476%); 1.1 MiB/s, 110 objects/s recovering 2026-03-10T14:13:18.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:17 vm03.local ceph-mon[103098]: pgmap v180: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 84 KiB/s rd, 3 op/s; 2908/336 objects degraded (865.476%); 1.1 MiB/s, 110 objects/s recovering 2026-03-10T14:13:18.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T14:13:18.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:18.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T14:13:18.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:18 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T14:13:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T14:13:19.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:18 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.548+0000 7f48a4adf700 1 -- 192.168.123.103:0/2665488226 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 msgr2=0x7f48a0103b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.548+0000 7f48a4adf700 1 --2- 192.168.123.103:0/2665488226 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 0x7f48a0103b20 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f4890009b00 tx=0x7f4890009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.549+0000 7f48a4adf700 1 -- 192.168.123.103:0/2665488226 shutdown_connections 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.549+0000 7f48a4adf700 1 --2- 192.168.123.103:0/2665488226 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 0x7f48a0103b20 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.549+0000 7f48a4adf700 1 --2- 192.168.123.103:0/2665488226 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a01024d0 0x7f48a01028e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.549+0000 7f48a4adf700 1 -- 192.168.123.103:0/2665488226 >> 192.168.123.103:0/2665488226 conn(0x7f48a00fda80 msgr2=0x7f48a00ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.549+0000 7f48a4adf700 1 -- 192.168.123.103:0/2665488226 shutdown_connections 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.549+0000 7f48a4adf700 1 -- 192.168.123.103:0/2665488226 wait complete. 2026-03-10T14:13:19.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f48a4adf700 1 Processor -- start 2026-03-10T14:13:19.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f48a4adf700 1 -- start start 2026-03-10T14:13:19.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f48a4adf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a01024d0 0x7f48a0193960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:19.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f48a4adf700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 0x7f48a0193ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:19.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f48a4adf700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48a01943e0 con 0x7f48a01036d0 2026-03-10T14:13:19.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f48a4adf700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f48a0194520 con 0x7f48a01024d0 2026-03-10T14:13:19.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f489effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 0x7f48a0193ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:19.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f489effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 0x7f48a0193ea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57994/0 (socket says 192.168.123.103:57994) 2026-03-10T14:13:19.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f489effd700 1 -- 192.168.123.103:0/1078335289 learned_addr learned my addr 192.168.123.103:0/1078335289 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:19.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.550+0000 7f489f7fe700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a01024d0 0x7f48a0193960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:19.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f489effd700 1 -- 192.168.123.103:0/1078335289 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a01024d0 msgr2=0x7f48a0193960 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f489effd700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a01024d0 0x7f48a0193960 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f489effd700 1 -- 192.168.123.103:0/1078335289 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48900097e0 con 0x7f48a01036d0 2026-03-10T14:13:19.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f489effd700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 0x7f48a0193ea0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f48900049c0 tx=0x7f48900049f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:19.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f489cff9700 1 -- 192.168.123.103:0/1078335289 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f489001d070 con 0x7f48a01036d0 2026-03-10T14:13:19.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f48a006a770 con 0x7f48a01036d0 2026-03-10T14:13:19.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f48a006ac30 con 0x7f48a01036d0 2026-03-10T14:13:19.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f489cff9700 1 -- 192.168.123.103:0/1078335289 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f489000bc50 con 0x7f48a01036d0 2026-03-10T14:13:19.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.551+0000 7f489cff9700 1 -- 192.168.123.103:0/1078335289 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f489000f910 con 0x7f48a01036d0 2026-03-10T14:13:19.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.552+0000 7f489cff9700 1 -- 192.168.123.103:0/1078335289 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f489000fa70 con 0x7f48a01036d0 2026-03-10T14:13:19.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.553+0000 7f489cff9700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4888077990 0x7f4888079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:19.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.553+0000 7f489cff9700 1 -- 192.168.123.103:0/1078335289 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(92..92 src has 1..92) v4 ==== 6521+0+0 (secure 0 0 0) 0x7f489009b340 con 0x7f48a01036d0 2026-03-10T14:13:19.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.553+0000 7f489f7fe700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4888077990 0x7f4888079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:19.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.553+0000 7f489f7fe700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4888077990 0x7f4888079e40 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f48a0103530 tx=0x7f489400a3b0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:19.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.554+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f488c005320 con 0x7f48a01036d0 2026-03-10T14:13:19.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.556+0000 7f489cff9700 1 -- 192.168.123.103:0/1078335289 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4890063be0 con 0x7f48a01036d0 2026-03-10T14:13:19.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.685+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f488c000bf0 con 0x7f4888077990 2026-03-10T14:13:19.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.687+0000 7f489cff9700 1 -- 192.168.123.103:0/1078335289 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f488c000bf0 con 0x7f4888077990 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4888077990 msgr2=0x7f4888079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4888077990 0x7f4888079e40 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f48a0103530 tx=0x7f489400a3b0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 msgr2=0x7f48a0193ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 0x7f48a0193ea0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f48900049c0 tx=0x7f48900049f0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 shutdown_connections 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4888077990 0x7f4888079e40 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f48a01024d0 0x7f48a0193960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 --2- 192.168.123.103:0/1078335289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48a01036d0 0x7f48a0193ea0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.689+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 >> 192.168.123.103:0/1078335289 conn(0x7f48a00fda80 msgr2=0x7f48a0106900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.690+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 shutdown_connections 2026-03-10T14:13:19.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.690+0000 7f48a4adf700 1 -- 192.168.123.103:0/1078335289 wait complete. 2026-03-10T14:13:19.698 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:13:19.737 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local systemd[1]: Stopping Ceph osd.3 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:13:19.738 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[61840]: 2026-03-10T14:13:19.508+0000 7f26b2077700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:13:19.738 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[61840]: 2026-03-10T14:13:19.508+0000 7f26b2077700 -1 osd.3 92 *** Got signal Terminated *** 2026-03-10T14:13:19.738 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[61840]: 2026-03-10T14:13:19.508+0000 7f26b2077700 -1 osd.3 92 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:13:19.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.779+0000 7f4cf6f77700 1 -- 192.168.123.103:0/163583713 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0102740 msgr2=0x7f4cf0102b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.779+0000 7f4cf6f77700 1 --2- 192.168.123.103:0/163583713 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0102740 0x7f4cf0102b50 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f4ce4009b00 tx=0x7f4ce4009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:19.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.779+0000 7f4cf6f77700 1 -- 192.168.123.103:0/163583713 shutdown_connections 2026-03-10T14:13:19.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.779+0000 7f4cf6f77700 1 --2- 192.168.123.103:0/163583713 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cf0103940 0x7f4cf0103d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.779+0000 7f4cf6f77700 1 --2- 192.168.123.103:0/163583713 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0102740 0x7f4cf0102b50 secure :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f4ce4009b00 tx=0x7f4ce4009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.779+0000 7f4cf6f77700 1 -- 192.168.123.103:0/163583713 >> 192.168.123.103:0/163583713 conn(0x7f4cf00fdcf0 msgr2=0x7f4cf0100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.780+0000 7f4cf6f77700 1 -- 192.168.123.103:0/163583713 shutdown_connections 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.780+0000 7f4cf6f77700 1 -- 192.168.123.103:0/163583713 wait complete. 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.780+0000 7f4cf6f77700 1 Processor -- start 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.780+0000 7f4cf6f77700 1 -- start start 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf6f77700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0103940 0x7f4cf01982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf6f77700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cf0198830 0x7f4cf019d8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf6f77700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cf0198d30 con 0x7f4cf0103940 2026-03-10T14:13:19.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf6f77700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cf0198ea0 con 0x7f4cf0198830 2026-03-10T14:13:19.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf4d13700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0103940 0x7f4cf01982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:19.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf4d13700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0103940 0x7f4cf01982f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58012/0 (socket says 192.168.123.103:58012) 2026-03-10T14:13:19.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf4d13700 1 -- 192.168.123.103:0/1408779771 learned_addr learned my addr 192.168.123.103:0/1408779771 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:19.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf4d13700 1 -- 192.168.123.103:0/1408779771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cf0198830 msgr2=0x7f4cf019d8a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:13:19.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf4d13700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cf0198830 0x7f4cf019d8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf4d13700 1 -- 192.168.123.103:0/1408779771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4ce40097e0 con 0x7f4cf0103940 2026-03-10T14:13:19.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.781+0000 7f4cf4d13700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0103940 0x7f4cf01982f0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f4ce4000c00 tx=0x7f4ce400bb70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:19.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.782+0000 7f4cedffb700 1 -- 192.168.123.103:0/1408779771 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ce401d070 con 0x7f4cf0103940 2026-03-10T14:13:19.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.782+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4cf019dde0 con 0x7f4cf0103940 2026-03-10T14:13:19.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.783+0000 7f4cedffb700 1 -- 192.168.123.103:0/1408779771 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4ce400bc90 con 0x7f4cf0103940 2026-03-10T14:13:19.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.783+0000 7f4cedffb700 1 -- 192.168.123.103:0/1408779771 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ce4021850 con 0x7f4cf0103940 2026-03-10T14:13:19.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.783+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4cf019e240 con 0x7f4cf0103940 2026-03-10T14:13:19.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.784+0000 7f4cedffb700 1 -- 192.168.123.103:0/1408779771 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4ce4021a70 con 0x7f4cf0103940 2026-03-10T14:13:19.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.784+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4cf0066e40 con 0x7f4cf0103940 2026-03-10T14:13:19.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.785+0000 7f4cedffb700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ce00777d0 0x7f4ce0079c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:19.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.785+0000 7f4cedffb700 1 -- 192.168.123.103:0/1408779771 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(93..93 src has 1..93) v4 ==== 6521+0+0 (secure 0 0 0) 0x7f4ce400f460 con 0x7f4cf0103940 2026-03-10T14:13:19.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.787+0000 7f4ceffff700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ce00777d0 0x7f4ce0079c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:19.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.787+0000 7f4ceffff700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ce00777d0 0x7f4ce0079c80 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f4cdc009730 tx=0x7f4cdc006cb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:19.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.788+0000 7f4cedffb700 1 -- 192.168.123.103:0/1408779771 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4ce40a1050 con 0x7f4cf0103940 2026-03-10T14:13:19.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:19 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T14:13:19.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:19 vm03.local ceph-mon[103098]: Upgrade: osd.3 is safe to restart 2026-03-10T14:13:19.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:19 vm03.local ceph-mon[103098]: Upgrade: Updating osd.3 2026-03-10T14:13:19.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:19 vm03.local ceph-mon[103098]: pgmap v181: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 76 KiB/s rd, 4 op/s; 2908/336 objects degraded (865.476%); 977 KiB/s, 94 objects/s recovering 2026-03-10T14:13:19.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:19 vm03.local ceph-mon[103098]: Deploying daemon osd.3 on vm04 2026-03-10T14:13:19.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:19 vm03.local ceph-mon[103098]: osd.3 marked itself down and dead 2026-03-10T14:13:19.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.920+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4cf0108290 con 0x7f4ce00777d0 2026-03-10T14:13:19.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.921+0000 7f4cedffb700 1 -- 192.168.123.103:0/1408779771 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f4cf0108290 con 0x7f4ce00777d0 2026-03-10T14:13:19.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ce00777d0 msgr2=0x7f4ce0079c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ce00777d0 0x7f4ce0079c80 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f4cdc009730 tx=0x7f4cdc006cb0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0103940 msgr2=0x7f4cf01982f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0103940 0x7f4cf01982f0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f4ce4000c00 tx=0x7f4ce400bb70 comp rx=0 tx=0).stop 2026-03-10T14:13:19.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 shutdown_connections 2026-03-10T14:13:19.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ce00777d0 0x7f4ce0079c80 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cf0103940 0x7f4cf01982f0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 --2- 192.168.123.103:0/1408779771 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cf0198830 0x7f4cf019d8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.924+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 >> 192.168.123.103:0/1408779771 conn(0x7f4cf00fdcf0 msgr2=0x7f4cf0106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:19.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.925+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 shutdown_connections 2026-03-10T14:13:19.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.925+0000 7f4cf6f77700 1 -- 192.168.123.103:0/1408779771 wait complete. 2026-03-10T14:13:19.991 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T14:13:19.991 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-mon[92084]: Upgrade: osd.3 is safe to restart 2026-03-10T14:13:19.991 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-mon[92084]: Upgrade: Updating osd.3 2026-03-10T14:13:19.991 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-mon[92084]: pgmap v181: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 76 KiB/s rd, 4 op/s; 2908/336 objects degraded (865.476%); 977 KiB/s, 94 objects/s recovering 2026-03-10T14:13:19.991 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-mon[92084]: Deploying daemon osd.3 on vm04 2026-03-10T14:13:19.991 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:19 vm04.local ceph-mon[92084]: osd.3 marked itself down and dead 2026-03-10T14:13:19.991 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local podman[97994]: 2026-03-10 14:13:19.792772988 +0000 UTC m=+0.299256049 container died 99f4c3155942a2a5198a9373bfff5fc55f70d1851838bcc89aa8a3c36283c2b8 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, ceph=True, GIT_BRANCH=HEAD) 2026-03-10T14:13:19.991 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local podman[97994]: 2026-03-10 14:13:19.813799638 +0000 UTC m=+0.320282701 container remove 99f4c3155942a2a5198a9373bfff5fc55f70d1851838bcc89aa8a3c36283c2b8 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3, GIT_BRANCH=HEAD, GIT_CLEAN=True, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.0, RELEASE=HEAD, ceph=True, org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T14:13:19.991 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local bash[97994]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3 2026-03-10T14:13:19.991 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local podman[98059]: 2026-03-10 14:13:19.950937693 +0000 UTC m=+0.015508021 container create 1653afd3fb7af5e9aacebee4e2455c64ac6cfefa247b17cf3946ab3cefce72f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223) 2026-03-10T14:13:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.994+0000 7f47a34e7700 1 -- 192.168.123.103:0/3342912974 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 msgr2=0x7f479c0731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.994+0000 7f47a34e7700 1 --2- 192.168.123.103:0/3342912974 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 0x7f479c0731e0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f4790009b50 tx=0x7f4790009e60 comp rx=0 tx=0).stop 2026-03-10T14:13:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.995+0000 7f47a34e7700 1 -- 192.168.123.103:0/3342912974 shutdown_connections 2026-03-10T14:13:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.995+0000 7f47a34e7700 1 --2- 192.168.123.103:0/3342912974 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f479c0737b0 0x7f479c073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.995+0000 7f47a34e7700 1 --2- 192.168.123.103:0/3342912974 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 0x7f479c0731e0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.995+0000 7f47a34e7700 1 -- 192.168.123.103:0/3342912974 >> 192.168.123.103:0/3342912974 conn(0x7f479c0fbae0 msgr2=0x7f479c0fdf30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.995+0000 7f47a34e7700 1 -- 192.168.123.103:0/3342912974 shutdown_connections 2026-03-10T14:13:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.995+0000 7f47a34e7700 1 -- 192.168.123.103:0/3342912974 wait complete. 2026-03-10T14:13:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.995+0000 7f47a34e7700 1 Processor -- start 2026-03-10T14:13:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.995+0000 7f47a34e7700 1 -- start start 2026-03-10T14:13:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a34e7700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f479c0737b0 0x7f479c10ba10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a34e7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 0x7f479c10bf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a34e7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f479c10c570 con 0x7f479c074d80 2026-03-10T14:13:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a34e7700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f479c10c6b0 con 0x7f479c0737b0 2026-03-10T14:13:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a0a82700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 0x7f479c10bf50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a0a82700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 0x7f479c10bf50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58034/0 (socket says 192.168.123.103:58034) 2026-03-10T14:13:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a0a82700 1 -- 192.168.123.103:0/1550403458 learned_addr learned my addr 192.168.123.103:0/1550403458 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a0a82700 1 -- 192.168.123.103:0/1550403458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f479c0737b0 msgr2=0x7f479c10ba10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a0a82700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f479c0737b0 0x7f479c10ba10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a0a82700 1 -- 192.168.123.103:0/1550403458 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f47900097e0 con 0x7f479c074d80 2026-03-10T14:13:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.996+0000 7f47a0a82700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 0x7f479c10bf50 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f479800eb10 tx=0x7f479800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:19.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.997+0000 7f478e7fc700 1 -- 192.168.123.103:0/1550403458 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f479800cca0 con 0x7f479c074d80 2026-03-10T14:13:19.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.997+0000 7f478e7fc700 1 -- 192.168.123.103:0/1550403458 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f479800ce00 con 0x7f479c074d80 2026-03-10T14:13:19.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.998+0000 7f478e7fc700 1 -- 192.168.123.103:0/1550403458 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4798018910 con 0x7f479c074d80 2026-03-10T14:13:19.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.998+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f479c06eda0 con 0x7f479c074d80 2026-03-10T14:13:19.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.998+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f479c06f270 con 0x7f479c074d80 2026-03-10T14:13:19.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:19.999+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f479c066e40 con 0x7f479c074d80 2026-03-10T14:13:20.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.000+0000 7f478e7fc700 1 -- 192.168.123.103:0/1550403458 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4798018a70 con 0x7f479c074d80 2026-03-10T14:13:20.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.000+0000 7f478e7fc700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f47880776c0 0x7f4788079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.000+0000 7f478e7fc700 1 -- 192.168.123.103:0/1550403458 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(93..93 src has 1..93) v4 ==== 6521+0+0 (secure 0 0 0) 0x7f4798014070 con 0x7f479c074d80 2026-03-10T14:13:20.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.003+0000 7f478e7fc700 1 -- 192.168.123.103:0/1550403458 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4798062850 con 0x7f479c074d80 2026-03-10T14:13:20.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.005+0000 7f47a1283700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f47880776c0 0x7f4788079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.005+0000 7f47a1283700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f47880776c0 0x7f4788079b70 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f479000b5c0 tx=0x7f47900058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.126 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.126+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f479c103600 con 0x7f47880776c0 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (9m) 18s ago 9m 22.1M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (7m) 18s ago 10m 9575k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (9m) 3m ago 9m 11.2M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 18s ago 10m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (3m) 3m ago 9m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (9m) 18s ago 9m 92.0M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (7m) 18s ago 7m 136M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (7m) 18s ago 7m 72.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (7m) 3m ago 7m 91.2M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (7m) 3m ago 7m 166M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (5m) 18s ago 10m 620M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (4m) 3m ago 9m 496M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 18s ago 10m 61.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (4m) 3m ago 9m 49.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (10m) 18s ago 10m 15.4M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (9m) 3m ago 9m 15.0M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 18s ago 9m 206M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (42s) 18s ago 8m 127M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (20s) 18s ago 8m 13.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e0d768b70468 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (8m) 3m ago 8m 434M 4096M 18.2.0 dc2bc1663786 99f4c3155942 2026-03-10T14:13:20.131 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (8m) 3m ago 8m 397M 4096M 18.2.0 dc2bc1663786 127d95fabe23 2026-03-10T14:13:20.132 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (8m) 3m ago 8m 352M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:13:20.132 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (4m) 18s ago 9m 69.1M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:13:20.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.132+0000 7f478e7fc700 1 -- 192.168.123.103:0/1550403458 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f479c103600 con 0x7f47880776c0 2026-03-10T14:13:20.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.135+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f47880776c0 msgr2=0x7f4788079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.135+0000 7f47a34e7700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f47880776c0 0x7f4788079b70 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f479000b5c0 tx=0x7f47900058e0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.135+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 msgr2=0x7f479c10bf50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.135+0000 7f47a34e7700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 0x7f479c10bf50 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f479800eb10 tx=0x7f479800eed0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.135+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 shutdown_connections 2026-03-10T14:13:20.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.135+0000 7f47a34e7700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f47880776c0 0x7f4788079b70 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.135+0000 7f47a34e7700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f479c0737b0 0x7f479c10ba10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.136+0000 7f47a34e7700 1 --2- 192.168.123.103:0/1550403458 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f479c074d80 0x7f479c10bf50 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.136+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 >> 192.168.123.103:0/1550403458 conn(0x7f479c0fbae0 msgr2=0x7f479c101ee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:20.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.136+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 shutdown_connections 2026-03-10T14:13:20.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.136+0000 7f47a34e7700 1 -- 192.168.123.103:0/1550403458 wait complete. 2026-03-10T14:13:20.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.204+0000 7f2ca72eb700 1 -- 192.168.123.103:0/1785632076 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 msgr2=0x7f2ca0102b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.204+0000 7f2ca72eb700 1 --2- 192.168.123.103:0/1785632076 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 0x7f2ca0102b90 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f2c94009b00 tx=0x7f2c94009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:20.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.205+0000 7f2ca72eb700 1 -- 192.168.123.103:0/1785632076 shutdown_connections 2026-03-10T14:13:20.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.205+0000 7f2ca72eb700 1 --2- 192.168.123.103:0/1785632076 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ca0103980 0x7f2ca0103dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.205+0000 7f2ca72eb700 1 --2- 192.168.123.103:0/1785632076 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 0x7f2ca0102b90 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.205+0000 7f2ca72eb700 1 -- 192.168.123.103:0/1785632076 >> 192.168.123.103:0/1785632076 conn(0x7f2ca00fdd50 msgr2=0x7f2ca0100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:20.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.208+0000 7f2ca72eb700 1 -- 192.168.123.103:0/1785632076 shutdown_connections 2026-03-10T14:13:20.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.208+0000 7f2ca72eb700 1 -- 192.168.123.103:0/1785632076 wait complete. 2026-03-10T14:13:20.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.209+0000 7f2ca72eb700 1 Processor -- start 2026-03-10T14:13:20.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.209+0000 7f2ca72eb700 1 -- start start 2026-03-10T14:13:20.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.209+0000 7f2ca72eb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 0x7f2ca019c430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.209+0000 7f2ca72eb700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ca0103980 0x7f2ca019c970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.209+0000 7f2ca72eb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ca019cf90 con 0x7f2ca0102780 2026-03-10T14:13:20.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.209+0000 7f2ca72eb700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ca019d0d0 con 0x7f2ca0103980 2026-03-10T14:13:20.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.209+0000 7f2ca5087700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 0x7f2ca019c430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.210+0000 7f2ca5087700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 0x7f2ca019c430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58050/0 (socket says 192.168.123.103:58050) 2026-03-10T14:13:20.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.210+0000 7f2ca5087700 1 -- 192.168.123.103:0/3332277949 learned_addr learned my addr 192.168.123.103:0/3332277949 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:20.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.210+0000 7f2ca5087700 1 -- 192.168.123.103:0/3332277949 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ca0103980 msgr2=0x7f2ca019c970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:13:20.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.210+0000 7f2ca5087700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ca0103980 0x7f2ca019c970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.210+0000 7f2ca5087700 1 -- 192.168.123.103:0/3332277949 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2c940097e0 con 0x7f2ca0102780 2026-03-10T14:13:20.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.210+0000 7f2ca5087700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 0x7f2ca019c430 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f2c94009fd0 tx=0x7f2c94004c80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.211+0000 7f2c927fc700 1 -- 192.168.123.103:0/3332277949 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2c9401d070 con 0x7f2ca0102780 2026-03-10T14:13:20.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.211+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2ca006a8b0 con 0x7f2ca0102780 2026-03-10T14:13:20.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.211+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2ca006ada0 con 0x7f2ca0102780 2026-03-10T14:13:20.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.213+0000 7f2c927fc700 1 -- 192.168.123.103:0/3332277949 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2c9400bc30 con 0x7f2ca0102780 2026-03-10T14:13:20.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.213+0000 7f2c927fc700 1 -- 192.168.123.103:0/3332277949 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2c9400f810 con 0x7f2ca0102780 2026-03-10T14:13:20.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.213+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2c84005320 con 0x7f2ca0102780 2026-03-10T14:13:20.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.215+0000 7f2c927fc700 1 -- 192.168.123.103:0/3332277949 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2c9400fa30 con 0x7f2ca0102780 2026-03-10T14:13:20.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.215+0000 7f2c927fc700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2c8c077750 0x7f2c8c079c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.215+0000 7f2c927fc700 1 -- 192.168.123.103:0/3332277949 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(93..93 src has 1..93) v4 ==== 6521+0+0 (secure 0 0 0) 0x7f2c9409c570 con 0x7f2ca0102780 2026-03-10T14:13:20.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.216+0000 7f2ca4886700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2c8c077750 0x7f2c8c079c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.217+0000 7f2ca4886700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2c8c077750 0x7f2c8c079c00 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f2c9c00ba10 tx=0x7f2c9c00b3f0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.217+0000 7f2c927fc700 1 -- 192.168.123.103:0/3332277949 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2c94064ca0 con 0x7f2ca0102780 2026-03-10T14:13:20.280 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local podman[98059]: 2026-03-10 14:13:19.993379492 +0000 UTC m=+0.057949829 container init 1653afd3fb7af5e9aacebee4e2455c64ac6cfefa247b17cf3946ab3cefce72f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223) 2026-03-10T14:13:20.280 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local podman[98059]: 2026-03-10 14:13:19.99762574 +0000 UTC m=+0.062196068 container start 1653afd3fb7af5e9aacebee4e2455c64ac6cfefa247b17cf3946ab3cefce72f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True) 2026-03-10T14:13:20.280 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:19 vm04.local podman[98059]: 2026-03-10 14:13:19.998781092 +0000 UTC m=+0.063351420 container attach 1653afd3fb7af5e9aacebee4e2455c64ac6cfefa247b17cf3946ab3cefce72f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3) 2026-03-10T14:13:20.280 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:20 vm04.local podman[98059]: 2026-03-10 14:13:19.944751352 +0000 UTC m=+0.009321671 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:13:20.280 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:20 vm04.local podman[98059]: 2026-03-10 14:13:20.12935558 +0000 UTC m=+0.193925908 container died 1653afd3fb7af5e9aacebee4e2455c64ac6cfefa247b17cf3946ab3cefce72f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-10T14:13:20.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.390+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f2c84006160 con 0x7f2ca0102780 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.391+0000 7f2c927fc700 1 -- 192.168.123.103:0/3332277949 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f2c940643f0 con 0x7f2ca0102780 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2, 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6, 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 7 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:13:20.390 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:13:20.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2c8c077750 msgr2=0x7f2c8c079c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2c8c077750 0x7f2c8c079c00 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f2c9c00ba10 tx=0x7f2c9c00b3f0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 msgr2=0x7f2ca019c430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 0x7f2ca019c430 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f2c94009fd0 tx=0x7f2c94004c80 comp rx=0 tx=0).stop 2026-03-10T14:13:20.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 shutdown_connections 2026-03-10T14:13:20.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2c8c077750 0x7f2c8c079c00 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ca0102780 0x7f2ca019c430 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 --2- 192.168.123.103:0/3332277949 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2ca0103980 0x7f2ca019c970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.397+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 >> 192.168.123.103:0/3332277949 conn(0x7f2ca00fdd50 msgr2=0x7f2ca0106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:20.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.398+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 shutdown_connections 2026-03-10T14:13:20.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.398+0000 7f2ca72eb700 1 -- 192.168.123.103:0/3332277949 wait complete. 2026-03-10T14:13:20.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.469+0000 7f6519039700 1 -- 192.168.123.103:0/2850943663 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 msgr2=0x7f6514102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.469+0000 7f6519039700 1 --2- 192.168.123.103:0/2850943663 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 0x7f6514102b70 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f64fc009b00 tx=0x7f64fc009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:20.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.472+0000 7f6519039700 1 -- 192.168.123.103:0/2850943663 shutdown_connections 2026-03-10T14:13:20.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.472+0000 7f6519039700 1 --2- 192.168.123.103:0/2850943663 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6514103a00 0x7f6514103e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.472+0000 7f6519039700 1 --2- 192.168.123.103:0/2850943663 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 0x7f6514102b70 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.472+0000 7f6519039700 1 -- 192.168.123.103:0/2850943663 >> 192.168.123.103:0/2850943663 conn(0x7f65140fddb0 msgr2=0x7f65141001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:20.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.472+0000 7f6519039700 1 -- 192.168.123.103:0/2850943663 shutdown_connections 2026-03-10T14:13:20.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.472+0000 7f6519039700 1 -- 192.168.123.103:0/2850943663 wait complete. 2026-03-10T14:13:20.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6519039700 1 Processor -- start 2026-03-10T14:13:20.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6519039700 1 -- start start 2026-03-10T14:13:20.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6519039700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 0x7f6514198060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6519039700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6514103a00 0x7f65141985a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6519039700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6514198bc0 con 0x7f6514102760 2026-03-10T14:13:20.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6519039700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6514198d00 con 0x7f6514103a00 2026-03-10T14:13:20.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6512d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 0x7f6514198060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6512d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 0x7f6514198060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58070/0 (socket says 192.168.123.103:58070) 2026-03-10T14:13:20.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.473+0000 7f6512d9d700 1 -- 192.168.123.103:0/1614405149 learned_addr learned my addr 192.168.123.103:0/1614405149 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:20.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f6512d9d700 1 -- 192.168.123.103:0/1614405149 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6514103a00 msgr2=0x7f65141985a0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:13:20.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f6512d9d700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6514103a00 0x7f65141985a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f6512d9d700 1 -- 192.168.123.103:0/1614405149 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64fc0097e0 con 0x7f6514102760 2026-03-10T14:13:20.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f6512d9d700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 0x7f6514198060 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f64fc0049c0 tx=0x7f64fc004aa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f650bfff700 1 -- 192.168.123.103:0/1614405149 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f64fc00ba40 con 0x7f6514102760 2026-03-10T14:13:20.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f650bfff700 1 -- 192.168.123.103:0/1614405149 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f64fc027070 con 0x7f6514102760 2026-03-10T14:13:20.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f650bfff700 1 -- 192.168.123.103:0/1614405149 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f64fc00fb50 con 0x7f6514102760 2026-03-10T14:13:20.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6514075190 con 0x7f6514102760 2026-03-10T14:13:20.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.474+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6514075680 con 0x7f6514102760 2026-03-10T14:13:20.475 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.476+0000 7f650bfff700 1 -- 192.168.123.103:0/1614405149 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f64fc029030 con 0x7f6514102760 2026-03-10T14:13:20.475 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.477+0000 7f650bfff700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6500077990 0x7f6500079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.477+0000 7f651259c700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6500077990 0x7f6500079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.478+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6514066e40 con 0x7f6514102760 2026-03-10T14:13:20.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.478+0000 7f650bfff700 1 -- 192.168.123.103:0/1614405149 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(93..93 src has 1..93) v4 ==== 6521+0+0 (secure 0 0 0) 0x7f64fc0a46e0 con 0x7f6514102760 2026-03-10T14:13:20.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.478+0000 7f651259c700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6500077990 0x7f6500079e40 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f6504009730 tx=0x7f6504006cb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.481+0000 7f650bfff700 1 -- 192.168.123.103:0/1614405149 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f64fc06ce10 con 0x7f6514102760 2026-03-10T14:13:20.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.622+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f6514075de0 con 0x7f6514102760 2026-03-10T14:13:20.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.623+0000 7f650bfff700 1 -- 192.168.123.103:0/1614405149 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 16 v16) v1 ==== 76+0+1954 (secure 0 0 0) 0x7f64fc02c430 con 0x7f6514102760 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:e16 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-10T14:12:48:855184+0000 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:epoch 16 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:12:48.656761+0000 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:13:20.623 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 24263 members: 24263,14470 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:13:20.624 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.627+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6500077990 msgr2=0x7f6500079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.627+0000 7f6519039700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6500077990 0x7f6500079e40 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f6504009730 tx=0x7f6504006cb0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.627+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 msgr2=0x7f6514198060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.627+0000 7f6519039700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 0x7f6514198060 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f64fc0049c0 tx=0x7f64fc004aa0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.628+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 shutdown_connections 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.628+0000 7f6519039700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6500077990 0x7f6500079e40 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.628+0000 7f6519039700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6514102760 0x7f6514198060 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.628+0000 7f6519039700 1 --2- 192.168.123.103:0/1614405149 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6514103a00 0x7f65141985a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.628+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 >> 192.168.123.103:0/1614405149 conn(0x7f65140fddb0 msgr2=0x7f6514100170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:20.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.628+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 shutdown_connections 2026-03-10T14:13:20.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.628+0000 7f6519039700 1 -- 192.168.123.103:0/1614405149 wait complete. 2026-03-10T14:13:20.628 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 16 2026-03-10T14:13:20.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.704+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/1741002738 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 msgr2=0x7fe0b8103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.704+0000 7fe0bd5ea700 1 --2- 192.168.123.103:0/1741002738 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 0x7fe0b8103e70 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fe0a8009b00 tx=0x7fe0a8009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.705+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/1741002738 shutdown_connections 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.705+0000 7fe0bd5ea700 1 --2- 192.168.123.103:0/1741002738 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 0x7fe0b8103e70 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.705+0000 7fe0bd5ea700 1 --2- 192.168.123.103:0/1741002738 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe0b8102760 0x7fe0b8102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.705+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/1741002738 >> 192.168.123.103:0/1741002738 conn(0x7fe0b80fddb0 msgr2=0x7fe0b81001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.705+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/1741002738 shutdown_connections 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.705+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/1741002738 wait complete. 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.705+0000 7fe0bd5ea700 1 Processor -- start 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0bd5ea700 1 -- start start 2026-03-10T14:13:20.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0bd5ea700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe0b8102760 0x7fe0b8198040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0bd5ea700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 0x7fe0b8198580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0bd5ea700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0b8198ba0 con 0x7fe0b8103a00 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0bd5ea700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0b8198ce0 con 0x7fe0b8102760 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0b67fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 0x7fe0b8198580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0b67fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 0x7fe0b8198580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58082/0 (socket says 192.168.123.103:58082) 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0b67fc700 1 -- 192.168.123.103:0/3220554950 learned_addr learned my addr 192.168.123.103:0/3220554950 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0b67fc700 1 -- 192.168.123.103:0/3220554950 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe0b8102760 msgr2=0x7fe0b8198040 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0b6ffd700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe0b8102760 0x7fe0b8198040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0b67fc700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe0b8102760 0x7fe0b8198040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.706+0000 7fe0b67fc700 1 -- 192.168.123.103:0/3220554950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0a80097e0 con 0x7fe0b8103a00 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.707+0000 7fe0b67fc700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 0x7fe0b8198580 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fe0a8005230 tx=0x7fe0a80056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.707+0000 7fe0affff700 1 -- 192.168.123.103:0/3220554950 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0a801d070 con 0x7fe0b8103a00 2026-03-10T14:13:20.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.707+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe0b819d730 con 0x7fe0b8103a00 2026-03-10T14:13:20.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.707+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe0b819dc20 con 0x7fe0b8103a00 2026-03-10T14:13:20.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.707+0000 7fe0affff700 1 -- 192.168.123.103:0/3220554950 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe0a800bc50 con 0x7fe0b8103a00 2026-03-10T14:13:20.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.707+0000 7fe0affff700 1 -- 192.168.123.103:0/3220554950 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0a800f850 con 0x7fe0b8103a00 2026-03-10T14:13:20.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.708+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe098005320 con 0x7fe0b8103a00 2026-03-10T14:13:20.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.709+0000 7fe0affff700 1 -- 192.168.123.103:0/3220554950 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe0a800fa90 con 0x7fe0b8103a00 2026-03-10T14:13:20.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.709+0000 7fe0affff700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe0a4077910 0x7fe0a4079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.709+0000 7fe0b6ffd700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe0a4077910 0x7fe0a4079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.710+0000 7fe0b6ffd700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe0a4077910 0x7fe0a4079dc0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fe0a0006fd0 tx=0x7fe0a0008040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.712+0000 7fe0affff700 1 -- 192.168.123.103:0/3220554950 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(93..93 src has 1..93) v4 ==== 6521+0+0 (secure 0 0 0) 0x7fe0a800bdc0 con 0x7fe0b8103a00 2026-03-10T14:13:20.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.712+0000 7fe0affff700 1 -- 192.168.123.103:0/3220554950 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe0a80a0050 con 0x7fe0b8103a00 2026-03-10T14:13:20.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.836+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe098000bf0 con 0x7fe0a4077910 2026-03-10T14:13:20.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.838+0000 7fe0affff700 1 -- 192.168.123.103:0/3220554950 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fe098000bf0 con 0x7fe0a4077910 2026-03-10T14:13:20.838 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "9/23 daemons upgraded", 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:13:20.839 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:13:20.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe0a4077910 msgr2=0x7fe0a4079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe0a4077910 0x7fe0a4079dc0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fe0a0006fd0 tx=0x7fe0a0008040 comp rx=0 tx=0).stop 2026-03-10T14:13:20.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 msgr2=0x7fe0b8198580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 0x7fe0b8198580 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fe0a8005230 tx=0x7fe0a80056c0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 shutdown_connections 2026-03-10T14:13:20.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe0a4077910 0x7fe0a4079dc0 secure :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fe0a0006fd0 tx=0x7fe0a0008040 comp rx=0 tx=0).stop 2026-03-10T14:13:20.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe0b8102760 0x7fe0b8198040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 --2- 192.168.123.103:0/3220554950 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0b8103a00 0x7fe0b8198580 secure :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fe0a8005230 tx=0x7fe0a80056c0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.842+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 >> 192.168.123.103:0/3220554950 conn(0x7fe0b80fddb0 msgr2=0x7fe0b8100150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:20.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.843+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 shutdown_connections 2026-03-10T14:13:20.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.843+0000 7fe0bd5ea700 1 -- 192.168.123.103:0/3220554950 wait complete. 2026-03-10T14:13:20.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.908+0000 7ff11c92c700 1 -- 192.168.123.103:0/3292592740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1140ff5a0 msgr2=0x7ff1140ffa10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:20.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.908+0000 7ff11c92c700 1 --2- 192.168.123.103:0/3292592740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1140ff5a0 0x7ff1140ffa10 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7ff108009b50 tx=0x7ff108009e60 comp rx=0 tx=0).stop 2026-03-10T14:13:20.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.909+0000 7ff11c92c700 1 -- 192.168.123.103:0/3292592740 shutdown_connections 2026-03-10T14:13:20.978 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.909+0000 7ff11c92c700 1 --2- 192.168.123.103:0/3292592740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1140ff5a0 0x7ff1140ffa10 secure :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7ff108009b50 tx=0x7ff108009e60 comp rx=0 tx=0).stop 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.909+0000 7ff11c92c700 1 --2- 192.168.123.103:0/3292592740 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff1140fe830 0x7ff1140fec40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.909+0000 7ff11c92c700 1 -- 192.168.123.103:0/3292592740 >> 192.168.123.103:0/3292592740 conn(0x7ff1140fa180 msgr2=0x7ff1140fc5d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.909+0000 7ff11c92c700 1 -- 192.168.123.103:0/3292592740 shutdown_connections 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.909+0000 7ff11c92c700 1 -- 192.168.123.103:0/3292592740 wait complete. 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11c92c700 1 Processor -- start 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11c92c700 1 -- start start 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11c92c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff1140fe830 0x7ff114198250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11c92c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff114198790 0x7ff1141a1ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11c92c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1141058c0 con 0x7ff114198790 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11c92c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff114105a30 con 0x7ff1140fe830 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11a6c8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff1140fe830 0x7ff114198250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11a6c8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff1140fe830 0x7ff114198250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:41918/0 (socket says 192.168.123.103:41918) 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.910+0000 7ff11a6c8700 1 -- 192.168.123.103:0/2308635372 learned_addr learned my addr 192.168.123.103:0/2308635372 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.911+0000 7ff11a6c8700 1 -- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff114198790 msgr2=0x7ff1141a1ff0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.911+0000 7ff119ec7700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff114198790 0x7ff1141a1ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.911+0000 7ff11a6c8700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff114198790 0x7ff1141a1ff0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.911+0000 7ff11a6c8700 1 -- 192.168.123.103:0/2308635372 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff1080097e0 con 0x7ff1140fe830 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.912+0000 7ff11a6c8700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff1140fe830 0x7ff114198250 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7ff10c00eb10 tx=0x7ff10c00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.979 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.912+0000 7ff119ec7700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff114198790 0x7ff1141a1ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:13:20.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.981+0000 7ff1077fe700 1 -- 192.168.123.103:0/2308635372 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff10c00cca0 con 0x7ff1140fe830 2026-03-10T14:13:20.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.981+0000 7ff1077fe700 1 -- 192.168.123.103:0/2308635372 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff10c00ce00 con 0x7ff1140fe830 2026-03-10T14:13:20.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.982+0000 7ff1077fe700 1 -- 192.168.123.103:0/2308635372 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff10c018990 con 0x7ff1140fe830 2026-03-10T14:13:20.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.982+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff114198cc0 con 0x7ff1140fe830 2026-03-10T14:13:20.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.982+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff1141a2700 con 0x7ff1140fe830 2026-03-10T14:13:20.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.983+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff114066e40 con 0x7ff1140fe830 2026-03-10T14:13:20.982 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.983+0000 7ff1077fe700 1 -- 192.168.123.103:0/2308635372 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff10c018af0 con 0x7ff1140fe830 2026-03-10T14:13:20.982 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.984+0000 7ff1077fe700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff1000776c0 0x7ff100079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:20.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.984+0000 7ff119ec7700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff1000776c0 0x7ff100079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:20.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.985+0000 7ff1077fe700 1 -- 192.168.123.103:0/2308635372 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(93..93 src has 1..93) v4 ==== 6521+0+0 (secure 0 0 0) 0x7ff10c014070 con 0x7ff1140fe830 2026-03-10T14:13:20.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.985+0000 7ff119ec7700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff1000776c0 0x7ff100079b70 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7ff108009b20 tx=0x7ff108005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:20.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:20.987+0000 7ff1077fe700 1 -- 192.168.123.103:0/2308635372 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff10c0625e0 con 0x7ff1140fe830 2026-03-10T14:13:21.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.158+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ff1141a29b0 con 0x7ff1140fe830 2026-03-10T14:13:21.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.357+0000 7ff1077fe700 1 -- 192.168.123.103:0/2308635372 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+508 (secure 0 0 0) 0x7ff10c061d30 con 0x7ff1140fe830 2026-03-10T14:13:21.379 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 osds down; Degraded data redundancy: 2908/336 objects degraded (865.476%), 3 pgs degraded 2026-03-10T14:13:21.379 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-10T14:13:21.379 INFO:teuthology.orchestra.run.vm03.stdout: osd.3 (root=default,host=vm04) is down 2026-03-10T14:13:21.379 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 2908/336 objects degraded (865.476%), 3 pgs degraded 2026-03-10T14:13:21.379 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.7 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-10T14:13:21.379 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.a is active+recovery_wait+undersized+degraded+remapped, acting [4,0] 2026-03-10T14:13:21.379 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [4,0] 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff1000776c0 msgr2=0x7ff100079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff1000776c0 0x7ff100079b70 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7ff108009b20 tx=0x7ff108005fb0 comp rx=0 tx=0).stop 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff1140fe830 msgr2=0x7ff114198250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff1140fe830 0x7ff114198250 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7ff10c00eb10 tx=0x7ff10c00eed0 comp rx=0 tx=0).stop 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 shutdown_connections 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff1000776c0 0x7ff100079b70 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff1140fe830 0x7ff114198250 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 --2- 192.168.123.103:0/2308635372 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff114198790 0x7ff1141a1ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.361+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 >> 192.168.123.103:0/2308635372 conn(0x7ff1140fa180 msgr2=0x7ff114107c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.362+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 shutdown_connections 2026-03-10T14:13:21.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:21.362+0000 7ff11c92c700 1 -- 192.168.123.103:0/2308635372 wait complete. 2026-03-10T14:13:21.564 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local podman[98059]: 2026-03-10 14:13:21.187810465 +0000 UTC m=+1.252380793 container remove 1653afd3fb7af5e9aacebee4e2455c64ac6cfefa247b17cf3946ab3cefce72f2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T14:13:21.564 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.3.service: Deactivated successfully. 2026-03-10T14:13:21.564 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.3.service: Unit process 98070 (conmon) remains running after unit stopped. 2026-03-10T14:13:21.564 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.3.service: Unit process 98078 (podman) remains running after unit stopped. 2026-03-10T14:13:21.564 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local systemd[1]: Stopped Ceph osd.3 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:13:21.564 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.3.service: Consumed 54.952s CPU time, 883.7M memory peak. 2026-03-10T14:13:21.564 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local systemd[1]: Starting Ceph osd.3 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:13:21.830 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:21 vm03.local ceph-mon[103098]: from='client.34346 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:21.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:21 vm03.local ceph-mon[103098]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:13:21.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:21 vm03.local ceph-mon[103098]: osdmap e93: 6 total, 5 up, 6 in 2026-03-10T14:13:21.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:21 vm03.local ceph-mon[103098]: from='client.34350 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:21.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:21 vm03.local ceph-mon[103098]: from='client.34354 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:21.831 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:21 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3332277949' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:21.832 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:21 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 2908/336 objects degraded (865.476%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:21.832 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:21 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1614405149' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:13:21.995 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:21 vm04.local ceph-mon[92084]: from='client.34346 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:21.995 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:21 vm04.local ceph-mon[92084]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:13:21.995 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:21 vm04.local ceph-mon[92084]: osdmap e93: 6 total, 5 up, 6 in 2026-03-10T14:13:21.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:21 vm04.local ceph-mon[92084]: from='client.34350 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:21.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:21 vm04.local ceph-mon[92084]: from='client.34354 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:21.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:21 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3332277949' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:21.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:21 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 2908/336 objects degraded (865.476%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:21.996 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:21 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1614405149' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:13:21.996 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local podman[98171]: 2026-03-10 14:13:21.512211383 +0000 UTC m=+0.010649857 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:13:21.996 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:21 vm04.local podman[98171]: 2026-03-10 14:13:21.85616845 +0000 UTC m=+0.354606913 container create f71847aad3e0f34091a36351ca861966532b65b5133a62d67b1d717a9287dc8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T14:13:22.314 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local podman[98171]: 2026-03-10 14:13:22.127591649 +0000 UTC m=+0.626030112 container init f71847aad3e0f34091a36351ca861966532b65b5133a62d67b1d717a9287dc8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True) 2026-03-10T14:13:22.314 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local podman[98171]: 2026-03-10 14:13:22.130901064 +0000 UTC m=+0.629339527 container start f71847aad3e0f34091a36351ca861966532b65b5133a62d67b1d717a9287dc8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.build-date=20260223) 2026-03-10T14:13:22.314 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local podman[98171]: 2026-03-10 14:13:22.198686724 +0000 UTC m=+0.697125187 container attach f71847aad3e0f34091a36351ca861966532b65b5133a62d67b1d717a9287dc8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-10T14:13:22.314 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:22.314 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local bash[98171]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:22.314 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:22.314 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local bash[98171]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:22.672 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-mon[92084]: pgmap v183: 65 pgs: 10 peering, 1 active+recovering+undersized+remapped, 9 stale+active+clean, 3 active+recovery_wait+undersized+degraded+remapped, 42 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 68 KiB/s rd, 2 op/s; 2908/336 objects degraded (865.476%); 895 KiB/s, 79 objects/s recovering 2026-03-10T14:13:22.672 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-mon[92084]: from='client.34366 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:22.672 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-mon[92084]: Health check failed: Reduced data availability: 1 pg inactive, 4 pgs peering (PG_AVAILABILITY) 2026-03-10T14:13:22.672 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-mon[92084]: osdmap e94: 6 total, 5 up, 6 in 2026-03-10T14:13:22.672 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2308635372' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:13:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:22 vm03.local ceph-mon[103098]: pgmap v183: 65 pgs: 10 peering, 1 active+recovering+undersized+remapped, 9 stale+active+clean, 3 active+recovery_wait+undersized+degraded+remapped, 42 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 68 KiB/s rd, 2 op/s; 2908/336 objects degraded (865.476%); 895 KiB/s, 79 objects/s recovering 2026-03-10T14:13:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:22 vm03.local ceph-mon[103098]: from='client.34366 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:22 vm03.local ceph-mon[103098]: Health check failed: Reduced data availability: 1 pg inactive, 4 pgs peering (PG_AVAILABILITY) 2026-03-10T14:13:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:22 vm03.local ceph-mon[103098]: osdmap e94: 6 total, 5 up, 6 in 2026-03-10T14:13:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:22 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2308635372' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:13:22.947 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local bash[98171]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local bash[98171]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local bash[98171]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local bash[98171]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6c717c2c-3d0f-4a5c-9080-ac9874339fd5/osd-block-9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T14:13:22.972 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:22 vm04.local bash[98171]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6c717c2c-3d0f-4a5c-9080-ac9874339fd5/osd-block-9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T14:13:23.204 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/ln -snf /dev/ceph-6c717c2c-3d0f-4a5c-9080-ac9874339fd5/osd-block-9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb /var/lib/ceph/osd/ceph-3/block 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local bash[98171]: Running command: /usr/bin/ln -snf /dev/ceph-6c717c2c-3d0f-4a5c-9080-ac9874339fd5/osd-block-9dd6f41a-a55f-4ec5-8953-3f3a0d66b9fb /var/lib/ceph/osd/ceph-3/block 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local bash[98171]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local bash[98171]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local bash[98171]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate[98184]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local bash[98171]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local conmon[98184]: conmon f71847aad3e0f34091a3 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f71847aad3e0f34091a36351ca861966532b65b5133a62d67b1d717a9287dc8a.scope/container/memory.events 2026-03-10T14:13:23.205 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local podman[98171]: 2026-03-10 14:13:23.060720028 +0000 UTC m=+1.559158481 container died f71847aad3e0f34091a36351ca861966532b65b5133a62d67b1d717a9287dc8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T14:13:23.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:23 vm04.local ceph-mon[92084]: pgmap v185: 65 pgs: 10 active+undersized, 1 activating+undersized+degraded, 10 peering, 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 12 active+undersized+degraded, 28 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2959/336 objects degraded (880.655%) 2026-03-10T14:13:23.813 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local podman[98171]: 2026-03-10 14:13:23.558580203 +0000 UTC m=+2.057018656 container remove f71847aad3e0f34091a36351ca861966532b65b5133a62d67b1d717a9287dc8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-activate, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:13:23.814 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local podman[98429]: 2026-03-10 14:13:23.646299869 +0000 UTC m=+0.012966444 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:13:23.814 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local podman[98429]: 2026-03-10 14:13:23.749990401 +0000 UTC m=+0.116656967 container create f7fc2aafa9d94bca3b636d8f1ea6262f9c4afe89932b2a8faa2e9ccf992163be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default) 2026-03-10T14:13:24.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:23 vm03.local ceph-mon[103098]: pgmap v185: 65 pgs: 10 active+undersized, 1 activating+undersized+degraded, 10 peering, 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 12 active+undersized+degraded, 28 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2959/336 objects degraded (880.655%) 2026-03-10T14:13:24.256 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local podman[98429]: 2026-03-10 14:13:23.95412014 +0000 UTC m=+0.320786715 container init f7fc2aafa9d94bca3b636d8f1ea6262f9c4afe89932b2a8faa2e9ccf992163be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) 2026-03-10T14:13:24.256 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local podman[98429]: 2026-03-10 14:13:23.956993488 +0000 UTC m=+0.323660053 container start f7fc2aafa9d94bca3b636d8f1ea6262f9c4afe89932b2a8faa2e9ccf992163be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T14:13:24.256 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local bash[98429]: f7fc2aafa9d94bca3b636d8f1ea6262f9c4afe89932b2a8faa2e9ccf992163be 2026-03-10T14:13:24.256 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:23 vm04.local systemd[1]: Started Ceph osd.3 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:13:24.980 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:24 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[98440]: 2026-03-10T14:13:24.816+0000 7f43538c4740 -1 Falling back to public interface 2026-03-10T14:13:25.259 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:25.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:25.260 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:25.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:26.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:26 vm03.local ceph-mon[103098]: pgmap v186: 65 pgs: 10 active+undersized, 1 activating+undersized+degraded, 10 peering, 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 12 active+undersized+degraded, 28 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2959/336 objects degraded (880.655%) 2026-03-10T14:13:26.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:26 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 2959/336 objects degraded (880.655%), 16 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:26.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:26.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:26.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:26.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:26.379 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:26 vm04.local ceph-mon[92084]: pgmap v186: 65 pgs: 10 active+undersized, 1 activating+undersized+degraded, 10 peering, 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 12 active+undersized+degraded, 28 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2959/336 objects degraded (880.655%) 2026-03-10T14:13:26.379 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:26 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 2959/336 objects degraded (880.655%), 16 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:26.379 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:26.379 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:26.379 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:26.379 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:28.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: pgmap v187: 65 pgs: 1 active+recovering+undersized+remapped, 16 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 17 active+undersized+degraded, 28 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2969/336 objects degraded (883.631%) 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg inactive, 4 pgs peering) 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T14:13:28.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:28 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (17 PGs are or would become offline) 2026-03-10T14:13:28.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: pgmap v187: 65 pgs: 1 active+recovering+undersized+remapped, 16 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 17 active+undersized+degraded, 28 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2969/336 objects degraded (883.631%) 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg inactive, 4 pgs peering) 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T14:13:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:28 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (17 PGs are or would become offline) 2026-03-10T14:13:29.503 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:29 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[98440]: 2026-03-10T14:13:29.188+0000 7f43538c4740 -1 osd.3 0 read_superblock omap replica is missing. 2026-03-10T14:13:29.814 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:29 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[98440]: 2026-03-10T14:13:29.504+0000 7f43538c4740 -1 osd.3 92 log_to_monitors true 2026-03-10T14:13:30.312 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:30 vm03.local ceph-mon[103098]: pgmap v188: 65 pgs: 1 active+recovering+undersized+remapped, 16 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 17 active+undersized+degraded, 28 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2969/336 objects degraded (883.631%) 2026-03-10T14:13:30.312 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:30 vm03.local ceph-mon[103098]: from='osd.3 [v2:192.168.123.104:6800/2621104757,v1:192.168.123.104:6801/2621104757]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T14:13:30.312 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:30 vm03.local ceph-mon[103098]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T14:13:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:30 vm04.local ceph-mon[92084]: pgmap v188: 65 pgs: 1 active+recovering+undersized+remapped, 16 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 17 active+undersized+degraded, 28 active+clean; 275 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 2969/336 objects degraded (883.631%) 2026-03-10T14:13:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:30 vm04.local ceph-mon[92084]: from='osd.3 [v2:192.168.123.104:6800/2621104757,v1:192.168.123.104:6801/2621104757]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T14:13:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:30 vm04.local ceph-mon[92084]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T14:13:30.313 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:13:30 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[98440]: 2026-03-10T14:13:30.079+0000 7f434b65e640 -1 osd.3 92 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:13:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:31 vm03.local ceph-mon[103098]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T14:13:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:31 vm03.local ceph-mon[103098]: osdmap e95: 6 total, 5 up, 6 in 2026-03-10T14:13:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:31 vm03.local ceph-mon[103098]: from='osd.3 [v2:192.168.123.104:6800/2621104757,v1:192.168.123.104:6801/2621104757]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:13:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:31 vm03.local ceph-mon[103098]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:13:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:31 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 2969/336 objects degraded (883.631%), 20 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:13:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:31 vm04.local ceph-mon[92084]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T14:13:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:31 vm04.local ceph-mon[92084]: osdmap e95: 6 total, 5 up, 6 in 2026-03-10T14:13:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:31 vm04.local ceph-mon[92084]: from='osd.3 [v2:192.168.123.104:6800/2621104757,v1:192.168.123.104:6801/2621104757]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:13:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:31 vm04.local ceph-mon[92084]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:13:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:31 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 2969/336 objects degraded (883.631%), 20 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:31.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:13:32.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:32 vm03.local ceph-mon[103098]: pgmap v190: 65 pgs: 1 active+recovering+undersized+remapped, 16 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 17 active+undersized+degraded, 28 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 2969/336 objects degraded (883.631%); 0 B/s, 7 objects/s recovering 2026-03-10T14:13:32.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:32 vm03.local ceph-mon[103098]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:13:32.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:32 vm03.local ceph-mon[103098]: osd.3 [v2:192.168.123.104:6800/2621104757,v1:192.168.123.104:6801/2621104757] boot 2026-03-10T14:13:32.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:32 vm03.local ceph-mon[103098]: osdmap e96: 6 total, 6 up, 6 in 2026-03-10T14:13:32.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:32 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:13:32.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:32 vm04.local ceph-mon[92084]: pgmap v190: 65 pgs: 1 active+recovering+undersized+remapped, 16 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 17 active+undersized+degraded, 28 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 2969/336 objects degraded (883.631%); 0 B/s, 7 objects/s recovering 2026-03-10T14:13:32.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:32 vm04.local ceph-mon[92084]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:13:32.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:32 vm04.local ceph-mon[92084]: osd.3 [v2:192.168.123.104:6800/2621104757,v1:192.168.123.104:6801/2621104757] boot 2026-03-10T14:13:32.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:32 vm04.local ceph-mon[92084]: osdmap e96: 6 total, 6 up, 6 in 2026-03-10T14:13:32.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:32 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T14:13:33.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:33 vm04.local ceph-mon[92084]: osdmap e97: 6 total, 6 up, 6 in 2026-03-10T14:13:33.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:33 vm03.local ceph-mon[103098]: osdmap e97: 6 total, 6 up, 6 in 2026-03-10T14:13:34.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:34 vm04.local ceph-mon[92084]: pgmap v193: 65 pgs: 7 peering, 1 active+recovering+undersized+remapped, 8 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 11 active+undersized+degraded, 35 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 2951/336 objects degraded (878.274%); 0 B/s, 7 objects/s recovering 2026-03-10T14:13:34.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:34 vm03.local ceph-mon[103098]: pgmap v193: 65 pgs: 7 peering, 1 active+recovering+undersized+remapped, 8 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 11 active+undersized+degraded, 35 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 2951/336 objects degraded (878.274%); 0 B/s, 7 objects/s recovering 2026-03-10T14:13:36.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:36 vm04.local ceph-mon[92084]: pgmap v194: 65 pgs: 7 peering, 1 active+recovering+undersized+remapped, 8 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 11 active+undersized+degraded, 35 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 1 op/s; 2951/336 objects degraded (878.274%); 0 B/s, 7 objects/s recovering 2026-03-10T14:13:36.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:36 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 2951/336 objects degraded (878.274%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:36.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:36 vm03.local ceph-mon[103098]: pgmap v194: 65 pgs: 7 peering, 1 active+recovering+undersized+remapped, 8 active+undersized, 3 active+recovery_wait+undersized+degraded+remapped, 11 active+undersized+degraded, 35 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 853 B/s rd, 1 op/s; 2951/336 objects degraded (878.274%); 0 B/s, 7 objects/s recovering 2026-03-10T14:13:36.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:36 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 2951/336 objects degraded (878.274%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:38.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:38 vm04.local ceph-mon[92084]: pgmap v195: 65 pgs: 1 active+recovering+undersized+remapped, 2 peering, 3 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 416 KiB/s rd, 6 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 6 objects/s recovering 2026-03-10T14:13:38.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:38 vm03.local ceph-mon[103098]: pgmap v195: 65 pgs: 1 active+recovering+undersized+remapped, 2 peering, 3 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 416 KiB/s rd, 6 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 6 objects/s recovering 2026-03-10T14:13:39.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:39 vm04.local ceph-mon[92084]: pgmap v196: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 344 KiB/s rd, 6 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 5 objects/s recovering 2026-03-10T14:13:39.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:39 vm03.local ceph-mon[103098]: pgmap v196: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 344 KiB/s rd, 6 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 5 objects/s recovering 2026-03-10T14:13:40.697 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:40 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 2908/336 objects degraded (865.476%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:40.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:40 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 2908/336 objects degraded (865.476%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:41.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:41 vm04.local ceph-mon[92084]: pgmap v197: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 288 KiB/s rd, 6 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 9 objects/s recovering 2026-03-10T14:13:41.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:41 vm03.local ceph-mon[103098]: pgmap v197: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 288 KiB/s rd, 6 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 9 objects/s recovering 2026-03-10T14:13:42.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:42 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T14:13:42.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:42 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:42.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:42 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T14:13:42.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:42 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:42.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:42 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T14:13:42.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:42 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:42.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:42 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T14:13:42.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:42 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:43.533 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local systemd[1]: Stopping Ceph osd.4 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:13:43.533 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[67504]: 2026-03-10T14:13:43.281+0000 7f16e892d700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:13:43.533 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[67504]: 2026-03-10T14:13:43.281+0000 7f16e892d700 -1 osd.4 97 *** Got signal Terminated *** 2026-03-10T14:13:43.533 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[67504]: 2026-03-10T14:13:43.281+0000 7f16e892d700 -1 osd.4 97 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:13:43.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T14:13:43.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-mon[92084]: Upgrade: osd.4 is safe to restart 2026-03-10T14:13:43.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-mon[92084]: Upgrade: Updating osd.4 2026-03-10T14:13:43.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-mon[92084]: Deploying daemon osd.4 on vm04 2026-03-10T14:13:43.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-mon[92084]: pgmap v198: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 261 KiB/s rd, 5 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 8 objects/s recovering 2026-03-10T14:13:43.791 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:43 vm04.local ceph-mon[92084]: osd.4 marked itself down and dead 2026-03-10T14:13:43.791 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local podman[101764]: 2026-03-10 14:13:43.611232542 +0000 UTC m=+0.346240333 container died 127d95fabe233b4de3bc31df58789546837e2500791c66a5fc9b30e2327ad153 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1) 2026-03-10T14:13:43.791 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local podman[101764]: 2026-03-10 14:13:43.631008471 +0000 UTC m=+0.366016272 container remove 127d95fabe233b4de3bc31df58789546837e2500791c66a5fc9b30e2327ad153 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4, RELEASE=HEAD, org.label-schema.schema-version=1.0, io.buildah.version=1.29.1, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS) 2026-03-10T14:13:43.791 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local bash[101764]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4 2026-03-10T14:13:43.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:43 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T14:13:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:43 vm03.local ceph-mon[103098]: Upgrade: osd.4 is safe to restart 2026-03-10T14:13:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:43 vm03.local ceph-mon[103098]: Upgrade: Updating osd.4 2026-03-10T14:13:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:43 vm03.local ceph-mon[103098]: Deploying daemon osd.4 on vm04 2026-03-10T14:13:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:43 vm03.local ceph-mon[103098]: pgmap v198: 65 pgs: 1 active+recovering+undersized+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 261 KiB/s rd, 5 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 8 objects/s recovering 2026-03-10T14:13:43.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:43 vm03.local ceph-mon[103098]: osd.4 marked itself down and dead 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local podman[101832]: 2026-03-10 14:13:43.793484376 +0000 UTC m=+0.019859846 container create a1f86dd22d4cbc7167ba30855797010e4a8f7ca3f7889c6f15597e8761b419f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local podman[101832]: 2026-03-10 14:13:43.838227212 +0000 UTC m=+0.064602682 container init a1f86dd22d4cbc7167ba30855797010e4a8f7ca3f7889c6f15597e8761b419f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, io.buildah.version=1.41.3) 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local podman[101832]: 2026-03-10 14:13:43.841782728 +0000 UTC m=+0.068158209 container start a1f86dd22d4cbc7167ba30855797010e4a8f7ca3f7889c6f15597e8761b419f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3) 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local podman[101832]: 2026-03-10 14:13:43.846941284 +0000 UTC m=+0.073316765 container attach a1f86dd22d4cbc7167ba30855797010e4a8f7ca3f7889c6f15597e8761b419f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0) 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:43 vm04.local podman[101832]: 2026-03-10 14:13:43.786666353 +0000 UTC m=+0.013041844 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local podman[101851]: 2026-03-10 14:13:44.007452461 +0000 UTC m=+0.013952769 container died a1f86dd22d4cbc7167ba30855797010e4a8f7ca3f7889c6f15597e8761b419f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local podman[101851]: 2026-03-10 14:13:44.022699193 +0000 UTC m=+0.029199501 container remove a1f86dd22d4cbc7167ba30855797010e4a8f7ca3f7889c6f15597e8761b419f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid) 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.4.service: Deactivated successfully. 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local systemd[1]: Stopped Ceph osd.4 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:13:44.052 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.4.service: Consumed 46.996s CPU time. 2026-03-10T14:13:44.314 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local systemd[1]: Starting Ceph osd.4 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:13:44.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-mon[92084]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:13:44.815 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-mon[92084]: osdmap e98: 6 total, 5 up, 6 in 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local podman[101937]: 2026-03-10 14:13:44.321911349 +0000 UTC m=+0.019079566 container create e2b569934f45cf0faab2c0106a417753e86a76a53323d4d33102b5377343f91b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True) 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local podman[101937]: 2026-03-10 14:13:44.359841853 +0000 UTC m=+0.057010081 container init e2b569934f45cf0faab2c0106a417753e86a76a53323d4d33102b5377343f91b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local podman[101937]: 2026-03-10 14:13:44.363149785 +0000 UTC m=+0.060317993 container start e2b569934f45cf0faab2c0106a417753e86a76a53323d4d33102b5377343f91b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid) 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local podman[101937]: 2026-03-10 14:13:44.36593568 +0000 UTC m=+0.063103897 container attach e2b569934f45cf0faab2c0106a417753e86a76a53323d4d33102b5377343f91b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS) 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local podman[101937]: 2026-03-10 14:13:44.313288539 +0000 UTC m=+0.010456765 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local bash[101937]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:44.815 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local bash[101937]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:44.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:44 vm03.local ceph-mon[103098]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:13:44.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:44 vm03.local ceph-mon[103098]: osdmap e98: 6 total, 5 up, 6 in 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local bash[101937]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local bash[101937]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local bash[101937]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local bash[101937]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-37ef833a-0065-4d2b-9c5a-52d8d73f1c37/osd-block-9d0ae930-1cc2-4815-bd24-e9129038b319 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:44 vm04.local bash[101937]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-37ef833a-0065-4d2b-9c5a-52d8d73f1c37/osd-block-9d0ae930-1cc2-4815-bd24-e9129038b319 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/ln -snf /dev/ceph-37ef833a-0065-4d2b-9c5a-52d8d73f1c37/osd-block-9d0ae930-1cc2-4815-bd24-e9129038b319 /var/lib/ceph/osd/ceph-4/block 2026-03-10T14:13:45.259 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local bash[101937]: Running command: /usr/bin/ln -snf /dev/ceph-37ef833a-0065-4d2b-9c5a-52d8d73f1c37/osd-block-9d0ae930-1cc2-4815-bd24-e9129038b319 /var/lib/ceph/osd/ceph-4/block 2026-03-10T14:13:45.556 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local bash[101937]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local bash[101937]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local bash[101937]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate[101948]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local bash[101937]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local podman[101937]: 2026-03-10 14:13:45.291081437 +0000 UTC m=+0.988249654 container died e2b569934f45cf0faab2c0106a417753e86a76a53323d4d33102b5377343f91b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3) 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local podman[101937]: 2026-03-10 14:13:45.308679991 +0000 UTC m=+1.005848198 container remove e2b569934f45cf0faab2c0106a417753e86a76a53323d4d33102b5377343f91b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local podman[102199]: 2026-03-10 14:13:45.404194589 +0000 UTC m=+0.016262163 container create 89f9225212d4f6f2c196a5d4f8cf9312d316f9b19bd283499954ba54538cdbcd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local podman[102199]: 2026-03-10 14:13:45.437715908 +0000 UTC m=+0.049783472 container init 89f9225212d4f6f2c196a5d4f8cf9312d316f9b19bd283499954ba54538cdbcd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local podman[102199]: 2026-03-10 14:13:45.44191496 +0000 UTC m=+0.053982523 container start 89f9225212d4f6f2c196a5d4f8cf9312d316f9b19bd283499954ba54538cdbcd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20260223) 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local bash[102199]: 89f9225212d4f6f2c196a5d4f8cf9312d316f9b19bd283499954ba54538cdbcd 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local podman[102199]: 2026-03-10 14:13:45.39763011 +0000 UTC m=+0.009697694 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:13:45.557 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:45 vm04.local systemd[1]: Started Ceph osd.4 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:13:45.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-mon[92084]: osdmap e99: 6 total, 5 up, 6 in 2026-03-10T14:13:45.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-mon[92084]: pgmap v201: 65 pgs: 2 stale+active+recovery_wait+undersized+degraded+remapped, 6 stale+active+clean, 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 3 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 6 objects/s recovering 2026-03-10T14:13:45.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:45.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:45.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:45.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:45 vm04.local ceph-mon[92084]: osdmap e100: 6 total, 5 up, 6 in 2026-03-10T14:13:45.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:45 vm03.local ceph-mon[103098]: osdmap e99: 6 total, 5 up, 6 in 2026-03-10T14:13:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:45 vm03.local ceph-mon[103098]: pgmap v201: 65 pgs: 2 stale+active+recovery_wait+undersized+degraded+remapped, 6 stale+active+clean, 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 1.2 KiB/s rd, 3 op/s; 2908/336 objects degraded (865.476%); 0 B/s, 6 objects/s recovering 2026-03-10T14:13:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:45.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:45 vm03.local ceph-mon[103098]: osdmap e100: 6 total, 5 up, 6 in 2026-03-10T14:13:46.563 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:46 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[102211]: 2026-03-10T14:13:46.271+0000 7f2937d78740 -1 Falling back to public interface 2026-03-10T14:13:46.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:13:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:46.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:47.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:13:47.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:47.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:47.876 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:47.876 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:47.876 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:47 vm04.local ceph-mon[92084]: pgmap v203: 65 pgs: 3 remapped+peering, 10 active+undersized, 1 active+recovering+undersized+remapped, 1 stale+active+clean, 12 active+undersized+degraded, 38 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 44/336 objects degraded (13.095%) 2026-03-10T14:13:47.876 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:47 vm04.local ceph-mon[92084]: osdmap e101: 6 total, 5 up, 6 in 2026-03-10T14:13:47.876 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:47.876 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:48.060 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:48.060 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:48.060 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:47 vm03.local ceph-mon[103098]: pgmap v203: 65 pgs: 3 remapped+peering, 10 active+undersized, 1 active+recovering+undersized+remapped, 1 stale+active+clean, 12 active+undersized+degraded, 38 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 44/336 objects degraded (13.095%) 2026-03-10T14:13:48.060 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:47 vm03.local ceph-mon[103098]: osdmap e101: 6 total, 5 up, 6 in 2026-03-10T14:13:48.060 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:48.060 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:49.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: Health check failed: Reduced data availability: 2 pgs peering (PG_AVAILABILITY) 2026-03-10T14:13:49.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 44/336 objects degraded (13.095%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:49.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: osdmap e102: 6 total, 5 up, 6 in 2026-03-10T14:13:49.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T14:13:49.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:48 vm04.local ceph-mon[92084]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-10T14:13:49.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: Health check failed: Reduced data availability: 2 pgs peering (PG_AVAILABILITY) 2026-03-10T14:13:49.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 44/336 objects degraded (13.095%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:49.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: osdmap e102: 6 total, 5 up, 6 in 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T14:13:49.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:48 vm03.local ceph-mon[103098]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-10T14:13:50.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:49 vm04.local ceph-mon[92084]: pgmap v206: 65 pgs: 3 remapped+peering, 13 active+undersized, 1 active+recovering+undersized+remapped, 16 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 55/336 objects degraded (16.369%) 2026-03-10T14:13:50.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:49 vm03.local ceph-mon[103098]: pgmap v206: 65 pgs: 3 remapped+peering, 13 active+undersized, 1 active+recovering+undersized+remapped, 16 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 55/336 objects degraded (16.369%) 2026-03-10T14:13:51.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:50 vm04.local ceph-mon[92084]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 2 pgs peering) 2026-03-10T14:13:51.063 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:50 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[102211]: 2026-03-10T14:13:50.901+0000 7f2937d78740 -1 osd.4 0 read_superblock omap replica is missing. 2026-03-10T14:13:51.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:50 vm03.local ceph-mon[103098]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 2 pgs peering) 2026-03-10T14:13:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.443+0000 7fc0b4544700 1 -- 192.168.123.103:0/267159441 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 msgr2=0x7fc0ac0721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.443+0000 7fc0b4544700 1 --2- 192.168.123.103:0/267159441 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 0x7fc0ac0721c0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fc0a8009b00 tx=0x7fc0a8009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.443+0000 7fc0b4544700 1 -- 192.168.123.103:0/267159441 shutdown_connections 2026-03-10T14:13:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.443+0000 7fc0b4544700 1 --2- 192.168.123.103:0/267159441 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0ac107d50 0x7fc0ac1081c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.443+0000 7fc0b4544700 1 --2- 192.168.123.103:0/267159441 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 0x7fc0ac0721c0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.443+0000 7fc0b4544700 1 -- 192.168.123.103:0/267159441 >> 192.168.123.103:0/267159441 conn(0x7fc0ac06d3e0 msgr2=0x7fc0ac06f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b4544700 1 -- 192.168.123.103:0/267159441 shutdown_connections 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b4544700 1 -- 192.168.123.103:0/267159441 wait complete. 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b4544700 1 Processor -- start 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b4544700 1 -- start start 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b4544700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 0x7fc0ac1169a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b4544700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0ac107d50 0x7fc0ac116ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b4544700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0ac117500 con 0x7fc0ac107d50 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b4544700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0ac1b28f0 con 0x7fc0ac071db0 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.444+0000 7fc0b1adf700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0ac107d50 0x7fc0ac116ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b22e0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 0x7fc0ac1169a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b22e0700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 0x7fc0ac1169a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38470/0 (socket says 192.168.123.103:38470) 2026-03-10T14:13:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b1adf700 1 -- 192.168.123.103:0/2014762177 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 msgr2=0x7fc0ac1169a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b1adf700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 0x7fc0ac1169a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b1adf700 1 -- 192.168.123.103:0/2014762177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc09c009710 con 0x7fc0ac107d50 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b22e0700 1 -- 192.168.123.103:0/2014762177 learned_addr learned my addr 192.168.123.103:0/2014762177 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b22e0700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 0x7fc0ac1169a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello state changed while learned_addr, mark_down or replacing must be happened just now 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b1adf700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0ac107d50 0x7fc0ac116ee0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fc09c00eb40 tx=0x7fc09c00ef00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0a37fe700 1 -- 192.168.123.103:0/2014762177 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc09c00ccf0 con 0x7fc0ac107d50 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0a37fe700 1 -- 192.168.123.103:0/2014762177 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc09c00ce50 con 0x7fc0ac107d50 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0a37fe700 1 -- 192.168.123.103:0/2014762177 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc09c003ea0 con 0x7fc0ac107d50 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0a80097e0 con 0x7fc0ac107d50 2026-03-10T14:13:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.445+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0ac1b2e60 con 0x7fc0ac107d50 2026-03-10T14:13:51.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.446+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc0ac066e40 con 0x7fc0ac107d50 2026-03-10T14:13:51.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.450+0000 7fc0a37fe700 1 -- 192.168.123.103:0/2014762177 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc09c004000 con 0x7fc0ac107d50 2026-03-10T14:13:51.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.450+0000 7fc0a37fe700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc0980779e0 0x7fc098079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.450+0000 7fc0a37fe700 1 -- 192.168.123.103:0/2014762177 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6525+0+0 (secure 0 0 0) 0x7fc09c014070 con 0x7fc0ac107d50 2026-03-10T14:13:51.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.450+0000 7fc0a37fe700 1 -- 192.168.123.103:0/2014762177 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc09c09a530 con 0x7fc0ac107d50 2026-03-10T14:13:51.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.450+0000 7fc0b22e0700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc0980779e0 0x7fc098079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:51.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.451+0000 7fc0b22e0700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc0980779e0 0x7fc098079e90 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fc0a800b5c0 tx=0x7fc0a8005a90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:51.563 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:51 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[102211]: 2026-03-10T14:13:51.161+0000 7f2937d78740 -1 osd.4 97 log_to_monitors true 2026-03-10T14:13:51.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.582+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc0ac1b3140 con 0x7fc0980779e0 2026-03-10T14:13:51.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.584+0000 7fc0a37fe700 1 -- 192.168.123.103:0/2014762177 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7fc0ac1b3140 con 0x7fc0980779e0 2026-03-10T14:13:51.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.586+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc0980779e0 msgr2=0x7fc098079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.587+0000 7fc0b4544700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc0980779e0 0x7fc098079e90 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fc0a800b5c0 tx=0x7fc0a8005a90 comp rx=0 tx=0).stop 2026-03-10T14:13:51.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.587+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0ac107d50 msgr2=0x7fc0ac116ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.587+0000 7fc0b4544700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0ac107d50 0x7fc0ac116ee0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fc09c00eb40 tx=0x7fc09c00ef00 comp rx=0 tx=0).stop 2026-03-10T14:13:51.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.587+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 shutdown_connections 2026-03-10T14:13:51.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.588+0000 7fc0b4544700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc0980779e0 0x7fc098079e90 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.588+0000 7fc0b4544700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc0ac071db0 0x7fc0ac1169a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.588+0000 7fc0b4544700 1 --2- 192.168.123.103:0/2014762177 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0ac107d50 0x7fc0ac116ee0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.588+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 >> 192.168.123.103:0/2014762177 conn(0x7fc0ac06d3e0 msgr2=0x7fc0ac10af80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:51.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.588+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 shutdown_connections 2026-03-10T14:13:51.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.588+0000 7fc0b4544700 1 -- 192.168.123.103:0/2014762177 wait complete. 2026-03-10T14:13:51.596 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:13:51.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.657+0000 7f3ef2036700 1 -- 192.168.123.103:0/4102854362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec102740 msgr2=0x7f3eec102b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.657+0000 7f3ef2036700 1 --2- 192.168.123.103:0/4102854362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec102740 0x7f3eec102b50 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f3ed4009b00 tx=0x7f3ed4009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:51.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.657+0000 7f3ef2036700 1 -- 192.168.123.103:0/4102854362 shutdown_connections 2026-03-10T14:13:51.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.657+0000 7f3ef2036700 1 --2- 192.168.123.103:0/4102854362 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3eec103940 0x7f3eec103d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.657+0000 7f3ef2036700 1 --2- 192.168.123.103:0/4102854362 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec102740 0x7f3eec102b50 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.657+0000 7f3ef2036700 1 -- 192.168.123.103:0/4102854362 >> 192.168.123.103:0/4102854362 conn(0x7f3eec0fdcf0 msgr2=0x7f3eec100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:51.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.658+0000 7f3ef2036700 1 -- 192.168.123.103:0/4102854362 shutdown_connections 2026-03-10T14:13:51.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.658+0000 7f3ef2036700 1 -- 192.168.123.103:0/4102854362 wait complete. 2026-03-10T14:13:51.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.658+0000 7f3ef2036700 1 Processor -- start 2026-03-10T14:13:51.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.658+0000 7f3ef2036700 1 -- start start 2026-03-10T14:13:51.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ef2036700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3eec103940 0x7f3eec198240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ef2036700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec198780 0x7f3eec19d7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ef2036700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3eec198c80 con 0x7f3eec198780 2026-03-10T14:13:51.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ef2036700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3eec198df0 con 0x7f3eec103940 2026-03-10T14:13:51.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3eeaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec198780 0x7f3eec19d7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:51.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3eeaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec198780 0x7f3eec19d7f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:50040/0 (socket says 192.168.123.103:50040) 2026-03-10T14:13:51.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3eeaffd700 1 -- 192.168.123.103:0/3927898428 learned_addr learned my addr 192.168.123.103:0/3927898428 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:51.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3eeaffd700 1 -- 192.168.123.103:0/3927898428 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3eec103940 msgr2=0x7f3eec198240 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:13:51.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3eeaffd700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3eec103940 0x7f3eec198240 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3eeaffd700 1 -- 192.168.123.103:0/3927898428 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3ed40097e0 con 0x7f3eec198780 2026-03-10T14:13:51.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3eeaffd700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec198780 0x7f3eec19d7f0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f3edc00d8d0 tx=0x7f3edc00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:51.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ee8ff9700 1 -- 192.168.123.103:0/3927898428 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3edc009940 con 0x7f3eec198780 2026-03-10T14:13:51.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ee8ff9700 1 -- 192.168.123.103:0/3927898428 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3edc010460 con 0x7f3eec198780 2026-03-10T14:13:51.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ee8ff9700 1 -- 192.168.123.103:0/3927898428 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3edc00f5d0 con 0x7f3eec198780 2026-03-10T14:13:51.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3eec19dd90 con 0x7f3eec198780 2026-03-10T14:13:51.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.659+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3eec0754c0 con 0x7f3eec198780 2026-03-10T14:13:51.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.660+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3eec066e40 con 0x7f3eec198780 2026-03-10T14:13:51.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.664+0000 7f3ee8ff9700 1 -- 192.168.123.103:0/3927898428 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3edc0105d0 con 0x7f3eec198780 2026-03-10T14:13:51.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.664+0000 7f3ee8ff9700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3ed80779e0 0x7f3ed8079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.664+0000 7f3ee8ff9700 1 -- 192.168.123.103:0/3927898428 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(102..102 src has 1..102) v4 ==== 6525+0+0 (secure 0 0 0) 0x7f3edc09a080 con 0x7f3eec198780 2026-03-10T14:13:51.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.664+0000 7f3ee8ff9700 1 -- 192.168.123.103:0/3927898428 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3edc09a500 con 0x7f3eec198780 2026-03-10T14:13:51.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.665+0000 7f3eeb7fe700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3ed80779e0 0x7f3ed8079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:51.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.665+0000 7f3eeb7fe700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3ed80779e0 0x7f3ed8079e90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f3ed4000c00 tx=0x7f3ed4009f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:51.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.789+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3eec108290 con 0x7f3ed80779e0 2026-03-10T14:13:51.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.791+0000 7f3ee8ff9700 1 -- 192.168.123.103:0/3927898428 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f3eec108290 con 0x7f3ed80779e0 2026-03-10T14:13:51.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3ed80779e0 msgr2=0x7f3ed8079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3ed80779e0 0x7f3ed8079e90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f3ed4000c00 tx=0x7f3ed4009f90 comp rx=0 tx=0).stop 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec198780 msgr2=0x7f3eec19d7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec198780 0x7f3eec19d7f0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f3edc00d8d0 tx=0x7f3edc00dc90 comp rx=0 tx=0).stop 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 shutdown_connections 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3ed80779e0 0x7f3ed8079e90 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3eec103940 0x7f3eec198240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 --2- 192.168.123.103:0/3927898428 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3eec198780 0x7f3eec19d7f0 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 >> 192.168.123.103:0/3927898428 conn(0x7f3eec0fdcf0 msgr2=0x7f3eec106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.796+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 shutdown_connections 2026-03-10T14:13:51.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.797+0000 7f3ef2036700 1 -- 192.168.123.103:0/3927898428 wait complete. 2026-03-10T14:13:51.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.871+0000 7f534ffff700 1 -- 192.168.123.103:0/2274346251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350100d40 msgr2=0x7f5350103120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.871+0000 7f534ffff700 1 --2- 192.168.123.103:0/2274346251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350100d40 0x7f5350103120 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f5338009b00 tx=0x7f5338009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:51.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.872+0000 7f534ffff700 1 -- 192.168.123.103:0/2274346251 shutdown_connections 2026-03-10T14:13:51.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.872+0000 7f534ffff700 1 --2- 192.168.123.103:0/2274346251 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5350103660 0x7f5350105a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.872+0000 7f534ffff700 1 --2- 192.168.123.103:0/2274346251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350100d40 0x7f5350103120 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.872+0000 7f534ffff700 1 -- 192.168.123.103:0/2274346251 >> 192.168.123.103:0/2274346251 conn(0x7f53500fa740 msgr2=0x7f53500fcb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:51.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.872+0000 7f534ffff700 1 -- 192.168.123.103:0/2274346251 shutdown_connections 2026-03-10T14:13:51.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.872+0000 7f534ffff700 1 -- 192.168.123.103:0/2274346251 wait complete. 2026-03-10T14:13:51.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534ffff700 1 Processor -- start 2026-03-10T14:13:51.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534ffff700 1 -- start start 2026-03-10T14:13:51.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534ffff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5350100d40 0x7f5350193940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350103660 0x7f5350193e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534ffff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53501944a0 con 0x7f5350103660 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534ffff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53501945e0 con 0x7f5350100d40 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5350100d40 0x7f5350193940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534effd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5350100d40 0x7f5350193940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38494/0 (socket says 192.168.123.103:38494) 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534effd700 1 -- 192.168.123.103:0/459277366 learned_addr learned my addr 192.168.123.103:0/459277366 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534e7fc700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350103660 0x7f5350193e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.873+0000 7f534effd700 1 -- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350103660 msgr2=0x7f5350193e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f534effd700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350103660 0x7f5350193e80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f534effd700 1 -- 192.168.123.103:0/459277366 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f53380097e0 con 0x7f5350100d40 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f534e7fc700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350103660 0x7f5350193e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:13:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f534effd700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5350100d40 0x7f5350193940 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f533800b5c0 tx=0x7f5338004a40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:51.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f53548a1700 1 -- 192.168.123.103:0/459277366 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f533801d070 con 0x7f5350100d40 2026-03-10T14:13:51.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5350199030 con 0x7f5350100d40 2026-03-10T14:13:51.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5350199520 con 0x7f5350100d40 2026-03-10T14:13:51.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f53548a1700 1 -- 192.168.123.103:0/459277366 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f533800bc50 con 0x7f5350100d40 2026-03-10T14:13:51.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.874+0000 7f53548a1700 1 -- 192.168.123.103:0/459277366 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f533800f670 con 0x7f5350100d40 2026-03-10T14:13:51.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.875+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5330005320 con 0x7f5350100d40 2026-03-10T14:13:51.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.876+0000 7f53548a1700 1 -- 192.168.123.103:0/459277366 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5338049920 con 0x7f5350100d40 2026-03-10T14:13:51.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.876+0000 7f53548a1700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f533c07bd20 0x7f533c07e1d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:51.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.876+0000 7f534e7fc700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f533c07bd20 0x7f533c07e1d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:51.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.876+0000 7f53548a1700 1 -- 192.168.123.103:0/459277366 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6525+0+0 (secure 0 0 0) 0x7f533809b320 con 0x7f5350100d40 2026-03-10T14:13:51.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.877+0000 7f534e7fc700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f533c07bd20 0x7f533c07e1d0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f5340005fd0 tx=0x7f5340005ee0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:51.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.878+0000 7f53548a1700 1 -- 192.168.123.103:0/459277366 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5338063b00 con 0x7f5350100d40 2026-03-10T14:13:51.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:51.996+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f5330000bf0 con 0x7f533c07bd20 2026-03-10T14:13:51.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.001+0000 7f53548a1700 1 -- 192.168.123.103:0/459277366 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f5330000bf0 con 0x7f533c07bd20 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (9m) 50s ago 10m 22.1M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (8m) 50s ago 10m 9575k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (10m) 5s ago 10m 11.8M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 50s ago 10m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (4m) 5s ago 10m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (9m) 50s ago 10m 92.0M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (8m) 50s ago 8m 136M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (8m) 50s ago 8m 72.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (8m) 5s ago 8m 63.1M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (8m) 5s ago 8m 144M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (5m) 50s ago 11m 620M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (5m) 5s ago 9m 499M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 50s ago 11m 61.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (4m) 5s ago 9m 54.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (10m) 50s ago 10m 15.4M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (10m) 5s ago 10m 15.5M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 50s ago 9m 206M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (74s) 50s ago 9m 127M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (52s) 50s ago 9m 13.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e0d768b70468 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (28s) 5s ago 9m 135M 4096M 19.2.3-678-ge911bdeb 654f31e6858e f7fc2aafa9d9 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (6s) 5s ago 8m 13.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 89f9225212d4 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (8m) 5s ago 8m 396M 4096M 18.2.0 dc2bc1663786 63def67884f8 2026-03-10T14:13:52.000 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (5m) 50s ago 10m 69.1M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f533c07bd20 msgr2=0x7f533c07e1d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f533c07bd20 0x7f533c07e1d0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f5340005fd0 tx=0x7f5340005ee0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5350100d40 msgr2=0x7f5350193940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5350100d40 0x7f5350193940 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f533800b5c0 tx=0x7f5338004a40 comp rx=0 tx=0).stop 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 shutdown_connections 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f533c07bd20 0x7f533c07e1d0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5350100d40 0x7f5350193940 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 --2- 192.168.123.103:0/459277366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5350103660 0x7f5350193e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.003+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 >> 192.168.123.103:0/459277366 conn(0x7f53500fa740 msgr2=0x7f53500fcb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.004+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 shutdown_connections 2026-03-10T14:13:52.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.004+0000 7f534ffff700 1 -- 192.168.123.103:0/459277366 wait complete. 2026-03-10T14:13:52.065 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:51 vm03.local ceph-mon[103098]: pgmap v207: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+undersized+remapped, 1 active+clean+remapped, 1 active+recovering+undersized+remapped, 13 active+undersized, 16 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 774/336 objects degraded (230.357%); 1483/336 objects misplaced (441.369%); 0 B/s, 110 objects/s recovering 2026-03-10T14:13:52.065 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:51 vm03.local ceph-mon[103098]: from='osd.4 [v2:192.168.123.104:6808/1485333437,v1:192.168.123.104:6809/1485333437]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T14:13:52.065 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:51 vm03.local ceph-mon[103098]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T14:13:52.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.066+0000 7fb4be51b700 1 -- 192.168.123.103:0/2278455892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4b81036a0 msgr2=0x7fb4b8105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.066+0000 7fb4be51b700 1 --2- 192.168.123.103:0/2278455892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4b81036a0 0x7fb4b8105ac0 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fb4a800b3a0 tx=0x7fb4a800b6b0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.070+0000 7fb4be51b700 1 -- 192.168.123.103:0/2278455892 shutdown_connections 2026-03-10T14:13:52.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.070+0000 7fb4be51b700 1 --2- 192.168.123.103:0/2278455892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4b81036a0 0x7fb4b8105ac0 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.070+0000 7fb4be51b700 1 --2- 192.168.123.103:0/2278455892 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4b8069160 0x7fb4b8103160 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.070+0000 7fb4be51b700 1 -- 192.168.123.103:0/2278455892 >> 192.168.123.103:0/2278455892 conn(0x7fb4b80faa70 msgr2=0x7fb4b80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.070+0000 7fb4be51b700 1 -- 192.168.123.103:0/2278455892 shutdown_connections 2026-03-10T14:13:52.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.070+0000 7fb4be51b700 1 -- 192.168.123.103:0/2278455892 wait complete. 2026-03-10T14:13:52.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.070+0000 7fb4be51b700 1 Processor -- start 2026-03-10T14:13:52.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4be51b700 1 -- start start 2026-03-10T14:13:52.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4be51b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4b8069160 0x7fb4b8071da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4be51b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4b81036a0 0x7fb4b80722e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4be51b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb4b8072900 con 0x7fb4b81036a0 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4be51b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb4b8072a40 con 0x7fb4b8069160 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4b7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4b8069160 0x7fb4b8071da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4b7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4b8069160 0x7fb4b8071da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38512/0 (socket says 192.168.123.103:38512) 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4b7fff700 1 -- 192.168.123.103:0/102546646 learned_addr learned my addr 192.168.123.103:0/102546646 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4b7fff700 1 -- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4b81036a0 msgr2=0x7fb4b80722e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4b7fff700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4b81036a0 0x7fb4b80722e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4b7fff700 1 -- 192.168.123.103:0/102546646 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb4a800b050 con 0x7fb4b8069160 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.071+0000 7fb4b7fff700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4b8069160 0x7fb4b8071da0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fb4b0009bc0 tx=0x7fb4b0009f80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.072+0000 7fb4b57fa700 1 -- 192.168.123.103:0/102546646 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb4b000cbe0 con 0x7fb4b8069160 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.072+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb4b819d7a0 con 0x7fb4b8069160 2026-03-10T14:13:52.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.072+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb4b819dcf0 con 0x7fb4b8069160 2026-03-10T14:13:52.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.072+0000 7fb4b57fa700 1 -- 192.168.123.103:0/102546646 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb4b0004500 con 0x7fb4b8069160 2026-03-10T14:13:52.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.072+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb4b806bf10 con 0x7fb4b8069160 2026-03-10T14:13:52.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.073+0000 7fb4b57fa700 1 -- 192.168.123.103:0/102546646 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb4b0017610 con 0x7fb4b8069160 2026-03-10T14:13:52.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.073+0000 7fb4b57fa700 1 -- 192.168.123.103:0/102546646 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb4b003e9a0 con 0x7fb4b8069160 2026-03-10T14:13:52.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.074+0000 7fb4b57fa700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb4a00779f0 0x7fb4a0079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.074+0000 7fb4b77fe700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb4a00779f0 0x7fb4a0079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.074+0000 7fb4b57fa700 1 -- 192.168.123.103:0/102546646 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6525+0+0 (secure 0 0 0) 0x7fb4b009b010 con 0x7fb4b8069160 2026-03-10T14:13:52.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.074+0000 7fb4b77fe700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb4a00779f0 0x7fb4a0079ea0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fb4a8015040 tx=0x7fb4a80090d0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:52.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.075+0000 7fb4b57fa700 1 -- 192.168.123.103:0/102546646 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb4b0063740 con 0x7fb4b8069160 2026-03-10T14:13:52.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.233+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fb4b802cdc0 con 0x7fb4b8069160 2026-03-10T14:13:52.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.234+0000 7fb4b57fa700 1 -- 192.168.123.103:0/102546646 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fb4b0062e90 con 0x7fb4b8069160 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 1, 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 8 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:13:52.233 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb4a00779f0 msgr2=0x7fb4a0079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb4a00779f0 0x7fb4a0079ea0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fb4a8015040 tx=0x7fb4a80090d0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4b8069160 msgr2=0x7fb4b8071da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4b8069160 0x7fb4b8071da0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fb4b0009bc0 tx=0x7fb4b0009f80 comp rx=0 tx=0).stop 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 shutdown_connections 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb4a00779f0 0x7fb4a0079ea0 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb4b8069160 0x7fb4b8071da0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 --2- 192.168.123.103:0/102546646 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4b81036a0 0x7fb4b80722e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.236+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 >> 192.168.123.103:0/102546646 conn(0x7fb4b80faa70 msgr2=0x7fb4b80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.237+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 shutdown_connections 2026-03-10T14:13:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.237+0000 7fb4be51b700 1 -- 192.168.123.103:0/102546646 wait complete. 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.302+0000 7f1ad9230700 1 -- 192.168.123.103:0/2760740563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 msgr2=0x7f1ad4105cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.302+0000 7f1ad9230700 1 --2- 192.168.123.103:0/2760740563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 0x7f1ad4105cb0 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f1ac4009b00 tx=0x7f1ac4009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.302+0000 7f1ad9230700 1 -- 192.168.123.103:0/2760740563 shutdown_connections 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.302+0000 7f1ad9230700 1 --2- 192.168.123.103:0/2760740563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 0x7f1ad4105cb0 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.302+0000 7f1ad9230700 1 --2- 192.168.123.103:0/2760740563 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ad4100fb0 0x7f1ad4103390 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.302+0000 7f1ad9230700 1 -- 192.168.123.103:0/2760740563 >> 192.168.123.103:0/2760740563 conn(0x7f1ad40fa9b0 msgr2=0x7f1ad40fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.302+0000 7f1ad9230700 1 -- 192.168.123.103:0/2760740563 shutdown_connections 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.302+0000 7f1ad9230700 1 -- 192.168.123.103:0/2760740563 wait complete. 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad9230700 1 Processor -- start 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad9230700 1 -- start start 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad9230700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ad4100fb0 0x7f1ad4071c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad9230700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 0x7f1ad40721b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad9230700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ad40727b0 con 0x7f1ad41038d0 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad9230700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ad4072920 con 0x7f1ad4100fb0 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad2d9d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ad4100fb0 0x7f1ad4071c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 0x7f1ad40721b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 0x7f1ad40721b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:50096/0 (socket says 192.168.123.103:50096) 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad259c700 1 -- 192.168.123.103:0/938344958 learned_addr learned my addr 192.168.123.103:0/938344958 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad259c700 1 -- 192.168.123.103:0/938344958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ad4100fb0 msgr2=0x7f1ad4071c70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad259c700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ad4100fb0 0x7f1ad4071c70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad259c700 1 -- 192.168.123.103:0/938344958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1ac40097e0 con 0x7f1ad41038d0 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.303+0000 7f1ad2d9d700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ad4100fb0 0x7f1ad4071c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.304+0000 7f1ad259c700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 0x7f1ad40721b0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f1ac4005f50 tx=0x7f1ac4005090 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.304+0000 7f1acbfff700 1 -- 192.168.123.103:0/938344958 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1ac401d070 con 0x7f1ad41038d0 2026-03-10T14:13:52.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.304+0000 7f1acbfff700 1 -- 192.168.123.103:0/938344958 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1ac400bcd0 con 0x7f1ad41038d0 2026-03-10T14:13:52.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.304+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1ad41a3c20 con 0x7f1ad41038d0 2026-03-10T14:13:52.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.304+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1ad41a4110 con 0x7f1ad41038d0 2026-03-10T14:13:52.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.304+0000 7f1acbfff700 1 -- 192.168.123.103:0/938344958 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1ac4017880 con 0x7f1ad41038d0 2026-03-10T14:13:52.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.305+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1ad406bf10 con 0x7f1ad41038d0 2026-03-10T14:13:52.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.307+0000 7f1acbfff700 1 -- 192.168.123.103:0/938344958 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1ac4029030 con 0x7f1ad41038d0 2026-03-10T14:13:52.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.307+0000 7f1acbfff700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ac00779e0 0x7f1ac0079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.307+0000 7f1acbfff700 1 -- 192.168.123.103:0/938344958 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6525+0+0 (secure 0 0 0) 0x7f1ac409b770 con 0x7f1ad41038d0 2026-03-10T14:13:52.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.308+0000 7f1ad2d9d700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ac00779e0 0x7f1ac0079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.309+0000 7f1ad2d9d700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ac00779e0 0x7f1ac0079e90 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f1abc005fd0 tx=0x7f1abc005de0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:52.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.309+0000 7f1acbfff700 1 -- 192.168.123.103:0/938344958 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1ac4063f60 con 0x7f1ad41038d0 2026-03-10T14:13:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:51 vm04.local ceph-mon[92084]: pgmap v207: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+undersized+remapped, 1 active+clean+remapped, 1 active+recovering+undersized+remapped, 13 active+undersized, 16 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 774/336 objects degraded (230.357%); 1483/336 objects misplaced (441.369%); 0 B/s, 110 objects/s recovering 2026-03-10T14:13:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:51 vm04.local ceph-mon[92084]: from='osd.4 [v2:192.168.123.104:6808/1485333437,v1:192.168.123.104:6809/1485333437]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T14:13:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:51 vm04.local ceph-mon[92084]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T14:13:52.313 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:13:51 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[102211]: 2026-03-10T14:13:51.846+0000 7f292fb12640 -1 osd.4 97 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:13:52.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.453+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f1ad4066e40 con 0x7f1ad41038d0 2026-03-10T14:13:52.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.453+0000 7f1acbfff700 1 -- 192.168.123.103:0/938344958 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 16 v16) v1 ==== 76+0+1954 (secure 0 0 0) 0x7f1ac40636b0 con 0x7f1ad41038d0 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:e16 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-10T14:12:48:855184+0000 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:epoch 16 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:12:48.656761+0000 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 24263 members: 24263,14470 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{0:14486} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/1684840896,v1:192.168.123.103:6829/1684840896] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:24273} state up:standby-replay seq 2 join_fscid=1 addr [v2:192.168.123.104:6826/3934202413,v1:192.168.123.104:6827/3934202413] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:13:52.453 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.456+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ac00779e0 msgr2=0x7f1ac0079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.456+0000 7f1ad9230700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ac00779e0 0x7f1ac0079e90 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f1abc005fd0 tx=0x7f1abc005de0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.456+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 msgr2=0x7f1ad40721b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.456+0000 7f1ad9230700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 0x7f1ad40721b0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f1ac4005f50 tx=0x7f1ac4005090 comp rx=0 tx=0).stop 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.457+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 shutdown_connections 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.457+0000 7f1ad9230700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ac00779e0 0x7f1ac0079e90 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.457+0000 7f1ad9230700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ad4100fb0 0x7f1ad4071c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.457+0000 7f1ad9230700 1 --2- 192.168.123.103:0/938344958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ad41038d0 0x7f1ad40721b0 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.457+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 >> 192.168.123.103:0/938344958 conn(0x7f1ad40fa9b0 msgr2=0x7f1ad40fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.457+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 shutdown_connections 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.457+0000 7f1ad9230700 1 -- 192.168.123.103:0/938344958 wait complete. 2026-03-10T14:13:52.456 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 16 2026-03-10T14:13:52.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.525+0000 7f1d1fb67700 1 -- 192.168.123.103:0/3887818303 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d180fee80 msgr2=0x7f1d181012a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.525+0000 7f1d1fb67700 1 --2- 192.168.123.103:0/3887818303 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d180fee80 0x7f1d181012a0 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f1d0c009b00 tx=0x7f1d0c009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:52.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.525+0000 7f1d1fb67700 1 -- 192.168.123.103:0/3887818303 shutdown_connections 2026-03-10T14:13:52.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.525+0000 7f1d1fb67700 1 --2- 192.168.123.103:0/3887818303 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1d181017e0 0x7f1d18103c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.525+0000 7f1d1fb67700 1 --2- 192.168.123.103:0/3887818303 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d180fee80 0x7f1d181012a0 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.525+0000 7f1d1fb67700 1 -- 192.168.123.103:0/3887818303 >> 192.168.123.103:0/3887818303 conn(0x7f1d180faa70 msgr2=0x7f1d180fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.525+0000 7f1d1fb67700 1 -- 192.168.123.103:0/3887818303 shutdown_connections 2026-03-10T14:13:52.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.525+0000 7f1d1fb67700 1 -- 192.168.123.103:0/3887818303 wait complete. 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.526+0000 7f1d1fb67700 1 Processor -- start 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1fb67700 1 -- start start 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1fb67700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1d180fee80 0x7f1d1819c400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1fb67700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d181017e0 0x7f1d1819c940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1fb67700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d1819ced0 con 0x7f1d181017e0 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1fb67700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d1819d010 con 0x7f1d180fee80 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1d903700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1d180fee80 0x7f1d1819c400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1d903700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1d180fee80 0x7f1d1819c400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38540/0 (socket says 192.168.123.103:38540) 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1d903700 1 -- 192.168.123.103:0/2627609196 learned_addr learned my addr 192.168.123.103:0/2627609196 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1d102700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d181017e0 0x7f1d1819c940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1d102700 1 -- 192.168.123.103:0/2627609196 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1d180fee80 msgr2=0x7f1d1819c400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1d102700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1d180fee80 0x7f1d1819c400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.527+0000 7f1d1d102700 1 -- 192.168.123.103:0/2627609196 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1d0c0097e0 con 0x7f1d181017e0 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.528+0000 7f1d1d102700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d181017e0 0x7f1d1819c940 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f1d1400c8f0 tx=0x7f1d1400ccb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.528+0000 7f1d0affd700 1 -- 192.168.123.103:0/2627609196 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d140043f0 con 0x7f1d181017e0 2026-03-10T14:13:52.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.528+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1d181055c0 con 0x7f1d181017e0 2026-03-10T14:13:52.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.528+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1d18105ae0 con 0x7f1d181017e0 2026-03-10T14:13:52.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.528+0000 7f1d0affd700 1 -- 192.168.123.103:0/2627609196 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1d1400ce90 con 0x7f1d181017e0 2026-03-10T14:13:52.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.528+0000 7f1d0affd700 1 -- 192.168.123.103:0/2627609196 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d14003de0 con 0x7f1d181017e0 2026-03-10T14:13:52.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.529+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1d18196590 con 0x7f1d181017e0 2026-03-10T14:13:52.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.532+0000 7f1d0affd700 1 -- 192.168.123.103:0/2627609196 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1d140076f0 con 0x7f1d181017e0 2026-03-10T14:13:52.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.533+0000 7f1d0affd700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1d040779e0 0x7f1d04079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.533+0000 7f1d1d903700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1d040779e0 0x7f1d04079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.534+0000 7f1d0affd700 1 -- 192.168.123.103:0/2627609196 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6525+0+0 (secure 0 0 0) 0x7f1d14099bf0 con 0x7f1d181017e0 2026-03-10T14:13:52.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.534+0000 7f1d1d903700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1d040779e0 0x7f1d04079e90 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f1d0c00b5c0 tx=0x7f1d0c005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:52.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.534+0000 7f1d0affd700 1 -- 192.168.123.103:0/2627609196 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1d140c9a90 con 0x7f1d181017e0 2026-03-10T14:13:52.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.671+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1d18061190 con 0x7f1d040779e0 2026-03-10T14:13:52.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.672+0000 7f1d0affd700 1 -- 192.168.123.103:0/2627609196 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f1d18061190 con 0x7f1d040779e0 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "11/23 daemons upgraded", 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:13:52.671 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:13:52.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.674+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1d040779e0 msgr2=0x7f1d04079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.674+0000 7f1d1fb67700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1d040779e0 0x7f1d04079e90 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f1d0c00b5c0 tx=0x7f1d0c005fb0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d181017e0 msgr2=0x7f1d1819c940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d181017e0 0x7f1d1819c940 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f1d1400c8f0 tx=0x7f1d1400ccb0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 shutdown_connections 2026-03-10T14:13:52.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1d040779e0 0x7f1d04079e90 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1d180fee80 0x7f1d1819c400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 --2- 192.168.123.103:0/2627609196 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d181017e0 0x7f1d1819c940 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 >> 192.168.123.103:0/2627609196 conn(0x7f1d180faa70 msgr2=0x7f1d180fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 shutdown_connections 2026-03-10T14:13:52.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.675+0000 7f1d1fb67700 1 -- 192.168.123.103:0/2627609196 wait complete. 2026-03-10T14:13:52.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.740+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4154133543 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f24105a60 msgr2=0x7f4f24107e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.740+0000 7f4f2c66c700 1 --2- 192.168.123.103:0/4154133543 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f24105a60 0x7f4f24107e40 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f4f20009b00 tx=0x7f4f20009e10 comp rx=0 tx=0).stop 2026-03-10T14:13:52.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.743+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4154133543 shutdown_connections 2026-03-10T14:13:52.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.743+0000 7f4f2c66c700 1 --2- 192.168.123.103:0/4154133543 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f24105a60 0x7f4f24107e40 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.743+0000 7f4f2c66c700 1 --2- 192.168.123.103:0/4154133543 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f240691a0 0x7f4f24105520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.743+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4154133543 >> 192.168.123.103:0/4154133543 conn(0x7f4f240faa70 msgr2=0x7f4f240fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.743 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.745+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4154133543 shutdown_connections 2026-03-10T14:13:52.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.745+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4154133543 wait complete. 2026-03-10T14:13:52.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.745+0000 7f4f2c66c700 1 Processor -- start 2026-03-10T14:13:52.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f2c66c700 1 -- start start 2026-03-10T14:13:52.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f2c66c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f240691a0 0x7f4f24193c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f2c66c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f24105a60 0x7f4f24194180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f2c66c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4f24194750 con 0x7f4f240691a0 2026-03-10T14:13:52.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f2c66c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4f24194890 con 0x7f4f24105a60 2026-03-10T14:13:52.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f2a408700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f240691a0 0x7f4f24193c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f29c07700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f24105a60 0x7f4f24194180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f29c07700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f24105a60 0x7f4f24194180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:38550/0 (socket says 192.168.123.103:38550) 2026-03-10T14:13:52.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.746+0000 7f4f29c07700 1 -- 192.168.123.103:0/4006410109 learned_addr learned my addr 192.168.123.103:0/4006410109 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:13:52.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.747+0000 7f4f29c07700 1 -- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f240691a0 msgr2=0x7f4f24193c40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.747+0000 7f4f29c07700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f240691a0 0x7f4f24193c40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.747+0000 7f4f29c07700 1 -- 192.168.123.103:0/4006410109 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4f200097e0 con 0x7f4f24105a60 2026-03-10T14:13:52.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.747+0000 7f4f2a408700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f240691a0 0x7f4f24193c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:13:52.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.747+0000 7f4f29c07700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f24105a60 0x7f4f24194180 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f4f20005850 tx=0x7f4f20004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:52.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.747+0000 7f4f177fe700 1 -- 192.168.123.103:0/4006410109 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4f2001d070 con 0x7f4f24105a60 2026-03-10T14:13:52.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.747+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4f2406a860 con 0x7f4f24105a60 2026-03-10T14:13:52.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.748+0000 7f4f177fe700 1 -- 192.168.123.103:0/4006410109 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4f2000bc50 con 0x7f4f24105a60 2026-03-10T14:13:52.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.748+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4f2406ad50 con 0x7f4f24105a60 2026-03-10T14:13:52.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.748+0000 7f4f177fe700 1 -- 192.168.123.103:0/4006410109 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4f2000f790 con 0x7f4f24105a60 2026-03-10T14:13:52.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.749+0000 7f4f177fe700 1 -- 192.168.123.103:0/4006410109 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4f20022ac0 con 0x7f4f24105a60 2026-03-10T14:13:52.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.749+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4f241028b0 con 0x7f4f24105a60 2026-03-10T14:13:52.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.749+0000 7f4f177fe700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4f10077870 0x7f4f10079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:13:52.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.749+0000 7f4f177fe700 1 -- 192.168.123.103:0/4006410109 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(103..103 src has 1..103) v4 ==== 6525+0+0 (secure 0 0 0) 0x7f4f2009b300 con 0x7f4f24105a60 2026-03-10T14:13:52.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.750+0000 7f4f2a408700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4f10077870 0x7f4f10079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:13:52.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.750+0000 7f4f2a408700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4f10077870 0x7f4f10079d20 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f4f18005950 tx=0x7f4f180058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:13:52.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.752+0000 7f4f177fe700 1 -- 192.168.123.103:0/4006410109 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4f200639e0 con 0x7f4f24105a60 2026-03-10T14:13:52.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.926+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f4f2402d0b0 con 0x7f4f24105a60 2026-03-10T14:13:52.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.927+0000 7f4f177fe700 1 -- 192.168.123.103:0/4006410109 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1155 (secure 0 0 0) 0x7f4f2402d0b0 con 0x7f4f24105a60 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 774/336 objects degraded (230.357%), 17 pgs degraded 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 774/336 objects degraded (230.357%), 17 pgs degraded 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.4 is active+undersized+degraded, acting [1,2] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.5 is active+undersized+degraded, acting [3,0] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.6 is active+undersized+degraded, acting [1,3] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.7 is active+undersized+degraded, acting [3,2] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.9 is active+undersized+degraded, acting [1,0] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.a is active+undersized+degraded, acting [1,3] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.b is active+undersized+degraded, acting [3,5] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.f is active+undersized+degraded, acting [0,5] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.11 is active+undersized+degraded, acting [3,1] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.13 is active+undersized+degraded, acting [0,2] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.14 is active+undersized+degraded, acting [3,5] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.18 is active+undersized+degraded, acting [5,3] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.19 is active+undersized+degraded, acting [0,2] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1a is active+undersized+degraded, acting [3,5] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1c is active+undersized+degraded, acting [5,2] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1f is active+undersized+degraded, acting [0,3] 2026-03-10T14:13:52.926 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.a is active+recovery_wait+undersized+degraded+remapped, acting [0,1] 2026-03-10T14:13:52.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4f10077870 msgr2=0x7f4f10079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4f10077870 0x7f4f10079d20 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f4f18005950 tx=0x7f4f180058e0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f24105a60 msgr2=0x7f4f24194180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f24105a60 0x7f4f24194180 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f4f20005850 tx=0x7f4f20004970 comp rx=0 tx=0).stop 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 shutdown_connections 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4f10077870 0x7f4f10079d20 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4f240691a0 0x7f4f24193c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 --2- 192.168.123.103:0/4006410109 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4f24105a60 0x7f4f24194180 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.929+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 >> 192.168.123.103:0/4006410109 conn(0x7f4f240faa70 msgr2=0x7f4f240fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.930+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 shutdown_connections 2026-03-10T14:13:52.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:13:52.930+0000 7f4f2c66c700 1 -- 192.168.123.103:0/4006410109 wait complete. 2026-03-10T14:13:53.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: from='client.34380 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: from='client.34384 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T14:13:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: osdmap e103: 6 total, 5 up, 6 in 2026-03-10T14:13:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: from='osd.4 [v2:192.168.123.104:6808/1485333437,v1:192.168.123.104:6809/1485333437]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:13:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:13:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: from='client.44275 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/102546646' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:53.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:52 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/938344958' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:13:53.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: from='client.34380 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:53.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: from='client.34384 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T14:13:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: osdmap e103: 6 total, 5 up, 6 in 2026-03-10T14:13:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: from='osd.4 [v2:192.168.123.104:6808/1485333437,v1:192.168.123.104:6809/1485333437]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:13:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:13:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: from='client.44275 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/102546646' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:13:53.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:52 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/938344958' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:13:54.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:53 vm03.local ceph-mon[103098]: pgmap v209: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+undersized+remapped, 1 active+clean+remapped, 1 active+recovering+undersized+remapped, 13 active+undersized, 16 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 774/336 objects degraded (230.357%); 1483/336 objects misplaced (441.369%); 0 B/s, 121 objects/s recovering 2026-03-10T14:13:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:53 vm03.local ceph-mon[103098]: from='client.34396 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:53 vm03.local ceph-mon[103098]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:13:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:53 vm03.local ceph-mon[103098]: osd.4 [v2:192.168.123.104:6808/1485333437,v1:192.168.123.104:6809/1485333437] boot 2026-03-10T14:13:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:53 vm03.local ceph-mon[103098]: osdmap e104: 6 total, 6 up, 6 in 2026-03-10T14:13:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:53 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:13:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:53 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/4006410109' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:13:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:53 vm04.local ceph-mon[92084]: pgmap v209: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+undersized+remapped, 1 active+clean+remapped, 1 active+recovering+undersized+remapped, 13 active+undersized, 16 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 774/336 objects degraded (230.357%); 1483/336 objects misplaced (441.369%); 0 B/s, 121 objects/s recovering 2026-03-10T14:13:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:53 vm04.local ceph-mon[92084]: from='client.34396 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:13:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:53 vm04.local ceph-mon[92084]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:13:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:53 vm04.local ceph-mon[92084]: osd.4 [v2:192.168.123.104:6808/1485333437,v1:192.168.123.104:6809/1485333437] boot 2026-03-10T14:13:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:53 vm04.local ceph-mon[92084]: osdmap e104: 6 total, 6 up, 6 in 2026-03-10T14:13:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:53 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T14:13:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:53 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/4006410109' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:13:55.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:54 vm03.local ceph-mon[103098]: osdmap e105: 6 total, 6 up, 6 in 2026-03-10T14:13:55.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:54 vm04.local ceph-mon[92084]: osdmap e105: 6 total, 6 up, 6 in 2026-03-10T14:13:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:55 vm04.local ceph-mon[92084]: pgmap v212: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+undersized+remapped, 1 active+clean+remapped, 1 active+recovering+undersized+remapped, 13 active+undersized, 16 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 774/336 objects degraded (230.357%); 1483/336 objects misplaced (441.369%); 0 B/s, 121 objects/s recovering 2026-03-10T14:13:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:55 vm04.local ceph-mon[92084]: osdmap e106: 6 total, 6 up, 6 in 2026-03-10T14:13:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:55 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 774/336 objects degraded (230.357%), 17 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:56.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:55 vm03.local ceph-mon[103098]: pgmap v212: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+undersized+remapped, 1 active+clean+remapped, 1 active+recovering+undersized+remapped, 13 active+undersized, 16 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 774/336 objects degraded (230.357%); 1483/336 objects misplaced (441.369%); 0 B/s, 121 objects/s recovering 2026-03-10T14:13:56.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:55 vm03.local ceph-mon[103098]: osdmap e106: 6 total, 6 up, 6 in 2026-03-10T14:13:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:55 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 774/336 objects degraded (230.357%), 17 pgs degraded (PG_DEGRADED) 2026-03-10T14:13:58.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:57 vm04.local ceph-mon[92084]: pgmap v214: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 5 active+undersized, 7 active+undersized+degraded, 50 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 2 op/s; 1471/336 objects degraded (437.798%); 749/336 objects misplaced (222.917%); 0 B/s, 8 objects/s recovering 2026-03-10T14:13:58.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:57 vm03.local ceph-mon[103098]: pgmap v214: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 5 active+undersized, 7 active+undersized+degraded, 50 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 2 op/s; 1471/336 objects degraded (437.798%); 749/336 objects misplaced (222.917%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:00.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:13:59 vm04.local ceph-mon[92084]: pgmap v215: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.3 KiB/s rd, 1 op/s; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%); 0 B/s, 6 objects/s recovering 2026-03-10T14:14:00.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:13:59 vm03.local ceph-mon[103098]: pgmap v215: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.3 KiB/s rd, 1 op/s; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%); 0 B/s, 6 objects/s recovering 2026-03-10T14:14:01.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:00 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1453/336 objects degraded (432.440%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T14:14:01.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:01.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:14:01.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:00 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1453/336 objects degraded (432.440%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T14:14:01.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:01.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:14:02.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:01 vm04.local ceph-mon[92084]: pgmap v216: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 2 op/s; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%); 0 B/s, 11 objects/s recovering 2026-03-10T14:14:02.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:01 vm03.local ceph-mon[103098]: pgmap v216: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 2 op/s; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%); 0 B/s, 11 objects/s recovering 2026-03-10T14:14:04.121 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:03 vm04.local ceph-mon[92084]: pgmap v217: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:04.121 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T14:14:04.121 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:03 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T14:14:04.121 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:03 vm04.local ceph-mon[92084]: Upgrade: osd.5 is safe to restart 2026-03-10T14:14:04.121 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:04.121 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T14:14:04.121 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:04.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:03 vm03.local ceph-mon[103098]: pgmap v217: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T14:14:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:03 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T14:14:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:03 vm03.local ceph-mon[103098]: Upgrade: osd.5 is safe to restart 2026-03-10T14:14:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T14:14:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:04.563 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:04 vm04.local systemd[1]: Stopping Ceph osd.5 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:14:04.563 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:04 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[73087]: 2026-03-10T14:14:04.192+0000 7f4267fde700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:14:04.563 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:04 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[73087]: 2026-03-10T14:14:04.192+0000 7f4267fde700 -1 osd.5 106 *** Got signal Terminated *** 2026-03-10T14:14:04.563 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:04 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[73087]: 2026-03-10T14:14:04.192+0000 7f4267fde700 -1 osd.5 106 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:14:05.152 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:04 vm04.local ceph-mon[92084]: Upgrade: Updating osd.5 2026-03-10T14:14:05.152 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:04 vm04.local ceph-mon[92084]: Deploying daemon osd.5 on vm04 2026-03-10T14:14:05.152 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:04 vm04.local ceph-mon[92084]: osd.5 marked itself down and dead 2026-03-10T14:14:05.152 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:04 vm04.local podman[105372]: 2026-03-10 14:14:04.972484753 +0000 UTC m=+0.797110030 container died 63def67884f8b01b5a08f85fe4cc8db6c227fa5c9bd05bb32a437e928b92df07 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, RELEASE=HEAD, io.buildah.version=1.29.1, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-10T14:14:05.152 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:04 vm04.local podman[105372]: 2026-03-10 14:14:04.985998009 +0000 UTC m=+0.810623286 container remove 63def67884f8b01b5a08f85fe4cc8db6c227fa5c9bd05bb32a437e928b92df07 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5, CEPH_POINT_RELEASE=-18.2.0, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , RELEASE=HEAD, org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T14:14:05.152 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:04 vm04.local bash[105372]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5 2026-03-10T14:14:05.152 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105436]: 2026-03-10 14:14:05.114758359 +0000 UTC m=+0.014707813 container create f53be4b2c92ec18d164980dad11dfc74821a74d813373256e48a06405e9d0de0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T14:14:05.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:04 vm03.local ceph-mon[103098]: Upgrade: Updating osd.5 2026-03-10T14:14:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:04 vm03.local ceph-mon[103098]: Deploying daemon osd.5 on vm04 2026-03-10T14:14:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:04 vm03.local ceph-mon[103098]: osd.5 marked itself down and dead 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105436]: 2026-03-10 14:14:05.157489016 +0000 UTC m=+0.057438470 container init f53be4b2c92ec18d164980dad11dfc74821a74d813373256e48a06405e9d0de0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105436]: 2026-03-10 14:14:05.160373046 +0000 UTC m=+0.060322500 container start f53be4b2c92ec18d164980dad11dfc74821a74d813373256e48a06405e9d0de0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105436]: 2026-03-10 14:14:05.167849461 +0000 UTC m=+0.067798915 container attach f53be4b2c92ec18d164980dad11dfc74821a74d813373256e48a06405e9d0de0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105436]: 2026-03-10 14:14:05.108616531 +0000 UTC m=+0.008565995 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local conmon[105447]: conmon f53be4b2c92ec18d1649 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f53be4b2c92ec18d164980dad11dfc74821a74d813373256e48a06405e9d0de0.scope/container/memory.events 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105436]: 2026-03-10 14:14:05.298219413 +0000 UTC m=+0.198168867 container died f53be4b2c92ec18d164980dad11dfc74821a74d813373256e48a06405e9d0de0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105436]: 2026-03-10 14:14:05.320786276 +0000 UTC m=+0.220735721 container remove f53be4b2c92ec18d164980dad11dfc74821a74d813373256e48a06405e9d0de0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0) 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.5.service: Deactivated successfully. 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.5.service: Unit process 105447 (conmon) remains running after unit stopped. 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.5.service: Unit process 105456 (podman) remains running after unit stopped. 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local systemd[1]: Stopped Ceph osd.5 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:14:05.416 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local systemd[1]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.5.service: Consumed 47.802s CPU time, 741.0M memory peak. 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local systemd[1]: Starting Ceph osd.5 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105545]: 2026-03-10 14:14:05.641146527 +0000 UTC m=+0.017259599 container create 3c94cac8e3c13b952c3f72b10755f6e00808d71d85788c73c0c5b148b5bb0f9d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105545]: 2026-03-10 14:14:05.680746586 +0000 UTC m=+0.056859667 container init 3c94cac8e3c13b952c3f72b10755f6e00808d71d85788c73c0c5b148b5bb0f9d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, OSD_FLAVOR=default) 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105545]: 2026-03-10 14:14:05.683902543 +0000 UTC m=+0.060015615 container start 3c94cac8e3c13b952c3f72b10755f6e00808d71d85788c73c0c5b148b5bb0f9d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105545]: 2026-03-10 14:14:05.699976082 +0000 UTC m=+0.076089154 container attach 3c94cac8e3c13b952c3f72b10755f6e00808d71d85788c73c0c5b148b5bb0f9d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223) 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local podman[105545]: 2026-03-10 14:14:05.633153174 +0000 UTC m=+0.009266256 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local bash[105545]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:14:05.815 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:05 vm04.local bash[105545]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:14:06.287 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:05 vm04.local ceph-mon[92084]: pgmap v218: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.8 KiB/s rd, 2 op/s; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:06.287 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:05 vm04.local ceph-mon[92084]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:14:06.287 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:05 vm04.local ceph-mon[92084]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T14:14:06.287 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:05 vm04.local ceph-mon[92084]: osdmap e107: 6 total, 5 up, 6 in 2026-03-10T14:14:06.287 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:05 vm04.local ceph-mon[92084]: osdmap e108: 6 total, 5 up, 6 in 2026-03-10T14:14:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:05 vm03.local ceph-mon[103098]: pgmap v218: 65 pgs: 1 active+recovering+undersized+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1.8 KiB/s rd, 2 op/s; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:05 vm03.local ceph-mon[103098]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T14:14:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:05 vm03.local ceph-mon[103098]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T14:14:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:05 vm03.local ceph-mon[103098]: osdmap e107: 6 total, 5 up, 6 in 2026-03-10T14:14:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:05 vm03.local ceph-mon[103098]: osdmap e108: 6 total, 5 up, 6 in 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-185584f5-c07c-476f-a0a1-acd271c22428/osd-block-5c4fe084-183b-43ce-9ebe-daeadbb4f59a --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T14:14:06.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-185584f5-c07c-476f-a0a1-acd271c22428/osd-block-5c4fe084-183b-43ce-9ebe-daeadbb4f59a --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/ln -snf /dev/ceph-185584f5-c07c-476f-a0a1-acd271c22428/osd-block-5c4fe084-183b-43ce-9ebe-daeadbb4f59a /var/lib/ceph/osd/ceph-5/block 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: Running command: /usr/bin/ln -snf /dev/ceph-185584f5-c07c-476f-a0a1-acd271c22428/osd-block-5c4fe084-183b-43ce-9ebe-daeadbb4f59a /var/lib/ceph/osd/ceph-5/block 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate[105556]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105545]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local podman[105777]: 2026-03-10 14:14:06.680815207 +0000 UTC m=+0.012720763 container died 3c94cac8e3c13b952c3f72b10755f6e00808d71d85788c73c0c5b148b5bb0f9d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local podman[105777]: 2026-03-10 14:14:06.698757345 +0000 UTC m=+0.030662890 container remove 3c94cac8e3c13b952c3f72b10755f6e00808d71d85788c73c0c5b148b5bb0f9d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local podman[105816]: 2026-03-10 14:14:06.801620676 +0000 UTC m=+0.022673523 container create 6c7573f5f3fabd30cf294937d83ad4c9cebbc6c2b9d4740d891e2a13356758ee (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local podman[105816]: 2026-03-10 14:14:06.843607482 +0000 UTC m=+0.064660329 container init 6c7573f5f3fabd30cf294937d83ad4c9cebbc6c2b9d4740d891e2a13356758ee (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default) 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local podman[105816]: 2026-03-10 14:14:06.846822781 +0000 UTC m=+0.067875628 container start 6c7573f5f3fabd30cf294937d83ad4c9cebbc6c2b9d4740d891e2a13356758ee (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS) 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local bash[105816]: 6c7573f5f3fabd30cf294937d83ad4c9cebbc6c2b9d4740d891e2a13356758ee 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local podman[105816]: 2026-03-10 14:14:06.78866214 +0000 UTC m=+0.009714996 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:14:07.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:06 vm04.local systemd[1]: Started Ceph osd.5 for b81bf660-1c89-11f1-b612-27d302cdb124. 2026-03-10T14:14:07.564 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:07 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:14:07.183+0000 7fe249a7e740 -1 Falling back to public interface 2026-03-10T14:14:07.860 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:07 vm04.local ceph-mon[92084]: osdmap e109: 6 total, 5 up, 6 in 2026-03-10T14:14:07.860 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:07 vm04.local ceph-mon[92084]: pgmap v222: 65 pgs: 19 peering, 6 stale+active+clean, 2 active+recovery_wait+undersized+degraded+remapped, 38 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%) 2026-03-10T14:14:07.860 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:07.860 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:07.860 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:08.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:07 vm03.local ceph-mon[103098]: osdmap e109: 6 total, 5 up, 6 in 2026-03-10T14:14:08.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:07 vm03.local ceph-mon[103098]: pgmap v222: 65 pgs: 19 peering, 6 stale+active+clean, 2 active+recovery_wait+undersized+degraded+remapped, 38 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 1453/336 objects degraded (432.440%); 749/336 objects misplaced (222.917%) 2026-03-10T14:14:08.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:08.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:08.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:08.951 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:08 vm04.local ceph-mon[92084]: Health check failed: Reduced data availability: 7 pgs peering (PG_AVAILABILITY) 2026-03-10T14:14:08.951 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:08 vm04.local ceph-mon[92084]: osdmap e110: 6 total, 5 up, 6 in 2026-03-10T14:14:08.951 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:08 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:08.951 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:08 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:08.951 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:08 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:08.951 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:08 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:08 vm03.local ceph-mon[103098]: Health check failed: Reduced data availability: 7 pgs peering (PG_AVAILABILITY) 2026-03-10T14:14:09.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:08 vm03.local ceph-mon[103098]: osdmap e110: 6 total, 5 up, 6 in 2026-03-10T14:14:09.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:08 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:08 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:08 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:08 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: osdmap e111: 6 total, 5 up, 6 in 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: pgmap v225: 65 pgs: 1 active+undersized+remapped, 6 active+undersized, 19 peering, 1 active+recovery_wait+undersized+degraded+remapped, 6 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 752/336 objects degraded (223.810%); 0 B/s, 148 objects/s recovering 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T14:14:09.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:09 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: osdmap e111: 6 total, 5 up, 6 in 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: pgmap v225: 65 pgs: 1 active+undersized+remapped, 6 active+undersized, 19 peering, 1 active+recovery_wait+undersized+degraded+remapped, 6 active+undersized+degraded, 32 active+clean; 275 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 752/336 objects degraded (223.810%); 0 B/s, 148 objects/s recovering 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T14:14:10.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:09 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T14:14:10.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:10 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all osd 2026-03-10T14:14:10.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:10 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 752/336 objects degraded (223.810%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T14:14:10.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:10 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T14:14:10.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:10 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T14:14:10.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:10 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T14:14:10.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:10 vm03.local ceph-mon[103098]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T14:14:10.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:10 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T14:14:10.903 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:10 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all osd 2026-03-10T14:14:10.903 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:10 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 752/336 objects degraded (223.810%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T14:14:10.904 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:10 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T14:14:10.904 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:10 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T14:14:10.904 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:10 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T14:14:10.904 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:10 vm04.local ceph-mon[92084]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T14:14:10.904 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:10 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T14:14:11.314 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:10 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:14:10.904+0000 7fe249a7e740 -1 osd.5 0 read_superblock omap replica is missing. 2026-03-10T14:14:11.314 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:14:11.126+0000 7fe249a7e740 -1 osd.5 106 log_to_monitors true 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: osdmap e112: 6 total, 5 up, 6 in 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: pgmap v227: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 16 active+undersized, 14 active+undersized+degraded, 33 active+clean; 275 MiB data, 1.5 GiB used, 119 GiB / 120 GiB avail; 51/336 objects degraded (15.179%); 0 B/s, 184 objects/s recovering 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: Upgrade: Disabling standby-replay for filesystem cephfs 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]: dispatch 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: from='osd.5 [v2:192.168.123.104:6816/62954821,v1:192.168.123.104:6817/62954821]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T14:14:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:11 vm04.local ceph-mon[92084]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: osdmap e112: 6 total, 5 up, 6 in 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: pgmap v227: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 16 active+undersized, 14 active+undersized+degraded, 33 active+clean; 275 MiB data, 1.5 GiB used, 119 GiB / 120 GiB avail; 51/336 objects degraded (15.179%); 0 B/s, 184 objects/s recovering 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: Upgrade: Disabling standby-replay for filesystem cephfs 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]: dispatch 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: from='osd.5 [v2:192.168.123.104:6816/62954821,v1:192.168.123.104:6817/62954821]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T14:14:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:11 vm03.local ceph-mon[103098]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T14:14:13.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 7 pgs peering) 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: Health check failed: insufficient standby MDS daemons available (MDS_INSUFFICIENT_STANDBY) 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: osdmap e113: 6 total, 5 up, 6 in 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]': finished 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: Upgrade: Scaling down filesystem cephfs 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T14:14:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 7 pgs peering) 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: Health check failed: insufficient standby MDS daemons available (MDS_INSUFFICIENT_STANDBY) 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: osdmap e113: 6 total, 5 up, 6 in 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "0"}]': finished 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: Upgrade: Scaling down filesystem cephfs 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T14:14:14.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: pgmap v229: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 16 active+undersized, 14 active+undersized+degraded, 33 active+clean; 275 MiB data, 1.5 GiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 51/336 objects degraded (15.179%); 0 B/s, 4 objects/s recovering 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: Health check cleared: MDS_INSUFFICIENT_STANDBY (was: insufficient standby MDS daemons available) 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: osdmap e114: 6 total, 5 up, 6 in 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] up:boot 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.103:6828/1608434114,v1:192.168.123.103:6829/1608434114] up:boot 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: stopping daemon mds.cephfs.vm03.aqaspa 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='osd.5 [v2:192.168.123.104:6816/62954821,v1:192.168.123.104:6817/62954821]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:stopping} 2 up:standby 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:13 vm04.local ceph-mon[92084]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: pgmap v229: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 16 active+undersized, 14 active+undersized+degraded, 33 active+clean; 275 MiB data, 1.5 GiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 51/336 objects degraded (15.179%); 0 B/s, 4 objects/s recovering 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: Health check cleared: MDS_INSUFFICIENT_STANDBY (was: insufficient standby MDS daemons available) 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: osdmap e114: 6 total, 5 up, 6 in 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] up:boot 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.103:6828/1608434114,v1:192.168.123.103:6829/1608434114] up:boot 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: stopping daemon mds.cephfs.vm03.aqaspa 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:active} 2 up:standby 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='osd.5 [v2:192.168.123.104:6816/62954821,v1:192.168.123.104:6817/62954821]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:stopping} 2 up:standby 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm04", "root=default"]}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:13 vm03.local ceph-mon[103098]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T14:14:14.563 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:14:14 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:14:14.120+0000 7fe241017640 -1 osd.5 106 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T14:14:15.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:15 vm04.local ceph-mon[92084]: from='osd.5 ' entity='osd.5' 2026-03-10T14:14:15.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:15 vm03.local ceph-mon[103098]: from='osd.5 ' entity='osd.5' 2026-03-10T14:14:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:16 vm04.local ceph-mon[92084]: pgmap v231: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 16 active+undersized, 14 active+undersized+degraded, 33 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 51/336 objects degraded (15.179%); 0 B/s, 3 objects/s recovering 2026-03-10T14:14:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:16 vm04.local ceph-mon[92084]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:14:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:16 vm04.local ceph-mon[92084]: osd.5 [v2:192.168.123.104:6816/62954821,v1:192.168.123.104:6817/62954821] boot 2026-03-10T14:14:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:16 vm04.local ceph-mon[92084]: osdmap e115: 6 total, 6 up, 6 in 2026-03-10T14:14:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:16 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:14:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:16 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 51/336 objects degraded (15.179%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T14:14:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:16 vm04.local ceph-mon[92084]: osdmap e116: 6 total, 6 up, 6 in 2026-03-10T14:14:16.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:16 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:14:16.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:16 vm03.local ceph-mon[103098]: pgmap v231: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 16 active+undersized, 14 active+undersized+degraded, 33 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 51/336 objects degraded (15.179%); 0 B/s, 3 objects/s recovering 2026-03-10T14:14:16.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:16 vm03.local ceph-mon[103098]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T14:14:16.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:16 vm03.local ceph-mon[103098]: osd.5 [v2:192.168.123.104:6816/62954821,v1:192.168.123.104:6817/62954821] boot 2026-03-10T14:14:16.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:16 vm03.local ceph-mon[103098]: osdmap e115: 6 total, 6 up, 6 in 2026-03-10T14:14:16.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:16 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T14:14:16.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:16 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 51/336 objects degraded (15.179%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T14:14:16.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:16 vm03.local ceph-mon[103098]: osdmap e116: 6 total, 6 up, 6 in 2026-03-10T14:14:16.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:16 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:14:17.952 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:17 vm04.local ceph-mon[92084]: osdmap e117: 6 total, 6 up, 6 in 2026-03-10T14:14:17.952 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:17 vm04.local ceph-mon[92084]: pgmap v235: 65 pgs: 4 peering, 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 14 active+undersized, 12 active+undersized+degraded, 33 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 511 B/s rd, 2.7 KiB/s wr, 2 op/s; 44/336 objects degraded (13.095%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:18.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:17 vm03.local ceph-mon[103098]: osdmap e117: 6 total, 6 up, 6 in 2026-03-10T14:14:18.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:17 vm03.local ceph-mon[103098]: pgmap v235: 65 pgs: 4 peering, 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 14 active+undersized, 12 active+undersized+degraded, 33 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 511 B/s rd, 2.7 KiB/s wr, 2 op/s; 44/336 objects degraded (13.095%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:20.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:19 vm04.local ceph-mon[92084]: pgmap v236: 65 pgs: 4 peering, 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 3 active+undersized, 4 active+undersized+degraded, 52 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 14 KiB/s rd, 5.5 KiB/s wr, 5 op/s; 21/336 objects degraded (6.250%); 15 B/s, 6 objects/s recovering 2026-03-10T14:14:20.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:19 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:stopping} 2 up:standby 2026-03-10T14:14:20.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:19 vm03.local ceph-mon[103098]: pgmap v236: 65 pgs: 4 peering, 1 active+recovering+undersized+remapped, 1 active+clean+remapped, 3 active+undersized, 4 active+undersized+degraded, 52 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 14 KiB/s rd, 5.5 KiB/s wr, 5 op/s; 21/336 objects degraded (6.250%); 15 B/s, 6 objects/s recovering 2026-03-10T14:14:20.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:19 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm04.sslxuq=up:active,1=cephfs.vm03.aqaspa=up:stopping} 2 up:standby 2026-03-10T14:14:21.010 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:20 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 21/336 objects degraded (6.250%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T14:14:21.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:20 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 21/336 objects degraded (6.250%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T14:14:22.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:21 vm04.local ceph-mon[92084]: pgmap v237: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 14 KiB/s rd, 6.8 KiB/s wr, 6 op/s; 247/336 objects degraded (73.512%); 29 B/s, 12 objects/s recovering 2026-03-10T14:14:22.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:21 vm03.local ceph-mon[103098]: pgmap v237: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 14 KiB/s rd, 6.8 KiB/s wr, 6 op/s; 247/336 objects degraded (73.512%); 29 B/s, 12 objects/s recovering 2026-03-10T14:14:23.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.009+0000 7fb9b022f700 1 -- 192.168.123.103:0/2935474043 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8102760 msgr2=0x7fb9a8102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.009+0000 7fb9b022f700 1 --2- 192.168.123.103:0/2935474043 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8102760 0x7fb9a8102b70 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fb99c009b00 tx=0x7fb99c009e10 comp rx=0 tx=0).stop 2026-03-10T14:14:23.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.010+0000 7fb9b022f700 1 -- 192.168.123.103:0/2935474043 shutdown_connections 2026-03-10T14:14:23.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.010+0000 7fb9b022f700 1 --2- 192.168.123.103:0/2935474043 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb9a8103960 0x7fb9a8103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.010+0000 7fb9b022f700 1 --2- 192.168.123.103:0/2935474043 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8102760 0x7fb9a8102b70 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.010+0000 7fb9b022f700 1 -- 192.168.123.103:0/2935474043 >> 192.168.123.103:0/2935474043 conn(0x7fb9a80fdcf0 msgr2=0x7fb9a8100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:23.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.010+0000 7fb9b022f700 1 -- 192.168.123.103:0/2935474043 shutdown_connections 2026-03-10T14:14:23.009 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.010+0000 7fb9b022f700 1 -- 192.168.123.103:0/2935474043 wait complete. 2026-03-10T14:14:23.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.011+0000 7fb9b022f700 1 Processor -- start 2026-03-10T14:14:23.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.011+0000 7fb9b022f700 1 -- start start 2026-03-10T14:14:23.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.012+0000 7fb9b022f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb9a8102760 0x7fb9a81980a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.012+0000 7fb9b022f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8103960 0x7fb9a81985e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.012+0000 7fb9ad7ca700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8103960 0x7fb9a81985e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.012+0000 7fb9ad7ca700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8103960 0x7fb9a81985e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38072/0 (socket says 192.168.123.103:38072) 2026-03-10T14:14:23.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.012+0000 7fb9ad7ca700 1 -- 192.168.123.103:0/1598830099 learned_addr learned my addr 192.168.123.103:0/1598830099 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:23.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.012+0000 7fb9b022f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9a8198c00 con 0x7fb9a8103960 2026-03-10T14:14:23.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.012+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9a8198d40 con 0x7fb9a8102760 2026-03-10T14:14:23.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.013+0000 7fb9ad7ca700 1 -- 192.168.123.103:0/1598830099 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb9a8102760 msgr2=0x7fb9a81980a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:14:23.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.013+0000 7fb9adfcb700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb9a8102760 0x7fb9a81980a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.013+0000 7fb9ad7ca700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb9a8102760 0x7fb9a81980a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.013+0000 7fb9ad7ca700 1 -- 192.168.123.103:0/1598830099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb99c0097e0 con 0x7fb9a8103960 2026-03-10T14:14:23.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.013+0000 7fb9ad7ca700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8103960 0x7fb9a81985e0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fb9a400b700 tx=0x7fb9a400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:23.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.013+0000 7fb99affd700 1 -- 192.168.123.103:0/1598830099 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb9a4010840 con 0x7fb9a8103960 2026-03-10T14:14:23.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.013+0000 7fb99affd700 1 -- 192.168.123.103:0/1598830099 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb9a4010e80 con 0x7fb9a8103960 2026-03-10T14:14:23.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.014+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9a819d7f0 con 0x7fb9a8103960 2026-03-10T14:14:23.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.014+0000 7fb99affd700 1 -- 192.168.123.103:0/1598830099 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb9a400d590 con 0x7fb9a8103960 2026-03-10T14:14:23.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.014+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9a819dd40 con 0x7fb9a8103960 2026-03-10T14:14:23.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.015+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb9a8066e40 con 0x7fb9a8103960 2026-03-10T14:14:23.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.018+0000 7fb99affd700 1 -- 192.168.123.103:0/1598830099 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb9a400f3e0 con 0x7fb9a8103960 2026-03-10T14:14:23.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.018+0000 7fb99affd700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb994077910 0x7fb994079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.018+0000 7fb99affd700 1 -- 192.168.123.103:0/1598830099 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(117..117 src has 1..117) v4 ==== 6463+0+0 (secure 0 0 0) 0x7fb9a4099d60 con 0x7fb9a8103960 2026-03-10T14:14:23.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.019+0000 7fb9adfcb700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb994077910 0x7fb994079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.020+0000 7fb9adfcb700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb994077910 0x7fb994079dc0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fb99c00b5c0 tx=0x7fb99c005c00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:23.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.020+0000 7fb99affd700 1 -- 192.168.123.103:0/1598830099 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb9a4062450 con 0x7fb9a8103960 2026-03-10T14:14:23.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.152+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb9a819e020 con 0x7fb994077910 2026-03-10T14:14:23.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.154+0000 7fb99affd700 1 -- 192.168.123.103:0/1598830099 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fb9a819e020 con 0x7fb994077910 2026-03-10T14:14:23.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.157+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb994077910 msgr2=0x7fb994079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.157+0000 7fb9b022f700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb994077910 0x7fb994079dc0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fb99c00b5c0 tx=0x7fb99c005c00 comp rx=0 tx=0).stop 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.157+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8103960 msgr2=0x7fb9a81985e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.157+0000 7fb9b022f700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8103960 0x7fb9a81985e0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fb9a400b700 tx=0x7fb9a400bac0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.157+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 shutdown_connections 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.157+0000 7fb9b022f700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb994077910 0x7fb994079dc0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.157+0000 7fb9b022f700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb9a8102760 0x7fb9a81980a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.158+0000 7fb9b022f700 1 --2- 192.168.123.103:0/1598830099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb9a8103960 0x7fb9a81985e0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.158+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 >> 192.168.123.103:0/1598830099 conn(0x7fb9a80fdcf0 msgr2=0x7fb9a8106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.158+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 shutdown_connections 2026-03-10T14:14:23.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.158+0000 7fb9b022f700 1 -- 192.168.123.103:0/1598830099 wait complete. 2026-03-10T14:14:23.166 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:14:23.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.235+0000 7f92c7300700 1 -- 192.168.123.103:0/2051159261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c00ff7d0 msgr2=0x7f92c00ffc40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.235+0000 7f92c7300700 1 --2- 192.168.123.103:0/2051159261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c00ff7d0 0x7f92c00ffc40 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f92bc009b50 tx=0x7f92bc009e60 comp rx=0 tx=0).stop 2026-03-10T14:14:23.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.235+0000 7f92c7300700 1 -- 192.168.123.103:0/2051159261 shutdown_connections 2026-03-10T14:14:23.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.235+0000 7f92c7300700 1 --2- 192.168.123.103:0/2051159261 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c00ff7d0 0x7f92c00ffc40 secure :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f92bc009b50 tx=0x7f92bc009e60 comp rx=0 tx=0).stop 2026-03-10T14:14:23.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.235+0000 7f92c7300700 1 --2- 192.168.123.103:0/2051159261 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c00fee80 0x7f92c00ff290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.235+0000 7f92c7300700 1 -- 192.168.123.103:0/2051159261 >> 192.168.123.103:0/2051159261 conn(0x7f92c00faa70 msgr2=0x7f92c00fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:23.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.236+0000 7f92c7300700 1 -- 192.168.123.103:0/2051159261 shutdown_connections 2026-03-10T14:14:23.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.236+0000 7f92c7300700 1 -- 192.168.123.103:0/2051159261 wait complete. 2026-03-10T14:14:23.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.236+0000 7f92c7300700 1 Processor -- start 2026-03-10T14:14:23.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.236+0000 7f92c7300700 1 -- start start 2026-03-10T14:14:23.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.236+0000 7f92c7300700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c00fee80 0x7f92c0198230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.236+0000 7f92c7300700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c0198770 0x7f92c019d7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.236+0000 7f92c7300700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92c0198c70 con 0x7f92c0198770 2026-03-10T14:14:23.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.236+0000 7f92c7300700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92c0198de0 con 0x7f92c00fee80 2026-03-10T14:14:23.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c509c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c00fee80 0x7f92c0198230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c509c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c00fee80 0x7f92c0198230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:47446/0 (socket says 192.168.123.103:47446) 2026-03-10T14:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c509c700 1 -- 192.168.123.103:0/1653606673 learned_addr learned my addr 192.168.123.103:0/1653606673 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c489b700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c0198770 0x7f92c019d7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c509c700 1 -- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c0198770 msgr2=0x7f92c019d7e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c509c700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c0198770 0x7f92c019d7e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c509c700 1 -- 192.168.123.103:0/1653606673 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f92bc0097e0 con 0x7f92c00fee80 2026-03-10T14:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c489b700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c0198770 0x7f92c019d7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.237+0000 7f92c509c700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c00fee80 0x7f92c0198230 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f92b0009fd0 tx=0x7f92b000eea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:23.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.238+0000 7f92b67fc700 1 -- 192.168.123.103:0/1653606673 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92b0009980 con 0x7f92c00fee80 2026-03-10T14:14:23.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.238+0000 7f92b67fc700 1 -- 192.168.123.103:0/1653606673 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f92b0004500 con 0x7f92c00fee80 2026-03-10T14:14:23.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.238+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f92c019dd80 con 0x7f92c00fee80 2026-03-10T14:14:23.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.238+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f92c019e240 con 0x7f92c00fee80 2026-03-10T14:14:23.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.238+0000 7f92b67fc700 1 -- 192.168.123.103:0/1653606673 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92b0010450 con 0x7f92c00fee80 2026-03-10T14:14:23.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.240+0000 7f92b67fc700 1 -- 192.168.123.103:0/1653606673 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f92b000cca0 con 0x7f92c00fee80 2026-03-10T14:14:23.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.240+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f92a4005320 con 0x7f92c00fee80 2026-03-10T14:14:23.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.240+0000 7f92b67fc700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f92ac077910 0x7f92ac079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.240+0000 7f92b67fc700 1 -- 192.168.123.103:0/1653606673 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(117..117 src has 1..117) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f92b0014070 con 0x7f92c00fee80 2026-03-10T14:14:23.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.241+0000 7f92c489b700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f92ac077910 0x7f92ac079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.241+0000 7f92c489b700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f92ac077910 0x7f92ac079dc0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f92bc009b20 tx=0x7f92bc0058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:23.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.243+0000 7f92b67fc700 1 -- 192.168.123.103:0/1653606673 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f92b0062710 con 0x7f92c00fee80 2026-03-10T14:14:23.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.373+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f92a4000bf0 con 0x7f92ac077910 2026-03-10T14:14:23.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.374+0000 7f92b67fc700 1 -- 192.168.123.103:0/1653606673 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f92a4000bf0 con 0x7f92ac077910 2026-03-10T14:14:23.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.376+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f92ac077910 msgr2=0x7f92ac079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.376+0000 7f92c7300700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f92ac077910 0x7f92ac079dc0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f92bc009b20 tx=0x7f92bc0058e0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.376+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c00fee80 msgr2=0x7f92c0198230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.376+0000 7f92c7300700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c00fee80 0x7f92c0198230 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f92b0009fd0 tx=0x7f92b000eea0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.377+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 shutdown_connections 2026-03-10T14:14:23.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.377+0000 7f92c7300700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f92ac077910 0x7f92ac079dc0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.377+0000 7f92c7300700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f92c00fee80 0x7f92c0198230 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.377+0000 7f92c7300700 1 --2- 192.168.123.103:0/1653606673 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92c0198770 0x7f92c019d7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.377+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 >> 192.168.123.103:0/1653606673 conn(0x7f92c00faa70 msgr2=0x7f92c01074b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:23.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.377+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 shutdown_connections 2026-03-10T14:14:23.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.377+0000 7f92c7300700 1 -- 192.168.123.103:0/1653606673 wait complete. 2026-03-10T14:14:23.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.445+0000 7f2098a95700 1 -- 192.168.123.103:0/248042881 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20941039b0 msgr2=0x7f2094105d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.445+0000 7f2098a95700 1 --2- 192.168.123.103:0/248042881 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20941039b0 0x7f2094105d90 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f207c009b00 tx=0x7f207c009e10 comp rx=0 tx=0).stop 2026-03-10T14:14:23.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.445+0000 7f2098a95700 1 -- 192.168.123.103:0/248042881 shutdown_connections 2026-03-10T14:14:23.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.445+0000 7f2098a95700 1 --2- 192.168.123.103:0/248042881 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f20941039b0 0x7f2094105d90 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.445+0000 7f2098a95700 1 --2- 192.168.123.103:0/248042881 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2094101090 0x7f2094103470 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.445+0000 7f2098a95700 1 -- 192.168.123.103:0/248042881 >> 192.168.123.103:0/248042881 conn(0x7f20940faa10 msgr2=0x7f20940fce40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:23.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.445+0000 7f2098a95700 1 -- 192.168.123.103:0/248042881 shutdown_connections 2026-03-10T14:14:23.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.445+0000 7f2098a95700 1 -- 192.168.123.103:0/248042881 wait complete. 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f2098a95700 1 Processor -- start 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f2098a95700 1 -- start start 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f2098a95700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2094101090 0x7f2094198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f2098a95700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20941039b0 0x7f2094198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f2098a95700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2094198b80 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f2098a95700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2094198cc0 con 0x7f20941039b0 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f209259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2094101090 0x7f2094198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f209259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2094101090 0x7f2094198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38098/0 (socket says 192.168.123.103:38098) 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.446+0000 7f209259c700 1 -- 192.168.123.103:0/3316039788 learned_addr learned my addr 192.168.123.103:0/3316039788 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f209259c700 1 -- 192.168.123.103:0/3316039788 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20941039b0 msgr2=0x7f2094198560 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f209259c700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20941039b0 0x7f2094198560 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f209259c700 1 -- 192.168.123.103:0/3316039788 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f207c0097e0 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f209259c700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2094101090 0x7f2094198020 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f208400ee50 tx=0x7f208400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f208b7fe700 1 -- 192.168.123.103:0/3316039788 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f208400cd70 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f208b7fe700 1 -- 192.168.123.103:0/3316039788 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f208400ced0 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f208b7fe700 1 -- 192.168.123.103:0/3316039788 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2084010640 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f209419d770 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.447+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f209419dcc0 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.449+0000 7f208b7fe700 1 -- 192.168.123.103:0/3316039788 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f20840107a0 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.449+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f20940fc5d0 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.449+0000 7f208b7fe700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2080077870 0x7f2080079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.449+0000 7f208b7fe700 1 -- 192.168.123.103:0/3316039788 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(117..117 src has 1..117) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f2084014070 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.452+0000 7f208bfff700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2080077870 0x7f2080079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.452+0000 7f208b7fe700 1 -- 192.168.123.103:0/3316039788 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f20840626a0 con 0x7f2094101090 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.452+0000 7f208bfff700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2080077870 0x7f2080079d20 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f207c005f50 tx=0x7f207c005dc0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.579+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f2094061190 con 0x7f2080077870 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.585+0000 7f208b7fe700 1 -- 192.168.123.103:0/3316039788 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f2094061190 con 0x7f2080077870 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (10m) 81s ago 11m 22.1M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (8m) 81s ago 11m 9575k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:14:23.646 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (10m) 15s ago 10m 12.0M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 81s ago 11m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (4m) 15s ago 10m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (10m) 81s ago 10m 92.0M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (9m) 81s ago 9m 136M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (8m) 81s ago 8m 72.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (8m) 15s ago 8m 63.4M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (8m) 15s ago 8m 144M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (6m) 81s ago 11m 620M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (5m) 15s ago 10m 499M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 81s ago 11m 61.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (5m) 15s ago 10m 56.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (11m) 81s ago 11m 15.4M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (10m) 15s ago 10m 15.6M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 81s ago 10m 206M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (106s) 81s ago 10m 127M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (83s) 81s ago 9m 13.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e0d768b70468 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (59s) 15s ago 9m 137M 4096M 19.2.3-678-ge911bdeb 654f31e6858e f7fc2aafa9d9 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (38s) 15s ago 9m 114M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 89f9225212d4 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (16s) 15s ago 9m 15.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6c7573f5f3fa 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (5m) 81s ago 10m 69.1M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2080077870 msgr2=0x7f2080079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2080077870 0x7f2080079d20 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f207c005f50 tx=0x7f207c005dc0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2094101090 msgr2=0x7f2094198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2094101090 0x7f2094198020 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f208400ee50 tx=0x7f208400c5b0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 shutdown_connections 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2080077870 0x7f2080079d20 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2094101090 0x7f2094198020 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 --2- 192.168.123.103:0/3316039788 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f20941039b0 0x7f2094198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 >> 192.168.123.103:0/3316039788 conn(0x7f20940faa10 msgr2=0x7f20940ff6f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 shutdown_connections 2026-03-10T14:14:23.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.588+0000 7f2098a95700 1 -- 192.168.123.103:0/3316039788 wait complete. 2026-03-10T14:14:23.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.655+0000 7f90e43ca700 1 -- 192.168.123.103:0/2524181959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc102780 msgr2=0x7f90dc102b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.655+0000 7f90e43ca700 1 --2- 192.168.123.103:0/2524181959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc102780 0x7f90dc102b90 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f90cc009b00 tx=0x7f90cc009e10 comp rx=0 tx=0).stop 2026-03-10T14:14:23.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.655+0000 7f90e43ca700 1 -- 192.168.123.103:0/2524181959 shutdown_connections 2026-03-10T14:14:23.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.655+0000 7f90e43ca700 1 --2- 192.168.123.103:0/2524181959 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f90dc103980 0x7f90dc103dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.655+0000 7f90e43ca700 1 --2- 192.168.123.103:0/2524181959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc102780 0x7f90dc102b90 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.655+0000 7f90e43ca700 1 -- 192.168.123.103:0/2524181959 >> 192.168.123.103:0/2524181959 conn(0x7f90dc0fdd10 msgr2=0x7f90dc100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:23.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.661+0000 7f90e43ca700 1 -- 192.168.123.103:0/2524181959 shutdown_connections 2026-03-10T14:14:23.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.662+0000 7f90e43ca700 1 -- 192.168.123.103:0/2524181959 wait complete. 2026-03-10T14:14:23.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.662+0000 7f90e43ca700 1 Processor -- start 2026-03-10T14:14:23.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e43ca700 1 -- start start 2026-03-10T14:14:23.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e43ca700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f90dc102780 0x7f90dc071ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e43ca700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc103980 0x7f90dc072220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e43ca700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90dc072840 con 0x7f90dc103980 2026-03-10T14:14:23.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e43ca700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90dc1a5da0 con 0x7f90dc102780 2026-03-10T14:14:23.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e1965700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc103980 0x7f90dc072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e1965700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc103980 0x7f90dc072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48838/0 (socket says 192.168.123.103:48838) 2026-03-10T14:14:23.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e1965700 1 -- 192.168.123.103:0/1772670377 learned_addr learned my addr 192.168.123.103:0/1772670377 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:23.662 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.663+0000 7f90e2166700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f90dc102780 0x7f90dc071ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.664+0000 7f90e2166700 1 -- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc103980 msgr2=0x7f90dc072220 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:23.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.664+0000 7f90e2166700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc103980 0x7f90dc072220 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:23.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.664+0000 7f90e2166700 1 -- 192.168.123.103:0/1772670377 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f90cc0097e0 con 0x7f90dc102780 2026-03-10T14:14:23.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.664+0000 7f90e2166700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f90dc102780 0x7f90dc071ce0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f90cc000c00 tx=0x7f90cc004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:23.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.665+0000 7f90d37fe700 1 -- 192.168.123.103:0/1772670377 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90cc01d070 con 0x7f90dc102780 2026-03-10T14:14:23.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.665+0000 7f90e43ca700 1 -- 192.168.123.103:0/1772670377 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f90dc1a5f40 con 0x7f90dc102780 2026-03-10T14:14:23.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.665+0000 7f90d37fe700 1 -- 192.168.123.103:0/1772670377 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f90cc00bc50 con 0x7f90dc102780 2026-03-10T14:14:23.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.665+0000 7f90d37fe700 1 -- 192.168.123.103:0/1772670377 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90cc00f670 con 0x7f90dc102780 2026-03-10T14:14:23.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.665+0000 7f90e43ca700 1 -- 192.168.123.103:0/1772670377 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f90dc1a6350 con 0x7f90dc102780 2026-03-10T14:14:23.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.666+0000 7f90d17fa700 1 -- 192.168.123.103:0/1772670377 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f90dc066e40 con 0x7f90dc102780 2026-03-10T14:14:23.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.667+0000 7f90d37fe700 1 -- 192.168.123.103:0/1772670377 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f90cc022470 con 0x7f90dc102780 2026-03-10T14:14:23.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.667+0000 7f90d37fe700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f90c80778c0 0x7f90c8079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:23.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.667+0000 7f90e1965700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f90c80778c0 0x7f90c8079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:23.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.668+0000 7f90d37fe700 1 -- 192.168.123.103:0/1772670377 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(117..117 src has 1..117) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f90cc09ba30 con 0x7f90dc102780 2026-03-10T14:14:23.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.668+0000 7f90e1965700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f90c80778c0 0x7f90c8079d70 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f90d8007d10 tx=0x7f90d8007480 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:23.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.669+0000 7f90d37fe700 1 -- 192.168.123.103:0/1772670377 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f90cc0641a0 con 0x7f90dc102780 2026-03-10T14:14:23.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:23.858+0000 7f90d17fa700 1 -- 192.168.123.103:0/1772670377 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f90dc061960 con 0x7f90dc102780 2026-03-10T14:14:24.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.001+0000 7f90d37fe700 1 -- 192.168.123.103:0/1772670377 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+709 (secure 0 0 0) 0x7f90cc0638f0 con 0x7f90dc102780 2026-03-10T14:14:24.003 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4, 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 10 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:14:24.022 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 -- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f90c80778c0 msgr2=0x7f90c8079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f90c80778c0 0x7f90c8079d70 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f90d8007d10 tx=0x7f90d8007480 comp rx=0 tx=0).stop 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 -- 192.168.123.103:0/1772670377 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f90dc102780 msgr2=0x7f90dc071ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f90dc102780 0x7f90dc071ce0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f90cc000c00 tx=0x7f90cc004930 comp rx=0 tx=0).stop 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 -- 192.168.123.103:0/1772670377 shutdown_connections 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f90c80778c0 0x7f90c8079d70 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f90dc102780 0x7f90dc071ce0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 --2- 192.168.123.103:0/1772670377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90dc103980 0x7f90dc072220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.004+0000 7f90e43ca700 1 -- 192.168.123.103:0/1772670377 >> 192.168.123.103:0/1772670377 conn(0x7f90dc0fdd10 msgr2=0x7f90dc106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.005+0000 7f90e43ca700 1 -- 192.168.123.103:0/1772670377 shutdown_connections 2026-03-10T14:14:24.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.005+0000 7f90e43ca700 1 -- 192.168.123.103:0/1772670377 wait complete. 2026-03-10T14:14:24.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.098+0000 7f8048876700 1 -- 192.168.123.103:0/3547617398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8040072360 msgr2=0x7f80400770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.098+0000 7f8048876700 1 --2- 192.168.123.103:0/3547617398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8040072360 0x7f80400770e0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f803c00d3f0 tx=0x7f803c00d700 comp rx=0 tx=0).stop 2026-03-10T14:14:24.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.098+0000 7f8048876700 1 -- 192.168.123.103:0/3547617398 shutdown_connections 2026-03-10T14:14:24.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.098+0000 7f8048876700 1 --2- 192.168.123.103:0/3547617398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8040072360 0x7f80400770e0 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.098+0000 7f8048876700 1 --2- 192.168.123.103:0/3547617398 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8040071980 0x7f8040071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.098+0000 7f8048876700 1 -- 192.168.123.103:0/3547617398 >> 192.168.123.103:0/3547617398 conn(0x7f804006d1a0 msgr2=0x7f804006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.098+0000 7f8048876700 1 -- 192.168.123.103:0/3547617398 shutdown_connections 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.098+0000 7f8048876700 1 -- 192.168.123.103:0/3547617398 wait complete. 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8048876700 1 Processor -- start 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8048876700 1 -- start start 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8048876700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8040071980 0x7f8040131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8048876700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8040131890 0x7f804007f520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8048876700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8040131d90 con 0x7f8040131890 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8048876700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8040131ed0 con 0x7f8040071980 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8046612700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8040071980 0x7f8040131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8046612700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8040071980 0x7f8040131350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37734/0 (socket says 192.168.123.103:37734) 2026-03-10T14:14:24.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.099+0000 7f8046612700 1 -- 192.168.123.103:0/960468727 learned_addr learned my addr 192.168.123.103:0/960468727 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:24.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.100+0000 7f8045e11700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8040131890 0x7f804007f520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.100+0000 7f8046612700 1 -- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8040131890 msgr2=0x7f804007f520 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.100+0000 7f8046612700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8040131890 0x7f804007f520 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.100+0000 7f8046612700 1 -- 192.168.123.103:0/960468727 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f803c007ed0 con 0x7f8040071980 2026-03-10T14:14:24.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.100+0000 7f8046612700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8040071980 0x7f8040131350 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f803400b770 tx=0x7f803400bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:24.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.101+0000 7f80337fe700 1 -- 192.168.123.103:0/960468727 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f803400f820 con 0x7f8040071980 2026-03-10T14:14:24.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.101+0000 7f8048876700 1 -- 192.168.123.103:0/960468727 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f804007fac0 con 0x7f8040071980 2026-03-10T14:14:24.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.101+0000 7f8048876700 1 -- 192.168.123.103:0/960468727 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f804007ffc0 con 0x7f8040071980 2026-03-10T14:14:24.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.102+0000 7f80337fe700 1 -- 192.168.123.103:0/960468727 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f803400fe60 con 0x7f8040071980 2026-03-10T14:14:24.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.102+0000 7f80337fe700 1 -- 192.168.123.103:0/960468727 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f803400d610 con 0x7f8040071980 2026-03-10T14:14:24.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.102+0000 7f8048876700 1 -- 192.168.123.103:0/960468727 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8024005320 con 0x7f8040071980 2026-03-10T14:14:24.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.103+0000 7f80337fe700 1 -- 192.168.123.103:0/960468727 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f803400f980 con 0x7f8040071980 2026-03-10T14:14:24.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.104+0000 7f80337fe700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f802c0779e0 0x7f802c079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.104+0000 7f80337fe700 1 -- 192.168.123.103:0/960468727 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(117..117 src has 1..117) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f8034099960 con 0x7f8040071980 2026-03-10T14:14:24.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.104+0000 7f8045e11700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f802c0779e0 0x7f802c079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.105+0000 7f8045e11700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f802c0779e0 0x7f802c079e90 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f803c000f80 tx=0x7f803c00db00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:24.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.108+0000 7f80337fe700 1 -- 192.168.123.103:0/960468727 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f80340620d0 con 0x7f8040071980 2026-03-10T14:14:24.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.277+0000 7f8048876700 1 -- 192.168.123.103:0/960468727 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8024005cc0 con 0x7f8040071980 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: pgmap v238: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 11 KiB/s rd, 6.1 KiB/s wr, 4 op/s; 247/336 objects degraded (73.512%); 23 B/s, 5 objects/s recovering 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:24.276 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:24 vm03.local ceph-mon[103098]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.278+0000 7f80337fe700 1 -- 192.168.123.103:0/960468727 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 20 v20) v1 ==== 76+0+1953 (secure 0 0 0) 0x7f8034061820 con 0x7f8040071980 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:e20 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-10T14:14:18:710963+0000 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:epoch 20 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:14:18.660815+0000 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263,1=14470} 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:14:24.277 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 24263 members: 24263 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{1:14470} state up:stopping seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{-1:34408} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:14:24.278 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{-1:34412} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.103:6828/1608434114,v1:192.168.123.103:6829/1608434114] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.282+0000 7f80317fa700 1 -- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f802c0779e0 msgr2=0x7f802c079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.282+0000 7f80317fa700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f802c0779e0 0x7f802c079e90 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f803c000f80 tx=0x7f803c00db00 comp rx=0 tx=0).stop 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.282+0000 7f80317fa700 1 -- 192.168.123.103:0/960468727 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8040071980 msgr2=0x7f8040131350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.282+0000 7f80317fa700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8040071980 0x7f8040131350 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f803400b770 tx=0x7f803400bb30 comp rx=0 tx=0).stop 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.283+0000 7f80317fa700 1 -- 192.168.123.103:0/960468727 shutdown_connections 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.283+0000 7f80317fa700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f802c0779e0 0x7f802c079e90 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.283+0000 7f80317fa700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8040071980 0x7f8040131350 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.283+0000 7f80317fa700 1 --2- 192.168.123.103:0/960468727 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8040131890 0x7f804007f520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.283+0000 7f80317fa700 1 -- 192.168.123.103:0/960468727 >> 192.168.123.103:0/960468727 conn(0x7f804006d1a0 msgr2=0x7f8040076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.283+0000 7f80317fa700 1 -- 192.168.123.103:0/960468727 shutdown_connections 2026-03-10T14:14:24.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.283+0000 7f80317fa700 1 -- 192.168.123.103:0/960468727 wait complete. 2026-03-10T14:14:24.283 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 20 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.379+0000 7f87b9da3700 1 -- 192.168.123.103:0/2092367225 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 msgr2=0x7f87b410be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.379+0000 7f87b9da3700 1 --2- 192.168.123.103:0/2092367225 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 0x7f87b410be90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f87a0009b00 tx=0x7f87a0009e10 comp rx=0 tx=0).stop 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.380+0000 7f87b9da3700 1 -- 192.168.123.103:0/2092367225 shutdown_connections 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.380+0000 7f87b9da3700 1 --2- 192.168.123.103:0/2092367225 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 0x7f87b410be90 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.380+0000 7f87b9da3700 1 --2- 192.168.123.103:0/2092367225 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87b4071a60 0x7f87b4071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.380+0000 7f87b9da3700 1 -- 192.168.123.103:0/2092367225 >> 192.168.123.103:0/2092367225 conn(0x7f87b406d1a0 msgr2=0x7f87b406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.380+0000 7f87b9da3700 1 -- 192.168.123.103:0/2092367225 shutdown_connections 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.380+0000 7f87b9da3700 1 -- 192.168.123.103:0/2092367225 wait complete. 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.380+0000 7f87b9da3700 1 Processor -- start 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.380+0000 7f87b9da3700 1 -- start start 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.381+0000 7f87b9da3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87b4071a60 0x7f87b4116c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.379 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.381+0000 7f87b9da3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 0x7f87b41171d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.381+0000 7f87b9da3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87b41177a0 con 0x7f87b4071a60 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.381+0000 7f87b9da3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87b4117910 con 0x7f87b4072440 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.381+0000 7f87b3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 0x7f87b41171d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.381+0000 7f87b3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 0x7f87b41171d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37744/0 (socket says 192.168.123.103:37744) 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.381+0000 7f87b3fff700 1 -- 192.168.123.103:0/828168551 learned_addr learned my addr 192.168.123.103:0/828168551 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b8da1700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87b4071a60 0x7f87b4116c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b3fff700 1 -- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87b4071a60 msgr2=0x7f87b4116c90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b3fff700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87b4071a60 0x7f87b4116c90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b3fff700 1 -- 192.168.123.103:0/828168551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f87a00097e0 con 0x7f87b4072440 2026-03-10T14:14:24.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b3fff700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 0x7f87b41171d0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f87a0005310 tx=0x7f87a0003730 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:24.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b1ffb700 1 -- 192.168.123.103:0/828168551 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87a001d070 con 0x7f87b4072440 2026-03-10T14:14:24.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b1ffb700 1 -- 192.168.123.103:0/828168551 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f87a0003d10 con 0x7f87b4072440 2026-03-10T14:14:24.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b1ffb700 1 -- 192.168.123.103:0/828168551 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87a0021980 con 0x7f87b4072440 2026-03-10T14:14:24.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b9da3700 1 -- 192.168.123.103:0/828168551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f87b41b2b20 con 0x7f87b4072440 2026-03-10T14:14:24.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.382+0000 7f87b9da3700 1 -- 192.168.123.103:0/828168551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f87b41b3010 con 0x7f87b4072440 2026-03-10T14:14:24.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.383+0000 7f879f7fe700 1 -- 192.168.123.103:0/828168551 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87b404ea50 con 0x7f87b4072440 2026-03-10T14:14:24.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.386+0000 7f87b1ffb700 1 -- 192.168.123.103:0/828168551 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f87a000f530 con 0x7f87b4072440 2026-03-10T14:14:24.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.386+0000 7f87b1ffb700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f87a40778c0 0x7f87a4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.386+0000 7f87b1ffb700 1 -- 192.168.123.103:0/828168551 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(117..117 src has 1..117) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f87a009b3c0 con 0x7f87b4072440 2026-03-10T14:14:24.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.387+0000 7f87b1ffb700 1 -- 192.168.123.103:0/828168551 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f87a0063b30 con 0x7f87b4072440 2026-03-10T14:14:24.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.390+0000 7f87b8da1700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f87a40778c0 0x7f87a4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.403+0000 7f87b8da1700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f87a40778c0 0x7f87a4079d70 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f87b41ae690 tx=0x7f87a8008040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:24.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.527+0000 7f879f7fe700 1 -- 192.168.123.103:0/828168551 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f87b4061190 con 0x7f87a40778c0 2026-03-10T14:14:24.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.528+0000 7f87b1ffb700 1 -- 192.168.123.103:0/828168551 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f87b4061190 con 0x7f87a40778c0 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "osd", 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "12/23 daemons upgraded", 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:14:24.529 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:14:24.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 -- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f87a40778c0 msgr2=0x7f87a4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f87a40778c0 0x7f87a4079d70 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f87b41ae690 tx=0x7f87a8008040 comp rx=0 tx=0).stop 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 -- 192.168.123.103:0/828168551 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 msgr2=0x7f87b41171d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 0x7f87b41171d0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f87a0005310 tx=0x7f87a0003730 comp rx=0 tx=0).stop 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 -- 192.168.123.103:0/828168551 shutdown_connections 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f87a40778c0 0x7f87a4079d70 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87b4071a60 0x7f87b4116c90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 --2- 192.168.123.103:0/828168551 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f87b4072440 0x7f87b41171d0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.533+0000 7f879f7fe700 1 -- 192.168.123.103:0/828168551 >> 192.168.123.103:0/828168551 conn(0x7f87b406d1a0 msgr2=0x7f87b410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.534+0000 7f879f7fe700 1 -- 192.168.123.103:0/828168551 shutdown_connections 2026-03-10T14:14:24.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.534+0000 7f879f7fe700 1 -- 192.168.123.103:0/828168551 wait complete. 2026-03-10T14:14:24.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: pgmap v238: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 11 KiB/s rd, 6.1 KiB/s wr, 4 op/s; 247/336 objects degraded (73.512%); 23 B/s, 5 objects/s recovering 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:24.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:24 vm04.local ceph-mon[92084]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T14:14:24.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.604+0000 7fe8be536700 1 -- 192.168.123.103:0/1790928116 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 msgr2=0x7fe8b8105d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.604+0000 7fe8be536700 1 --2- 192.168.123.103:0/1790928116 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 0x7fe8b8105d60 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7fe8a8009b00 tx=0x7fe8a8009e10 comp rx=0 tx=0).stop 2026-03-10T14:14:24.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.604+0000 7fe8be536700 1 -- 192.168.123.103:0/1790928116 shutdown_connections 2026-03-10T14:14:24.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.604+0000 7fe8be536700 1 --2- 192.168.123.103:0/1790928116 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 0x7fe8b8105d60 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.604+0000 7fe8be536700 1 --2- 192.168.123.103:0/1790928116 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8b8101060 0x7fe8b8103440 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.604+0000 7fe8be536700 1 -- 192.168.123.103:0/1790928116 >> 192.168.123.103:0/1790928116 conn(0x7fe8b80fa9e0 msgr2=0x7fe8b80fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:24.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.605+0000 7fe8be536700 1 -- 192.168.123.103:0/1790928116 shutdown_connections 2026-03-10T14:14:24.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.605+0000 7fe8be536700 1 -- 192.168.123.103:0/1790928116 wait complete. 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.605+0000 7fe8be536700 1 Processor -- start 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.605+0000 7fe8be536700 1 -- start start 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.605+0000 7fe8be536700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8b8101060 0x7fe8b8195e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.605+0000 7fe8be536700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 0x7fe8b8196340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.605+0000 7fe8be536700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe8b8196960 con 0x7fe8b8103980 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.605+0000 7fe8be536700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe8b8196aa0 con 0x7fe8b8101060 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.606+0000 7fe8b7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8b8101060 0x7fe8b8195e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.606+0000 7fe8b7fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8b8101060 0x7fe8b8195e00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37770/0 (socket says 192.168.123.103:37770) 2026-03-10T14:14:24.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.606+0000 7fe8b7fff700 1 -- 192.168.123.103:0/1780449897 learned_addr learned my addr 192.168.123.103:0/1780449897 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:24.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.606+0000 7fe8b77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 0x7fe8b8196340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.606+0000 7fe8b77fe700 1 -- 192.168.123.103:0/1780449897 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8b8101060 msgr2=0x7fe8b8195e00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.606+0000 7fe8b77fe700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8b8101060 0x7fe8b8195e00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.606+0000 7fe8b77fe700 1 -- 192.168.123.103:0/1780449897 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe8a80097e0 con 0x7fe8b8103980 2026-03-10T14:14:24.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.607+0000 7fe8b77fe700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 0x7fe8b8196340 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7fe8a8009fd0 tx=0x7fe8a8004a00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:24.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.607+0000 7fe8b57fa700 1 -- 192.168.123.103:0/1780449897 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe8a801d070 con 0x7fe8b8103980 2026-03-10T14:14:24.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.607+0000 7fe8b57fa700 1 -- 192.168.123.103:0/1780449897 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe8a800bc50 con 0x7fe8b8103980 2026-03-10T14:14:24.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.607+0000 7fe8b57fa700 1 -- 192.168.123.103:0/1780449897 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe8a800f7c0 con 0x7fe8b8103980 2026-03-10T14:14:24.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.607+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe8b80fee50 con 0x7fe8b8103980 2026-03-10T14:14:24.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.607+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe8b80ff3c0 con 0x7fe8b8103980 2026-03-10T14:14:24.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.609+0000 7fe8b57fa700 1 -- 192.168.123.103:0/1780449897 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe8a8022ae0 con 0x7fe8b8103980 2026-03-10T14:14:24.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.609+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe8b8190090 con 0x7fe8b8103980 2026-03-10T14:14:24.611 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.612+0000 7fe8b57fa700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe8a40778c0 0x7fe8a4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:24.611 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.612+0000 7fe8b57fa700 1 -- 192.168.123.103:0/1780449897 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(117..117 src has 1..117) v4 ==== 6463+0+0 (secure 0 0 0) 0x7fe8a809b370 con 0x7fe8b8103980 2026-03-10T14:14:24.611 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.612+0000 7fe8b7fff700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe8a40778c0 0x7fe8a4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:24.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.613+0000 7fe8b7fff700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe8a40778c0 0x7fe8a4079d70 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fe8a0009d10 tx=0x7fe8a0009480 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:24.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.613+0000 7fe8b57fa700 1 -- 192.168.123.103:0/1780449897 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe8a8063ae0 con 0x7fe8b8103980 2026-03-10T14:14:24.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.774+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fe8b8066e40 con 0x7fe8b8103980 2026-03-10T14:14:24.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.775+0000 7fe8b57fa700 1 -- 192.168.123.103:0/1780449897 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+261 (secure 0 0 0) 0x7fe8a8027120 con 0x7fe8b8103980 2026-03-10T14:14:24.774 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 247/336 objects degraded (73.512%), 1 pg degraded 2026-03-10T14:14:24.774 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 247/336 objects degraded (73.512%), 1 pg degraded 2026-03-10T14:14:24.774 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [0,2] 2026-03-10T14:14:24.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.777+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe8a40778c0 msgr2=0x7fe8a4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.777+0000 7fe8be536700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe8a40778c0 0x7fe8a4079d70 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fe8a0009d10 tx=0x7fe8a0009480 comp rx=0 tx=0).stop 2026-03-10T14:14:24.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.778+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 msgr2=0x7fe8b8196340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:24.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.778+0000 7fe8be536700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 0x7fe8b8196340 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7fe8a8009fd0 tx=0x7fe8a8004a00 comp rx=0 tx=0).stop 2026-03-10T14:14:24.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.778+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 shutdown_connections 2026-03-10T14:14:24.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.778+0000 7fe8be536700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe8a40778c0 0x7fe8a4079d70 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.778+0000 7fe8be536700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe8b8101060 0x7fe8b8195e00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.778+0000 7fe8be536700 1 --2- 192.168.123.103:0/1780449897 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe8b8103980 0x7fe8b8196340 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:24.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.778+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 >> 192.168.123.103:0/1780449897 conn(0x7fe8b80fa9e0 msgr2=0x7fe8b80fce30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:24.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.781+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 shutdown_connections 2026-03-10T14:14:24.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:24.782+0000 7fe8be536700 1 -- 192.168.123.103:0/1780449897 wait complete. 2026-03-10T14:14:25.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:25 vm04.local ceph-mon[92084]: from='client.34416 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:25 vm04.local ceph-mon[92084]: from='client.44299 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:25 vm04.local ceph-mon[92084]: from='client.34424 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:25 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1772670377' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:25 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/960468727' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:14:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:25 vm04.local ceph-mon[92084]: from='client.44311 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:25 vm04.local ceph-mon[92084]: pgmap v239: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 9.0 KiB/s rd, 5.0 KiB/s wr, 3 op/s; 247/336 objects degraded (73.512%); 19 B/s, 4 objects/s recovering 2026-03-10T14:14:25.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:25 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1780449897' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:14:25.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:25 vm03.local ceph-mon[103098]: from='client.34416 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:25.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:25 vm03.local ceph-mon[103098]: from='client.44299 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:25.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:25 vm03.local ceph-mon[103098]: from='client.34424 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:25.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:25 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1772670377' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:25.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:25 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/960468727' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:14:25.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:25 vm03.local ceph-mon[103098]: from='client.44311 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:25.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:25 vm03.local ceph-mon[103098]: pgmap v239: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 9.0 KiB/s rd, 5.0 KiB/s wr, 3 op/s; 247/336 objects degraded (73.512%); 19 B/s, 4 objects/s recovering 2026-03-10T14:14:25.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:25 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1780449897' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:14:26.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:26 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 247/336 objects degraded (73.512%), 1 pg degraded (PG_DEGRADED) 2026-03-10T14:14:26.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:26 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 247/336 objects degraded (73.512%), 1 pg degraded (PG_DEGRADED) 2026-03-10T14:14:27.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:27 vm03.local ceph-mon[103098]: pgmap v240: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 8.1 KiB/s rd, 4.5 KiB/s wr, 3 op/s; 247/336 objects degraded (73.512%); 17 B/s, 7 objects/s recovering 2026-03-10T14:14:27.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:27 vm04.local ceph-mon[92084]: pgmap v240: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 8.1 KiB/s rd, 4.5 KiB/s wr, 3 op/s; 247/336 objects degraded (73.512%); 17 B/s, 7 objects/s recovering 2026-03-10T14:14:30.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:29 vm04.local ceph-mon[92084]: pgmap v241: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 6.9 KiB/s rd, 4.3 KiB/s wr, 3 op/s; 247/336 objects degraded (73.512%); 14 B/s, 6 objects/s recovering 2026-03-10T14:14:30.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:29 vm03.local ceph-mon[103098]: pgmap v241: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 6.9 KiB/s rd, 4.3 KiB/s wr, 3 op/s; 247/336 objects degraded (73.512%); 14 B/s, 6 objects/s recovering 2026-03-10T14:14:31.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:14:31.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:14:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:31 vm04.local ceph-mon[92084]: pgmap v242: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 85 B/s rd, 2.5 KiB/s wr, 4 op/s; 247/336 objects degraded (73.512%); 7 B/s, 9 objects/s recovering 2026-03-10T14:14:32.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:31 vm03.local ceph-mon[103098]: pgmap v242: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 85 B/s rd, 2.5 KiB/s wr, 4 op/s; 247/336 objects degraded (73.512%); 7 B/s, 9 objects/s recovering 2026-03-10T14:14:34.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: pgmap v243: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 2.0 KiB/s wr, 4 op/s; 247/336 objects degraded (73.512%); 0 B/s, 5 objects/s recovering 2026-03-10T14:14:34.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:34.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:33 vm04.local ceph-mon[92084]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: pgmap v243: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 2.0 KiB/s wr, 4 op/s; 247/336 objects degraded (73.512%); 0 B/s, 5 objects/s recovering 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:34.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:33 vm03.local ceph-mon[103098]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T14:14:36.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:35 vm04.local ceph-mon[92084]: pgmap v244: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.2 KiB/s wr, 5 op/s; 247/336 objects degraded (73.512%); 0 B/s, 5 objects/s recovering 2026-03-10T14:14:36.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:35 vm03.local ceph-mon[103098]: pgmap v244: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 275 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.2 KiB/s wr, 5 op/s; 247/336 objects degraded (73.512%); 0 B/s, 5 objects/s recovering 2026-03-10T14:14:37.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:36 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 247/288 objects degraded (85.764%), 1 pg degraded (PG_DEGRADED) 2026-03-10T14:14:37.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:36 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 247/288 objects degraded (85.764%), 1 pg degraded (PG_DEGRADED) 2026-03-10T14:14:38.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:37 vm04.local ceph-mon[92084]: pgmap v245: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 211 MiB data, 928 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.2 KiB/s wr, 6 op/s; 247/288 objects degraded (85.764%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:38.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:37 vm03.local ceph-mon[103098]: pgmap v245: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 211 MiB data, 928 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.2 KiB/s wr, 6 op/s; 247/288 objects degraded (85.764%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:40.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:39 vm04.local ceph-mon[92084]: pgmap v246: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.3 KiB/s wr, 8 op/s; 247/243 objects degraded (101.646%); 0 B/s, 6 objects/s recovering 2026-03-10T14:14:40.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:39 vm03.local ceph-mon[103098]: pgmap v246: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.3 KiB/s wr, 8 op/s; 247/243 objects degraded (101.646%); 0 B/s, 6 objects/s recovering 2026-03-10T14:14:41.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:41 vm04.local ceph-mon[92084]: pgmap v247: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 853 B/s wr, 7 op/s; 247/243 objects degraded (101.646%); 0 B/s, 11 objects/s recovering 2026-03-10T14:14:41.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:41 vm03.local ceph-mon[103098]: pgmap v247: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 853 B/s wr, 7 op/s; 247/243 objects degraded (101.646%); 0 B/s, 11 objects/s recovering 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: pgmap v248: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 853 B/s wr, 4 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T14:14:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:43 vm04.local ceph-mon[92084]: daemon mds.cephfs.vm03.aqaspa finished stopping rank 1 in filesystem cephfs (now has 1 ranks) 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: pgmap v248: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 853 B/s wr, 4 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T14:14:44.074 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:43 vm03.local ceph-mon[103098]: daemon mds.cephfs.vm03.aqaspa finished stopping rank 1 in filesystem cephfs (now has 1 ranks) 2026-03-10T14:14:45.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:45 vm04.local ceph-mon[92084]: mds.1 [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] down:stopped 2026-03-10T14:14:45.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:45 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 2 up:standby 2026-03-10T14:14:45.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:45 vm03.local ceph-mon[103098]: mds.1 [v2:192.168.123.103:6826/3503287793,v1:192.168.123.103:6827/3503287793] down:stopped 2026-03-10T14:14:45.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:45 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 2 up:standby 2026-03-10T14:14:46.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:46 vm03.local ceph-mon[103098]: pgmap v249: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 938 B/s wr, 4 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:46.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:46 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.103:6826/3588884656,v1:192.168.123.103:6827/3588884656] up:boot 2026-03-10T14:14:46.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:46 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 3 up:standby 2026-03-10T14:14:46.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:14:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:46 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded (PG_DEGRADED) 2026-03-10T14:14:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:14:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:46 vm04.local ceph-mon[92084]: pgmap v249: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 938 B/s wr, 4 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:46 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.103:6826/3588884656,v1:192.168.123.103:6827/3588884656] up:boot 2026-03-10T14:14:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:46 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 3 up:standby 2026-03-10T14:14:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:14:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:46 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded (PG_DEGRADED) 2026-03-10T14:14:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:14:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:48 vm04.local ceph-mon[92084]: pgmap v250: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 426 B/s wr, 2 op/s; 247/243 objects degraded (101.646%); 0 B/s, 12 objects/s recovering 2026-03-10T14:14:48.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:48 vm03.local ceph-mon[103098]: pgmap v250: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 426 B/s wr, 2 op/s; 247/243 objects degraded (101.646%); 0 B/s, 12 objects/s recovering 2026-03-10T14:14:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:50 vm04.local ceph-mon[92084]: pgmap v251: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 426 B/s wr, 1 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:50.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:50 vm03.local ceph-mon[103098]: pgmap v251: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 426 B/s wr, 1 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:52.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:52 vm03.local ceph-mon[103098]: pgmap v252: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s; 247/243 objects degraded (101.646%); 0 B/s, 12 objects/s recovering 2026-03-10T14:14:52.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:52 vm04.local ceph-mon[92084]: pgmap v252: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s; 247/243 objects degraded (101.646%); 0 B/s, 12 objects/s recovering 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: pgmap v253: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm03.aqaspa"]}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.aqaspa", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:14:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:54 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: pgmap v253: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm03.aqaspa"]}]: dispatch 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.aqaspa", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:14:54.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:54 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.856+0000 7f958c69c700 1 -- 192.168.123.103:0/3152999246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9584074d80 msgr2=0x7f95840731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.856+0000 7f958c69c700 1 --2- 192.168.123.103:0/3152999246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9584074d80 0x7f95840731e0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f9574009a60 tx=0x7f9574009d70 comp rx=0 tx=0).stop 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.858+0000 7f958c69c700 1 -- 192.168.123.103:0/3152999246 shutdown_connections 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.858+0000 7f958c69c700 1 --2- 192.168.123.103:0/3152999246 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95840737b0 0x7f9584073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.858+0000 7f958c69c700 1 --2- 192.168.123.103:0/3152999246 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9584074d80 0x7f95840731e0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.858+0000 7f958c69c700 1 -- 192.168.123.103:0/3152999246 >> 192.168.123.103:0/3152999246 conn(0x7f95840fba80 msgr2=0x7f95840fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.858+0000 7f958c69c700 1 -- 192.168.123.103:0/3152999246 shutdown_connections 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.858+0000 7f958c69c700 1 -- 192.168.123.103:0/3152999246 wait complete. 2026-03-10T14:14:54.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.859+0000 7f958c69c700 1 Processor -- start 2026-03-10T14:14:54.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.859+0000 7f958c69c700 1 -- start start 2026-03-10T14:14:54.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.859+0000 7f958c69c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95840737b0 0x7f958419c3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:54.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.859+0000 7f958c69c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9584074d80 0x7f958419c8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:54.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.859+0000 7f958c69c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f958419cf10 con 0x7f95840737b0 2026-03-10T14:14:54.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.859+0000 7f958c69c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f958419d050 con 0x7f9584074d80 2026-03-10T14:14:54.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f958a438700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95840737b0 0x7f958419c3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:54.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f958a438700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95840737b0 0x7f958419c3b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52976/0 (socket says 192.168.123.103:52976) 2026-03-10T14:14:54.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f958a438700 1 -- 192.168.123.103:0/267795257 learned_addr learned my addr 192.168.123.103:0/267795257 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f958a438700 1 -- 192.168.123.103:0/267795257 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9584074d80 msgr2=0x7f958419c8f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f958a438700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9584074d80 0x7f958419c8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f958a438700 1 -- 192.168.123.103:0/267795257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95800097e0 con 0x7f95840737b0 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f958a438700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95840737b0 0x7f958419c3b0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f9574009420 tx=0x7f957400fb20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f957b7fe700 1 -- 192.168.123.103:0/267795257 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f957401d070 con 0x7f95840737b0 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f957b7fe700 1 -- 192.168.123.103:0/267795257 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f95740218f0 con 0x7f95840737b0 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.860+0000 7f957b7fe700 1 -- 192.168.123.103:0/267795257 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9574017870 con 0x7f95840737b0 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.861+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9574009710 con 0x7f95840737b0 2026-03-10T14:14:54.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.861+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95841a1e00 con 0x7f95840737b0 2026-03-10T14:14:54.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.862+0000 7f957b7fe700 1 -- 192.168.123.103:0/267795257 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9574021a60 con 0x7f95840737b0 2026-03-10T14:14:54.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.862+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9584066e40 con 0x7f95840737b0 2026-03-10T14:14:54.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.862+0000 7f957b7fe700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9570077870 0x7f9570079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:54.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.862+0000 7f957b7fe700 1 -- 192.168.123.103:0/267795257 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(118..118 src has 1..118) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f957409af10 con 0x7f95840737b0 2026-03-10T14:14:54.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.864+0000 7f9589c37700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9570077870 0x7f9570079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:54.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.864+0000 7f9589c37700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9570077870 0x7f9570079d20 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f9580005fd0 tx=0x7f9580009500 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:54.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.866+0000 7f957b7fe700 1 -- 192.168.123.103:0/267795257 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f95740636b0 con 0x7f95840737b0 2026-03-10T14:14:54.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.998+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f95841035c0 con 0x7f9570077870 2026-03-10T14:14:54.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:54.999+0000 7f957b7fe700 1 -- 192.168.123.103:0/267795257 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f95841035c0 con 0x7f9570077870 2026-03-10T14:14:55.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.001+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9570077870 msgr2=0x7f9570079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.001+0000 7f958c69c700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9570077870 0x7f9570079d20 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f9580005fd0 tx=0x7f9580009500 comp rx=0 tx=0).stop 2026-03-10T14:14:55.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.001+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95840737b0 msgr2=0x7f958419c3b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.001+0000 7f958c69c700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95840737b0 0x7f958419c3b0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f9574009420 tx=0x7f957400fb20 comp rx=0 tx=0).stop 2026-03-10T14:14:55.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.002+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 shutdown_connections 2026-03-10T14:14:55.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.002+0000 7f958c69c700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9570077870 0x7f9570079d20 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.002+0000 7f958c69c700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95840737b0 0x7f958419c3b0 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.002+0000 7f958c69c700 1 --2- 192.168.123.103:0/267795257 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9584074d80 0x7f958419c8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.002+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 >> 192.168.123.103:0/267795257 conn(0x7f95840fba80 msgr2=0x7f9584101ea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.002+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 shutdown_connections 2026-03-10T14:14:55.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.002+0000 7f958c69c700 1 -- 192.168.123.103:0/267795257 wait complete. 2026-03-10T14:14:55.009 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:14:55.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.071+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2596956248 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64102760 msgr2=0x7f5a64102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.071+0000 7f5a6c39a700 1 --2- 192.168.123.103:0/2596956248 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64102760 0x7f5a64102b70 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f5a54009a60 tx=0x7f5a54009d70 comp rx=0 tx=0).stop 2026-03-10T14:14:55.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.072+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2596956248 shutdown_connections 2026-03-10T14:14:55.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.072+0000 7f5a6c39a700 1 --2- 192.168.123.103:0/2596956248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a64103a00 0x7f5a64103e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.072+0000 7f5a6c39a700 1 --2- 192.168.123.103:0/2596956248 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64102760 0x7f5a64102b70 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.072+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2596956248 >> 192.168.123.103:0/2596956248 conn(0x7f5a640fddb0 msgr2=0x7f5a641001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.072+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2596956248 shutdown_connections 2026-03-10T14:14:55.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.072+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2596956248 wait complete. 2026-03-10T14:14:55.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.073+0000 7f5a6c39a700 1 Processor -- start 2026-03-10T14:14:55.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.073+0000 7f5a6c39a700 1 -- start start 2026-03-10T14:14:55.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a6c39a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a64102760 0x7f5a64198000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a6c39a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64103a00 0x7f5a64198540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a6c39a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a64198b60 con 0x7f5a64102760 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a69935700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64103a00 0x7f5a64198540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a6c39a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a64198ca0 con 0x7f5a64103a00 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a69935700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64103a00 0x7f5a64198540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37256/0 (socket says 192.168.123.103:37256) 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a69935700 1 -- 192.168.123.103:0/2290919668 learned_addr learned my addr 192.168.123.103:0/2290919668 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a6a136700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a64102760 0x7f5a64198000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a6a136700 1 -- 192.168.123.103:0/2290919668 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64103a00 msgr2=0x7f5a64198540 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a6a136700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64103a00 0x7f5a64198540 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.074+0000 7f5a6a136700 1 -- 192.168.123.103:0/2290919668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a600097e0 con 0x7f5a64102760 2026-03-10T14:14:55.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.075+0000 7f5a6a136700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a64102760 0x7f5a64198000 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f5a54009a60 tx=0x7f5a5400f690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.075+0000 7f5a5b7fe700 1 -- 192.168.123.103:0/2290919668 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a5401d070 con 0x7f5a64102760 2026-03-10T14:14:55.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.075+0000 7f5a5b7fe700 1 -- 192.168.123.103:0/2290919668 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5a5400fbf0 con 0x7f5a64102760 2026-03-10T14:14:55.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.075+0000 7f5a5b7fe700 1 -- 192.168.123.103:0/2290919668 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a540176d0 con 0x7f5a64102760 2026-03-10T14:14:55.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.075+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a54009710 con 0x7f5a64102760 2026-03-10T14:14:55.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.075+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a6419da50 con 0x7f5a64102760 2026-03-10T14:14:55.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.076+0000 7f5a5b7fe700 1 -- 192.168.123.103:0/2290919668 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5a54017830 con 0x7f5a64102760 2026-03-10T14:14:55.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.077+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5a64066e40 con 0x7f5a64102760 2026-03-10T14:14:55.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.077+0000 7f5a5b7fe700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5a500778c0 0x7f5a50079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.077+0000 7f5a5b7fe700 1 -- 192.168.123.103:0/2290919668 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(118..118 src has 1..118) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f5a5409aba0 con 0x7f5a64102760 2026-03-10T14:14:55.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.077+0000 7f5a69935700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5a500778c0 0x7f5a50079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.077+0000 7f5a69935700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5a500778c0 0x7f5a50079d70 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f5a60005d10 tx=0x7f5a60009500 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.080+0000 7f5a5b7fe700 1 -- 192.168.123.103:0/2290919668 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5a54063960 con 0x7f5a64102760 2026-03-10T14:14:55.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:55 vm03.local ceph-mon[103098]: Upgrade: It appears safe to stop mds.cephfs.vm03.aqaspa 2026-03-10T14:14:55.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:55 vm03.local ceph-mon[103098]: Upgrade: Updating mds.cephfs.vm03.aqaspa 2026-03-10T14:14:55.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:55 vm03.local ceph-mon[103098]: Deploying daemon mds.cephfs.vm03.aqaspa on vm03 2026-03-10T14:14:55.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:55 vm03.local ceph-mon[103098]: osdmap e118: 6 total, 6 up, 6 in 2026-03-10T14:14:55.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:55 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 2 up:standby 2026-03-10T14:14:55.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:55 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:14:55.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.211+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5a64108350 con 0x7f5a500778c0 2026-03-10T14:14:55.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.212+0000 7f5a5b7fe700 1 -- 192.168.123.103:0/2290919668 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f5a64108350 con 0x7f5a500778c0 2026-03-10T14:14:55.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5a500778c0 msgr2=0x7f5a50079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5a500778c0 0x7f5a50079d70 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f5a60005d10 tx=0x7f5a60009500 comp rx=0 tx=0).stop 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a64102760 msgr2=0x7f5a64198000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a64102760 0x7f5a64198000 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f5a54009a60 tx=0x7f5a5400f690 comp rx=0 tx=0).stop 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 shutdown_connections 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5a500778c0 0x7f5a50079d70 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5a64102760 0x7f5a64198000 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 --2- 192.168.123.103:0/2290919668 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5a64103a00 0x7f5a64198540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.214+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 >> 192.168.123.103:0/2290919668 conn(0x7f5a640fddb0 msgr2=0x7f5a64106c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.215+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 shutdown_connections 2026-03-10T14:14:55.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.215+0000 7f5a6c39a700 1 -- 192.168.123.103:0/2290919668 wait complete. 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.280+0000 7fe6f8060700 1 -- 192.168.123.103:0/293654488 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 msgr2=0x7fe6f01081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.280+0000 7fe6f8060700 1 --2- 192.168.123.103:0/293654488 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 0x7fe6f01081c0 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7fe6ec009b50 tx=0x7fe6ec009e60 comp rx=0 tx=0).stop 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.280+0000 7fe6f8060700 1 -- 192.168.123.103:0/293654488 shutdown_connections 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.280+0000 7fe6f8060700 1 --2- 192.168.123.103:0/293654488 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 0x7fe6f01081c0 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.280+0000 7fe6f8060700 1 --2- 192.168.123.103:0/293654488 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f0106f60 0x7fe6f0107370 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.280+0000 7fe6f8060700 1 -- 192.168.123.103:0/293654488 >> 192.168.123.103:0/293654488 conn(0x7fe6f0075b50 msgr2=0x7fe6f0077f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.280+0000 7fe6f8060700 1 -- 192.168.123.103:0/293654488 shutdown_connections 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.280+0000 7fe6f8060700 1 -- 192.168.123.103:0/293654488 wait complete. 2026-03-10T14:14:55.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f8060700 1 Processor -- start 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f8060700 1 -- start start 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f8060700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f0106f60 0x7fe6f019c3e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f8060700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 0x7fe6f019c920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f8060700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe6f019cf40 con 0x7fe6f0107d70 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f8060700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe6f019d080 con 0x7fe6f0106f60 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f55fb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 0x7fe6f019c920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f55fb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 0x7fe6f019c920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:53024/0 (socket says 192.168.123.103:53024) 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.281+0000 7fe6f55fb700 1 -- 192.168.123.103:0/2713672441 learned_addr learned my addr 192.168.123.103:0/2713672441 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6f5dfc700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f0106f60 0x7fe6f019c3e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6f55fb700 1 -- 192.168.123.103:0/2713672441 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f0106f60 msgr2=0x7fe6f019c3e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6f55fb700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f0106f60 0x7fe6f019c3e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6f55fb700 1 -- 192.168.123.103:0/2713672441 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe6ec0097e0 con 0x7fe6f0107d70 2026-03-10T14:14:55.280 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6f5dfc700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f0106f60 0x7fe6f019c3e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:14:55.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6f55fb700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 0x7fe6f019c920 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fe6ec005950 tx=0x7fe6ec0057d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6e2ffd700 1 -- 192.168.123.103:0/2713672441 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe6ec01d070 con 0x7fe6f0107d70 2026-03-10T14:14:55.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6e2ffd700 1 -- 192.168.123.103:0/2713672441 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe6ec00bb50 con 0x7fe6f0107d70 2026-03-10T14:14:55.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6e2ffd700 1 -- 192.168.123.103:0/2713672441 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe6ec00f7d0 con 0x7fe6f0107d70 2026-03-10T14:14:55.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe6f01a1ad0 con 0x7fe6f0107d70 2026-03-10T14:14:55.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.282+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe6f01a1fc0 con 0x7fe6f0107d70 2026-03-10T14:14:55.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.283+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe6f0066e40 con 0x7fe6f0107d70 2026-03-10T14:14:55.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.284+0000 7fe6e2ffd700 1 -- 192.168.123.103:0/2713672441 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe6ec00bcc0 con 0x7fe6f0107d70 2026-03-10T14:14:55.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.285+0000 7fe6e2ffd700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6dc077910 0x7fe6dc079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.285+0000 7fe6e2ffd700 1 -- 192.168.123.103:0/2713672441 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(118..118 src has 1..118) v4 ==== 6463+0+0 (secure 0 0 0) 0x7fe6ec09b3e0 con 0x7fe6f0107d70 2026-03-10T14:14:55.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.287+0000 7fe6f5dfc700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6dc077910 0x7fe6dc079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.287+0000 7fe6e2ffd700 1 -- 192.168.123.103:0/2713672441 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe6ec063b50 con 0x7fe6f0107d70 2026-03-10T14:14:55.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.287+0000 7fe6f5dfc700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6dc077910 0x7fe6dc079dc0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fe6e4005fd0 tx=0x7fe6e4005f00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.403+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fe6f010c6c0 con 0x7fe6dc077910 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.408+0000 7fe6e2ffd700 1 -- 192.168.123.103:0/2713672441 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fe6f010c6c0 con 0x7fe6dc077910 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (10m) 113s ago 11m 22.1M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (9m) 113s ago 11m 9575k - 18.2.0 dc2bc1663786 7f20e4fc0ed9 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (11m) 47s ago 11m 12.0M - 18.2.0 dc2bc1663786 8b6b949a2f58 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 113s ago 11m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (5m) 47s ago 11m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (10m) 113s ago 11m 92.0M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (9m) 113s ago 9m 136M - 18.2.0 dc2bc1663786 db33bf4450b8 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (9m) 113s ago 9m 72.7M - 18.2.0 dc2bc1663786 5d05b227aa40 2026-03-10T14:14:55.406 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (9m) 47s ago 9m 63.4M - 18.2.0 dc2bc1663786 2494aff9d6c9 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (9m) 47s ago 9m 144M - 18.2.0 dc2bc1663786 e286b66e6f5a 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (6m) 113s ago 12m 620M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (6m) 47s ago 11m 499M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (6m) 113s ago 12m 61.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (5m) 47s ago 11m 56.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (11m) 113s ago 11m 15.4M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (11m) 47s ago 11m 15.6M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 113s ago 10m 206M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 113s ago 10m 127M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (115s) 113s ago 10m 13.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e0d768b70468 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (91s) 47s ago 10m 137M 4096M 19.2.3-678-ge911bdeb 654f31e6858e f7fc2aafa9d9 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (70s) 47s ago 10m 114M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 89f9225212d4 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (48s) 47s ago 9m 15.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6c7573f5f3fa 2026-03-10T14:14:55.407 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (6m) 113s ago 11m 69.1M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:14:55.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6dc077910 msgr2=0x7fe6dc079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6dc077910 0x7fe6dc079dc0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fe6e4005fd0 tx=0x7fe6e4005f00 comp rx=0 tx=0).stop 2026-03-10T14:14:55.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 msgr2=0x7fe6f019c920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 0x7fe6f019c920 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fe6ec005950 tx=0x7fe6ec0057d0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 shutdown_connections 2026-03-10T14:14:55.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe6dc077910 0x7fe6dc079dc0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe6f0106f60 0x7fe6f019c3e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 --2- 192.168.123.103:0/2713672441 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe6f0107d70 0x7fe6f019c920 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 >> 192.168.123.103:0/2713672441 conn(0x7fe6f0075b50 msgr2=0x7fe6f010afa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 shutdown_connections 2026-03-10T14:14:55.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.410+0000 7fe6f8060700 1 -- 192.168.123.103:0/2713672441 wait complete. 2026-03-10T14:14:55.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.474+0000 7f0e1dea4700 1 -- 192.168.123.103:0/3653281732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 msgr2=0x7f0e18073c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.474+0000 7f0e1dea4700 1 --2- 192.168.123.103:0/3653281732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 0x7f0e18073c20 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f0e0c009b80 tx=0x7f0e0c009e90 comp rx=0 tx=0).stop 2026-03-10T14:14:55.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.477+0000 7f0e1dea4700 1 -- 192.168.123.103:0/3653281732 shutdown_connections 2026-03-10T14:14:55.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.477+0000 7f0e1dea4700 1 --2- 192.168.123.103:0/3653281732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 0x7f0e18073c20 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.477+0000 7f0e1dea4700 1 --2- 192.168.123.103:0/3653281732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e18074d80 0x7f0e180731e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.477+0000 7f0e1dea4700 1 -- 192.168.123.103:0/3653281732 >> 192.168.123.103:0/3653281732 conn(0x7f0e180fb850 msgr2=0x7f0e180fdc80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.478+0000 7f0e1dea4700 1 -- 192.168.123.103:0/3653281732 shutdown_connections 2026-03-10T14:14:55.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.478+0000 7f0e1dea4700 1 -- 192.168.123.103:0/3653281732 wait complete. 2026-03-10T14:14:55.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.478+0000 7f0e1dea4700 1 Processor -- start 2026-03-10T14:14:55.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.478+0000 7f0e1dea4700 1 -- start start 2026-03-10T14:14:55.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.478+0000 7f0e1dea4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 0x7f0e18071d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.478+0000 7f0e1dea4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e18074d80 0x7f0e18072270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.478+0000 7f0e1dea4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e18072840 con 0x7f0e180737b0 2026-03-10T14:14:55.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.478+0000 7f0e1dea4700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e18072980 con 0x7f0e18074d80 2026-03-10T14:14:55.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.479+0000 7f0e1cea2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 0x7f0e18071d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.479+0000 7f0e1cea2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 0x7f0e18071d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:53034/0 (socket says 192.168.123.103:53034) 2026-03-10T14:14:55.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.479+0000 7f0e1cea2700 1 -- 192.168.123.103:0/4054130821 learned_addr learned my addr 192.168.123.103:0/4054130821 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:55.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.479+0000 7f0e17fff700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e18074d80 0x7f0e18072270 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.479+0000 7f0e1cea2700 1 -- 192.168.123.103:0/4054130821 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e18074d80 msgr2=0x7f0e18072270 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.479+0000 7f0e1cea2700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e18074d80 0x7f0e18072270 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.479+0000 7f0e1cea2700 1 -- 192.168.123.103:0/4054130821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e0c0097e0 con 0x7f0e180737b0 2026-03-10T14:14:55.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.480+0000 7f0e1cea2700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 0x7f0e18071d30 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f0e0800d8d0 tx=0x7f0e0800dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.480+0000 7f0e15ffb700 1 -- 192.168.123.103:0/4054130821 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e0800f840 con 0x7f0e180737b0 2026-03-10T14:14:55.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.480+0000 7f0e15ffb700 1 -- 192.168.123.103:0/4054130821 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0e0800fe80 con 0x7f0e180737b0 2026-03-10T14:14:55.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.481+0000 7f0e15ffb700 1 -- 192.168.123.103:0/4054130821 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e0800fb50 con 0x7f0e180737b0 2026-03-10T14:14:55.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.482+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0e181a18f0 con 0x7f0e180737b0 2026-03-10T14:14:55.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.482+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0e181a1dc0 con 0x7f0e180737b0 2026-03-10T14:14:55.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.483+0000 7f0e15ffb700 1 -- 192.168.123.103:0/4054130821 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0e08010460 con 0x7f0e180737b0 2026-03-10T14:14:55.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.483+0000 7f0e15ffb700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e000778c0 0x7f0e00079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.483+0000 7f0e15ffb700 1 -- 192.168.123.103:0/4054130821 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(118..118 src has 1..118) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f0e08099810 con 0x7f0e180737b0 2026-03-10T14:14:55.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.483+0000 7f0e17fff700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e000778c0 0x7f0e00079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.483+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0e1804ea50 con 0x7f0e180737b0 2026-03-10T14:14:55.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.484+0000 7f0e17fff700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e000778c0 0x7f0e00079d70 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f0e0c009fd0 tx=0x7f0e0c005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.486+0000 7f0e15ffb700 1 -- 192.168.123.103:0/4054130821 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0e08061e80 con 0x7f0e180737b0 2026-03-10T14:14:55.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:55 vm04.local ceph-mon[92084]: Upgrade: It appears safe to stop mds.cephfs.vm03.aqaspa 2026-03-10T14:14:55.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:55 vm04.local ceph-mon[92084]: Upgrade: Updating mds.cephfs.vm03.aqaspa 2026-03-10T14:14:55.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:55 vm04.local ceph-mon[92084]: Deploying daemon mds.cephfs.vm03.aqaspa on vm03 2026-03-10T14:14:55.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:55 vm04.local ceph-mon[92084]: osdmap e118: 6 total, 6 up, 6 in 2026-03-10T14:14:55.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:55 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 2 up:standby 2026-03-10T14:14:55.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:55 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:14:55.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.651+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0e181a2310 con 0x7f0e180737b0 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.652+0000 7f0e15ffb700 1 -- 192.168.123.103:0/4054130821 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+709 (secure 0 0 0) 0x7f0e08016070 con 0x7f0e180737b0 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 3 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 3, 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 10 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:14:55.651 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:14:55.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.654+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e000778c0 msgr2=0x7f0e00079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.654+0000 7f0e1dea4700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e000778c0 0x7f0e00079d70 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f0e0c009fd0 tx=0x7f0e0c005fb0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 msgr2=0x7f0e18071d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 0x7f0e18071d30 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f0e0800d8d0 tx=0x7f0e0800dbe0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 shutdown_connections 2026-03-10T14:14:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e000778c0 0x7f0e00079d70 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e180737b0 0x7f0e18071d30 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 --2- 192.168.123.103:0/4054130821 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e18074d80 0x7f0e18072270 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 >> 192.168.123.103:0/4054130821 conn(0x7f0e180fb850 msgr2=0x7f0e18106080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 shutdown_connections 2026-03-10T14:14:55.654 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.655+0000 7f0e1dea4700 1 -- 192.168.123.103:0/4054130821 wait complete. 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.721+0000 7f193df0c700 1 -- 192.168.123.103:0/427686880 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381036a0 msgr2=0x7f1938105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.721+0000 7f193df0c700 1 --2- 192.168.123.103:0/427686880 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381036a0 0x7f1938105ac0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f1928009b00 tx=0x7f1928009e10 comp rx=0 tx=0).stop 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.722+0000 7f193df0c700 1 -- 192.168.123.103:0/427686880 shutdown_connections 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.722+0000 7f193df0c700 1 --2- 192.168.123.103:0/427686880 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381036a0 0x7f1938105ac0 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.722+0000 7f193df0c700 1 --2- 192.168.123.103:0/427686880 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1938069160 0x7f1938103160 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.722+0000 7f193df0c700 1 -- 192.168.123.103:0/427686880 >> 192.168.123.103:0/427686880 conn(0x7f19380faa70 msgr2=0x7f19380fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.722+0000 7f193df0c700 1 -- 192.168.123.103:0/427686880 shutdown_connections 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.722+0000 7f193df0c700 1 -- 192.168.123.103:0/427686880 wait complete. 2026-03-10T14:14:55.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.723+0000 7f193df0c700 1 Processor -- start 2026-03-10T14:14:55.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.723+0000 7f193df0c700 1 -- start start 2026-03-10T14:14:55.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.723+0000 7f193df0c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19381036a0 0x7f19381981a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.723+0000 7f193df0c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381986e0 0x7f193819d750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.723+0000 7f193df0c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1938198be0 con 0x7f19381986e0 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.723+0000 7f193df0c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1938198d50 con 0x7f19381036a0 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.724+0000 7f1936ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381986e0 0x7f193819d750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.724+0000 7f1936ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381986e0 0x7f193819d750 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:53046/0 (socket says 192.168.123.103:53046) 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.724+0000 7f1936ffd700 1 -- 192.168.123.103:0/3158833258 learned_addr learned my addr 192.168.123.103:0/3158833258 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.724+0000 7f1936ffd700 1 -- 192.168.123.103:0/3158833258 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19381036a0 msgr2=0x7f19381981a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.724+0000 7f19377fe700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19381036a0 0x7f19381981a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.725+0000 7f1936ffd700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19381036a0 0x7f19381981a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.725+0000 7f1936ffd700 1 -- 192.168.123.103:0/3158833258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f19280097e0 con 0x7f19381986e0 2026-03-10T14:14:55.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.725+0000 7f1936ffd700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381986e0 0x7f193819d750 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f1928005f50 tx=0x7f1928004a60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.726+0000 7f19377fe700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19381036a0 0x7f19381981a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:14:55.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.727+0000 7f1934ff9700 1 -- 192.168.123.103:0/3158833258 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f192801d070 con 0x7f19381986e0 2026-03-10T14:14:55.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.727+0000 7f1934ff9700 1 -- 192.168.123.103:0/3158833258 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f192800bc50 con 0x7f19381986e0 2026-03-10T14:14:55.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.727+0000 7f1934ff9700 1 -- 192.168.123.103:0/3158833258 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f192800f910 con 0x7f19381986e0 2026-03-10T14:14:55.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.728+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f193819dc90 con 0x7f19381986e0 2026-03-10T14:14:55.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.728+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f193819e150 con 0x7f19381986e0 2026-03-10T14:14:55.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.729+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f19380fc670 con 0x7f19381986e0 2026-03-10T14:14:55.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.730+0000 7f1934ff9700 1 -- 192.168.123.103:0/3158833258 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f192800fa70 con 0x7f19381986e0 2026-03-10T14:14:55.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.730+0000 7f1934ff9700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f19240778c0 0x7f1924079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.730+0000 7f1934ff9700 1 -- 192.168.123.103:0/3158833258 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(118..118 src has 1..118) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f192809b510 con 0x7f19381986e0 2026-03-10T14:14:55.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.732+0000 7f1934ff9700 1 -- 192.168.123.103:0/3158833258 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1928063c80 con 0x7f19381986e0 2026-03-10T14:14:55.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.733+0000 7f19377fe700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f19240778c0 0x7f1924079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.733+0000 7f19377fe700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f19240778c0 0x7f1924079d70 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f1920006fd0 tx=0x7f1920008040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.872+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f1938066e40 con 0x7f19381986e0 2026-03-10T14:14:55.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.874+0000 7f1934ff9700 1 -- 192.168.123.103:0/3158833258 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 23 v23) v1 ==== 76+0+1750 (secure 0 0 0) 0x7f19280276f0 con 0x7f19381986e0 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:e23 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-10T14:14:54:516822+0000 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:epoch 21 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:14:43.661835+0000 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24263} 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:stopped 1 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 24263 members: 24263 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:24263} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.104:6824/291713758,v1:192.168.123.104:6825/291713758] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{-1:34408} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:14:55.873 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{-1:34412} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.103:6828/1608434114,v1:192.168.123.103:6829/1608434114] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T14:14:55.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.876+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f19240778c0 msgr2=0x7f1924079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.876+0000 7f193df0c700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f19240778c0 0x7f1924079d70 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f1920006fd0 tx=0x7f1920008040 comp rx=0 tx=0).stop 2026-03-10T14:14:55.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.876+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381986e0 msgr2=0x7f193819d750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.876+0000 7f193df0c700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381986e0 0x7f193819d750 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f1928005f50 tx=0x7f1928004a60 comp rx=0 tx=0).stop 2026-03-10T14:14:55.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.877+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 shutdown_connections 2026-03-10T14:14:55.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.877+0000 7f193df0c700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f19240778c0 0x7f1924079d70 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.877+0000 7f193df0c700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f19381036a0 0x7f19381981a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.877+0000 7f193df0c700 1 --2- 192.168.123.103:0/3158833258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f19381986e0 0x7f193819d750 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.877+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 >> 192.168.123.103:0/3158833258 conn(0x7f19380faa70 msgr2=0x7f1938104300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.877+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 shutdown_connections 2026-03-10T14:14:55.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.877+0000 7f193df0c700 1 -- 192.168.123.103:0/3158833258 wait complete. 2026-03-10T14:14:55.877 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 23 2026-03-10T14:14:55.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.951+0000 7f0838b00700 1 -- 192.168.123.103:0/1943805445 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08340feda0 msgr2=0x7f08341011c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.951+0000 7f0838b00700 1 --2- 192.168.123.103:0/1943805445 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08340feda0 0x7f08341011c0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f0824009b00 tx=0x7f0824009e10 comp rx=0 tx=0).stop 2026-03-10T14:14:55.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.952+0000 7f0838b00700 1 -- 192.168.123.103:0/1943805445 shutdown_connections 2026-03-10T14:14:55.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.952+0000 7f0838b00700 1 --2- 192.168.123.103:0/1943805445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0834101700 0x7f0834103b80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.952+0000 7f0838b00700 1 --2- 192.168.123.103:0/1943805445 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f08340feda0 0x7f08341011c0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.952+0000 7f0838b00700 1 -- 192.168.123.103:0/1943805445 >> 192.168.123.103:0/1943805445 conn(0x7f08340fa9b0 msgr2=0x7f08340fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:55.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.952+0000 7f0838b00700 1 -- 192.168.123.103:0/1943805445 shutdown_connections 2026-03-10T14:14:55.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.952+0000 7f0838b00700 1 -- 192.168.123.103:0/1943805445 wait complete. 2026-03-10T14:14:55.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.953+0000 7f0838b00700 1 Processor -- start 2026-03-10T14:14:55.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.953+0000 7f0838b00700 1 -- start start 2026-03-10T14:14:55.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.953+0000 7f0838b00700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08340feda0 0x7f083419c3a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.953+0000 7f0838b00700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0834101700 0x7f083419c8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.953+0000 7f0838b00700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f083419cf00 con 0x7f08340feda0 2026-03-10T14:14:55.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.953+0000 7f0838b00700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f083419d040 con 0x7f0834101700 2026-03-10T14:14:55.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.953+0000 7f083259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08340feda0 0x7f083419c3a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.954+0000 7f083259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08340feda0 0x7f083419c3a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:53066/0 (socket says 192.168.123.103:53066) 2026-03-10T14:14:55.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.954+0000 7f083259c700 1 -- 192.168.123.103:0/616046432 learned_addr learned my addr 192.168.123.103:0/616046432 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:55.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.954+0000 7f0831d9b700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0834101700 0x7f083419c8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.954+0000 7f083259c700 1 -- 192.168.123.103:0/616046432 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0834101700 msgr2=0x7f083419c8e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:55.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.954+0000 7f083259c700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0834101700 0x7f083419c8e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:55.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.954+0000 7f083259c700 1 -- 192.168.123.103:0/616046432 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0828009710 con 0x7f08340feda0 2026-03-10T14:14:55.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.954+0000 7f0831d9b700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0834101700 0x7f083419c8e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:14:55.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.955+0000 7f083259c700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08340feda0 0x7f083419c3a0 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f0824004980 tx=0x7f08240049b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.955+0000 7f08237fe700 1 -- 192.168.123.103:0/616046432 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f082401d070 con 0x7f08340feda0 2026-03-10T14:14:55.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.955+0000 7f08237fe700 1 -- 192.168.123.103:0/616046432 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f082400bb40 con 0x7f08340feda0 2026-03-10T14:14:55.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.955+0000 7f08237fe700 1 -- 192.168.123.103:0/616046432 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0824003db0 con 0x7f08340feda0 2026-03-10T14:14:55.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.955+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f08240097e0 con 0x7f08340feda0 2026-03-10T14:14:55.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.955+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f08341a1d90 con 0x7f08340feda0 2026-03-10T14:14:55.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.957+0000 7f08237fe700 1 -- 192.168.123.103:0/616046432 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0824004db0 con 0x7f08340feda0 2026-03-10T14:14:55.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.957+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f08341965e0 con 0x7f08340feda0 2026-03-10T14:14:55.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.957+0000 7f08237fe700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f081c077910 0x7f081c079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:55.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.957+0000 7f08237fe700 1 -- 192.168.123.103:0/616046432 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(118..118 src has 1..118) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f082409b5a0 con 0x7f08340feda0 2026-03-10T14:14:55.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.957+0000 7f0831d9b700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f081c077910 0x7f081c079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:55.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.959+0000 7f0831d9b700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f081c077910 0x7f081c079dc0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f082800f7e0 tx=0x7f0828009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:55.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:55.960+0000 7f08237fe700 1 -- 192.168.123.103:0/616046432 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0824063d10 con 0x7f08340feda0 2026-03-10T14:14:56.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.085+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0834061190 con 0x7f081c077910 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.086+0000 7f08237fe700 1 -- 192.168.123.103:0/616046432 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f0834061190 con 0x7f081c077910 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "osd", 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "12/23 daemons upgraded", 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading mds daemons", 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:14:56.085 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f081c077910 msgr2=0x7f081c079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f081c077910 0x7f081c079dc0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f082800f7e0 tx=0x7f0828009450 comp rx=0 tx=0).stop 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08340feda0 msgr2=0x7f083419c3a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08340feda0 0x7f083419c3a0 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f0824004980 tx=0x7f08240049b0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 shutdown_connections 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f081c077910 0x7f081c079dc0 secure :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f082800f7e0 tx=0x7f0828009450 comp rx=0 tx=0).stop 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08340feda0 0x7f083419c3a0 secure :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f0824004980 tx=0x7f08240049b0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 --2- 192.168.123.103:0/616046432 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0834101700 0x7f083419c8e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.090+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 >> 192.168.123.103:0/616046432 conn(0x7f08340fa9b0 msgr2=0x7f08340fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.092+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 shutdown_connections 2026-03-10T14:14:56.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.092+0000 7f0838b00700 1 -- 192.168.123.103:0/616046432 wait complete. 2026-03-10T14:14:56.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.171+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2726171845 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cb8072330 msgr2=0x7f4cb80770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:56.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.171+0000 7f4cbcca6700 1 --2- 192.168.123.103:0/2726171845 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cb8072330 0x7f4cb80770b0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f4cb000d3f0 tx=0x7f4cb000d700 comp rx=0 tx=0).stop 2026-03-10T14:14:56.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.171+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2726171845 shutdown_connections 2026-03-10T14:14:56.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.171+0000 7f4cbcca6700 1 --2- 192.168.123.103:0/2726171845 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cb8072330 0x7f4cb80770b0 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.171+0000 7f4cbcca6700 1 --2- 192.168.123.103:0/2726171845 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cb8071950 0x7f4cb8071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.170 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.171+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2726171845 >> 192.168.123.103:0/2726171845 conn(0x7f4cb806d1a0 msgr2=0x7f4cb806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:56.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.172+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2726171845 shutdown_connections 2026-03-10T14:14:56.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.172+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2726171845 wait complete. 2026-03-10T14:14:56.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.172+0000 7f4cbcca6700 1 Processor -- start 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.172+0000 7f4cbcca6700 1 -- start start 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.172+0000 7f4cbcca6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cb8071950 0x7f4cb81313d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.172+0000 7f4cbcca6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cb8131910 0x7f4cb807f570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.172+0000 7f4cbcca6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cb8131e10 con 0x7f4cb8071950 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.172+0000 7f4cbcca6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4cb8131f80 con 0x7f4cb8131910 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.173+0000 7f4cb6ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cb8131910 0x7f4cb807f570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.173+0000 7f4cb6ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cb8131910 0x7f4cb807f570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37338/0 (socket says 192.168.123.103:37338) 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.173+0000 7f4cb6ffd700 1 -- 192.168.123.103:0/2122990827 learned_addr learned my addr 192.168.123.103:0/2122990827 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.173+0000 7f4cb77fe700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cb8071950 0x7f4cb81313d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:56.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.173+0000 7f4cb6ffd700 1 -- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cb8071950 msgr2=0x7f4cb81313d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:56.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.173+0000 7f4cb6ffd700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cb8071950 0x7f4cb81313d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.173+0000 7f4cb6ffd700 1 -- 192.168.123.103:0/2122990827 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4cb0007ed0 con 0x7f4cb8131910 2026-03-10T14:14:56.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.173+0000 7f4cb6ffd700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cb8131910 0x7f4cb807f570 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f4cb0003c30 tx=0x7f4cb0004b40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:56.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.176+0000 7f4cb4ff9700 1 -- 192.168.123.103:0/2122990827 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cb001c070 con 0x7f4cb8131910 2026-03-10T14:14:56.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.176+0000 7f4cb4ff9700 1 -- 192.168.123.103:0/2122990827 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4cb000deb0 con 0x7f4cb8131910 2026-03-10T14:14:56.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.176+0000 7f4cb4ff9700 1 -- 192.168.123.103:0/2122990827 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cb0017b10 con 0x7f4cb8131910 2026-03-10T14:14:56.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.177+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4cb807fab0 con 0x7f4cb8131910 2026-03-10T14:14:56.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.177+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4cb807ff70 con 0x7f4cb8131910 2026-03-10T14:14:56.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.177+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4cb812b500 con 0x7f4cb8131910 2026-03-10T14:14:56.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.179+0000 7f4cb4ff9700 1 -- 192.168.123.103:0/2122990827 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4cb0017c70 con 0x7f4cb8131910 2026-03-10T14:14:56.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.179+0000 7f4cb4ff9700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ca0077910 0x7f4ca0079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:14:56.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.179+0000 7f4cb4ff9700 1 -- 192.168.123.103:0/2122990827 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(118..118 src has 1..118) v4 ==== 6463+0+0 (secure 0 0 0) 0x7f4cb0013070 con 0x7f4cb8131910 2026-03-10T14:14:56.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.179+0000 7f4cb77fe700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ca0077910 0x7f4ca0079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:14:56.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.180+0000 7f4cb77fe700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ca0077910 0x7f4ca0079dc0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f4ca8005950 tx=0x7f4ca800b410 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:14:56.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.182+0000 7f4cb4ff9700 1 -- 192.168.123.103:0/2122990827 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4cb00642a0 con 0x7f4cb8131910 2026-03-10T14:14:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:56 vm03.local ceph-mon[103098]: pgmap v255: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:56 vm03.local ceph-mon[103098]: from='client.34446 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:56 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/4054130821' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:56 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3158833258' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:14:56.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.363+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f4cb804ea50 con 0x7f4cb8131910 2026-03-10T14:14:56.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.363+0000 7f4cb4ff9700 1 -- 192.168.123.103:0/2122990827 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+263 (secure 0 0 0) 0x7f4cb0007480 con 0x7f4cb8131910 2026-03-10T14:14:56.363 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded 2026-03-10T14:14:56.363 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded 2026-03-10T14:14:56.363 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [0,2] 2026-03-10T14:14:56.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.366+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ca0077910 msgr2=0x7f4ca0079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:56.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.366+0000 7f4cbcca6700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ca0077910 0x7f4ca0079dc0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f4ca8005950 tx=0x7f4ca800b410 comp rx=0 tx=0).stop 2026-03-10T14:14:56.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.367+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cb8131910 msgr2=0x7f4cb807f570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:14:56.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.367+0000 7f4cbcca6700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cb8131910 0x7f4cb807f570 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f4cb0003c30 tx=0x7f4cb0004b40 comp rx=0 tx=0).stop 2026-03-10T14:14:56.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.367+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 shutdown_connections 2026-03-10T14:14:56.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.367+0000 7f4cbcca6700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4ca0077910 0x7f4ca0079dc0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.367+0000 7f4cbcca6700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4cb8071950 0x7f4cb81313d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.367+0000 7f4cbcca6700 1 --2- 192.168.123.103:0/2122990827 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4cb8131910 0x7f4cb807f570 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:14:56.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.367+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 >> 192.168.123.103:0/2122990827 conn(0x7f4cb806d1a0 msgr2=0x7f4cb8076550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:14:56.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.368+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 shutdown_connections 2026-03-10T14:14:56.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:14:56.368+0000 7f4cbcca6700 1 -- 192.168.123.103:0/2122990827 wait complete. 2026-03-10T14:14:56.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:56 vm04.local ceph-mon[92084]: pgmap v255: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:56.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:56 vm04.local ceph-mon[92084]: from='client.34446 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:56.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:56 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/4054130821' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:56.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:56 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3158833258' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:14:57.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:57 vm03.local ceph-mon[103098]: from='client.34450 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:57.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:57 vm03.local ceph-mon[103098]: from='client.34454 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:57.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:57 vm03.local ceph-mon[103098]: from='client.34466 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:57.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:57 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2122990827' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:14:57.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:57 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:57.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:57 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:57.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:57 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:57.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:57 vm04.local ceph-mon[92084]: from='client.34450 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:57.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:57 vm04.local ceph-mon[92084]: from='client.34454 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:57.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:57 vm04.local ceph-mon[92084]: from='client.34466 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:14:57.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:57 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2122990827' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:14:57.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:57 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:57.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:57 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:57.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:57 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:58.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:58 vm03.local ceph-mon[103098]: pgmap v256: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:58.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:58 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded, 1 pg undersized (PG_DEGRADED) 2026-03-10T14:14:58.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:58 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] up:boot 2026-03-10T14:14:58.321 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:58 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 3 up:standby 2026-03-10T14:14:58.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:58 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:14:58.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:58 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:58.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:58 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:58 vm04.local ceph-mon[92084]: pgmap v256: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:58 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded, 1 pg undersized (PG_DEGRADED) 2026-03-10T14:14:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:58 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] up:boot 2026-03-10T14:14:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:58 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 3 up:standby 2026-03-10T14:14:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:58 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:14:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:58 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:58 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: pgmap v257: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: Detected new or changed devices on vm03 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:59.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:14:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm03.itwezo"]}]: dispatch 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: pgmap v257: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: Detected new or changed devices on vm03 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:14:59.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:59.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:59.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:59.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:14:59.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:14:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm03.itwezo"]}]: dispatch 2026-03-10T14:15:00.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:00 vm04.local ceph-mon[92084]: Upgrade: It appears safe to stop mds.cephfs.vm03.itwezo 2026-03-10T14:15:00.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:00 vm04.local ceph-mon[92084]: Upgrade: Updating mds.cephfs.vm03.itwezo 2026-03-10T14:15:00.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:00.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.itwezo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:15:00.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:00.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:00 vm04.local ceph-mon[92084]: Deploying daemon mds.cephfs.vm03.itwezo on vm03 2026-03-10T14:15:00.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:00 vm03.local ceph-mon[103098]: Upgrade: It appears safe to stop mds.cephfs.vm03.itwezo 2026-03-10T14:15:00.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:00 vm03.local ceph-mon[103098]: Upgrade: Updating mds.cephfs.vm03.itwezo 2026-03-10T14:15:00.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:00.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.itwezo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:15:00.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:00.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:00 vm03.local ceph-mon[103098]: Deploying daemon mds.cephfs.vm03.itwezo on vm03 2026-03-10T14:15:01.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:01 vm04.local ceph-mon[92084]: osdmap e119: 6 total, 6 up, 6 in 2026-03-10T14:15:01.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:01 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 2 up:standby 2026-03-10T14:15:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:15:01.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:01 vm04.local ceph-mon[92084]: pgmap v259: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 11 objects/s recovering 2026-03-10T14:15:01.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:01 vm03.local ceph-mon[103098]: osdmap e119: 6 total, 6 up, 6 in 2026-03-10T14:15:01.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:01 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 2 up:standby 2026-03-10T14:15:01.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:15:01.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:01 vm03.local ceph-mon[103098]: pgmap v259: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 11 objects/s recovering 2026-03-10T14:15:03.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:15:03.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:15:03.999 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:03 vm03.local ceph-mon[103098]: pgmap v260: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 11 objects/s recovering 2026-03-10T14:15:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:03 vm04.local ceph-mon[92084]: pgmap v260: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 11 objects/s recovering 2026-03-10T14:15:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:05 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:05 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:05 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:05 vm03.local ceph-mon[103098]: pgmap v261: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:05 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.103:6828/2389872767,v1:192.168.123.103:6829/2389872767] up:boot 2026-03-10T14:15:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:05 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 3 up:standby 2026-03-10T14:15:05.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:05 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:15:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:05 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:05 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:05 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:05 vm04.local ceph-mon[92084]: pgmap v261: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 740 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:05 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.103:6828/2389872767,v1:192.168.123.103:6829/2389872767] up:boot 2026-03-10T14:15:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:05 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm04.sslxuq=up:active} 3 up:standby 2026-03-10T14:15:05.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:05 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:15:06.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:06.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:06.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:06.610 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:06 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:06 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:06 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:06.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:06 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: pgmap v262: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm04.sslxuq"]}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: Upgrade: It appears safe to stop mds.cephfs.vm04.sslxuq 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: Upgrade: Updating mds.cephfs.vm04.sslxuq 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.sslxuq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: Deploying daemon mds.cephfs.vm04.sslxuq on vm04 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: osdmap e120: 6 total, 6 up, 6 in 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: Standby daemon mds.cephfs.vm04.puavjd assigned to filesystem cephfs as rank 0 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T14:15:07.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:07 vm04.local ceph-mon[92084]: fsmap cephfs:1/1 {0=cephfs.vm04.puavjd=up:replay} 2 up:standby 2026-03-10T14:15:07.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[103094]: 2026-03-10T14:15:07.525+0000 7f3df4214640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: pgmap v262: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm04.sslxuq"]}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: Upgrade: It appears safe to stop mds.cephfs.vm04.sslxuq 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: Upgrade: Updating mds.cephfs.vm04.sslxuq 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.sslxuq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: Deploying daemon mds.cephfs.vm04.sslxuq on vm04 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: osdmap e120: 6 total, 6 up, 6 in 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: Standby daemon mds.cephfs.vm04.puavjd assigned to filesystem cephfs as rank 0 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T14:15:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:07 vm03.local ceph-mon[103098]: fsmap cephfs:1/1 {0=cephfs.vm04.puavjd=up:replay} 2 up:standby 2026-03-10T14:15:10.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:09 vm04.local ceph-mon[92084]: pgmap v264: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 503 KiB/s rd, 0 op/s; 247/243 objects degraded (101.646%); 0 B/s, 5 objects/s recovering 2026-03-10T14:15:10.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:09 vm03.local ceph-mon[103098]: pgmap v264: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 503 KiB/s rd, 0 op/s; 247/243 objects degraded (101.646%); 0 B/s, 5 objects/s recovering 2026-03-10T14:15:11.740 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: pgmap v265: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 10 MiB/s rd, 3 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:11.740 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:11.740 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:11.740 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:11.740 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: reconnect by client.14514 192.168.144.1:0/4117503891 after 0 2026-03-10T14:15:11.740 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: reconnect by client.24297 192.168.144.1:0/448103355 after 0.001 2026-03-10T14:15:11.740 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] up:reconnect 2026-03-10T14:15:11.741 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.104:6824/3487856112,v1:192.168.123.104:6825/3487856112] up:boot 2026-03-10T14:15:11.741 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: fsmap cephfs:1/1 {0=cephfs.vm04.puavjd=up:reconnect} 3 up:standby 2026-03-10T14:15:11.741 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:15:11.741 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:11.741 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:12.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: pgmap v265: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 10 MiB/s rd, 3 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: reconnect by client.14514 192.168.144.1:0/4117503891 after 0 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: reconnect by client.24297 192.168.144.1:0/448103355 after 0.001 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] up:reconnect 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.104:6824/3487856112,v1:192.168.123.104:6825/3487856112] up:boot 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: fsmap cephfs:1/1 {0=cephfs.vm04.puavjd=up:reconnect} 3 up:standby 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:12 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] up:rejoin 2026-03-10T14:15:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:12 vm04.local ceph-mon[92084]: fsmap cephfs:1/1 {0=cephfs.vm04.puavjd=up:rejoin} 3 up:standby 2026-03-10T14:15:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:12 vm04.local ceph-mon[92084]: daemon mds.cephfs.vm04.puavjd is now active in filesystem cephfs as rank 0 2026-03-10T14:15:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:13.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:12 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:12 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] up:rejoin 2026-03-10T14:15:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:12 vm03.local ceph-mon[103098]: fsmap cephfs:1/1 {0=cephfs.vm04.puavjd=up:rejoin} 3 up:standby 2026-03-10T14:15:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:12 vm03.local ceph-mon[103098]: daemon mds.cephfs.vm04.puavjd is now active in filesystem cephfs as rank 0 2026-03-10T14:15:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:13.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:12 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: pgmap v266: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 4 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] up:active 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm04.puavjd=up:active} 3 up:standby 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: Detected new or changed devices on vm04 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm04.puavjd"]}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: Upgrade: It appears safe to stop mds.cephfs.vm04.puavjd 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.puavjd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:15:14.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:13 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: pgmap v266: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 4 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.104:6826/3074647252,v1:192.168.123.104:6827/3074647252] up:active 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm04.puavjd=up:active} 3 up:standby 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: Detected new or changed devices on vm04 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm04.puavjd"]}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: Upgrade: It appears safe to stop mds.cephfs.vm04.puavjd 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm04.puavjd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:13 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[103094]: 2026-03-10T14:15:13.808+0000 7f3df4214640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:15:15.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: Upgrade: Updating mds.cephfs.vm04.puavjd 2026-03-10T14:15:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: Deploying daemon mds.cephfs.vm04.puavjd on vm04 2026-03-10T14:15:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T14:15:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:15:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: osdmap e121: 6 total, 6 up, 6 in 2026-03-10T14:15:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: Standby daemon mds.cephfs.vm03.aqaspa assigned to filesystem cephfs as rank 0 2026-03-10T14:15:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T14:15:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T14:15:15.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:14 vm04.local ceph-mon[92084]: fsmap cephfs:1/1 {0=cephfs.vm03.aqaspa=up:replay} 2 up:standby 2026-03-10T14:15:15.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: Upgrade: Updating mds.cephfs.vm04.puavjd 2026-03-10T14:15:15.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: Deploying daemon mds.cephfs.vm04.puavjd on vm04 2026-03-10T14:15:15.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T14:15:15.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T14:15:15.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: osdmap e121: 6 total, 6 up, 6 in 2026-03-10T14:15:15.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: Standby daemon mds.cephfs.vm03.aqaspa assigned to filesystem cephfs as rank 0 2026-03-10T14:15:15.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T14:15:15.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T14:15:15.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:14 vm03.local ceph-mon[103098]: fsmap cephfs:1/1 {0=cephfs.vm03.aqaspa=up:replay} 2 up:standby 2026-03-10T14:15:16.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:16 vm03.local ceph-mon[103098]: pgmap v268: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 6 op/s; 247/243 objects degraded (101.646%); 0 B/s, 5 objects/s recovering 2026-03-10T14:15:16.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:16 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:16.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:16 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:15:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:16 vm04.local ceph-mon[92084]: pgmap v268: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 6 op/s; 247/243 objects degraded (101.646%); 0 B/s, 5 objects/s recovering 2026-03-10T14:15:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:16 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:16 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:15:17.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:17 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-10T14:15:17.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:17 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] up:reconnect 2026-03-10T14:15:17.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:17 vm03.local ceph-mon[103098]: fsmap cephfs:1/1 {0=cephfs.vm03.aqaspa=up:reconnect} 2 up:standby 2026-03-10T14:15:17.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:17 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-10T14:15:17.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:17 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] up:reconnect 2026-03-10T14:15:17.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:17 vm04.local ceph-mon[92084]: fsmap cephfs:1/1 {0=cephfs.vm03.aqaspa=up:reconnect} 2 up:standby 2026-03-10T14:15:18.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:18 vm04.local ceph-mon[92084]: pgmap v269: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 1007 B/s wr, 13 op/s; 247/243 objects degraded (101.646%); 0 B/s, 10 objects/s recovering 2026-03-10T14:15:18.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:18 vm04.local ceph-mon[92084]: reconnect by client.14514 192.168.144.1:0/4117503891 after 0 2026-03-10T14:15:18.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:18 vm04.local ceph-mon[92084]: reconnect by client.24297 192.168.144.1:0/448103355 after 0.002 2026-03-10T14:15:18.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:18 vm03.local ceph-mon[103098]: pgmap v269: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 1007 B/s wr, 13 op/s; 247/243 objects degraded (101.646%); 0 B/s, 10 objects/s recovering 2026-03-10T14:15:18.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:18 vm03.local ceph-mon[103098]: reconnect by client.14514 192.168.144.1:0/4117503891 after 0 2026-03-10T14:15:18.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:18 vm03.local ceph-mon[103098]: reconnect by client.24297 192.168.144.1:0/448103355 after 0.002 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] up:rejoin 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: fsmap cephfs:1/1 {0=cephfs.vm03.aqaspa=up:rejoin} 2 up:standby 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: daemon mds.cephfs.vm03.aqaspa is now active in filesystem cephfs as rank 0 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: pgmap v270: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 27 MiB/s rd, 1.2 KiB/s wr, 14 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:19.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:19 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:19.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] up:rejoin 2026-03-10T14:15:19.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: fsmap cephfs:1/1 {0=cephfs.vm03.aqaspa=up:rejoin} 2 up:standby 2026-03-10T14:15:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: daemon mds.cephfs.vm03.aqaspa is now active in filesystem cephfs as rank 0 2026-03-10T14:15:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: pgmap v270: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 27 MiB/s rd, 1.2 KiB/s wr, 14 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:19.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:19 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:20.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:20 vm04.local ceph-mon[92084]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T14:15:20.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:20 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] up:active 2026-03-10T14:15:20.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:20 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.104:6826/3930678535,v1:192.168.123.104:6827/3930678535] up:boot 2026-03-10T14:15:20.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:20 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm03.aqaspa=up:active} 3 up:standby 2026-03-10T14:15:20.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:15:20.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:20.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:20 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:20.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:20 vm03.local ceph-mon[103098]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T14:15:20.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:20 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] up:active 2026-03-10T14:15:20.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:20 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.104:6826/3930678535,v1:192.168.123.104:6827/3930678535] up:boot 2026-03-10T14:15:20.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:20 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm03.aqaspa=up:active} 3 up:standby 2026-03-10T14:15:20.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:15:20.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:20.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:20 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all mds 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.aqaspa"}]': finished 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.itwezo"}]': finished 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.puavjd"}]': finished 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.sslxuq"}]': finished 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: Upgrade: Scaling up filesystem cephfs max_mds to 2 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T14:15:21.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:21 vm03.local ceph-mon[103098]: pgmap v271: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 2.3 KiB/s wr, 16 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all mds 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.aqaspa"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.aqaspa"}]': finished 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.itwezo"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.itwezo"}]': finished 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.puavjd"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.puavjd"}]': finished 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.sslxuq"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm04.sslxuq"}]': finished 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: Upgrade: Scaling up filesystem cephfs max_mds to 2 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T14:15:21.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:21 vm04.local ceph-mon[92084]: pgmap v271: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 744 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 2.3 KiB/s wr, 16 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: daemon mds.cephfs.vm03.itwezo assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: fsmap cephfs:1 {0=cephfs.vm03.aqaspa=up:active} 3 up:standby 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm03.aqaspa=up:active,1=cephfs.vm03.itwezo=up:starting} 2 up:standby 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: Upgrade: Enabling allow_standby_replay on filesystem cephfs 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]: dispatch 2026-03-10T14:15:22.766 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:22 vm04.local ceph-mon[92084]: daemon mds.cephfs.vm03.itwezo is now active in filesystem cephfs as rank 1 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: daemon mds.cephfs.vm03.itwezo assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: fsmap cephfs:1 {0=cephfs.vm03.aqaspa=up:active} 3 up:standby 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm03.aqaspa=up:active,1=cephfs.vm03.itwezo=up:starting} 2 up:standby 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: Upgrade: Enabling allow_standby_replay on filesystem cephfs 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]: dispatch 2026-03-10T14:15:22.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:22 vm03.local ceph-mon[103098]: daemon mds.cephfs.vm03.itwezo is now active in filesystem cephfs as rank 1 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]': finished 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: mds.? [v2:192.168.123.103:6828/2389872767,v1:192.168.123.103:6829/2389872767] up:active 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm03.aqaspa=up:active,1=cephfs.vm03.itwezo=up:active} 2 up:standby 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: fsmap cephfs:2 {0=cephfs.vm03.aqaspa=up:active,1=cephfs.vm03.itwezo=up:active} 2 up:standby-replay 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all rgw 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: pgmap v272: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 745 MiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 3.0 KiB/s wr, 17 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: Upgrade: Updating ceph-exporter.vm03 (1/2) 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:24.301 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:24 vm03.local ceph-mon[103098]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "1"}]': finished 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: mds.? [v2:192.168.123.103:6828/2389872767,v1:192.168.123.103:6829/2389872767] up:active 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm03.aqaspa=up:active,1=cephfs.vm03.itwezo=up:active} 2 up:standby 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: fsmap cephfs:2 {0=cephfs.vm03.aqaspa=up:active,1=cephfs.vm03.itwezo=up:active} 2 up:standby-replay 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all rgw 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: pgmap v272: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 745 MiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 3.0 KiB/s wr, 17 op/s; 247/243 objects degraded (101.646%); 0 B/s, 9 objects/s recovering 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:24.557 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:24.558 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: Upgrade: Updating ceph-exporter.vm03 (1/2) 2026-03-10T14:15:24.558 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:24.558 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:15:24.558 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:24.558 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:24 vm04.local ceph-mon[92084]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-10T14:15:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:25 vm03.local ceph-mon[103098]: pgmap v273: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 2.9 KiB/s wr, 17 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:15:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:25 vm03.local ceph-mon[103098]: Upgrade: Updating ceph-exporter.vm04 (2/2) 2026-03-10T14:15:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:15:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:25 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:25.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:25 vm03.local ceph-mon[103098]: Deploying daemon ceph-exporter.vm04 on vm04 2026-03-10T14:15:25.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:25.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:25.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:25 vm04.local ceph-mon[92084]: pgmap v273: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 2.9 KiB/s wr, 17 op/s; 247/243 objects degraded (101.646%); 0 B/s, 8 objects/s recovering 2026-03-10T14:15:25.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:25 vm04.local ceph-mon[92084]: Upgrade: Updating ceph-exporter.vm04 (2/2) 2026-03-10T14:15:25.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:25.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm04", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T14:15:25.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:25 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:25.866 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:25 vm04.local ceph-mon[92084]: Deploying daemon ceph-exporter.vm04 on vm04 2026-03-10T14:15:26.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.507+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/4080189760 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d78071960 msgr2=0x7f2d78071dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:26.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.507+0000 7f2d7fa3f700 1 --2- 192.168.123.103:0/4080189760 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d78071960 0x7f2d78071dd0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f2d7000b3a0 tx=0x7f2d7000b6b0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.507+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/4080189760 shutdown_connections 2026-03-10T14:15:26.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.507+0000 7f2d7fa3f700 1 --2- 192.168.123.103:0/4080189760 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d78071960 0x7f2d78071dd0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.507+0000 7f2d7fa3f700 1 --2- 192.168.123.103:0/4080189760 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d781080e0 0x7f2d781084f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.507+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/4080189760 >> 192.168.123.103:0/4080189760 conn(0x7f2d7806d3e0 msgr2=0x7f2d7806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:26.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.507+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/4080189760 shutdown_connections 2026-03-10T14:15:26.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.507+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/4080189760 wait complete. 2026-03-10T14:15:26.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.509+0000 7f2d7fa3f700 1 Processor -- start 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.509+0000 7f2d7fa3f700 1 -- start start 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.509+0000 7f2d7fa3f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d78071960 0x7f2d781b72b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.509+0000 7f2d7fa3f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d781080e0 0x7f2d781b77f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.509+0000 7f2d7fa3f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d781b7e10 con 0x7f2d781080e0 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.509+0000 7f2d7fa3f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2d781b7f50 con 0x7f2d78071960 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.510+0000 7f2d7cfda700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d781080e0 0x7f2d781b77f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.510+0000 7f2d7cfda700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d781080e0 0x7f2d781b77f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52298/0 (socket says 192.168.123.103:52298) 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.510+0000 7f2d7cfda700 1 -- 192.168.123.103:0/3352465055 learned_addr learned my addr 192.168.123.103:0/3352465055 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.510+0000 7f2d7cfda700 1 -- 192.168.123.103:0/3352465055 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d78071960 msgr2=0x7f2d781b72b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.510+0000 7f2d7cfda700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d78071960 0x7f2d781b72b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.510+0000 7f2d7cfda700 1 -- 192.168.123.103:0/3352465055 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2d74009710 con 0x7f2d781080e0 2026-03-10T14:15:26.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.510+0000 7f2d7cfda700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d781080e0 0x7f2d781b77f0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f2d70007a50 tx=0x7f2d70012ad0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:26.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.512+0000 7f2d6e7fc700 1 -- 192.168.123.103:0/3352465055 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d7000e040 con 0x7f2d781080e0 2026-03-10T14:15:26.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.512+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2d7000b050 con 0x7f2d781080e0 2026-03-10T14:15:26.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.512+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2d7807e970 con 0x7f2d781080e0 2026-03-10T14:15:26.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.513+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2d78066e40 con 0x7f2d781080e0 2026-03-10T14:15:26.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.517+0000 7f2d6e7fc700 1 -- 192.168.123.103:0/3352465055 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2d700090d0 con 0x7f2d781080e0 2026-03-10T14:15:26.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.517+0000 7f2d6e7fc700 1 -- 192.168.123.103:0/3352465055 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2d70004820 con 0x7f2d781080e0 2026-03-10T14:15:26.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.517+0000 7f2d6e7fc700 1 -- 192.168.123.103:0/3352465055 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2d70026460 con 0x7f2d781080e0 2026-03-10T14:15:26.516 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.517+0000 7f2d6e7fc700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d64077ad0 0x7f2d64079f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:26.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.518+0000 7f2d6e7fc700 1 -- 192.168.123.103:0/3352465055 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(121..121 src has 1..121) v4 ==== 6635+0+0 (secure 0 0 0) 0x7f2d7009c260 con 0x7f2d781080e0 2026-03-10T14:15:26.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.518+0000 7f2d7d7db700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d64077ad0 0x7f2d64079f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:26.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.518+0000 7f2d6e7fc700 1 -- 192.168.123.103:0/3352465055 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2d7009e7a0 con 0x7f2d781080e0 2026-03-10T14:15:26.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.519+0000 7f2d7d7db700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d64077ad0 0x7f2d64079f80 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f2d78072a50 tx=0x7f2d74009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:26.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.705+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2d7807e550 con 0x7f2d64077ad0 2026-03-10T14:15:26.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.706+0000 7f2d6e7fc700 1 -- 192.168.123.103:0/3352465055 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+441 (secure 0 0 0) 0x7f2d7807e550 con 0x7f2d64077ad0 2026-03-10T14:15:26.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.710+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d64077ad0 msgr2=0x7f2d64079f80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:26.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.710+0000 7f2d7fa3f700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d64077ad0 0x7f2d64079f80 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f2d78072a50 tx=0x7f2d74009450 comp rx=0 tx=0).stop 2026-03-10T14:15:26.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d781080e0 msgr2=0x7f2d781b77f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:26.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d781080e0 0x7f2d781b77f0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f2d70007a50 tx=0x7f2d70012ad0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 shutdown_connections 2026-03-10T14:15:26.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2d64077ad0 0x7f2d64079f80 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2d78071960 0x7f2d781b72b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 --2- 192.168.123.103:0/3352465055 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2d781080e0 0x7f2d781b77f0 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 >> 192.168.123.103:0/3352465055 conn(0x7f2d7806d3e0 msgr2=0x7f2d78073040 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:26.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 shutdown_connections 2026-03-10T14:15:26.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.711+0000 7f2d7fa3f700 1 -- 192.168.123.103:0/3352465055 wait complete. 2026-03-10T14:15:26.727 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:15:26.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.790+0000 7f4017fff700 1 -- 192.168.123.103:0/2124529553 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 msgr2=0x7f40180ff5c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:26.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.790+0000 7f4017fff700 1 --2- 192.168.123.103:0/2124529553 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 0x7f40180ff5c0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f4008009a60 tx=0x7f4008009d70 comp rx=0 tx=0).stop 2026-03-10T14:15:26.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.790+0000 7f4017fff700 1 -- 192.168.123.103:0/2124529553 shutdown_connections 2026-03-10T14:15:26.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.790+0000 7f4017fff700 1 --2- 192.168.123.103:0/2124529553 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40181003b0 0x7f4018100800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.790+0000 7f4017fff700 1 --2- 192.168.123.103:0/2124529553 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 0x7f40180ff5c0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.790+0000 7f4017fff700 1 -- 192.168.123.103:0/2124529553 >> 192.168.123.103:0/2124529553 conn(0x7f40180fa740 msgr2=0x7f40180fcb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:26.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.791+0000 7f4017fff700 1 -- 192.168.123.103:0/2124529553 shutdown_connections 2026-03-10T14:15:26.790 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.791+0000 7f4017fff700 1 -- 192.168.123.103:0/2124529553 wait complete. 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4017fff700 1 Processor -- start 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4017fff700 1 -- start start 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4017fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 0x7f401810ebf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4017fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40181003b0 0x7f401810f130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4017fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4018114280 con 0x7f40181003b0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4017fff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f401810f670 con 0x7f40180ff1b0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4016ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 0x7f401810ebf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4016ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 0x7f401810ebf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34314/0 (socket says 192.168.123.103:34314) 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4016ffd700 1 -- 192.168.123.103:0/3975811935 learned_addr learned my addr 192.168.123.103:0/3975811935 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4016ffd700 1 -- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40181003b0 msgr2=0x7f401810f130 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4016ffd700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40181003b0 0x7f401810f130 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4016ffd700 1 -- 192.168.123.103:0/3975811935 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4008009710 con 0x7f40180ff1b0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.792+0000 7f4016ffd700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 0x7f401810ebf0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f400800f690 tx=0x7f400800f770 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.793+0000 7f401c95b700 1 -- 192.168.123.103:0/3975811935 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f400801d070 con 0x7f40180ff1b0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.793+0000 7f4017fff700 1 -- 192.168.123.103:0/3975811935 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f40181aa140 con 0x7f40180ff1b0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.793+0000 7f4017fff700 1 -- 192.168.123.103:0/3975811935 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f40181aa4f0 con 0x7f40180ff1b0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.793+0000 7f401c95b700 1 -- 192.168.123.103:0/3975811935 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f400800fdc0 con 0x7f40180ff1b0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.794+0000 7f401c95b700 1 -- 192.168.123.103:0/3975811935 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40080177a0 con 0x7f40180ff1b0 2026-03-10T14:15:26.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.794+0000 7f4017fff700 1 -- 192.168.123.103:0/3975811935 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3ff8005320 con 0x7f40180ff1b0 2026-03-10T14:15:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.795+0000 7f401c95b700 1 -- 192.168.123.103:0/3975811935 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4008017900 con 0x7f40180ff1b0 2026-03-10T14:15:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.795+0000 7f401c95b700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4004077700 0x7f4004079bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.795+0000 7f401c95b700 1 -- 192.168.123.103:0/3975811935 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(121..121 src has 1..121) v4 ==== 6635+0+0 (secure 0 0 0) 0x7f400809b5b0 con 0x7f40180ff1b0 2026-03-10T14:15:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.795+0000 7f40167fc700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4004077700 0x7f4004079bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:26.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.796+0000 7f40167fc700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4004077700 0x7f4004079bb0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f4000005fd0 tx=0x7f4000009500 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:26.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.798+0000 7f401c95b700 1 -- 192.168.123.103:0/3975811935 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4008063d20 con 0x7f40180ff1b0 2026-03-10T14:15:26.981 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.981+0000 7f4017fff700 1 -- 192.168.123.103:0/3975811935 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3ff8000bf0 con 0x7f4004077700 2026-03-10T14:15:26.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.983+0000 7f401c95b700 1 -- 192.168.123.103:0/3975811935 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+441 (secure 0 0 0) 0x7f3ff8000bf0 con 0x7f4004077700 2026-03-10T14:15:26.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.985+0000 7f400e7fc700 1 -- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4004077700 msgr2=0x7f4004079bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:26.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.985+0000 7f400e7fc700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4004077700 0x7f4004079bb0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f4000005fd0 tx=0x7f4000009500 comp rx=0 tx=0).stop 2026-03-10T14:15:26.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.985+0000 7f400e7fc700 1 -- 192.168.123.103:0/3975811935 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 msgr2=0x7f401810ebf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:26.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.985+0000 7f400e7fc700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 0x7f401810ebf0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f400800f690 tx=0x7f400800f770 comp rx=0 tx=0).stop 2026-03-10T14:15:26.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.986+0000 7f400e7fc700 1 -- 192.168.123.103:0/3975811935 shutdown_connections 2026-03-10T14:15:26.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.986+0000 7f400e7fc700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4004077700 0x7f4004079bb0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.986+0000 7f400e7fc700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f40180ff1b0 0x7f401810ebf0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.986+0000 7f400e7fc700 1 --2- 192.168.123.103:0/3975811935 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f40181003b0 0x7f401810f130 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:26.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.986+0000 7f400e7fc700 1 -- 192.168.123.103:0/3975811935 >> 192.168.123.103:0/3975811935 conn(0x7f40180fa740 msgr2=0x7f40181035e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:26.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.986+0000 7f400e7fc700 1 -- 192.168.123.103:0/3975811935 shutdown_connections 2026-03-10T14:15:26.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:26.986+0000 7f400e7fc700 1 -- 192.168.123.103:0/3975811935 wait complete. 2026-03-10T14:15:27.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.056+0000 7f8b7a324700 1 -- 192.168.123.103:0/1413093906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74072330 msgr2=0x7f8b740770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.056+0000 7f8b7a324700 1 --2- 192.168.123.103:0/1413093906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74072330 0x7f8b740770b0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f8b6c009230 tx=0x7f8b6c009260 comp rx=0 tx=0).stop 2026-03-10T14:15:27.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.056+0000 7f8b7a324700 1 -- 192.168.123.103:0/1413093906 shutdown_connections 2026-03-10T14:15:27.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.056+0000 7f8b7a324700 1 --2- 192.168.123.103:0/1413093906 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74072330 0x7f8b740770b0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.056+0000 7f8b7a324700 1 --2- 192.168.123.103:0/1413093906 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b74071950 0x7f8b74071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.056+0000 7f8b7a324700 1 -- 192.168.123.103:0/1413093906 >> 192.168.123.103:0/1413093906 conn(0x7f8b7406d1a0 msgr2=0x7f8b7406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:27.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:27.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:27.056 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:26 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.057+0000 7f8b7a324700 1 -- 192.168.123.103:0/1413093906 shutdown_connections 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b7a324700 1 -- 192.168.123.103:0/1413093906 wait complete. 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b7a324700 1 Processor -- start 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b7a324700 1 -- start start 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b7a324700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b74071950 0x7f8b740824e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b7a324700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74082a20 0x7f8b74082e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b7a324700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b7412dd80 con 0x7f8b74071950 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b7a324700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b7412def0 con 0x7f8b74082a20 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b78b21700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74082a20 0x7f8b74082e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b78b21700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74082a20 0x7f8b74082e90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34330/0 (socket says 192.168.123.103:34330) 2026-03-10T14:15:27.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.060+0000 7f8b78b21700 1 -- 192.168.123.103:0/1195060662 learned_addr learned my addr 192.168.123.103:0/1195060662 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:27.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.061+0000 7f8b78b21700 1 -- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b74071950 msgr2=0x7f8b740824e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.061+0000 7f8b78b21700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b74071950 0x7f8b740824e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.061+0000 7f8b78b21700 1 -- 192.168.123.103:0/1195060662 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8b6c008ee0 con 0x7f8b74082a20 2026-03-10T14:15:27.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.061+0000 7f8b78b21700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74082a20 0x7f8b74082e90 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f8b6c0042a0 tx=0x7f8b6c004380 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:27.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.061+0000 7f8b6a7fc700 1 -- 192.168.123.103:0/1195060662 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b6c01c070 con 0x7f8b74082a20 2026-03-10T14:15:27.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.061+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8b7412e170 con 0x7f8b74082a20 2026-03-10T14:15:27.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.061+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8b7412e660 con 0x7f8b74082a20 2026-03-10T14:15:27.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.062+0000 7f8b6a7fc700 1 -- 192.168.123.103:0/1195060662 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8b6c0047e0 con 0x7f8b74082a20 2026-03-10T14:15:27.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.062+0000 7f8b6a7fc700 1 -- 192.168.123.103:0/1195060662 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b6c016610 con 0x7f8b74082a20 2026-03-10T14:15:27.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.063+0000 7f8b6a7fc700 1 -- 192.168.123.103:0/1195060662 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8b6c016770 con 0x7f8b74082a20 2026-03-10T14:15:27.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.064+0000 7f8b6a7fc700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8b60077910 0x7f8b60079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.064+0000 7f8b79322700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8b60077910 0x7f8b60079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.064+0000 7f8b6a7fc700 1 -- 192.168.123.103:0/1195060662 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(121..121 src has 1..121) v4 ==== 6635+0+0 (secure 0 0 0) 0x7f8b6c012070 con 0x7f8b74082a20 2026-03-10T14:15:27.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.065+0000 7f8b79322700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8b60077910 0x7f8b60079dc0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f8b7000b3c0 tx=0x7f8b7000d040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:27.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.065+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8b58005320 con 0x7f8b74082a20 2026-03-10T14:15:27.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.068+0000 7f8b6a7fc700 1 -- 192.168.123.103:0/1195060662 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8b6c064aa0 con 0x7f8b74082a20 2026-03-10T14:15:27.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.197+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f8b58000bf0 con 0x7f8b60077910 2026-03-10T14:15:27.212 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:15:27.212 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (11m) 21s ago 12m 22.1M - 0.25.0 c8568f914cd2 0b9f79a1191a 2026-03-10T14:15:27.212 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 starting - - - - 2026-03-10T14:15:27.212 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (1s) 0s ago 11m 9945k - 19.2.3-678-ge911bdeb 654f31e6858e 54bbafe0555e 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 21s ago 12m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (5m) 0s ago 11m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (11m) 21s ago 11m 93.0M - 9.4.7 954c08fa6188 1fb5820b250e 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (30s) 21s ago 10m 17.8M - 19.2.3-678-ge911bdeb 654f31e6858e b1023e0bcace 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (23s) 21s ago 10m 13.0M - 19.2.3-678-ge911bdeb 654f31e6858e ddad42dde865 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (8s) 0s ago 10m 18.4M - 19.2.3-678-ge911bdeb 654f31e6858e f5060c63df19 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (16s) 0s ago 10m 158M - 19.2.3-678-ge911bdeb 654f31e6858e 89fac44ae15d 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (7m) 21s ago 12m 623M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (6m) 0s ago 11m 499M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (6m) 21s ago 12m 68.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (6m) 0s ago 11m 63.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (12m) 21s ago 12m 15.6M - 1.5.0 0da6a335fe13 ea3faf07c01f 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (11m) 0s ago 11m 15.5M - 1.5.0 0da6a335fe13 d2c83044f057 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 21s ago 11m 276M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 21s ago 11m 230M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 21s ago 10m 165M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e0d768b70468 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (2m) 0s ago 10m 152M 4096M 19.2.3-678-ge911bdeb 654f31e6858e f7fc2aafa9d9 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (101s) 0s ago 10m 125M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 89f9225212d4 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (80s) 0s ago 10m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6c7573f5f3fa 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (6m) 21s ago 11m 71.7M - 2.43.0 a07b618ecd1d 2e394cc74058 2026-03-10T14:15:27.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.211+0000 7f8b6a7fc700 1 -- 192.168.123.103:0/1195060662 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f8b58000bf0 con 0x7f8b60077910 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8b60077910 msgr2=0x7f8b60079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8b60077910 0x7f8b60079dc0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f8b7000b3c0 tx=0x7f8b7000d040 comp rx=0 tx=0).stop 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74082a20 msgr2=0x7f8b74082e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74082a20 0x7f8b74082e90 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f8b6c0042a0 tx=0x7f8b6c004380 comp rx=0 tx=0).stop 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 shutdown_connections 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8b60077910 0x7f8b60079dc0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8b74071950 0x7f8b740824e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 --2- 192.168.123.103:0/1195060662 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8b74082a20 0x7f8b74082e90 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 >> 192.168.123.103:0/1195060662 conn(0x7f8b7406d1a0 msgr2=0x7f8b740764d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 shutdown_connections 2026-03-10T14:15:27.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.216+0000 7f8b7a324700 1 -- 192.168.123.103:0/1195060662 wait complete. 2026-03-10T14:15:27.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.295+0000 7f2e0b436700 1 -- 192.168.123.103:0/1712697193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2e04103680 msgr2=0x7f2e04105a60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.295+0000 7f2e0b436700 1 --2- 192.168.123.103:0/1712697193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2e04103680 0x7f2e04105a60 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f2df4009b00 tx=0x7f2df4009e10 comp rx=0 tx=0).stop 2026-03-10T14:15:27.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.297+0000 7f2e0b436700 1 -- 192.168.123.103:0/1712697193 shutdown_connections 2026-03-10T14:15:27.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.297+0000 7f2e0b436700 1 --2- 192.168.123.103:0/1712697193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2e04103680 0x7f2e04105a60 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.297+0000 7f2e0b436700 1 --2- 192.168.123.103:0/1712697193 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e04100d60 0x7f2e04103140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.297+0000 7f2e0b436700 1 -- 192.168.123.103:0/1712697193 >> 192.168.123.103:0/1712697193 conn(0x7f2e040fa7a0 msgr2=0x7f2e040fcbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.299+0000 7f2e0b436700 1 -- 192.168.123.103:0/1712697193 shutdown_connections 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.299+0000 7f2e0b436700 1 -- 192.168.123.103:0/1712697193 wait complete. 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.299+0000 7f2e0b436700 1 Processor -- start 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.299+0000 7f2e0b436700 1 -- start start 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.300+0000 7f2e0b436700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e04100d60 0x7f2e04193930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.300+0000 7f2e0b436700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2e04103680 0x7f2e04193e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.300+0000 7f2e0b436700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e04194490 con 0x7f2e04103680 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.300+0000 7f2e0b436700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e041945d0 con 0x7f2e04100d60 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.300+0000 7f2e0a434700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e04100d60 0x7f2e04193930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.300+0000 7f2e0a434700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e04100d60 0x7f2e04193930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:34346/0 (socket says 192.168.123.103:34346) 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.300+0000 7f2e0a434700 1 -- 192.168.123.103:0/3577421107 learned_addr learned my addr 192.168.123.103:0/3577421107 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.300+0000 7f2e09c33700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2e04103680 0x7f2e04193e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.301+0000 7f2e0a434700 1 -- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2e04103680 msgr2=0x7f2e04193e70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.301+0000 7f2e0a434700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2e04103680 0x7f2e04193e70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.301+0000 7f2e0a434700 1 -- 192.168.123.103:0/3577421107 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2df40097e0 con 0x7f2e04100d60 2026-03-10T14:15:27.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.305+0000 7f2e0a434700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e04100d60 0x7f2e04193930 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f2e0000c8a0 tx=0x7f2e0000cbb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:27.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.305+0000 7f2dfb7fe700 1 -- 192.168.123.103:0/3577421107 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e000078c0 con 0x7f2e04100d60 2026-03-10T14:15:27.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.305+0000 7f2dfb7fe700 1 -- 192.168.123.103:0/3577421107 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2e0000f450 con 0x7f2e04100d60 2026-03-10T14:15:27.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.305+0000 7f2e0b436700 1 -- 192.168.123.103:0/3577421107 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2e04199080 con 0x7f2e04100d60 2026-03-10T14:15:27.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.305+0000 7f2dfb7fe700 1 -- 192.168.123.103:0/3577421107 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2e0000e5c0 con 0x7f2e04100d60 2026-03-10T14:15:27.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.305+0000 7f2e0b436700 1 -- 192.168.123.103:0/3577421107 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2e041995d0 con 0x7f2e04100d60 2026-03-10T14:15:27.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.306+0000 7f2df97fa700 1 -- 192.168.123.103:0/3577421107 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2dec0052f0 con 0x7f2e04100d60 2026-03-10T14:15:27.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.311+0000 7f2dfb7fe700 1 -- 192.168.123.103:0/3577421107 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2e0000f5c0 con 0x7f2e04100d60 2026-03-10T14:15:27.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.312+0000 7f2dfb7fe700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2df007bcd0 0x7f2df007e180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.312+0000 7f2dfb7fe700 1 -- 192.168.123.103:0/3577421107 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(121..121 src has 1..121) v4 ==== 6635+0+0 (secure 0 0 0) 0x7f2e00099eb0 con 0x7f2e04100d60 2026-03-10T14:15:27.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.313+0000 7f2e09c33700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2df007bcd0 0x7f2df007e180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.315+0000 7f2e09c33700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2df007bcd0 0x7f2df007e180 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f2df400b5c0 tx=0x7f2df4005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:27.319 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:27.319 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:27.319 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:26 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:27.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.321+0000 7f2dfb7fe700 1 -- 192.168.123.103:0/3577421107 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2e00062570 con 0x7f2e04100d60 2026-03-10T14:15:27.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.499+0000 7f2df97fa700 1 -- 192.168.123.103:0/3577421107 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f2dec005c90 con 0x7f2e04100d60 2026-03-10T14:15:27.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.500+0000 7f2dfb7fe700 1 -- 192.168.123.103:0/3577421107 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f2e00061cc0 con 0x7f2e04100d60 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:15:27.500 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:15:27.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 -- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2df007bcd0 msgr2=0x7f2df007e180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2df007bcd0 0x7f2df007e180 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f2df400b5c0 tx=0x7f2df4005fb0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 -- 192.168.123.103:0/3577421107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e04100d60 msgr2=0x7f2e04193930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e04100d60 0x7f2e04193930 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f2e0000c8a0 tx=0x7f2e0000cbb0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 -- 192.168.123.103:0/3577421107 shutdown_connections 2026-03-10T14:15:27.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f2df007bcd0 0x7f2df007e180 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2e04100d60 0x7f2e04193930 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 --2- 192.168.123.103:0/3577421107 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2e04103680 0x7f2e04193e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.504+0000 7f2e0b436700 1 -- 192.168.123.103:0/3577421107 >> 192.168.123.103:0/3577421107 conn(0x7f2e040fa7a0 msgr2=0x7f2e040fcb80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:27.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.507+0000 7f2e0b436700 1 -- 192.168.123.103:0/3577421107 shutdown_connections 2026-03-10T14:15:27.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.507+0000 7f2e0b436700 1 -- 192.168.123.103:0/3577421107 wait complete. 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.580+0000 7fdb10d69700 1 -- 192.168.123.103:0/1514074600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c072360 msgr2=0x7fdb0c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.580+0000 7fdb10d69700 1 --2- 192.168.123.103:0/1514074600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c072360 0x7fdb0c0770e0 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7fdb0400b600 tx=0x7fdb0400b910 comp rx=0 tx=0).stop 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 -- 192.168.123.103:0/1514074600 shutdown_connections 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 --2- 192.168.123.103:0/1514074600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c072360 0x7fdb0c0770e0 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 --2- 192.168.123.103:0/1514074600 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb0c071980 0x7fdb0c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 -- 192.168.123.103:0/1514074600 >> 192.168.123.103:0/1514074600 conn(0x7fdb0c06d1a0 msgr2=0x7fdb0c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 -- 192.168.123.103:0/1514074600 shutdown_connections 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 -- 192.168.123.103:0/1514074600 wait complete. 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 Processor -- start 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 -- start start 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c071980 0x7fdb0c082620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb0c082b60 0x7fdb0c082fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb0c1b2a90 con 0x7fdb0c071980 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb10d69700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb0c1b2bd0 con 0x7fdb0c082b60 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb0a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c071980 0x7fdb0c082620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb0a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c071980 0x7fdb0c082620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52370/0 (socket says 192.168.123.103:52370) 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.581+0000 7fdb0a59c700 1 -- 192.168.123.103:0/4051070254 learned_addr learned my addr 192.168.123.103:0/4051070254 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.582+0000 7fdb09d9b700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb0c082b60 0x7fdb0c082fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.582+0000 7fdb0a59c700 1 -- 192.168.123.103:0/4051070254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb0c082b60 msgr2=0x7fdb0c082fd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.582+0000 7fdb0a59c700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb0c082b60 0x7fdb0c082fd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.582+0000 7fdb0a59c700 1 -- 192.168.123.103:0/4051070254 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb0400b050 con 0x7fdb0c071980 2026-03-10T14:15:27.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.582+0000 7fdb0a59c700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c071980 0x7fdb0c082620 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7fdafc00baa0 tx=0x7fdafc00be60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:27.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.583+0000 7fdafb7fe700 1 -- 192.168.123.103:0/4051070254 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdafc00c760 con 0x7fdb0c071980 2026-03-10T14:15:27.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.583+0000 7fdb10d69700 1 -- 192.168.123.103:0/4051070254 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdb0c1b2e30 con 0x7fdb0c071980 2026-03-10T14:15:27.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.583+0000 7fdb10d69700 1 -- 192.168.123.103:0/4051070254 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdb0c1b32f0 con 0x7fdb0c071980 2026-03-10T14:15:27.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.585+0000 7fdafb7fe700 1 -- 192.168.123.103:0/4051070254 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdafc00cda0 con 0x7fdb0c071980 2026-03-10T14:15:27.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.585+0000 7fdafb7fe700 1 -- 192.168.123.103:0/4051070254 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdafc012550 con 0x7fdb0c071980 2026-03-10T14:15:27.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.585+0000 7fdafb7fe700 1 -- 192.168.123.103:0/4051070254 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdafc012770 con 0x7fdb0c071980 2026-03-10T14:15:27.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.585+0000 7fdafb7fe700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdaf4079c00 0x7fdaf407c0b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.586+0000 7fdb09d9b700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdaf4079c00 0x7fdaf407c0b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.586+0000 7fdb09d9b700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdaf4079c00 0x7fdaf407c0b0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fdb0400bd90 tx=0x7fdb04007c00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:27.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.587+0000 7fdb10d69700 1 -- 192.168.123.103:0/4051070254 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdaec005320 con 0x7fdb0c071980 2026-03-10T14:15:27.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.591+0000 7fdafb7fe700 1 -- 192.168.123.103:0/4051070254 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(121..121 src has 1..121) v4 ==== 6635+0+0 (secure 0 0 0) 0x7fdafc066900 con 0x7fdb0c071980 2026-03-10T14:15:27.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.591+0000 7fdafb7fe700 1 -- 192.168.123.103:0/4051070254 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdafc09c030 con 0x7fdb0c071980 2026-03-10T14:15:27.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.772+0000 7fdb10d69700 1 -- 192.168.123.103:0/4051070254 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fdaec005cc0 con 0x7fdb0c071980 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.773+0000 7fdafb7fe700 1 -- 192.168.123.103:0/4051070254 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 40 v40) v1 ==== 76+0+2006 (secure 0 0 0) 0x7fdafc09c070 con 0x7fdb0c071980 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:e40 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-10T14:15:22:636548+0000 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:epoch 40 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:15:22.636520+0000 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 121 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:up {0=34474,1=34476} 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 34474 members: 34476,34474 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{0:34474} state up:active seq 8 join_fscid=1 addr [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:34480} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.104:6824/3487856112,v1:192.168.123.104:6825/3487856112] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{1:34476} state up:active seq 6 join_fscid=1 addr [v2:192.168.123.103:6828/2389872767,v1:192.168.123.103:6829/2389872767] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:44351} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.104:6826/3930678535,v1:192.168.123.104:6827/3930678535] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:15:27.773 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 -- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdaf4079c00 msgr2=0x7fdaf407c0b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdaf4079c00 0x7fdaf407c0b0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fdb0400bd90 tx=0x7fdb04007c00 comp rx=0 tx=0).stop 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 -- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c071980 msgr2=0x7fdb0c082620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c071980 0x7fdb0c082620 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7fdafc00baa0 tx=0x7fdafc00be60 comp rx=0 tx=0).stop 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 -- 192.168.123.103:0/4051070254 shutdown_connections 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fdaf4079c00 0x7fdaf407c0b0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb0c071980 0x7fdb0c082620 unknown :-1 s=CLOSED pgs=177 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 --2- 192.168.123.103:0/4051070254 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fdb0c082b60 0x7fdb0c082fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 -- 192.168.123.103:0/4051070254 >> 192.168.123.103:0/4051070254 conn(0x7fdb0c06d1a0 msgr2=0x7fdb0c0764e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 -- 192.168.123.103:0/4051070254 shutdown_connections 2026-03-10T14:15:27.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.776+0000 7fdaf97fa700 1 -- 192.168.123.103:0/4051070254 wait complete. 2026-03-10T14:15:27.777 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 40 2026-03-10T14:15:27.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.856+0000 7f98fe9b9700 1 -- 192.168.123.103:0/3509512531 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8072440 msgr2=0x7f98f810be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:27.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.856+0000 7f98fe9b9700 1 --2- 192.168.123.103:0/3509512531 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8072440 0x7f98f810be90 secure :-1 s=READY pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7f98f000b3a0 tx=0x7f98f000b6b0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.856+0000 7f98fe9b9700 1 -- 192.168.123.103:0/3509512531 shutdown_connections 2026-03-10T14:15:27.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.860+0000 7f98fe9b9700 1 --2- 192.168.123.103:0/3509512531 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8072440 0x7f98f810be90 unknown :-1 s=CLOSED pgs=178 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.860+0000 7f98fe9b9700 1 --2- 192.168.123.103:0/3509512531 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98f8071a60 0x7f98f8071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.860+0000 7f98fe9b9700 1 -- 192.168.123.103:0/3509512531 >> 192.168.123.103:0/3509512531 conn(0x7f98f806d1a0 msgr2=0x7f98f806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:27.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.860+0000 7f98fe9b9700 1 -- 192.168.123.103:0/3509512531 shutdown_connections 2026-03-10T14:15:27.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.861+0000 7f98fe9b9700 1 -- 192.168.123.103:0/3509512531 wait complete. 2026-03-10T14:15:27.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.862+0000 7f98fe9b9700 1 Processor -- start 2026-03-10T14:15:27.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.862+0000 7f98fe9b9700 1 -- start start 2026-03-10T14:15:27.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.862+0000 7f98fe9b9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8071a60 0x7f98f81a4990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.862+0000 7f98fe9b9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98f8072440 0x7f98f81a4ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.862+0000 7f98fe9b9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98f81a54f0 con 0x7f98f8071a60 2026-03-10T14:15:27.861 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.862+0000 7f98fe9b9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98f81a5630 con 0x7f98f8072440 2026-03-10T14:15:27.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.863+0000 7f98fd9b7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8071a60 0x7f98f81a4990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.863+0000 7f98fd9b7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8071a60 0x7f98f81a4990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52380/0 (socket says 192.168.123.103:52380) 2026-03-10T14:15:27.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.863+0000 7f98fd9b7700 1 -- 192.168.123.103:0/3150241629 learned_addr learned my addr 192.168.123.103:0/3150241629 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:27.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.863+0000 7f98fd9b7700 1 -- 192.168.123.103:0/3150241629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98f8072440 msgr2=0x7f98f81a4ed0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:15:27.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.863+0000 7f98fd9b7700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98f8072440 0x7f98f81a4ed0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:27.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.863+0000 7f98fd9b7700 1 -- 192.168.123.103:0/3150241629 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f98f000b050 con 0x7f98f8071a60 2026-03-10T14:15:27.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.863+0000 7f98fd9b7700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8071a60 0x7f98f81a4990 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7f98ec00b700 tx=0x7f98ec00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:27.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.863+0000 7f98eaffd700 1 -- 192.168.123.103:0/3150241629 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f98ec010820 con 0x7f98f8071a60 2026-03-10T14:15:27.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.864+0000 7f98eaffd700 1 -- 192.168.123.103:0/3150241629 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f98ec010e60 con 0x7f98f8071a60 2026-03-10T14:15:27.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.864+0000 7f98eaffd700 1 -- 192.168.123.103:0/3150241629 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f98ec017570 con 0x7f98f8071a60 2026-03-10T14:15:27.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.866+0000 7f98fe9b9700 1 -- 192.168.123.103:0/3150241629 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f98f81aa0e0 con 0x7f98f8071a60 2026-03-10T14:15:27.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.866+0000 7f98fe9b9700 1 -- 192.168.123.103:0/3150241629 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f98f81aa600 con 0x7f98f8071a60 2026-03-10T14:15:27.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.868+0000 7f98e8ff9700 1 -- 192.168.123.103:0/3150241629 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f98f804ea50 con 0x7f98f8071a60 2026-03-10T14:15:27.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.872+0000 7f98eaffd700 1 -- 192.168.123.103:0/3150241629 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f98ec00f3c0 con 0x7f98f8071a60 2026-03-10T14:15:27.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.874+0000 7f98eaffd700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f98e4077910 0x7f98e4079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:27.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.874+0000 7f98eaffd700 1 -- 192.168.123.103:0/3150241629 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(121..121 src has 1..121) v4 ==== 6635+0+0 (secure 0 0 0) 0x7f98ec099c80 con 0x7f98f8071a60 2026-03-10T14:15:27.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.874+0000 7f98eaffd700 1 -- 192.168.123.103:0/3150241629 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f98ec062340 con 0x7f98f8071a60 2026-03-10T14:15:27.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.876+0000 7f98fd1b6700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f98e4077910 0x7f98e4079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:27.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:27.876+0000 7f98fd1b6700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f98e4077910 0x7f98e4079dc0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f98f0009250 tx=0x7f98f000bf90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:28.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.025+0000 7f98e8ff9700 1 -- 192.168.123.103:0/3150241629 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f98f8061190 con 0x7f98e4077910 2026-03-10T14:15:28.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.026+0000 7f98eaffd700 1 -- 192.168.123.103:0/3150241629 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+466 (secure 0 0 0) 0x7f98f8061190 con 0x7f98e4077910 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "mds", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "ceph-exporter", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "mgr", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "osd" 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "18/23 daemons upgraded", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading ceph-exporter daemons", 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:15:28.026 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:15:28.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.034+0000 7f98e8ff9700 1 -- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f98e4077910 msgr2=0x7f98e4079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:28.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.034+0000 7f98e8ff9700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f98e4077910 0x7f98e4079dc0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f98f0009250 tx=0x7f98f000bf90 comp rx=0 tx=0).stop 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.034+0000 7f98e8ff9700 1 -- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8071a60 msgr2=0x7f98f81a4990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.034+0000 7f98e8ff9700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8071a60 0x7f98f81a4990 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7f98ec00b700 tx=0x7f98ec00bac0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.034+0000 7f98e8ff9700 1 -- 192.168.123.103:0/3150241629 shutdown_connections 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.034+0000 7f98e8ff9700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f98e4077910 0x7f98e4079dc0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.034+0000 7f98e8ff9700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f98f8071a60 0x7f98f81a4990 unknown :-1 s=CLOSED pgs=179 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.034+0000 7f98e8ff9700 1 --2- 192.168.123.103:0/3150241629 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f98f8072440 0x7f98f81a4ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.035+0000 7f98e8ff9700 1 -- 192.168.123.103:0/3150241629 >> 192.168.123.103:0/3150241629 conn(0x7f98f806d1a0 msgr2=0x7f98f810a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.035+0000 7f98e8ff9700 1 -- 192.168.123.103:0/3150241629 shutdown_connections 2026-03-10T14:15:28.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.035+0000 7f98e8ff9700 1 -- 192.168.123.103:0/3150241629 wait complete. 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: pgmap v274: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 8.3 KiB/s wr, 24 op/s; 247/243 objects degraded (101.646%); 0 B/s, 11 objects/s recovering 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='client.34490 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='client.44359 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3577421107' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.110 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:27 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/4051070254' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:15:28.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.113+0000 7fa5b54ce700 1 -- 192.168.123.103:0/760228726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0071a60 msgr2=0x7fa5b0071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:28.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.113+0000 7fa5b54ce700 1 --2- 192.168.123.103:0/760228726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0071a60 0x7fa5b0071e70 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7fa5a0009b00 tx=0x7fa5a0009e10 comp rx=0 tx=0).stop 2026-03-10T14:15:28.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 -- 192.168.123.103:0/760228726 shutdown_connections 2026-03-10T14:15:28.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 --2- 192.168.123.103:0/760228726 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5b0072440 0x7fa5b010be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 --2- 192.168.123.103:0/760228726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0071a60 0x7fa5b0071e70 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 -- 192.168.123.103:0/760228726 >> 192.168.123.103:0/760228726 conn(0x7fa5b006d1a0 msgr2=0x7fa5b006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:28.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 -- 192.168.123.103:0/760228726 shutdown_connections 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 -- 192.168.123.103:0/760228726 wait complete. 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 Processor -- start 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 -- start start 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5b0071a60 0x7fa5b0116a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0072440 0x7fa5b0116f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5b01175a0 con 0x7fa5b0072440 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.114+0000 7fa5b54ce700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5b01b26f0 con 0x7fa5b0071a60 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5af7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0072440 0x7fa5b0116f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5af7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0072440 0x7fa5b0116f80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:52396/0 (socket says 192.168.123.103:52396) 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5af7fe700 1 -- 192.168.123.103:0/1970354273 learned_addr learned my addr 192.168.123.103:0/1970354273 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5affff700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5b0071a60 0x7fa5b0116a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:28.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5af7fe700 1 -- 192.168.123.103:0/1970354273 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5b0071a60 msgr2=0x7fa5b0116a40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:28.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5af7fe700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5b0071a60 0x7fa5b0116a40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5af7fe700 1 -- 192.168.123.103:0/1970354273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5a00097e0 con 0x7fa5b0072440 2026-03-10T14:15:28.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5af7fe700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0072440 0x7fa5b0116f80 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7fa5a4009fd0 tx=0x7fa5a400eea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:28.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.115+0000 7fa5ad7fa700 1 -- 192.168.123.103:0/1970354273 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5a4009980 con 0x7fa5b0072440 2026-03-10T14:15:28.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.116+0000 7fa5ad7fa700 1 -- 192.168.123.103:0/1970354273 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa5a4004d10 con 0x7fa5b0072440 2026-03-10T14:15:28.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.116+0000 7fa5ad7fa700 1 -- 192.168.123.103:0/1970354273 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5a4010470 con 0x7fa5b0072440 2026-03-10T14:15:28.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.116+0000 7fa5b54ce700 1 -- 192.168.123.103:0/1970354273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa5b01b28f0 con 0x7fa5b0072440 2026-03-10T14:15:28.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.116+0000 7fa5b54ce700 1 -- 192.168.123.103:0/1970354273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5b01b2df0 con 0x7fa5b0072440 2026-03-10T14:15:28.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.116+0000 7fa5b54ce700 1 -- 192.168.123.103:0/1970354273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa5b0110c20 con 0x7fa5b0072440 2026-03-10T14:15:28.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.119+0000 7fa5ad7fa700 1 -- 192.168.123.103:0/1970354273 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa5a400cca0 con 0x7fa5b0072440 2026-03-10T14:15:28.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.119+0000 7fa5ad7fa700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa5980779e0 0x7fa598079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:28.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.119+0000 7fa5ad7fa700 1 -- 192.168.123.103:0/1970354273 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(121..121 src has 1..121) v4 ==== 6635+0+0 (secure 0 0 0) 0x7fa5a4014070 con 0x7fa5b0072440 2026-03-10T14:15:28.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.120+0000 7fa5ad7fa700 1 -- 192.168.123.103:0/1970354273 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa5a4062d50 con 0x7fa5b0072440 2026-03-10T14:15:28.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.120+0000 7fa5affff700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa5980779e0 0x7fa598079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:28.127 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.121+0000 7fa5affff700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa5980779e0 0x7fa598079e90 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fa5a000b5c0 tx=0x7fa5a001a040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:28.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.287+0000 7fa5b54ce700 1 -- 192.168.123.103:0/1970354273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fa5b002cc30 con 0x7fa5b0072440 2026-03-10T14:15:28.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.288+0000 7fa5ad7fa700 1 -- 192.168.123.103:0/1970354273 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+456 (secure 0 0 0) 0x7fa5a40624a0 con 0x7fa5b0072440 2026-03-10T14:15:28.287 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded, 2 pgs undersized 2026-03-10T14:15:28.287 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 247/243 objects degraded (101.646%), 1 pg degraded, 2 pgs undersized 2026-03-10T14:15:28.287 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is stuck undersized for 70s, current state active+recovery_wait+undersized+degraded+remapped, last acting [0,2] 2026-03-10T14:15:28.287 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is stuck undersized for 91s, current state active+recovering+undersized+remapped, last acting [0,4] 2026-03-10T14:15:28.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 -- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa5980779e0 msgr2=0x7fa598079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:28.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa5980779e0 0x7fa598079e90 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fa5a000b5c0 tx=0x7fa5a001a040 comp rx=0 tx=0).stop 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 -- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0072440 msgr2=0x7fa5b0116f80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0072440 0x7fa5b0116f80 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7fa5a4009fd0 tx=0x7fa5a400eea0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 -- 192.168.123.103:0/1970354273 shutdown_connections 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa5980779e0 0x7fa598079e90 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa5b0071a60 0x7fa5b0116a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1970354273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5b0072440 0x7fa5b0116f80 unknown :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.291+0000 7fa596ffd700 1 -- 192.168.123.103:0/1970354273 >> 192.168.123.103:0/1970354273 conn(0x7fa5b006d1a0 msgr2=0x7fa5b010a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.292+0000 7fa596ffd700 1 -- 192.168.123.103:0/1970354273 shutdown_connections 2026-03-10T14:15:28.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:28.292+0000 7fa596ffd700 1 -- 192.168.123.103:0/1970354273 wait complete. 2026-03-10T14:15:28.295 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: pgmap v274: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 8.3 KiB/s wr, 24 op/s; 247/243 objects degraded (101.646%); 0 B/s, 11 objects/s recovering 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='client.34490 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='client.44359 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3577421107' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:28.296 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:27 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/4051070254' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='client.44363 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='client.34506 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1970354273' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]: dispatch 2026-03-10T14:15:29.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]': finished 2026-03-10T14:15:29.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm04"}]: dispatch 2026-03-10T14:15:29.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm04"}]': finished 2026-03-10T14:15:29.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:29 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='client.44363 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='client.34506 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1970354273' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]': finished 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm04"}]: dispatch 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm04"}]': finished 2026-03-10T14:15:29.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:29 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: pgmap v275: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 7.6 KiB/s wr, 20 op/s; 247/243 objects degraded (101.646%); 0 B/s, 7 objects/s recovering 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all iscsi 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all nfs 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: Upgrade: Setting container_image for all nvmeof 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: osdmap e122: 6 total, 6 up, 6 in 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: Upgrade: Updating node-exporter.vm03 (1/2) 2026-03-10T14:15:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:30 vm03.local ceph-mon[103098]: Deploying daemon node-exporter.vm03 on vm03 2026-03-10T14:15:30.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: pgmap v275: 65 pgs: 1 active+recovering+undersized+remapped, 1 active+recovery_wait+undersized+degraded+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 7.6 KiB/s wr, 20 op/s; 247/243 objects degraded (101.646%); 0 B/s, 7 objects/s recovering 2026-03-10T14:15:30.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all iscsi 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all nfs 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: Upgrade: Setting container_image for all nvmeof 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: osdmap e122: 6 total, 6 up, 6 in 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: Upgrade: Updating node-exporter.vm03 (1/2) 2026-03-10T14:15:30.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:30 vm04.local ceph-mon[92084]: Deploying daemon node-exporter.vm03 on vm03 2026-03-10T14:15:31.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:31 vm03.local ceph-mon[103098]: osdmap e123: 6 total, 6 up, 6 in 2026-03-10T14:15:31.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:31 vm03.local ceph-mon[103098]: pgmap v278: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 8.7 KiB/s wr, 20 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:31.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:31.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:15:31.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:31 vm03.local ceph-mon[103098]: Health check update: Degraded data redundancy: 1 pg undersized (PG_DEGRADED) 2026-03-10T14:15:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:31 vm04.local ceph-mon[92084]: osdmap e123: 6 total, 6 up, 6 in 2026-03-10T14:15:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:31 vm04.local ceph-mon[92084]: pgmap v278: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 8.7 KiB/s wr, 20 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:15:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:31 vm04.local ceph-mon[92084]: Health check update: Degraded data redundancy: 1 pg undersized (PG_DEGRADED) 2026-03-10T14:15:33.483 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:33.483 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:33 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:33.483 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:33 vm04.local ceph-mon[92084]: Upgrade: Updating node-exporter.vm04 (2/2) 2026-03-10T14:15:33.483 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:33 vm04.local ceph-mon[92084]: Deploying daemon node-exporter.vm04 on vm04 2026-03-10T14:15:33.484 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:33 vm04.local ceph-mon[92084]: pgmap v279: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 8.6 KiB/s wr, 17 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:33.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:33.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:33 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:33.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:33 vm03.local ceph-mon[103098]: Upgrade: Updating node-exporter.vm04 (2/2) 2026-03-10T14:15:33.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:33 vm03.local ceph-mon[103098]: Deploying daemon node-exporter.vm04 on vm04 2026-03-10T14:15:33.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:33 vm03.local ceph-mon[103098]: pgmap v279: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 8.6 KiB/s wr, 17 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:35.816 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:35 vm04.local ceph-mon[92084]: pgmap v280: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 6.3 MiB/s rd, 0 B/s wr, 6 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:15:35.816 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:35 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:35.816 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:35 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:35.816 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:35 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:36.070 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:35 vm03.local ceph-mon[103098]: pgmap v280: 65 pgs: 1 peering, 1 active+recovering+undersized+remapped, 63 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 6.3 MiB/s rd, 0 B/s wr, 6 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:15:36.070 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:35 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:36.070 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:35 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:36.070 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:35 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: pgmap v281: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 845 KiB/s rd, 3 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.801 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:37 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: pgmap v281: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 845 KiB/s rd, 3 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:37.814 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:37 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:38 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:39.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:38 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:39 vm04.local ceph-mon[92084]: pgmap v282: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 728 KiB/s rd, 3 op/s; 0 B/s, 10 objects/s recovering 2026-03-10T14:15:40.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:39 vm04.local ceph-mon[92084]: Upgrade: Updating prometheus.vm03 2026-03-10T14:15:40.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:39 vm04.local ceph-mon[92084]: Deploying daemon prometheus.vm03 on vm03 2026-03-10T14:15:40.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:39 vm03.local ceph-mon[103098]: pgmap v282: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 728 KiB/s rd, 3 op/s; 0 B/s, 10 objects/s recovering 2026-03-10T14:15:40.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:39 vm03.local ceph-mon[103098]: Upgrade: Updating prometheus.vm03 2026-03-10T14:15:40.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:39 vm03.local ceph-mon[103098]: Deploying daemon prometheus.vm03 on vm03 2026-03-10T14:15:42.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:42 vm04.local ceph-mon[92084]: pgmap v283: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.8 KiB/s rd, 3 op/s; 0 B/s, 9 objects/s recovering 2026-03-10T14:15:42.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:42 vm03.local ceph-mon[103098]: pgmap v283: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.8 KiB/s rd, 3 op/s; 0 B/s, 9 objects/s recovering 2026-03-10T14:15:44.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:44 vm04.local ceph-mon[92084]: pgmap v284: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.6 KiB/s rd, 3 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:44.349 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:44 vm03.local ceph-mon[103098]: pgmap v284: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.6 KiB/s rd, 3 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:45.497 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:45.498 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:45.498 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:45.498 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:45 vm03.local ceph-mon[103098]: pgmap v285: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:45.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:45.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:45.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:45.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:45 vm04.local ceph-mon[92084]: pgmap v285: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:46.504 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:15:46.504 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:46.504 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:46.504 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:46.504 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:46.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:15:46.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:46.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:46.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:46.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: pgmap v286: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 4 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:47 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: pgmap v286: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 4 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:48.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:47 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:49.235 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:48 vm03.local ceph-mon[103098]: Upgrade: Updating alertmanager.vm03 2026-03-10T14:15:49.235 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:48 vm03.local ceph-mon[103098]: Deploying daemon alertmanager.vm03 on vm03 2026-03-10T14:15:49.235 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:49.235 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:49.235 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:48 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:48 vm04.local ceph-mon[92084]: Upgrade: Updating alertmanager.vm03 2026-03-10T14:15:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:48 vm04.local ceph-mon[92084]: Deploying daemon alertmanager.vm03 on vm03 2026-03-10T14:15:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:48 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:50.489 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:50 vm03.local ceph-mon[103098]: pgmap v287: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 4 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:50.489 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:50 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:50.489 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:50 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:50.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:50 vm04.local ceph-mon[92084]: pgmap v287: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 4 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:50.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:50 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:50.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:50 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: pgmap v288: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s; 0 B/s, 12 objects/s recovering 2026-03-10T14:15:51.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.361 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:51 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: pgmap v288: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s; 0 B/s, 12 objects/s recovering 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:51.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:51.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:51 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:52.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:52 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T14:15:52.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:52 vm04.local ceph-mon[92084]: Upgrade: Updating grafana.vm03 2026-03-10T14:15:52.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:52 vm04.local ceph-mon[92084]: Deploying daemon grafana.vm03 on vm03 2026-03-10T14:15:52.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:52 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T14:15:52.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:52 vm03.local ceph-mon[103098]: Upgrade: Updating grafana.vm03 2026-03-10T14:15:52.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:52 vm03.local ceph-mon[103098]: Deploying daemon grafana.vm03 on vm03 2026-03-10T14:15:53.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:53 vm03.local ceph-mon[103098]: pgmap v289: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:53.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:53 vm04.local ceph-mon[92084]: pgmap v289: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:55 vm04.local ceph-mon[92084]: pgmap v290: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:55 vm04.local ceph-mon[92084]: osdmap e124: 6 total, 6 up, 6 in 2026-03-10T14:15:56.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:55 vm03.local ceph-mon[103098]: pgmap v290: 65 pgs: 1 active+recovering+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s; 0 B/s, 7 objects/s recovering 2026-03-10T14:15:56.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:55 vm03.local ceph-mon[103098]: osdmap e124: 6 total, 6 up, 6 in 2026-03-10T14:15:57.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:57 vm03.local ceph-mon[103098]: osdmap e125: 6 total, 6 up, 6 in 2026-03-10T14:15:57.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:57 vm03.local ceph-mon[103098]: pgmap v293: 65 pgs: 1 active+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:58.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:57 vm04.local ceph-mon[92084]: osdmap e125: 6 total, 6 up, 6 in 2026-03-10T14:15:58.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:57 vm04.local ceph-mon[92084]: pgmap v293: 65 pgs: 1 active+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 0 B/s, 11 objects/s recovering 2026-03-10T14:15:58.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.364+0000 7f173bfff700 1 -- 192.168.123.103:0/2046299117 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 msgr2=0x7f17340a58c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:58.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.364+0000 7f173bfff700 1 --2- 192.168.123.103:0/2046299117 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 0x7f17340a58c0 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7f173c0669f0 tx=0x7f173c0699f0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.365+0000 7f173bfff700 1 -- 192.168.123.103:0/2046299117 shutdown_connections 2026-03-10T14:15:58.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.365+0000 7f173bfff700 1 --2- 192.168.123.103:0/2046299117 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 0x7f17340a58c0 unknown :-1 s=CLOSED pgs=182 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.365+0000 7f173bfff700 1 --2- 192.168.123.103:0/2046299117 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17340a4310 0x7f17340a4720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.365+0000 7f173bfff700 1 -- 192.168.123.103:0/2046299117 >> 192.168.123.103:0/2046299117 conn(0x7f173409f7e0 msgr2=0x7f17340a1c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:58.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.366+0000 7f173bfff700 1 -- 192.168.123.103:0/2046299117 shutdown_connections 2026-03-10T14:15:58.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.366+0000 7f173bfff700 1 -- 192.168.123.103:0/2046299117 wait complete. 2026-03-10T14:15:58.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.366+0000 7f173bfff700 1 Processor -- start 2026-03-10T14:15:58.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.366+0000 7f173bfff700 1 -- start start 2026-03-10T14:15:58.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.366+0000 7f173bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17340a4310 0x7f17340b3450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.366+0000 7f173bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 0x7f17340b39b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.366+0000 7f173bfff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17340b3ff0 con 0x7f17340a5450 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.366+0000 7f173bfff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f173413ed80 con 0x7f17340a4310 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173a7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 0x7f17340b39b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173a7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 0x7f17340b39b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60372/0 (socket says 192.168.123.103:60372) 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173a7fc700 1 -- 192.168.123.103:0/160127256 learned_addr learned my addr 192.168.123.103:0/160127256 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173a7fc700 1 -- 192.168.123.103:0/160127256 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17340a4310 msgr2=0x7f17340b3450 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173a7fc700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17340a4310 0x7f17340b3450 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173a7fc700 1 -- 192.168.123.103:0/160127256 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f173c067050 con 0x7f17340a5450 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173a7fc700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 0x7f17340b39b0 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f173c066080 tx=0x7f173c0688a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f1723fff700 1 -- 192.168.123.103:0/160127256 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f173c078030 con 0x7f17340a5450 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f1723fff700 1 -- 192.168.123.103:0/160127256 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f173c073440 con 0x7f17340a5450 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f1723fff700 1 -- 192.168.123.103:0/160127256 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f173c07d3c0 con 0x7f17340a5450 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f173413f000 con 0x7f17340a5450 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.367+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f173413f4c0 con 0x7f17340a5450 2026-03-10T14:15:58.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.368+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1734004f80 con 0x7f17340a5450 2026-03-10T14:15:58.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.371+0000 7f1723fff700 1 -- 192.168.123.103:0/160127256 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f173c068c90 con 0x7f17340a5450 2026-03-10T14:15:58.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.371+0000 7f1723fff700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1724077aa0 0x7f1724079f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.371+0000 7f1723fff700 1 -- 192.168.123.103:0/160127256 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f173c08d030 con 0x7f17340a5450 2026-03-10T14:15:58.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.376+0000 7f173affd700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1724077aa0 0x7f1724079f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:58.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.376+0000 7f173affd700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1724077aa0 0x7f1724079f50 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f17340b4720 tx=0x7f1730005c10 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:58.378 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.379+0000 7f1723fff700 1 -- 192.168.123.103:0/160127256 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f173c0c74e0 con 0x7f17340a5450 2026-03-10T14:15:58.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.512+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f17340a9da0 con 0x7f1724077aa0 2026-03-10T14:15:58.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.513+0000 7f1723fff700 1 -- 192.168.123.103:0/160127256 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7f17340a9da0 con 0x7f1724077aa0 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.521+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1724077aa0 msgr2=0x7f1724079f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.521+0000 7f173bfff700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1724077aa0 0x7f1724079f50 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f17340b4720 tx=0x7f1730005c10 comp rx=0 tx=0).stop 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.521+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 msgr2=0x7f17340b39b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.521+0000 7f173bfff700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 0x7f17340b39b0 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f173c066080 tx=0x7f173c0688a0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.522+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 shutdown_connections 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.522+0000 7f173bfff700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1724077aa0 0x7f1724079f50 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.522+0000 7f173bfff700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f17340a4310 0x7f17340b3450 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.522+0000 7f173bfff700 1 --2- 192.168.123.103:0/160127256 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f17340a5450 0x7f17340b39b0 unknown :-1 s=CLOSED pgs=183 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.522+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 >> 192.168.123.103:0/160127256 conn(0x7f173409f7e0 msgr2=0x7f17340a8680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:58.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.522+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 shutdown_connections 2026-03-10T14:15:58.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.522+0000 7f173bfff700 1 -- 192.168.123.103:0/160127256 wait complete. 2026-03-10T14:15:58.530 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:15:58.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.643+0000 7f58ef259700 1 -- 192.168.123.103:0/341715365 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e8072440 msgr2=0x7f58e810be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:58.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.643+0000 7f58ef259700 1 --2- 192.168.123.103:0/341715365 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e8072440 0x7f58e810be90 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f58e4009b00 tx=0x7f58e4009e10 comp rx=0 tx=0).stop 2026-03-10T14:15:58.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.643+0000 7f58ef259700 1 -- 192.168.123.103:0/341715365 shutdown_connections 2026-03-10T14:15:58.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.643+0000 7f58ef259700 1 --2- 192.168.123.103:0/341715365 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e8072440 0x7f58e810be90 secure :-1 s=CLOSED pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f58e4009b00 tx=0x7f58e4009e10 comp rx=0 tx=0).stop 2026-03-10T14:15:58.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.643+0000 7f58ef259700 1 --2- 192.168.123.103:0/341715365 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58e8071a60 0x7f58e8071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.643+0000 7f58ef259700 1 -- 192.168.123.103:0/341715365 >> 192.168.123.103:0/341715365 conn(0x7f58e806d1a0 msgr2=0x7f58e806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:58.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.645+0000 7f58ef259700 1 -- 192.168.123.103:0/341715365 shutdown_connections 2026-03-10T14:15:58.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.645+0000 7f58ef259700 1 -- 192.168.123.103:0/341715365 wait complete. 2026-03-10T14:15:58.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.645+0000 7f58ef259700 1 Processor -- start 2026-03-10T14:15:58.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58ef259700 1 -- start start 2026-03-10T14:15:58.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58ef259700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58e8071a60 0x7f58e8117690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58ef259700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e81b0480 0x7f58e81b2860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58ef259700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58e81b2da0 con 0x7f58e81b0480 2026-03-10T14:15:58.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58ef259700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58e81b2f10 con 0x7f58e8071a60 2026-03-10T14:15:58.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58eda56700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e81b0480 0x7f58e81b2860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:58.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58eda56700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e81b0480 0x7f58e81b2860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60386/0 (socket says 192.168.123.103:60386) 2026-03-10T14:15:58.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58eda56700 1 -- 192.168.123.103:0/986927281 learned_addr learned my addr 192.168.123.103:0/986927281 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:58.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58eda56700 1 -- 192.168.123.103:0/986927281 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58e8071a60 msgr2=0x7f58e8117690 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:15:58.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58eda56700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58e8071a60 0x7f58e8117690 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.646+0000 7f58eda56700 1 -- 192.168.123.103:0/986927281 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58e40097e0 con 0x7f58e81b0480 2026-03-10T14:15:58.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.647+0000 7f58eda56700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e81b0480 0x7f58e81b2860 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f58e40052a0 tx=0x7f58e4003680 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:58.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.647+0000 7f58df7fe700 1 -- 192.168.123.103:0/986927281 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f58e401d070 con 0x7f58e81b0480 2026-03-10T14:15:58.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.647+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f58e81b3190 con 0x7f58e81b0480 2026-03-10T14:15:58.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.647+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f58e81b3680 con 0x7f58e81b0480 2026-03-10T14:15:58.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.647+0000 7f58df7fe700 1 -- 192.168.123.103:0/986927281 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f58e4003c10 con 0x7f58e81b0480 2026-03-10T14:15:58.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.647+0000 7f58df7fe700 1 -- 192.168.123.103:0/986927281 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f58e4017710 con 0x7f58e81b0480 2026-03-10T14:15:58.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.648+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f58cc005320 con 0x7f58e81b0480 2026-03-10T14:15:58.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.650+0000 7f58df7fe700 1 -- 192.168.123.103:0/986927281 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f58e4017870 con 0x7f58e81b0480 2026-03-10T14:15:58.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.651+0000 7f58df7fe700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f58d40778c0 0x7f58d4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.651+0000 7f58ee257700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f58d40778c0 0x7f58d4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:58.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.651+0000 7f58df7fe700 1 -- 192.168.123.103:0/986927281 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f58e409b0e0 con 0x7f58e81b0480 2026-03-10T14:15:58.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.651+0000 7f58ee257700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f58d40778c0 0x7f58d4079d70 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f58e8072f50 tx=0x7f58e000b040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:58.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.653+0000 7f58df7fe700 1 -- 192.168.123.103:0/986927281 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f58e4063950 con 0x7f58e81b0480 2026-03-10T14:15:58.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.800+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f58cc000bf0 con 0x7f58d40778c0 2026-03-10T14:15:58.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.807+0000 7f58df7fe700 1 -- 192.168.123.103:0/986927281 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7f58cc000bf0 con 0x7f58d40778c0 2026-03-10T14:15:58.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.812+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f58d40778c0 msgr2=0x7f58d4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:58.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.812+0000 7f58ef259700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f58d40778c0 0x7f58d4079d70 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f58e8072f50 tx=0x7f58e000b040 comp rx=0 tx=0).stop 2026-03-10T14:15:58.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.812+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e81b0480 msgr2=0x7f58e81b2860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:58.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.812+0000 7f58ef259700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e81b0480 0x7f58e81b2860 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f58e40052a0 tx=0x7f58e4003680 comp rx=0 tx=0).stop 2026-03-10T14:15:58.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.814+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 shutdown_connections 2026-03-10T14:15:58.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.814+0000 7f58ef259700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f58d40778c0 0x7f58d4079d70 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.814+0000 7f58ef259700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f58e8071a60 0x7f58e8117690 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.814+0000 7f58ef259700 1 --2- 192.168.123.103:0/986927281 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58e81b0480 0x7f58e81b2860 unknown :-1 s=CLOSED pgs=185 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.814+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 >> 192.168.123.103:0/986927281 conn(0x7f58e806d1a0 msgr2=0x7f58e810a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:58.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.814+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 shutdown_connections 2026-03-10T14:15:58.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.814+0000 7f58ef259700 1 -- 192.168.123.103:0/986927281 wait complete. 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.944+0000 7fb61076f700 1 -- 192.168.123.103:0/1577369288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb608102760 msgr2=0x7fb608102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.944+0000 7fb61076f700 1 --2- 192.168.123.103:0/1577369288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb608102760 0x7fb608102b70 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fb604009b00 tx=0x7fb604009e10 comp rx=0 tx=0).stop 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.944+0000 7fb61076f700 1 -- 192.168.123.103:0/1577369288 shutdown_connections 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.944+0000 7fb61076f700 1 --2- 192.168.123.103:0/1577369288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb608103960 0x7fb608103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.944+0000 7fb61076f700 1 --2- 192.168.123.103:0/1577369288 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb608102760 0x7fb608102b70 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.944+0000 7fb61076f700 1 -- 192.168.123.103:0/1577369288 >> 192.168.123.103:0/1577369288 conn(0x7fb6080fdcf0 msgr2=0x7fb608100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.944+0000 7fb61076f700 1 -- 192.168.123.103:0/1577369288 shutdown_connections 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.944+0000 7fb61076f700 1 -- 192.168.123.103:0/1577369288 wait complete. 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb61076f700 1 Processor -- start 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb61076f700 1 -- start start 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb61076f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb608102760 0x7fb608197e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb61076f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb608103960 0x7fb6081983a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.946 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb61076f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb6081989c0 con 0x7fb608103960 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb61076f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb608198b00 con 0x7fb608102760 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb60dd0a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb608103960 0x7fb6081983a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb60dd0a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb608103960 0x7fb6081983a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60406/0 (socket says 192.168.123.103:60406) 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb60dd0a700 1 -- 192.168.123.103:0/3893332438 learned_addr learned my addr 192.168.123.103:0/3893332438 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.945+0000 7fb60e50b700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb608102760 0x7fb608197e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.946+0000 7fb60dd0a700 1 -- 192.168.123.103:0/3893332438 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb608102760 msgr2=0x7fb608197e60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.946+0000 7fb60dd0a700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb608102760 0x7fb608197e60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.946+0000 7fb60dd0a700 1 -- 192.168.123.103:0/3893332438 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb6040097e0 con 0x7fb608103960 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.946+0000 7fb60dd0a700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb608103960 0x7fb6081983a0 secure :-1 s=READY pgs=186 cs=0 l=1 rev1=1 crypto rx=0x7fb5f800d8d0 tx=0x7fb5f800dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.946+0000 7fb5ff7fe700 1 -- 192.168.123.103:0/3893332438 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb5f8009880 con 0x7fb608103960 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.946+0000 7fb61076f700 1 -- 192.168.123.103:0/3893332438 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb60819d5b0 con 0x7fb608103960 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.946+0000 7fb61076f700 1 -- 192.168.123.103:0/3893332438 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb60819daa0 con 0x7fb608103960 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.947+0000 7fb5ff7fe700 1 -- 192.168.123.103:0/3893332438 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb5f8010460 con 0x7fb608103960 2026-03-10T14:15:58.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.947+0000 7fb5ff7fe700 1 -- 192.168.123.103:0/3893332438 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb5f800f5d0 con 0x7fb608103960 2026-03-10T14:15:58.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.947+0000 7fb5fd7fa700 1 -- 192.168.123.103:0/3893332438 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb5ec005320 con 0x7fb608103960 2026-03-10T14:15:58.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.949+0000 7fb5ff7fe700 1 -- 192.168.123.103:0/3893332438 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb5f80099e0 con 0x7fb608103960 2026-03-10T14:15:58.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.949+0000 7fb5ff7fe700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb5f4077870 0x7fb5f4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:58.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.951+0000 7fb60e50b700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb5f4077870 0x7fb5f4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:58.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.951+0000 7fb5ff7fe700 1 -- 192.168.123.103:0/3893332438 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fb5f8066ad0 con 0x7fb608103960 2026-03-10T14:15:58.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.951+0000 7fb60e50b700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb5f4077870 0x7fb5f4079d20 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fb60400b5c0 tx=0x7fb604009f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:58.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:58.954+0000 7fb5ff7fe700 1 -- 192.168.123.103:0/3893332438 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb5f80624f0 con 0x7fb608103960 2026-03-10T14:15:59.145 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.143+0000 7fb5fd7fa700 1 -- 192.168.123.103:0/3893332438 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fb5ec000bf0 con 0x7fb5f4077870 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (10s) 9s ago 12m 15.0M - 0.25.0 c8568f914cd2 cf55e8b6005f 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (35s) 9s ago 12m 10.4M - 19.2.3-678-ge911bdeb 654f31e6858e 1bd623640ecf 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (33s) 22s ago 12m 10.1M - 19.2.3-678-ge911bdeb 654f31e6858e 54bbafe0555e 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (6m) 9s ago 12m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (6m) 22s ago 12m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 starting - - - - 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (62s) 9s ago 10m 80.8M - 19.2.3-678-ge911bdeb 654f31e6858e b1023e0bcace 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (55s) 9s ago 10m 19.6M - 19.2.3-678-ge911bdeb 654f31e6858e ddad42dde865 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (40s) 22s ago 10m 18.3M - 19.2.3-678-ge911bdeb 654f31e6858e f5060c63df19 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (48s) 22s ago 10m 79.0M - 19.2.3-678-ge911bdeb 654f31e6858e 89fac44ae15d 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (7m) 9s ago 13m 629M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (7m) 22s ago 12m 499M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (7m) 9s ago 13m 69.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (6m) 22s ago 12m 55.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (27s) 9s ago 12m 8808k - 1.7.0 72c9c2088986 e10d80d39d78 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (23s) 22s ago 12m 4874k - 1.7.0 72c9c2088986 29f4915b7954 2026-03-10T14:15:59.151 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (6m) 9s ago 11m 284M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 9s ago 11m 234M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 9s ago 11m 171M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e0d768b70468 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (2m) 22s ago 11m 150M 4096M 19.2.3-678-ge911bdeb 654f31e6858e f7fc2aafa9d9 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (2m) 22s ago 11m 124M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 89f9225212d4 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (112s) 22s ago 10m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6c7573f5f3fa 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (14s) 9s ago 12m 53.2M - 2.51.0 1d3b7f56885b 465f119ec393 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.150+0000 7fb5ff7fe700 1 -- 192.168.123.103:0/3893332438 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fb5ec000bf0 con 0x7fb5f4077870 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 -- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb5f4077870 msgr2=0x7fb5f4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb5f4077870 0x7fb5f4079d20 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fb60400b5c0 tx=0x7fb604009f90 comp rx=0 tx=0).stop 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 -- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb608103960 msgr2=0x7fb6081983a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb608103960 0x7fb6081983a0 secure :-1 s=READY pgs=186 cs=0 l=1 rev1=1 crypto rx=0x7fb5f800d8d0 tx=0x7fb5f800dbe0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 -- 192.168.123.103:0/3893332438 shutdown_connections 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb5f4077870 0x7fb5f4079d20 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb608102760 0x7fb608197e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 --2- 192.168.123.103:0/3893332438 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb608103960 0x7fb6081983a0 unknown :-1 s=CLOSED pgs=186 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 -- 192.168.123.103:0/3893332438 >> 192.168.123.103:0/3893332438 conn(0x7fb6080fdcf0 msgr2=0x7fb6080fffc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 -- 192.168.123.103:0/3893332438 shutdown_connections 2026-03-10T14:15:59.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.154+0000 7fb61076f700 1 -- 192.168.123.103:0/3893332438 wait complete. 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.269+0000 7f1153217700 1 -- 192.168.123.103:0/201223239 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c072360 msgr2=0x7f114c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.269+0000 7f1153217700 1 --2- 192.168.123.103:0/201223239 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c072360 0x7f114c0770e0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f114400b3a0 tx=0x7f114400b6b0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.269+0000 7f1153217700 1 -- 192.168.123.103:0/201223239 shutdown_connections 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.269+0000 7f1153217700 1 --2- 192.168.123.103:0/201223239 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c072360 0x7f114c0770e0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.269+0000 7f1153217700 1 --2- 192.168.123.103:0/201223239 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f114c071980 0x7f114c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.269+0000 7f1153217700 1 -- 192.168.123.103:0/201223239 >> 192.168.123.103:0/201223239 conn(0x7f114c06d1a0 msgr2=0x7f114c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.269+0000 7f1153217700 1 -- 192.168.123.103:0/201223239 shutdown_connections 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.269+0000 7f1153217700 1 -- 192.168.123.103:0/201223239 wait complete. 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f1153217700 1 Processor -- start 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f1153217700 1 -- start start 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f1153217700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f114c071980 0x7f114c082580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f1153217700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c082ac0 0x7f114c082f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f1153217700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f114c1b2a90 con 0x7f114c071980 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f1153217700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f114c1b2bd0 con 0x7f114c082ac0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f114bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c082ac0 0x7f114c082f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f114bfff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c082ac0 0x7f114c082f30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:54004/0 (socket says 192.168.123.103:54004) 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f114bfff700 1 -- 192.168.123.103:0/1102414870 learned_addr learned my addr 192.168.123.103:0/1102414870 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f114bfff700 1 -- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f114c071980 msgr2=0x7f114c082580 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f114bfff700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f114c071980 0x7f114c082580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f114bfff700 1 -- 192.168.123.103:0/1102414870 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f113c009710 con 0x7f114c082ac0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.270+0000 7f114bfff700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c082ac0 0x7f114c082f30 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f11440095a0 tx=0x7f11440096c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.271+0000 7f1149ffb700 1 -- 192.168.123.103:0/1102414870 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f114400e050 con 0x7f114c082ac0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.271+0000 7f1149ffb700 1 -- 192.168.123.103:0/1102414870 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1144007e90 con 0x7f114c082ac0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.271+0000 7f1149ffb700 1 -- 192.168.123.103:0/1102414870 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f11440049a0 con 0x7f114c082ac0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.274+0000 7f1153217700 1 -- 192.168.123.103:0/1102414870 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f114400b050 con 0x7f114c082ac0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.274+0000 7f1153217700 1 -- 192.168.123.103:0/1102414870 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f114c1b3100 con 0x7f114c082ac0 2026-03-10T14:15:59.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.274+0000 7f1153217700 1 -- 192.168.123.103:0/1102414870 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f114c07c8b0 con 0x7f114c082ac0 2026-03-10T14:15:59.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.278+0000 7f1149ffb700 1 -- 192.168.123.103:0/1102414870 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1144019070 con 0x7f114c082ac0 2026-03-10T14:15:59.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.278+0000 7f1149ffb700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f11340778c0 0x7f1134079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.278+0000 7f1149ffb700 1 -- 192.168.123.103:0/1102414870 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f11440a35e0 con 0x7f114c082ac0 2026-03-10T14:15:59.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.279+0000 7f1150fb3700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f11340778c0 0x7f1134079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:59.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.279+0000 7f1149ffb700 1 -- 192.168.123.103:0/1102414870 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f114406bcd0 con 0x7f114c082ac0 2026-03-10T14:15:59.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.289+0000 7f1150fb3700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f11340778c0 0x7f1134079d70 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f113c005d90 tx=0x7f113c005ce0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:59.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.534+0000 7f1153217700 1 -- 192.168.123.103:0/1102414870 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f114c02d050 con 0x7f114c082ac0 2026-03-10T14:15:59.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.535+0000 7f1149ffb700 1 -- 192.168.123.103:0/1102414870 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f114406b420 con 0x7f114c082ac0 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:15:59.534 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:15:59.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 -- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f11340778c0 msgr2=0x7f1134079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f11340778c0 0x7f1134079d70 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f113c005d90 tx=0x7f113c005ce0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 -- 192.168.123.103:0/1102414870 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c082ac0 msgr2=0x7f114c082f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c082ac0 0x7f114c082f30 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f11440095a0 tx=0x7f11440096c0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 -- 192.168.123.103:0/1102414870 shutdown_connections 2026-03-10T14:15:59.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f11340778c0 0x7f1134079d70 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f114c071980 0x7f114c082580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 --2- 192.168.123.103:0/1102414870 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f114c082ac0 0x7f114c082f30 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 -- 192.168.123.103:0/1102414870 >> 192.168.123.103:0/1102414870 conn(0x7f114c06d1a0 msgr2=0x7f114c0764b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:59.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 -- 192.168.123.103:0/1102414870 shutdown_connections 2026-03-10T14:15:59.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.538+0000 7f11337fe700 1 -- 192.168.123.103:0/1102414870 wait complete. 2026-03-10T14:15:59.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.619+0000 7fa3ede36700 1 -- 192.168.123.103:0/2064864230 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 msgr2=0x7fa3e8105cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.619+0000 7fa3ede36700 1 --2- 192.168.123.103:0/2064864230 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 0x7fa3e8105cb0 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7fa3d8009b50 tx=0x7fa3d8009e60 comp rx=0 tx=0).stop 2026-03-10T14:15:59.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.620+0000 7fa3ede36700 1 -- 192.168.123.103:0/2064864230 shutdown_connections 2026-03-10T14:15:59.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.620+0000 7fa3ede36700 1 --2- 192.168.123.103:0/2064864230 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 0x7fa3e8105cb0 unknown :-1 s=CLOSED pgs=187 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.620+0000 7fa3ede36700 1 --2- 192.168.123.103:0/2064864230 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3e8100fb0 0x7fa3e8103390 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.620+0000 7fa3ede36700 1 -- 192.168.123.103:0/2064864230 >> 192.168.123.103:0/2064864230 conn(0x7fa3e80fa990 msgr2=0x7fa3e80fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:59.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.620+0000 7fa3ede36700 1 -- 192.168.123.103:0/2064864230 shutdown_connections 2026-03-10T14:15:59.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.620+0000 7fa3ede36700 1 -- 192.168.123.103:0/2064864230 wait complete. 2026-03-10T14:15:59.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3ede36700 1 Processor -- start 2026-03-10T14:15:59.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3ede36700 1 -- start start 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3ede36700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3e8100fb0 0x7fa3e8193bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3ede36700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 0x7fa3e8194100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3ede36700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3e8194720 con 0x7fa3e81038d0 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3ede36700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3e8194860 con 0x7fa3e8100fb0 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3e77fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3e8100fb0 0x7fa3e8193bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3e77fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3e8100fb0 0x7fa3e8193bc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:54016/0 (socket says 192.168.123.103:54016) 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3e77fe700 1 -- 192.168.123.103:0/1141029427 learned_addr learned my addr 192.168.123.103:0/1141029427 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.621+0000 7fa3e77fe700 1 -- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 msgr2=0x7fa3e8194100 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3e6ffd700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 0x7fa3e8194100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3e77fe700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 0x7fa3e8194100 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3e77fe700 1 -- 192.168.123.103:0/1141029427 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa3d80097e0 con 0x7fa3e8100fb0 2026-03-10T14:15:59.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3e6ffd700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 0x7fa3e8194100 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:15:59.621 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3e77fe700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3e8100fb0 0x7fa3e8193bc0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fa3d000eb10 tx=0x7fa3d000eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:59.621 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3e4ff9700 1 -- 192.168.123.103:0/1141029427 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3d000cca0 con 0x7fa3e8100fb0 2026-03-10T14:15:59.621 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa3e8199310 con 0x7fa3e8100fb0 2026-03-10T14:15:59.621 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa3e8199860 con 0x7fa3e8100fb0 2026-03-10T14:15:59.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3e4ff9700 1 -- 192.168.123.103:0/1141029427 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa3d000ce00 con 0x7fa3e8100fb0 2026-03-10T14:15:59.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.622+0000 7fa3e4ff9700 1 -- 192.168.123.103:0/1141029427 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3d0018990 con 0x7fa3e8100fb0 2026-03-10T14:15:59.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.623+0000 7fa3e4ff9700 1 -- 192.168.123.103:0/1141029427 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa3d0018b50 con 0x7fa3e8100fb0 2026-03-10T14:15:59.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.624+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa3e818ddd0 con 0x7fa3e8100fb0 2026-03-10T14:15:59.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.624+0000 7fa3e4ff9700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa3d407bd30 0x7fa3d407e1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.624+0000 7fa3e6ffd700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa3d407bd30 0x7fa3d407e1e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:59.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.624+0000 7fa3e4ff9700 1 -- 192.168.123.103:0/1141029427 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fa3d0014070 con 0x7fa3e8100fb0 2026-03-10T14:15:59.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.624+0000 7fa3e6ffd700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa3d407bd30 0x7fa3d407e1e0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fa3d800b5c0 tx=0x7fa3d8005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:59.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.627+0000 7fa3e4ff9700 1 -- 192.168.123.103:0/1141029427 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa3d0062b90 con 0x7fa3e8100fb0 2026-03-10T14:15:59.736 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:59 vm03.local ceph-mon[103098]: from='client.34514 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:59.736 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:59.736 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:15:59.736 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:59 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:15:59.736 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:59 vm03.local ceph-mon[103098]: pgmap v294: 65 pgs: 1 active+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:15:59.736 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:59 vm03.local ceph-mon[103098]: from='client.34518 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:59.736 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:59 vm03.local ceph-mon[103098]: from='client.34522 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:15:59.736 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:15:59 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1102414870' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:15:59.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.779+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fa3e8066e40 con 0x7fa3e8100fb0 2026-03-10T14:15:59.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.780+0000 7fa3e4ff9700 1 -- 192.168.123.103:0/1141029427 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 40 v40) v1 ==== 76+0+2006 (secure 0 0 0) 0x7fa3d00622e0 con 0x7fa3e8100fb0 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:e40 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-10T14:15:22:636548+0000 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:epoch 40 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:15:22.636520+0000 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 121 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:up {0=34474,1=34476} 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 34474 members: 34476,34474 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{0:34474} state up:active seq 8 join_fscid=1 addr [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:15:59.781 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:34480} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.104:6824/3487856112,v1:192.168.123.104:6825/3487856112] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:15:59.782 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{1:34476} state up:active seq 6 join_fscid=1 addr [v2:192.168.123.103:6828/2389872767,v1:192.168.123.103:6829/2389872767] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:15:59.782 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:44351} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.104:6826/3930678535,v1:192.168.123.104:6827/3930678535] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:15:59.782 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:15:59.782 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:15:59.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.784+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa3d407bd30 msgr2=0x7fa3d407e1e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.784+0000 7fa3ede36700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa3d407bd30 0x7fa3d407e1e0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fa3d800b5c0 tx=0x7fa3d8005fb0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.784+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3e8100fb0 msgr2=0x7fa3e8193bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.784+0000 7fa3ede36700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3e8100fb0 0x7fa3e8193bc0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fa3d000eb10 tx=0x7fa3d000eed0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.784+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 shutdown_connections 2026-03-10T14:15:59.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.785+0000 7fa3ede36700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa3d407bd30 0x7fa3d407e1e0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.785+0000 7fa3ede36700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa3e8100fb0 0x7fa3e8193bc0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.785+0000 7fa3ede36700 1 --2- 192.168.123.103:0/1141029427 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e81038d0 0x7fa3e8194100 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.785+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 >> 192.168.123.103:0/1141029427 conn(0x7fa3e80fa990 msgr2=0x7fa3e80fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:59.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.785+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 shutdown_connections 2026-03-10T14:15:59.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.785+0000 7fa3ede36700 1 -- 192.168.123.103:0/1141029427 wait complete. 2026-03-10T14:15:59.784 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 40 2026-03-10T14:15:59.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.869+0000 7f9af3fff700 1 -- 192.168.123.103:0/2941767702 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4071950 msgr2=0x7f9af4071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.869+0000 7f9af3fff700 1 --2- 192.168.123.103:0/2941767702 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4071950 0x7f9af4071d60 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7f9aec00b600 tx=0x7f9aec00b910 comp rx=0 tx=0).stop 2026-03-10T14:15:59.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 -- 192.168.123.103:0/2941767702 shutdown_connections 2026-03-10T14:15:59.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 --2- 192.168.123.103:0/2941767702 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9af4072330 0x7f9af40770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 --2- 192.168.123.103:0/2941767702 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4071950 0x7f9af4071d60 unknown :-1 s=CLOSED pgs=188 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 -- 192.168.123.103:0/2941767702 >> 192.168.123.103:0/2941767702 conn(0x7f9af406d1a0 msgr2=0x7f9af406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 -- 192.168.123.103:0/2941767702 shutdown_connections 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 -- 192.168.123.103:0/2941767702 wait complete. 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 Processor -- start 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 -- start start 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4071950 0x7f9af40824b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9af4072330 0x7f9af40829f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9af4082f30 con 0x7f9af4071950 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.870+0000 7f9af3fff700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9af4083070 con 0x7f9af4072330 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9af27fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9af4072330 0x7f9af40829f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9af27fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9af4072330 0x7f9af40829f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:54032/0 (socket says 192.168.123.103:54032) 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9af27fc700 1 -- 192.168.123.103:0/1581834894 learned_addr learned my addr 192.168.123.103:0/1581834894 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9af27fc700 1 -- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4071950 msgr2=0x7f9af40824b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9af27fc700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4071950 0x7f9af40824b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9af27fc700 1 -- 192.168.123.103:0/1581834894 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9aec00b050 con 0x7f9af4072330 2026-03-10T14:15:59.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9af27fc700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9af4072330 0x7f9af40829f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f9ae4012a70 tx=0x7f9ae4012e30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:15:59.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9adbfff700 1 -- 192.168.123.103:0/1581834894 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ae4009730 con 0x7f9af4072330 2026-03-10T14:15:59.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.871+0000 7f9adbfff700 1 -- 192.168.123.103:0/1581834894 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9ae4009d70 con 0x7f9af4072330 2026-03-10T14:15:59.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.872+0000 7f9adbfff700 1 -- 192.168.123.103:0/1581834894 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ae400c400 con 0x7f9af4072330 2026-03-10T14:15:59.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.874+0000 7f9af3fff700 1 -- 192.168.123.103:0/1581834894 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9af4083300 con 0x7f9af4072330 2026-03-10T14:15:59.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.874+0000 7f9af3fff700 1 -- 192.168.123.103:0/1581834894 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9af41b2dd0 con 0x7f9af4072330 2026-03-10T14:15:59.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.874+0000 7f9ad9ffb700 1 -- 192.168.123.103:0/1581834894 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9ad40052f0 con 0x7f9af4072330 2026-03-10T14:15:59.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.878+0000 7f9adbfff700 1 -- 192.168.123.103:0/1581834894 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9ae401a410 con 0x7f9af4072330 2026-03-10T14:15:59.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.880+0000 7f9adbfff700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9adc0776d0 0x7f9adc079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:15:59.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.880+0000 7f9adbfff700 1 -- 192.168.123.103:0/1581834894 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f9ae4029080 con 0x7f9af4072330 2026-03-10T14:15:59.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.880+0000 7f9adbfff700 1 -- 192.168.123.103:0/1581834894 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9ae409d960 con 0x7f9af4072330 2026-03-10T14:15:59.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.889+0000 7f9af2ffd700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9adc0776d0 0x7f9adc079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:15:59.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:15:59.889+0000 7f9af2ffd700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9adc0776d0 0x7f9adc079b80 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f9aec009250 tx=0x7f9aec009f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:00.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.019+0000 7f9ad9ffb700 1 -- 192.168.123.103:0/1581834894 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9ad4000bc0 con 0x7f9adc0776d0 2026-03-10T14:16:00.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.025+0000 7f9adbfff700 1 -- 192.168.123.103:0/1581834894 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7f9ad4000bc0 con 0x7f9adc0776d0 2026-03-10T14:16:00.023 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:16:00.023 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T14:16:00.023 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-10T14:16:00.023 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "mds", 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "ceph-exporter", 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "mgr", 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "osd" 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "18/23 daemons upgraded", 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading grafana daemons", 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:16:00.024 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:16:00.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 -- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9adc0776d0 msgr2=0x7f9adc079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:00.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9adc0776d0 0x7f9adc079b80 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f9aec009250 tx=0x7f9aec009f90 comp rx=0 tx=0).stop 2026-03-10T14:16:00.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 -- 192.168.123.103:0/1581834894 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9af4072330 msgr2=0x7f9af40829f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:00.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9af4072330 0x7f9af40829f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f9ae4012a70 tx=0x7f9ae4012e30 comp rx=0 tx=0).stop 2026-03-10T14:16:00.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 -- 192.168.123.103:0/1581834894 shutdown_connections 2026-03-10T14:16:00.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9adc0776d0 0x7f9adc079b80 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4071950 0x7f9af40824b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 --2- 192.168.123.103:0/1581834894 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9af4072330 0x7f9af40829f0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 -- 192.168.123.103:0/1581834894 >> 192.168.123.103:0/1581834894 conn(0x7f9af406d1a0 msgr2=0x7f9af40758d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:00.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 -- 192.168.123.103:0/1581834894 shutdown_connections 2026-03-10T14:16:00.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.028+0000 7f9af3fff700 1 -- 192.168.123.103:0/1581834894 wait complete. 2026-03-10T14:16:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:59 vm04.local ceph-mon[92084]: from='client.34514 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:59 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:16:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:59 vm04.local ceph-mon[92084]: pgmap v294: 65 pgs: 1 active+undersized+remapped, 64 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.9 KiB/s rd, 3 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:16:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:59 vm04.local ceph-mon[92084]: from='client.34518 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:59 vm04.local ceph-mon[92084]: from='client.34522 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:00.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:15:59 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1102414870' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:00.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.102+0000 7fd50f59e700 1 -- 192.168.123.103:0/4140711600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd510072330 msgr2=0x7fd5100770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:00.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.102+0000 7fd50f59e700 1 --2- 192.168.123.103:0/4140711600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd510072330 0x7fd5100770b0 secure :-1 s=READY pgs=189 cs=0 l=1 rev1=1 crypto rx=0x7fd508009230 tx=0x7fd508009260 comp rx=0 tx=0).stop 2026-03-10T14:16:00.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.102+0000 7fd50f59e700 1 -- 192.168.123.103:0/4140711600 shutdown_connections 2026-03-10T14:16:00.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.102+0000 7fd50f59e700 1 --2- 192.168.123.103:0/4140711600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd510072330 0x7fd5100770b0 unknown :-1 s=CLOSED pgs=189 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.102+0000 7fd50f59e700 1 --2- 192.168.123.103:0/4140711600 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd510071950 0x7fd510071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.102+0000 7fd50f59e700 1 -- 192.168.123.103:0/4140711600 >> 192.168.123.103:0/4140711600 conn(0x7fd51006d1a0 msgr2=0x7fd51006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:00.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.102+0000 7fd50f59e700 1 -- 192.168.123.103:0/4140711600 shutdown_connections 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.102+0000 7fd50f59e700 1 -- 192.168.123.103:0/4140711600 wait complete. 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50f59e700 1 Processor -- start 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50f59e700 1 -- start start 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50f59e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd510071950 0x7fd5100824a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50f59e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5100829e0 0x7fd510082e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50f59e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd510083e50 con 0x7fd5100829e0 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50f59e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd51012dd80 con 0x7fd510071950 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5100829e0 0x7fd510082e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5100829e0 0x7fd510082e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60496/0 (socket says 192.168.123.103:60496) 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50dd9b700 1 -- 192.168.123.103:0/2889384040 learned_addr learned my addr 192.168.123.103:0/2889384040 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50dd9b700 1 -- 192.168.123.103:0/2889384040 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd510071950 msgr2=0x7fd5100824a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50dd9b700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd510071950 0x7fd5100824a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.103+0000 7fd50dd9b700 1 -- 192.168.123.103:0/2889384040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd508008ee0 con 0x7fd5100829e0 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.104+0000 7fd50dd9b700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5100829e0 0x7fd510082e50 secure :-1 s=READY pgs=190 cs=0 l=1 rev1=1 crypto rx=0x7fd508004740 tx=0x7fd508004820 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:00.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.104+0000 7fd4ff7fe700 1 -- 192.168.123.103:0/2889384040 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd50801d070 con 0x7fd5100829e0 2026-03-10T14:16:00.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.104+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd51012dfa0 con 0x7fd5100829e0 2026-03-10T14:16:00.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.104+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd51012e490 con 0x7fd5100829e0 2026-03-10T14:16:00.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.106+0000 7fd4ff7fe700 1 -- 192.168.123.103:0/2889384040 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd50800ece0 con 0x7fd5100829e0 2026-03-10T14:16:00.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.106+0000 7fd4ff7fe700 1 -- 192.168.123.103:0/2889384040 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd508016b40 con 0x7fd5100829e0 2026-03-10T14:16:00.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.107+0000 7fd4ff7fe700 1 -- 192.168.123.103:0/2889384040 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd508016ca0 con 0x7fd5100829e0 2026-03-10T14:16:00.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.107+0000 7fd4ff7fe700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd4f8077910 0x7fd4f8079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:00.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.108+0000 7fd4ff7fe700 1 -- 192.168.123.103:0/2889384040 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fd508012070 con 0x7fd5100829e0 2026-03-10T14:16:00.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.109+0000 7fd50e59c700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd4f8077910 0x7fd4f8079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:00.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.110+0000 7fd50e59c700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd4f8077910 0x7fd4f8079dc0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fd50000b3c0 tx=0x7fd50000d040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:00.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.110+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd4f0005320 con 0x7fd5100829e0 2026-03-10T14:16:00.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.114+0000 7fd4ff7fe700 1 -- 192.168.123.103:0/2889384040 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd508064b60 con 0x7fd5100829e0 2026-03-10T14:16:00.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.284+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fd4f0005190 con 0x7fd5100829e0 2026-03-10T14:16:00.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.286+0000 7fd4ff7fe700 1 -- 192.168.123.103:0/2889384040 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+216 (secure 0 0 0) 0x7fd508026090 con 0x7fd5100829e0 2026-03-10T14:16:00.286 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN Degraded data redundancy: 1 pg undersized 2026-03-10T14:16:00.286 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1 pg undersized 2026-03-10T14:16:00.286 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is stuck undersized for 102s, current state active+undersized+remapped, last acting [0,2] 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.289+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd4f8077910 msgr2=0x7fd4f8079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.289+0000 7fd50f59e700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd4f8077910 0x7fd4f8079dc0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fd50000b3c0 tx=0x7fd50000d040 comp rx=0 tx=0).stop 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.289+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5100829e0 msgr2=0x7fd510082e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.289+0000 7fd50f59e700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5100829e0 0x7fd510082e50 secure :-1 s=READY pgs=190 cs=0 l=1 rev1=1 crypto rx=0x7fd508004740 tx=0x7fd508004820 comp rx=0 tx=0).stop 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.290+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 shutdown_connections 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.290+0000 7fd50f59e700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd4f8077910 0x7fd4f8079dc0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.290+0000 7fd50f59e700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd510071950 0x7fd5100824a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.290+0000 7fd50f59e700 1 --2- 192.168.123.103:0/2889384040 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd5100829e0 0x7fd510082e50 unknown :-1 s=CLOSED pgs=190 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:00.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.290+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 >> 192.168.123.103:0/2889384040 conn(0x7fd51006d1a0 msgr2=0x7fd5100764b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:00.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.290+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 shutdown_connections 2026-03-10T14:16:00.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:00.290+0000 7fd50f59e700 1 -- 192.168.123.103:0/2889384040 wait complete. 2026-03-10T14:16:01.027 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:00 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1141029427' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:16:01.027 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:00 vm03.local ceph-mon[103098]: from='client.44389 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:01.027 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:00 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2889384040' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:16:01.027 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:01.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:01.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:16:01.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:00 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1141029427' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:16:01.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:00 vm04.local ceph-mon[92084]: from='client.44389 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:01.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:00 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2889384040' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:16:01.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:01.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:01.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: pgmap v295: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 1 pg undersized) 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: Cluster is now healthy 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T14:16:02.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: pgmap v295: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 4 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 1 pg undersized) 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: Cluster is now healthy 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T14:16:02.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T14:16:02.315 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: Upgrade: Finalizing container_image settings 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: Upgrade: Complete! 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:16:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: Upgrade: Finalizing container_image settings 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: Upgrade: Complete! 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:16:03.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:04 vm04.local ceph-mon[92084]: pgmap v296: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:16:04.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:04 vm03.local ceph-mon[103098]: pgmap v296: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.9 KiB/s rd, 4 op/s; 0 B/s, 5 objects/s recovering 2026-03-10T14:16:05.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:05 vm04.local ceph-mon[92084]: pgmap v297: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.6 KiB/s rd, 3 op/s 2026-03-10T14:16:05.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:05 vm03.local ceph-mon[103098]: pgmap v297: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.6 KiB/s rd, 3 op/s 2026-03-10T14:16:07.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:06 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:07.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:06 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:08.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:07 vm04.local ceph-mon[92084]: pgmap v298: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:16:08.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:07 vm03.local ceph-mon[103098]: pgmap v298: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:16:10.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:09 vm04.local ceph-mon[92084]: pgmap v299: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:10.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:09 vm03.local ceph-mon[103098]: pgmap v299: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:11 vm04.local ceph-mon[92084]: pgmap v300: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:12.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:11 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:12.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:11 vm03.local ceph-mon[103098]: pgmap v300: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:11 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:14.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:13 vm04.local ceph-mon[92084]: pgmap v301: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:14.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:13 vm03.local ceph-mon[103098]: pgmap v301: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:15 vm04.local ceph-mon[92084]: pgmap v302: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:16:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:15 vm03.local ceph-mon[103098]: pgmap v302: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:16:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:16:18.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:17 vm04.local ceph-mon[92084]: pgmap v303: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:18.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:17 vm03.local ceph-mon[103098]: pgmap v303: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:20.039 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:19 vm03.local ceph-mon[103098]: pgmap v304: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:20.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:19 vm04.local ceph-mon[92084]: pgmap v304: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:21.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:21 vm04.local ceph-mon[92084]: pgmap v305: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:22.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:21 vm03.local ceph-mon[103098]: pgmap v305: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:24.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:23 vm04.local ceph-mon[92084]: pgmap v306: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:24.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:23 vm03.local ceph-mon[103098]: pgmap v306: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:26.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:25 vm04.local ceph-mon[92084]: pgmap v307: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:26.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:25 vm03.local ceph-mon[103098]: pgmap v307: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:27 vm04.local ceph-mon[92084]: pgmap v308: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:28.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:27 vm03.local ceph-mon[103098]: pgmap v308: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:29 vm04.local ceph-mon[92084]: pgmap v309: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:29 vm03.local ceph-mon[103098]: pgmap v309: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.363+0000 7fedc3074700 1 -- 192.168.123.103:0/949641711 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc100aa0 msgr2=0x7fedbc100eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.363+0000 7fedc3074700 1 --2- 192.168.123.103:0/949641711 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc100aa0 0x7fedbc100eb0 secure :-1 s=READY pgs=191 cs=0 l=1 rev1=1 crypto rx=0x7fedac009b50 tx=0x7fedac009e60 comp rx=0 tx=0).stop 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.363+0000 7fedc3074700 1 -- 192.168.123.103:0/949641711 shutdown_connections 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.363+0000 7fedc3074700 1 --2- 192.168.123.103:0/949641711 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedbc101480 0x7fedbc0fe650 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.363+0000 7fedc3074700 1 --2- 192.168.123.103:0/949641711 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc100aa0 0x7fedbc100eb0 unknown :-1 s=CLOSED pgs=191 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.363+0000 7fedc3074700 1 -- 192.168.123.103:0/949641711 >> 192.168.123.103:0/949641711 conn(0x7fedbc0fa1a0 msgr2=0x7fedbc0fc5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.363+0000 7fedc3074700 1 -- 192.168.123.103:0/949641711 shutdown_connections 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.363+0000 7fedc3074700 1 -- 192.168.123.103:0/949641711 wait complete. 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedc3074700 1 Processor -- start 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedc3074700 1 -- start start 2026-03-10T14:16:30.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedc3074700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedbc100aa0 0x7fedbc1980d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedc3074700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc101480 0x7fedbc198610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedc3074700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fedbc198c30 con 0x7fedbc101480 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedc3074700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fedbc198d70 con 0x7fedbc100aa0 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedbbfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc101480 0x7fedbc198610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedbbfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc101480 0x7fedbc198610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48272/0 (socket says 192.168.123.103:48272) 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedbbfff700 1 -- 192.168.123.103:0/2693786979 learned_addr learned my addr 192.168.123.103:0/2693786979 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedbbfff700 1 -- 192.168.123.103:0/2693786979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedbc100aa0 msgr2=0x7fedbc1980d0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedc0e10700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedbc100aa0 0x7fedbc1980d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedbbfff700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedbc100aa0 0x7fedbc1980d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedbbfff700 1 -- 192.168.123.103:0/2693786979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fedac0097e0 con 0x7fedbc101480 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.365+0000 7fedbbfff700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc101480 0x7fedbc198610 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7fedb000d900 tx=0x7fedb000dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:30.364 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.366+0000 7fedb9ffb700 1 -- 192.168.123.103:0/2693786979 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fedb00098e0 con 0x7fedbc101480 2026-03-10T14:16:30.365 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.366+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fedbc19d820 con 0x7fedbc101480 2026-03-10T14:16:30.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.367+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fedbc19dd70 con 0x7fedbc101480 2026-03-10T14:16:30.366 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.367+0000 7fedb9ffb700 1 -- 192.168.123.103:0/2693786979 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fedb0010460 con 0x7fedbc101480 2026-03-10T14:16:30.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.367+0000 7fedb9ffb700 1 -- 192.168.123.103:0/2693786979 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fedb000f5d0 con 0x7fedbc101480 2026-03-10T14:16:30.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.367+0000 7fedb9ffb700 1 -- 192.168.123.103:0/2693786979 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fedb000f770 con 0x7fedbc101480 2026-03-10T14:16:30.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.368+0000 7fedb9ffb700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feda4077a60 0x7feda4079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:30.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.368+0000 7fedc0e10700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feda4077a60 0x7feda4079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:30.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.368+0000 7fedb9ffb700 1 -- 192.168.123.103:0/2693786979 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fedb0099e50 con 0x7fedbc101480 2026-03-10T14:16:30.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.369+0000 7fedc0e10700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feda4077a60 0x7feda4079f10 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fedac000c00 tx=0x7fedac005fb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:30.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.369+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feda8005320 con 0x7fedbc101480 2026-03-10T14:16:30.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.373+0000 7fedb9ffb700 1 -- 192.168.123.103:0/2693786979 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fedb009f050 con 0x7fedbc101480 2026-03-10T14:16:30.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.496+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7feda8000bf0 con 0x7feda4077a60 2026-03-10T14:16:30.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.498+0000 7fedb9ffb700 1 -- 192.168.123.103:0/2693786979 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7feda8000bf0 con 0x7feda4077a60 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feda4077a60 msgr2=0x7feda4079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feda4077a60 0x7feda4079f10 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fedac000c00 tx=0x7fedac005fb0 comp rx=0 tx=0).stop 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc101480 msgr2=0x7fedbc198610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc101480 0x7fedbc198610 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7fedb000d900 tx=0x7fedb000dcc0 comp rx=0 tx=0).stop 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 shutdown_connections 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7feda4077a60 0x7feda4079f10 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedbc100aa0 0x7fedbc1980d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 --2- 192.168.123.103:0/2693786979 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedbc101480 0x7fedbc198610 unknown :-1 s=CLOSED pgs=192 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:30.499 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 >> 192.168.123.103:0/2693786979 conn(0x7fedbc0fa1a0 msgr2=0x7fedbc107d30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:30.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 shutdown_connections 2026-03-10T14:16:30.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:30.500+0000 7fedc3074700 1 -- 192.168.123.103:0/2693786979 wait complete. 2026-03-10T14:16:30.550 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T14:16:30.750 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:16:31.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.024+0000 7f2948874700 1 -- 192.168.123.103:0/4003803545 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29401066c0 msgr2=0x7f2940106a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:31.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.024+0000 7f2948874700 1 --2- 192.168.123.103:0/4003803545 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29401066c0 0x7f2940106a90 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f2938009b00 tx=0x7f2938009e10 comp rx=0 tx=0).stop 2026-03-10T14:16:31.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.024+0000 7f2948874700 1 -- 192.168.123.103:0/4003803545 shutdown_connections 2026-03-10T14:16:31.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.024+0000 7f2948874700 1 --2- 192.168.123.103:0/4003803545 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f2940068490 0x7f2940068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.024+0000 7f2948874700 1 --2- 192.168.123.103:0/4003803545 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f29401066c0 0x7f2940106a90 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.024+0000 7f2948874700 1 -- 192.168.123.103:0/4003803545 >> 192.168.123.103:0/4003803545 conn(0x7f29400754a0 msgr2=0x7f29400758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:31.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.024+0000 7f2948874700 1 -- 192.168.123.103:0/4003803545 shutdown_connections 2026-03-10T14:16:31.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.025+0000 7f2948874700 1 -- 192.168.123.103:0/4003803545 wait complete. 2026-03-10T14:16:31.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.025+0000 7f2948874700 1 Processor -- start 2026-03-10T14:16:31.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.025+0000 7f2948874700 1 -- start start 2026-03-10T14:16:31.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.025+0000 7f2948874700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2940068490 0x7f2940194010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:31.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.026+0000 7f2948874700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29401066c0 0x7f2940194550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:31.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.026+0000 7f2948874700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2940194c30 con 0x7f2940068490 2026-03-10T14:16:31.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.026+0000 7f2948874700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29401989c0 con 0x7f29401066c0 2026-03-10T14:16:31.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.026+0000 7f2945e0f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29401066c0 0x7f2940194550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:31.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.026+0000 7f2945e0f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29401066c0 0x7f2940194550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:51800/0 (socket says 192.168.123.103:51800) 2026-03-10T14:16:31.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.026+0000 7f2945e0f700 1 -- 192.168.123.103:0/2917410770 learned_addr learned my addr 192.168.123.103:0/2917410770 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:16:31.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.026+0000 7f2946610700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2940068490 0x7f2940194010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:31.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.027+0000 7f2945e0f700 1 -- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2940068490 msgr2=0x7f2940194010 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:31.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.027+0000 7f2945e0f700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2940068490 0x7f2940194010 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.027+0000 7f2945e0f700 1 -- 192.168.123.103:0/2917410770 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29380097e0 con 0x7f29401066c0 2026-03-10T14:16:31.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.027+0000 7f2946610700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2940068490 0x7f2940194010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:16:31.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.027+0000 7f2945e0f700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29401066c0 0x7f2940194550 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f293400d8d0 tx=0x7f293400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:31.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.028+0000 7f29337fe700 1 -- 192.168.123.103:0/2917410770 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2934009940 con 0x7f29401066c0 2026-03-10T14:16:31.026 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.028+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2940198ca0 con 0x7f29401066c0 2026-03-10T14:16:31.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.028+0000 7f29337fe700 1 -- 192.168.123.103:0/2917410770 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2934010460 con 0x7f29401066c0 2026-03-10T14:16:31.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.028+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f29401991f0 con 0x7f29401066c0 2026-03-10T14:16:31.027 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.029+0000 7f29337fe700 1 -- 192.168.123.103:0/2917410770 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f293400f5d0 con 0x7f29401066c0 2026-03-10T14:16:31.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.029+0000 7f29337fe700 1 -- 192.168.123.103:0/2917410770 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f29340105d0 con 0x7f29401066c0 2026-03-10T14:16:31.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.029+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f294004ea50 con 0x7f29401066c0 2026-03-10T14:16:31.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.030+0000 7f29337fe700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f292c07bc80 0x7f292c07e130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:31.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.030+0000 7f29337fe700 1 -- 192.168.123.103:0/2917410770 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f2934099900 con 0x7f29401066c0 2026-03-10T14:16:31.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.030+0000 7f2946610700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f292c07bc80 0x7f292c07e130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:31.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.031+0000 7f2946610700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f292c07bc80 0x7f292c07e130 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f2938006010 tx=0x7f293800b540 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:31.032 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.034+0000 7f29337fe700 1 -- 192.168.123.103:0/2917410770 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2934061820 con 0x7f29401066c0 2026-03-10T14:16:31.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.154+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f2940195410 con 0x7f292c07bc80 2026-03-10T14:16:31.158 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.159+0000 7f29337fe700 1 -- 192.168.123.103:0/2917410770 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f2940195410 con 0x7f292c07bc80 2026-03-10T14:16:31.158 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (42s) 30s ago 13m 15.0M - 0.25.0 c8568f914cd2 cf55e8b6005f 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (67s) 30s ago 13m 10.4M - 19.2.3-678-ge911bdeb 654f31e6858e 1bd623640ecf 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (65s) 54s ago 12m 10.1M - 19.2.3-678-ge911bdeb 654f31e6858e 54bbafe0555e 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (7m) 30s ago 13m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (7m) 54s ago 12m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (32s) 30s ago 12m 44.8M - 10.4.0 c8b91775d855 bbe8b6e1ebb7 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (94s) 30s ago 11m 80.6M - 19.2.3-678-ge911bdeb 654f31e6858e b1023e0bcace 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (87s) 30s ago 11m 19.7M - 19.2.3-678-ge911bdeb 654f31e6858e ddad42dde865 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (72s) 54s ago 11m 18.3M - 19.2.3-678-ge911bdeb 654f31e6858e f5060c63df19 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (80s) 54s ago 11m 79.0M - 19.2.3-678-ge911bdeb 654f31e6858e 89fac44ae15d 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (8m) 30s ago 13m 636M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (7m) 54s ago 12m 499M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (7m) 30s ago 13m 70.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (7m) 54s ago 12m 55.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:16:31.161 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (59s) 30s ago 13m 10.0M - 1.7.0 72c9c2088986 e10d80d39d78 2026-03-10T14:16:31.162 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (55s) 54s ago 12m 4874k - 1.7.0 72c9c2088986 29f4915b7954 2026-03-10T14:16:31.162 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (6m) 30s ago 12m 284M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:16:31.162 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 30s ago 12m 234M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:16:31.162 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (3m) 30s ago 11m 172M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e0d768b70468 2026-03-10T14:16:31.162 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (3m) 54s ago 11m 150M 4096M 19.2.3-678-ge911bdeb 654f31e6858e f7fc2aafa9d9 2026-03-10T14:16:31.162 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (2m) 54s ago 11m 124M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 89f9225212d4 2026-03-10T14:16:31.162 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (2m) 54s ago 11m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6c7573f5f3fa 2026-03-10T14:16:31.162 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (46s) 30s ago 12m 61.9M - 2.51.0 1d3b7f56885b 465f119ec393 2026-03-10T14:16:31.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f292c07bc80 msgr2=0x7f292c07e130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:31.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f292c07bc80 0x7f292c07e130 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f2938006010 tx=0x7f293800b540 comp rx=0 tx=0).stop 2026-03-10T14:16:31.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29401066c0 msgr2=0x7f2940194550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:31.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29401066c0 0x7f2940194550 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f293400d8d0 tx=0x7f293400dc90 comp rx=0 tx=0).stop 2026-03-10T14:16:31.163 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 shutdown_connections 2026-03-10T14:16:31.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f292c07bc80 0x7f292c07e130 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2940068490 0x7f2940194010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 --2- 192.168.123.103:0/2917410770 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f29401066c0 0x7f2940194550 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 >> 192.168.123.103:0/2917410770 conn(0x7f29400754a0 msgr2=0x7f29400fec60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:31.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 shutdown_connections 2026-03-10T14:16:31.164 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.165+0000 7f2948874700 1 -- 192.168.123.103:0/2917410770 wait complete. 2026-03-10T14:16:31.242 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-10T14:16:31.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:16:31.400 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:16:31.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.672+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2711807936 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac41066c0 msgr2=0x7f1ac4106a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:31.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.672+0000 7f1ac1ffb700 1 -- 192.168.123.103:0/2711807936 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1aac0056b0 con 0x7f1ac41066c0 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.672+0000 7f1ac99c4700 1 --2- 192.168.123.103:0/2711807936 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac41066c0 0x7f1ac4106a90 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f1aac009b00 tx=0x7f1aac009e10 comp rx=0 tx=0).stop 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.673+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2711807936 shutdown_connections 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.673+0000 7f1ac99c4700 1 --2- 192.168.123.103:0/2711807936 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ac4068490 0x7f1ac4068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.673+0000 7f1ac99c4700 1 --2- 192.168.123.103:0/2711807936 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac41066c0 0x7f1ac4106a90 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.673+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2711807936 >> 192.168.123.103:0/2711807936 conn(0x7f1ac40754a0 msgr2=0x7f1ac40758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.673+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2711807936 shutdown_connections 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.673+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2711807936 wait complete. 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac99c4700 1 Processor -- start 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac99c4700 1 -- start start 2026-03-10T14:16:31.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac99c4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac4068490 0x7f1ac410b4a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac2ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac4068490 0x7f1ac410b4a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac99c4700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ac41066c0 0x7f1ac410b9e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac99c4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ac410f5a0 con 0x7f1ac4068490 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac99c4700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ac410bf20 con 0x7f1ac41066c0 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac2ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac4068490 0x7f1ac410b4a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48316/0 (socket says 192.168.123.103:48316) 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac2ffd700 1 -- 192.168.123.103:0/2418220344 learned_addr learned my addr 192.168.123.103:0/2418220344 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac2ffd700 1 -- 192.168.123.103:0/2418220344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ac41066c0 msgr2=0x7f1ac410b9e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac2ffd700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ac41066c0 0x7f1ac410b9e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.674+0000 7f1ac2ffd700 1 -- 192.168.123.103:0/2418220344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1aac0097e0 con 0x7f1ac4068490 2026-03-10T14:16:31.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.675+0000 7f1ac2ffd700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac4068490 0x7f1ac410b4a0 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f1aac005f50 tx=0x7f1aac004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:31.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.675+0000 7f1ac89c2700 1 -- 192.168.123.103:0/2418220344 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1aac01d070 con 0x7f1ac4068490 2026-03-10T14:16:31.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.675+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1ac410c1a0 con 0x7f1ac4068490 2026-03-10T14:16:31.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.675+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1ac41a6b20 con 0x7f1ac4068490 2026-03-10T14:16:31.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.675+0000 7f1ac89c2700 1 -- 192.168.123.103:0/2418220344 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1aac00bc50 con 0x7f1ac4068490 2026-03-10T14:16:31.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.675+0000 7f1ac89c2700 1 -- 192.168.123.103:0/2418220344 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1aac017610 con 0x7f1ac4068490 2026-03-10T14:16:31.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.676+0000 7f1ac89c2700 1 -- 192.168.123.103:0/2418220344 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1aac017770 con 0x7f1ac4068490 2026-03-10T14:16:31.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.679+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1aa4005320 con 0x7f1ac4068490 2026-03-10T14:16:31.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.679+0000 7f1ac89c2700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ab0077910 0x7f1ab0079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:31.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.679+0000 7f1ac89c2700 1 -- 192.168.123.103:0/2418220344 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f1aac09c490 con 0x7f1ac4068490 2026-03-10T14:16:31.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.682+0000 7f1ac27fc700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ab0077910 0x7f1ab0079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:31.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.682+0000 7f1ac89c2700 1 -- 192.168.123.103:0/2418220344 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1aac064b30 con 0x7f1ac4068490 2026-03-10T14:16:31.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.683+0000 7f1ac27fc700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ab0077910 0x7f1ab0079dc0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f1ac410cc90 tx=0x7f1ab4005c90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:31.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.813+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1aa4000bf0 con 0x7f1ab0077910 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.814+0000 7f1ac89c2700 1 -- 192.168.123.103:0/2418220344 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f1aa4000bf0 con 0x7f1ab0077910 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": null, 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": false, 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout: "which": "", 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout: "progress": null, 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout: "message": "", 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-10T14:16:31.813 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:16:31.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.817+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ab0077910 msgr2=0x7f1ab0079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:31.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.817+0000 7f1ac99c4700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ab0077910 0x7f1ab0079dc0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f1ac410cc90 tx=0x7f1ab4005c90 comp rx=0 tx=0).stop 2026-03-10T14:16:31.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac4068490 msgr2=0x7f1ac410b4a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:31.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac4068490 0x7f1ac410b4a0 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f1aac005f50 tx=0x7f1aac004990 comp rx=0 tx=0).stop 2026-03-10T14:16:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 shutdown_connections 2026-03-10T14:16:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1ab0077910 0x7f1ab0079dc0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ac4068490 0x7f1ac410b4a0 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 --2- 192.168.123.103:0/2418220344 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1ac41066c0 0x7f1ac410b9e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 >> 192.168.123.103:0/2418220344 conn(0x7f1ac40754a0 msgr2=0x7f1ac40fed00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 shutdown_connections 2026-03-10T14:16:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:31.818+0000 7f1ac99c4700 1 -- 192.168.123.103:0/2418220344 wait complete. 2026-03-10T14:16:31.882 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-10T14:16:32.030 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:16:32.057 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:31 vm03.local ceph-mon[103098]: from='client.34536 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:32.057 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:31 vm03.local ceph-mon[103098]: pgmap v310: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:32.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.262+0000 7f5104b1c700 1 -- 192.168.123.103:0/2534048628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 msgr2=0x7f5100108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:32.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.262+0000 7f5104b1c700 1 --2- 192.168.123.103:0/2534048628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 0x7f5100108b50 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7f50e8009b00 tx=0x7f50e8009e10 comp rx=0 tx=0).stop 2026-03-10T14:16:32.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.263+0000 7f5104b1c700 1 -- 192.168.123.103:0/2534048628 shutdown_connections 2026-03-10T14:16:32.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.263+0000 7f5104b1c700 1 --2- 192.168.123.103:0/2534048628 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5100102780 0x7f5100102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.263+0000 7f5104b1c700 1 --2- 192.168.123.103:0/2534048628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 0x7f5100108b50 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.263+0000 7f5104b1c700 1 -- 192.168.123.103:0/2534048628 >> 192.168.123.103:0/2534048628 conn(0x7f51000fe280 msgr2=0x7f5100100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:32.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.264+0000 7f5104b1c700 1 -- 192.168.123.103:0/2534048628 shutdown_connections 2026-03-10T14:16:32.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.264+0000 7f5104b1c700 1 -- 192.168.123.103:0/2534048628 wait complete. 2026-03-10T14:16:32.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.264+0000 7f5104b1c700 1 Processor -- start 2026-03-10T14:16:32.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.264+0000 7f5104b1c700 1 -- start start 2026-03-10T14:16:32.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f5104b1c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5100102780 0x7f51001983e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f5104b1c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 0x7f5100198920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f5104b1c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5100199000 con 0x7f5100108780 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f5104b1c700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f510019cd90 con 0x7f5100102780 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50fdd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 0x7f5100198920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50fdd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 0x7f5100198920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48330/0 (socket says 192.168.123.103:48330) 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50fdd9b700 1 -- 192.168.123.103:0/3712107258 learned_addr learned my addr 192.168.123.103:0/3712107258 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50fdd9b700 1 -- 192.168.123.103:0/3712107258 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5100102780 msgr2=0x7f51001983e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50fdd9b700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5100102780 0x7f51001983e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50fdd9b700 1 -- 192.168.123.103:0/3712107258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f50e80097e0 con 0x7f5100108780 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50fdd9b700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 0x7f5100198920 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7f50f000b700 tx=0x7f50f000bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50f77fe700 1 -- 192.168.123.103:0/3712107258 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f50f0010840 con 0x7f5100108780 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50f77fe700 1 -- 192.168.123.103:0/3712107258 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f50f0010e80 con 0x7f5100108780 2026-03-10T14:16:32.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.265+0000 7f50f77fe700 1 -- 192.168.123.103:0/3712107258 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f50f0010840 con 0x7f5100108780 2026-03-10T14:16:32.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.266+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f510019d070 con 0x7f5100108780 2026-03-10T14:16:32.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.266+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f510019d5c0 con 0x7f5100108780 2026-03-10T14:16:32.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.267+0000 7f50f77fe700 1 -- 192.168.123.103:0/3712107258 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f50f000f3e0 con 0x7f5100108780 2026-03-10T14:16:32.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.267+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f510004ea50 con 0x7f5100108780 2026-03-10T14:16:32.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.268+0000 7f50f77fe700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f50ec0778d0 0x7f50ec079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:32.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.268+0000 7f50f77fe700 1 -- 192.168.123.103:0/3712107258 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f50f00653a0 con 0x7f5100108780 2026-03-10T14:16:32.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.271+0000 7f50fe59c700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f50ec0778d0 0x7f50ec079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:32.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.271+0000 7f50fe59c700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f50ec0778d0 0x7f50ec079d80 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f50e800b5c0 tx=0x7f50e8009f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:32.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.272+0000 7f50f77fe700 1 -- 192.168.123.103:0/3712107258 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f50f0060dc0 con 0x7f5100108780 2026-03-10T14:16:32.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:31 vm04.local ceph-mon[92084]: from='client.34536 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:32.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:31 vm04.local ceph-mon[92084]: pgmap v310: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:32.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.433+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f5100066e40 con 0x7f5100108780 2026-03-10T14:16:32.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.434+0000 7f50f77fe700 1 -- 192.168.123.103:0/3712107258 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f50f0014070 con 0x7f5100108780 2026-03-10T14:16:32.433 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_OK 2026-03-10T14:16:32.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.437+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f50ec0778d0 msgr2=0x7f50ec079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:32.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.437+0000 7f5104b1c700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f50ec0778d0 0x7f50ec079d80 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f50e800b5c0 tx=0x7f50e8009f90 comp rx=0 tx=0).stop 2026-03-10T14:16:32.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.437+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 msgr2=0x7f5100198920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:32.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.437+0000 7f5104b1c700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 0x7f5100198920 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7f50f000b700 tx=0x7f50f000bac0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.438+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 shutdown_connections 2026-03-10T14:16:32.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.438+0000 7f5104b1c700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f50ec0778d0 0x7f50ec079d80 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.438+0000 7f5104b1c700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5100102780 0x7f51001983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.438+0000 7f5104b1c700 1 --2- 192.168.123.103:0/3712107258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5100108780 0x7f5100198920 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.438+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 >> 192.168.123.103:0/3712107258 conn(0x7f51000fe280 msgr2=0x7f51000ffa60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:32.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.438+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 shutdown_connections 2026-03-10T14:16:32.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.438+0000 7f5104b1c700 1 -- 192.168.123.103:0/3712107258 wait complete. 2026-03-10T14:16:32.478 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T14:16:32.620 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:16:32.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.873+0000 7fea8d460700 1 -- 192.168.123.103:0/1903974580 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea8810cbe0 msgr2=0x7fea8810cfb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:32.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.873+0000 7fea8d460700 1 --2- 192.168.123.103:0/1903974580 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea8810cbe0 0x7fea8810cfb0 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7fea70009b00 tx=0x7fea70009e10 comp rx=0 tx=0).stop 2026-03-10T14:16:32.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.873+0000 7fea8d460700 1 -- 192.168.123.103:0/1903974580 shutdown_connections 2026-03-10T14:16:32.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.873+0000 7fea8d460700 1 --2- 192.168.123.103:0/1903974580 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fea88106bc0 0x7fea88107030 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.873+0000 7fea8d460700 1 --2- 192.168.123.103:0/1903974580 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea8810cbe0 0x7fea8810cfb0 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.873+0000 7fea8d460700 1 -- 192.168.123.103:0/1903974580 >> 192.168.123.103:0/1903974580 conn(0x7fea88074b80 msgr2=0x7fea88076f90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:32.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.874+0000 7fea8d460700 1 -- 192.168.123.103:0/1903974580 shutdown_connections 2026-03-10T14:16:32.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.874+0000 7fea8d460700 1 -- 192.168.123.103:0/1903974580 wait complete. 2026-03-10T14:16:32.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.874+0000 7fea8d460700 1 Processor -- start 2026-03-10T14:16:32.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea8d460700 1 -- start start 2026-03-10T14:16:32.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea8d460700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea88106bc0 0x7fea8819c990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:32.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea8d460700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fea8819ced0 0x7fea881a1370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:32.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea8d460700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fea8819d450 con 0x7fea88106bc0 2026-03-10T14:16:32.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea8d460700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fea8819d590 con 0x7fea8819ced0 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea86ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea88106bc0 0x7fea8819c990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea86ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea88106bc0 0x7fea8819c990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:48342/0 (socket says 192.168.123.103:48342) 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea86ffd700 1 -- 192.168.123.103:0/3040836527 learned_addr learned my addr 192.168.123.103:0/3040836527 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea86ffd700 1 -- 192.168.123.103:0/3040836527 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fea8819ced0 msgr2=0x7fea881a1370 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea867fc700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fea8819ced0 0x7fea881a1370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.875+0000 7fea86ffd700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fea8819ced0 0x7fea881a1370 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.876+0000 7fea86ffd700 1 -- 192.168.123.103:0/3040836527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fea700097e0 con 0x7fea88106bc0 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.876+0000 7fea867fc700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fea8819ced0 0x7fea881a1370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:16:32.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.876+0000 7fea86ffd700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea88106bc0 0x7fea8819c990 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7fea70009fd0 tx=0x7fea700049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:32.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.876+0000 7fea7ffff700 1 -- 192.168.123.103:0/3040836527 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fea7001d070 con 0x7fea88106bc0 2026-03-10T14:16:32.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.876+0000 7fea7ffff700 1 -- 192.168.123.103:0/3040836527 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fea70004b80 con 0x7fea88106bc0 2026-03-10T14:16:32.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.876+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fea881a18b0 con 0x7fea88106bc0 2026-03-10T14:16:32.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.876+0000 7fea7ffff700 1 -- 192.168.123.103:0/3040836527 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fea7000f670 con 0x7fea88106bc0 2026-03-10T14:16:32.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.876+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fea881a1e00 con 0x7fea88106bc0 2026-03-10T14:16:32.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.878+0000 7fea7ffff700 1 -- 192.168.123.103:0/3040836527 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fea7000f7d0 con 0x7fea88106bc0 2026-03-10T14:16:32.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.878+0000 7fea7ffff700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fea740779e0 0x7fea74079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:16:32.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.878+0000 7fea7ffff700 1 -- 192.168.123.103:0/3040836527 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fea7009c210 con 0x7fea88106bc0 2026-03-10T14:16:32.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.878+0000 7fea867fc700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fea740779e0 0x7fea74079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:16:32.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.879+0000 7fea867fc700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fea740779e0 0x7fea74079e90 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fea78005fd0 tx=0x7fea78005e20 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:16:32.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.879+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fea8810f070 con 0x7fea88106bc0 2026-03-10T14:16:32.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:32.882+0000 7fea7ffff700 1 -- 192.168.123.103:0/3040836527 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fea700649b0 con 0x7fea88106bc0 2026-03-10T14:16:33.037 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.038+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fea8804ea50 con 0x7fea88106bc0 2026-03-10T14:16:33.037 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:32 vm03.local ceph-mon[103098]: from='client.44397 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:33.037 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:32 vm03.local ceph-mon[103098]: from='client.34544 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:33.037 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:32 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3712107258' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:16:33.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.041+0000 7fea7ffff700 1 -- 192.168.123.103:0/3040836527 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fea70027070 con 0x7fea88106bc0 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:16:33.041 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fea740779e0 msgr2=0x7fea74079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fea740779e0 0x7fea74079e90 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fea78005fd0 tx=0x7fea78005e20 comp rx=0 tx=0).stop 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea88106bc0 msgr2=0x7fea8819c990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea88106bc0 0x7fea8819c990 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7fea70009fd0 tx=0x7fea700049e0 comp rx=0 tx=0).stop 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 shutdown_connections 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fea740779e0 0x7fea74079e90 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fea88106bc0 0x7fea8819c990 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 --2- 192.168.123.103:0/3040836527 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fea8819ced0 0x7fea881a1370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 >> 192.168.123.103:0/3040836527 conn(0x7fea88074b80 msgr2=0x7fea880765e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 shutdown_connections 2026-03-10T14:16:33.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:16:33.045+0000 7fea8d460700 1 -- 192.168.123.103:0/3040836527 wait complete. 2026-03-10T14:16:33.110 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-10T14:16:33.248 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:16:33.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:32 vm04.local ceph-mon[92084]: from='client.44397 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:33.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:32 vm04.local ceph-mon[92084]: from='client.34544 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:16:33.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:32 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3712107258' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T14:16:33.432 INFO:teuthology.orchestra.run.vm03.stdout:wait for servicemap items w/ changing names to refresh 2026-03-10T14:16:33.461 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-10T14:16:33.602 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:16:33.931 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:33 vm03.local ceph-mon[103098]: pgmap v311: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:33.931 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:33 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3040836527' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:34.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:33 vm04.local ceph-mon[92084]: pgmap v311: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:34.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:33 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3040836527' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:16:36.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:35 vm04.local ceph-mon[92084]: pgmap v312: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:36.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:35 vm03.local ceph-mon[103098]: pgmap v312: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:37 vm04.local ceph-mon[92084]: pgmap v313: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:38.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:37 vm03.local ceph-mon[103098]: pgmap v313: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:40.250 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:39 vm03.local ceph-mon[103098]: pgmap v314: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:39 vm04.local ceph-mon[92084]: pgmap v314: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:42.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:41 vm04.local ceph-mon[92084]: pgmap v315: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:42.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:41 vm03.local ceph-mon[103098]: pgmap v315: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:44.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:43 vm04.local ceph-mon[92084]: pgmap v316: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:44.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:43 vm03.local ceph-mon[103098]: pgmap v316: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:45 vm04.local ceph-mon[92084]: pgmap v317: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:16:46.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:45 vm03.local ceph-mon[103098]: pgmap v317: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:46.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:16:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:47 vm04.local ceph-mon[92084]: pgmap v318: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:48.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:47 vm03.local ceph-mon[103098]: pgmap v318: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:49 vm04.local ceph-mon[92084]: pgmap v319: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:49 vm03.local ceph-mon[103098]: pgmap v319: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:51 vm04.local ceph-mon[92084]: pgmap v320: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:52.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:51 vm03.local ceph-mon[103098]: pgmap v320: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:53 vm04.local ceph-mon[92084]: pgmap v321: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:53 vm03.local ceph-mon[103098]: pgmap v321: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:16:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:56 vm04.local ceph-mon[92084]: pgmap v322: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:56 vm03.local ceph-mon[103098]: pgmap v322: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:16:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:16:58 vm04.local ceph-mon[92084]: pgmap v323: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:16:58.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:16:58 vm03.local ceph-mon[103098]: pgmap v323: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:00.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:00 vm04.local ceph-mon[92084]: pgmap v324: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:00.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:00 vm03.local ceph-mon[103098]: pgmap v324: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:01.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:17:01.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:17:02.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:02 vm04.local ceph-mon[92084]: pgmap v325: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:02.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:17:02.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:02 vm03.local ceph-mon[103098]: pgmap v325: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:02.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:17:03.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:17:03.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:17:03.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:17:03.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:17:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:17:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:17:04.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:04 vm04.local ceph-mon[92084]: pgmap v326: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:04 vm03.local ceph-mon[103098]: pgmap v326: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:06.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:06 vm04.local ceph-mon[92084]: pgmap v327: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:06 vm03.local ceph-mon[103098]: pgmap v327: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:08 vm04.local ceph-mon[92084]: pgmap v328: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:08.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:08 vm03.local ceph-mon[103098]: pgmap v328: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:10.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:10 vm04.local ceph-mon[92084]: pgmap v329: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:10.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:10 vm03.local ceph-mon[103098]: pgmap v329: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:12.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:12 vm04.local ceph-mon[92084]: pgmap v330: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:12.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:12 vm03.local ceph-mon[103098]: pgmap v330: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:14.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:14 vm04.local ceph-mon[92084]: pgmap v331: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:14.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:14 vm03.local ceph-mon[103098]: pgmap v331: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:16.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:16 vm04.local ceph-mon[92084]: pgmap v332: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:16.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:16 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:17:16.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:16 vm03.local ceph-mon[103098]: pgmap v332: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:16.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:16 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:17:17.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:17 vm04.local ceph-mon[92084]: pgmap v333: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:17.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:17 vm03.local ceph-mon[103098]: pgmap v333: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:20.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:19 vm04.local ceph-mon[92084]: pgmap v334: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:20.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:19 vm03.local ceph-mon[103098]: pgmap v334: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:22.009 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:21 vm03.local ceph-mon[103098]: pgmap v335: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:22.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:21 vm04.local ceph-mon[92084]: pgmap v335: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:24.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:23 vm04.local ceph-mon[92084]: pgmap v336: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:24.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:23 vm03.local ceph-mon[103098]: pgmap v336: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:26.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:25 vm04.local ceph-mon[92084]: pgmap v337: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:26.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:25 vm03.local ceph-mon[103098]: pgmap v337: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:28.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:27 vm04.local ceph-mon[92084]: pgmap v338: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:28.066 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:27 vm03.local ceph-mon[103098]: pgmap v338: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:30.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:29 vm04.local ceph-mon[92084]: pgmap v339: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:30.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:29 vm03.local ceph-mon[103098]: pgmap v339: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:31.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:17:31.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:17:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:31 vm04.local ceph-mon[92084]: pgmap v340: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:32.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:31 vm03.local ceph-mon[103098]: pgmap v340: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:33.907 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T14:17:34.060 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:34.083 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:33 vm03.local ceph-mon[103098]: pgmap v341: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:34.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.310+0000 7f75cde5f700 1 -- 192.168.123.103:0/1961872462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c810cbe0 msgr2=0x7f75c810cfb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:34.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.310+0000 7f75cde5f700 1 --2- 192.168.123.103:0/1961872462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c810cbe0 0x7f75c810cfb0 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f75b0009b00 tx=0x7f75b0009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:34.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.311+0000 7f75cde5f700 1 -- 192.168.123.103:0/1961872462 shutdown_connections 2026-03-10T14:17:34.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.311+0000 7f75cde5f700 1 --2- 192.168.123.103:0/1961872462 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f75c8106bc0 0x7f75c8107030 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.311+0000 7f75cde5f700 1 --2- 192.168.123.103:0/1961872462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c810cbe0 0x7f75c810cfb0 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.311+0000 7f75cde5f700 1 -- 192.168.123.103:0/1961872462 >> 192.168.123.103:0/1961872462 conn(0x7f75c8074b30 msgr2=0x7f75c8076f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:34.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.312+0000 7f75cde5f700 1 -- 192.168.123.103:0/1961872462 shutdown_connections 2026-03-10T14:17:34.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.312+0000 7f75cde5f700 1 -- 192.168.123.103:0/1961872462 wait complete. 2026-03-10T14:17:34.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.312+0000 7f75cde5f700 1 Processor -- start 2026-03-10T14:17:34.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.312+0000 7f75cde5f700 1 -- start start 2026-03-10T14:17:34.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.313+0000 7f75cde5f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c8106bc0 0x7f75c819c990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:34.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.313+0000 7f75c77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c8106bc0 0x7f75c819c990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:34.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.313+0000 7f75c77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c8106bc0 0x7f75c819c990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:43838/0 (socket says 192.168.123.103:43838) 2026-03-10T14:17:34.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.313+0000 7f75cde5f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f75c819ced0 0x7f75c81a1370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:34.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.313+0000 7f75c77fe700 1 -- 192.168.123.103:0/523690606 learned_addr learned my addr 192.168.123.103:0/523690606 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:34.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.313+0000 7f75cde5f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f75c819d4e0 con 0x7f75c8106bc0 2026-03-10T14:17:34.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.313+0000 7f75c6ffd700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f75c819ced0 0x7f75c81a1370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:34.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.313+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f75c819d620 con 0x7f75c819ced0 2026-03-10T14:17:34.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.314+0000 7f75c77fe700 1 -- 192.168.123.103:0/523690606 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f75c819ced0 msgr2=0x7f75c81a1370 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:34.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.314+0000 7f75c77fe700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f75c819ced0 0x7f75c81a1370 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.314+0000 7f75c77fe700 1 -- 192.168.123.103:0/523690606 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f75b00097e0 con 0x7f75c8106bc0 2026-03-10T14:17:34.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.314+0000 7f75c77fe700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c8106bc0 0x7f75c819c990 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f75b0005370 tx=0x7f75b0004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:34.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:33 vm04.local ceph-mon[92084]: pgmap v341: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:34.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.314+0000 7f75c4ff9700 1 -- 192.168.123.103:0/523690606 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f75b001d070 con 0x7f75c8106bc0 2026-03-10T14:17:34.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.315+0000 7f75c4ff9700 1 -- 192.168.123.103:0/523690606 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f75b000bc50 con 0x7f75c8106bc0 2026-03-10T14:17:34.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.315+0000 7f75c4ff9700 1 -- 192.168.123.103:0/523690606 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f75b000f670 con 0x7f75c8106bc0 2026-03-10T14:17:34.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.315+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f75c81a1910 con 0x7f75c8106bc0 2026-03-10T14:17:34.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.315+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f75c81a1ee0 con 0x7f75c8106bc0 2026-03-10T14:17:34.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.316+0000 7f75c4ff9700 1 -- 192.168.123.103:0/523690606 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f75b000f7d0 con 0x7f75c8106bc0 2026-03-10T14:17:34.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.316+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f75c810f070 con 0x7f75c8106bc0 2026-03-10T14:17:34.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.316+0000 7f75c4ff9700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f75b4077870 0x7f75b4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:34.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.316+0000 7f75c4ff9700 1 -- 192.168.123.103:0/523690606 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f75b009b280 con 0x7f75c8106bc0 2026-03-10T14:17:34.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.317+0000 7f75c6ffd700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f75b4077870 0x7f75b4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:34.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.317+0000 7f75c6ffd700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f75b4077870 0x7f75b4079d20 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f75b8006fd0 tx=0x7f75b8008040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:34.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.319+0000 7f75c4ff9700 1 -- 192.168.123.103:0/523690606 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f75b00638f0 con 0x7f75c8106bc0 2026-03-10T14:17:34.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.436+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 --> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f75c8066e40 con 0x7f75b4077870 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.441+0000 7f75c4ff9700 1 -- 192.168.123.103:0/523690606 <== mgr.34100 v2:192.168.123.103:6800/65079813 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f75c8066e40 con 0x7f75b4077870 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (106s) 93s ago 14m 15.0M - 0.25.0 c8568f914cd2 cf55e8b6005f 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (2m) 93s ago 14m 10.4M - 19.2.3-678-ge911bdeb 654f31e6858e 1bd623640ecf 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm04 vm04 running (2m) 117s ago 13m 10.1M - 19.2.3-678-ge911bdeb 654f31e6858e 54bbafe0555e 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (8m) 93s ago 14m 7843k - 19.2.3-678-ge911bdeb 654f31e6858e 1a8bbbbe264a 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm04 vm04 running (8m) 117s ago 13m 7856k - 19.2.3-678-ge911bdeb 654f31e6858e cafd0fafe2fc 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (95s) 93s ago 14m 44.8M - 10.4.0 c8b91775d855 bbe8b6e1ebb7 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.aqaspa vm03 running (2m) 93s ago 12m 80.6M - 19.2.3-678-ge911bdeb 654f31e6858e b1023e0bcace 2026-03-10T14:17:34.440 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.itwezo vm03 running (2m) 93s ago 12m 19.7M - 19.2.3-678-ge911bdeb 654f31e6858e ddad42dde865 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.puavjd vm04 running (2m) 117s ago 12m 18.3M - 19.2.3-678-ge911bdeb 654f31e6858e f5060c63df19 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm04.sslxuq vm04 running (2m) 117s ago 12m 79.0M - 19.2.3-678-ge911bdeb 654f31e6858e 89fac44ae15d 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.rwbbep vm03 *:8443,9283,8765 running (9m) 93s ago 14m 636M - 19.2.3-678-ge911bdeb 654f31e6858e f23d24d786c6 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm04.ywwcto vm04 *:8443,9283,8765 running (8m) 117s ago 13m 499M - 19.2.3-678-ge911bdeb 654f31e6858e d43ddeefc7d3 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (8m) 93s ago 14m 70.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e c2a0f005ef9d 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm04 vm04 running (8m) 117s ago 13m 55.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 111e22858279 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (2m) 93s ago 14m 10.0M - 1.7.0 72c9c2088986 e10d80d39d78 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm04 vm04 *:9100 running (118s) 117s ago 13m 4874k - 1.7.0 72c9c2088986 29f4915b7954 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (7m) 93s ago 13m 284M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6e24e5898f4d 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (4m) 93s ago 13m 234M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bf01c6df2120 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (4m) 93s ago 13m 172M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e0d768b70468 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm04 running (4m) 117s ago 12m 150M 4096M 19.2.3-678-ge911bdeb 654f31e6858e f7fc2aafa9d9 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm04 running (3m) 117s ago 12m 124M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 89f9225212d4 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm04 running (3m) 117s ago 12m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 6c7573f5f3fa 2026-03-10T14:17:34.441 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (110s) 93s ago 13m 61.9M - 2.51.0 1d3b7f56885b 465f119ec393 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f75b4077870 msgr2=0x7f75b4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f75b4077870 0x7f75b4079d20 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f75b8006fd0 tx=0x7f75b8008040 comp rx=0 tx=0).stop 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c8106bc0 msgr2=0x7f75c819c990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c8106bc0 0x7f75c819c990 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f75b0005370 tx=0x7f75b0004970 comp rx=0 tx=0).stop 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 shutdown_connections 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f75b4077870 0x7f75b4079d20 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f75c8106bc0 0x7f75c819c990 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 --2- 192.168.123.103:0/523690606 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f75c819ced0 0x7f75c81a1370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.444+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 >> 192.168.123.103:0/523690606 conn(0x7f75c8074b30 msgr2=0x7f75c80764f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.445+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 shutdown_connections 2026-03-10T14:17:34.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.445+0000 7f75cde5f700 1 -- 192.168.123.103:0/523690606 wait complete. 2026-03-10T14:17:34.505 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T14:17:34.661 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.917+0000 7fa59d57b700 1 -- 192.168.123.103:0/3065796338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 msgr2=0x7fa5981009b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.917+0000 7fa59d57b700 1 --2- 192.168.123.103:0/3065796338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 0x7fa5981009b0 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7fa588009b00 tx=0x7fa588009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.918+0000 7fa59d57b700 1 -- 192.168.123.103:0/3065796338 shutdown_connections 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.918+0000 7fa59d57b700 1 --2- 192.168.123.103:0/3065796338 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 0x7fa5981009b0 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.918+0000 7fa59d57b700 1 --2- 192.168.123.103:0/3065796338 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa598106560 0x7fa598106930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.918+0000 7fa59d57b700 1 -- 192.168.123.103:0/3065796338 >> 192.168.123.103:0/3065796338 conn(0x7fa5980fc000 msgr2=0x7fa5980fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.918+0000 7fa59d57b700 1 -- 192.168.123.103:0/3065796338 shutdown_connections 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.918+0000 7fa59d57b700 1 -- 192.168.123.103:0/3065796338 wait complete. 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa59d57b700 1 Processor -- start 2026-03-10T14:17:34.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa59d57b700 1 -- start start 2026-03-10T14:17:34.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa59d57b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 0x7fa598198c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:34.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa59d57b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa598106560 0x7fa5981991a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:34.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa59d57b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa598193cf0 con 0x7fa598100540 2026-03-10T14:17:34.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa59d57b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa598193e60 con 0x7fa598106560 2026-03-10T14:17:34.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa5967fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa598106560 0x7fa5981991a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:34.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa5967fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa598106560 0x7fa5981991a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49526/0 (socket says 192.168.123.103:49526) 2026-03-10T14:17:34.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.919+0000 7fa5967fc700 1 -- 192.168.123.103:0/1533304211 learned_addr learned my addr 192.168.123.103:0/1533304211 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:34.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.920+0000 7fa5967fc700 1 -- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 msgr2=0x7fa598198c40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:34.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.920+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 0x7fa598198c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:34.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.920+0000 7fa5967fc700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 0x7fa598198c40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:34.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.920+0000 7fa5967fc700 1 -- 192.168.123.103:0/1533304211 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5880097e0 con 0x7fa598106560 2026-03-10T14:17:34.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.921+0000 7fa5967fc700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa598106560 0x7fa5981991a0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fa588005b40 tx=0x7fa58800bfd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:34.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.921+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 0x7fa598198c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:34.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.921+0000 7fa58ffff700 1 -- 192.168.123.103:0/1533304211 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa58801d070 con 0x7fa598106560 2026-03-10T14:17:34.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.921+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa5981940e0 con 0x7fa598106560 2026-03-10T14:17:34.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.921+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5981945d0 con 0x7fa598106560 2026-03-10T14:17:34.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.922+0000 7fa58ffff700 1 -- 192.168.123.103:0/1533304211 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa58800f460 con 0x7fa598106560 2026-03-10T14:17:34.921 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.922+0000 7fa58ffff700 1 -- 192.168.123.103:0/1533304211 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa588021620 con 0x7fa598106560 2026-03-10T14:17:34.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.923+0000 7fa58ffff700 1 -- 192.168.123.103:0/1533304211 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa588021780 con 0x7fa598106560 2026-03-10T14:17:34.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.923+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa59804ea50 con 0x7fa598106560 2026-03-10T14:17:34.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.924+0000 7fa58ffff700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa584077870 0x7fa584079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:34.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.924+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa584077870 0x7fa584079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:34.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.925+0000 7fa58ffff700 1 -- 192.168.123.103:0/1533304211 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fa58809adc0 con 0x7fa598106560 2026-03-10T14:17:34.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.925+0000 7fa596ffd700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa584077870 0x7fa584079d20 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fa580009dd0 tx=0x7fa580009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:34.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:34.927+0000 7fa58ffff700 1 -- 192.168.123.103:0/1533304211 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa58809e030 con 0x7fa598106560 2026-03-10T14:17:35.090 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.089+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fa598066e40 con 0x7fa598106560 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.093+0000 7fa58ffff700 1 -- 192.168.123.103:0/1533304211 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fa5880634e0 con 0x7fa598106560 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T14:17:35.092 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-10T14:17:35.093 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-10T14:17:35.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa584077870 msgr2=0x7fa584079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:35.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa584077870 0x7fa584079d20 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fa580009dd0 tx=0x7fa580009450 comp rx=0 tx=0).stop 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa598106560 msgr2=0x7fa5981991a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa598106560 0x7fa5981991a0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fa588005b40 tx=0x7fa58800bfd0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 shutdown_connections 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa584077870 0x7fa584079d20 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa598100540 0x7fa598198c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 --2- 192.168.123.103:0/1533304211 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa598106560 0x7fa5981991a0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 >> 192.168.123.103:0/1533304211 conn(0x7fa5980fc000 msgr2=0x7fa5980fd780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 shutdown_connections 2026-03-10T14:17:35.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.096+0000 7fa59d57b700 1 -- 192.168.123.103:0/1533304211 wait complete. 2026-03-10T14:17:35.160 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-10T14:17:35.310 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:35.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.573+0000 7fc9722a9700 1 -- 192.168.123.103:0/290453295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c1066c0 msgr2=0x7fc96c106a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:35.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.573+0000 7fc9722a9700 1 --2- 192.168.123.103:0/290453295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c1066c0 0x7fc96c106a90 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7fc954009b00 tx=0x7fc954009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:35.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.574+0000 7fc9722a9700 1 -- 192.168.123.103:0/290453295 shutdown_connections 2026-03-10T14:17:35.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.574+0000 7fc9722a9700 1 --2- 192.168.123.103:0/290453295 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc96c068490 0x7fc96c068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.574+0000 7fc9722a9700 1 --2- 192.168.123.103:0/290453295 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c1066c0 0x7fc96c106a90 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.575+0000 7fc9722a9700 1 -- 192.168.123.103:0/290453295 >> 192.168.123.103:0/290453295 conn(0x7fc96c0754a0 msgr2=0x7fc96c0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:35.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.575+0000 7fc9722a9700 1 -- 192.168.123.103:0/290453295 shutdown_connections 2026-03-10T14:17:35.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.575+0000 7fc9722a9700 1 -- 192.168.123.103:0/290453295 wait complete. 2026-03-10T14:17:35.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc9722a9700 1 Processor -- start 2026-03-10T14:17:35.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc9722a9700 1 -- start start 2026-03-10T14:17:35.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc9722a9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c068490 0x7fc96c194220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:35.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc9722a9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc96c194760 0x7fc96c198bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:35.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc9722a9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc96c194d70 con 0x7fc96c068490 2026-03-10T14:17:35.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc9722a9700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc96c194ee0 con 0x7fc96c194760 2026-03-10T14:17:35.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc96b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc96c194760 0x7fc96c198bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:35.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc96b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc96c194760 0x7fc96c198bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49556/0 (socket says 192.168.123.103:49556) 2026-03-10T14:17:35.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.576+0000 7fc96b7fe700 1 -- 192.168.123.103:0/1660128183 learned_addr learned my addr 192.168.123.103:0/1660128183 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:35.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.577+0000 7fc96bfff700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c068490 0x7fc96c194220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:35.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.577+0000 7fc96b7fe700 1 -- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c068490 msgr2=0x7fc96c194220 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:35.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.577+0000 7fc96b7fe700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c068490 0x7fc96c194220 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.577+0000 7fc96b7fe700 1 -- 192.168.123.103:0/1660128183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9540097e0 con 0x7fc96c194760 2026-03-10T14:17:35.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.577+0000 7fc96bfff700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c068490 0x7fc96c194220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:17:35.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.577+0000 7fc96b7fe700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc96c194760 0x7fc96c198bd0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fc95c00d900 tx=0x7fc95c00dc10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:35.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.578+0000 7fc9697fa700 1 -- 192.168.123.103:0/1660128183 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc95c0049e0 con 0x7fc96c194760 2026-03-10T14:17:35.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.578+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc96c1991d0 con 0x7fc96c194760 2026-03-10T14:17:35.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.578+0000 7fc9697fa700 1 -- 192.168.123.103:0/1660128183 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc95c005500 con 0x7fc96c194760 2026-03-10T14:17:35.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.578+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc96c199720 con 0x7fc96c194760 2026-03-10T14:17:35.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.579+0000 7fc9697fa700 1 -- 192.168.123.103:0/1660128183 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc95c004bd0 con 0x7fc96c194760 2026-03-10T14:17:35.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.579+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc96c199a50 con 0x7fc96c194760 2026-03-10T14:17:35.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.580+0000 7fc9697fa700 1 -- 192.168.123.103:0/1660128183 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc95c009b30 con 0x7fc96c194760 2026-03-10T14:17:35.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.580+0000 7fc9697fa700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc95807bcd0 0x7fc95807e180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:35.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.580+0000 7fc9697fa700 1 -- 192.168.123.103:0/1660128183 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fc95c099c50 con 0x7fc96c194760 2026-03-10T14:17:35.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.581+0000 7fc96bfff700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc95807bcd0 0x7fc95807e180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:35.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.581+0000 7fc96bfff700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc95807bcd0 0x7fc95807e180 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fc954000c00 tx=0x7fc95400b560 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:35.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.582+0000 7fc9697fa700 1 -- 192.168.123.103:0/1660128183 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc95c099e60 con 0x7fc96c194760 2026-03-10T14:17:35.743 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.743+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc96c103950 con 0x7fc96c194760 2026-03-10T14:17:35.743 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.744+0000 7fc9697fa700 1 -- 192.168.123.103:0/1660128183 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fc95c062340 con 0x7fc96c194760 2026-03-10T14:17:35.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.747+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc95807bcd0 msgr2=0x7fc95807e180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:35.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.747+0000 7fc9722a9700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc95807bcd0 0x7fc95807e180 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fc954000c00 tx=0x7fc95400b560 comp rx=0 tx=0).stop 2026-03-10T14:17:35.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.747+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc96c194760 msgr2=0x7fc96c198bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:35.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.747+0000 7fc9722a9700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc96c194760 0x7fc96c198bd0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fc95c00d900 tx=0x7fc95c00dc10 comp rx=0 tx=0).stop 2026-03-10T14:17:35.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.748+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 shutdown_connections 2026-03-10T14:17:35.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.748+0000 7fc9722a9700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fc95807bcd0 0x7fc95807e180 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.748+0000 7fc9722a9700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc96c068490 0x7fc96c194220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.748+0000 7fc9722a9700 1 --2- 192.168.123.103:0/1660128183 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fc96c194760 0x7fc96c198bd0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:35.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.748+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 >> 192.168.123.103:0/1660128183 conn(0x7fc96c0754a0 msgr2=0x7fc96c0fefb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:35.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.748+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 shutdown_connections 2026-03-10T14:17:35.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:35.748+0000 7fc9722a9700 1 -- 192.168.123.103:0/1660128183 wait complete. 2026-03-10T14:17:35.756 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-10T14:17:35.814 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-10T14:17:35.980 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:36.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:35 vm03.local ceph-mon[103098]: from='client.34556 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:17:36.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:35 vm03.local ceph-mon[103098]: pgmap v342: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:36.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:35 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1533304211' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:17:36.004 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:35 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1660128183' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:17:36.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.254+0000 7fedd3f6f700 1 -- 192.168.123.103:0/843140855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc10ed80 msgr2=0x7fedcc06d260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:36.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.254+0000 7fedd3f6f700 1 --2- 192.168.123.103:0/843140855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc10ed80 0x7fedcc06d260 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7fedc8009b00 tx=0x7fedc8009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:36.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.254+0000 7fedd3f6f700 1 -- 192.168.123.103:0/843140855 shutdown_connections 2026-03-10T14:17:36.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.254+0000 7fedd3f6f700 1 --2- 192.168.123.103:0/843140855 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedcc06d7a0 0x7fedcc06dc10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.254+0000 7fedd3f6f700 1 --2- 192.168.123.103:0/843140855 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc10ed80 0x7fedcc06d260 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.254+0000 7fedd3f6f700 1 -- 192.168.123.103:0/843140855 >> 192.168.123.103:0/843140855 conn(0x7fedcc06c830 msgr2=0x7fedcc071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:36.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.254+0000 7fedd3f6f700 1 -- 192.168.123.103:0/843140855 shutdown_connections 2026-03-10T14:17:36.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.254+0000 7fedd3f6f700 1 -- 192.168.123.103:0/843140855 wait complete. 2026-03-10T14:17:36.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.255+0000 7fedd3f6f700 1 Processor -- start 2026-03-10T14:17:36.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.255+0000 7fedd3f6f700 1 -- start start 2026-03-10T14:17:36.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.255+0000 7fedd3f6f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc06d7a0 0x7fedcc117540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:36.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.255+0000 7fedd3f6f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedcc10ed80 0x7fedcc1124f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:36.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.255+0000 7fedd3f6f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fedcc112a30 con 0x7fedcc06d7a0 2026-03-10T14:17:36.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.255+0000 7fedd3f6f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fedcc112ba0 con 0x7fedcc10ed80 2026-03-10T14:17:36.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.257+0000 7fedd150a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedcc10ed80 0x7fedcc1124f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.257+0000 7fedd1d0b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc06d7a0 0x7fedcc117540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.257+0000 7fedd1d0b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc06d7a0 0x7fedcc117540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:43902/0 (socket says 192.168.123.103:43902) 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.257+0000 7fedd1d0b700 1 -- 192.168.123.103:0/1379684100 learned_addr learned my addr 192.168.123.103:0/1379684100 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.258+0000 7fedd1d0b700 1 -- 192.168.123.103:0/1379684100 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedcc10ed80 msgr2=0x7fedcc1124f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.258+0000 7fedd1d0b700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedcc10ed80 0x7fedcc1124f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.258+0000 7fedd1d0b700 1 -- 192.168.123.103:0/1379684100 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fedc80097e0 con 0x7fedcc06d7a0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.258+0000 7fedd150a700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedcc10ed80 0x7fedcc1124f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.258+0000 7fedd1d0b700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc06d7a0 0x7fedcc117540 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7fedc80038b0 tx=0x7fedc800bfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.258+0000 7fedc2ffd700 1 -- 192.168.123.103:0/1379684100 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fedc801d070 con 0x7fedcc06d7a0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.258+0000 7fedc2ffd700 1 -- 192.168.123.103:0/1379684100 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fedc8003e00 con 0x7fedcc06d7a0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.259+0000 7fedc2ffd700 1 -- 192.168.123.103:0/1379684100 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fedc80178f0 con 0x7fedcc06d7a0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.260+0000 7fedd3f6f700 1 -- 192.168.123.103:0/1379684100 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fedcc112e80 con 0x7fedcc06d7a0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.260+0000 7fedd3f6f700 1 -- 192.168.123.103:0/1379684100 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fedcc113350 con 0x7fedcc06d7a0 2026-03-10T14:17:36.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.264+0000 7fedd3f6f700 1 -- 192.168.123.103:0/1379684100 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fedcc109e10 con 0x7fedcc06d7a0 2026-03-10T14:17:36.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.266+0000 7fedc2ffd700 1 -- 192.168.123.103:0/1379684100 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fedc80052d0 con 0x7fedcc06d7a0 2026-03-10T14:17:36.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.267+0000 7fedc2ffd700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fedb8077720 0x7fedb8079bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:36.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.267+0000 7fedd150a700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fedb8077720 0x7fedb8079bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:36.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.268+0000 7fedc2ffd700 1 -- 192.168.123.103:0/1379684100 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fedc8029330 con 0x7fedcc06d7a0 2026-03-10T14:17:36.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.269+0000 7fedd150a700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fedb8077720 0x7fedb8079bd0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fedbc009fd0 tx=0x7fedbc009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:36.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.273+0000 7fedc2ffd700 1 -- 192.168.123.103:0/1379684100 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fedc8063ae0 con 0x7fedcc06d7a0 2026-03-10T14:17:36.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:35 vm04.local ceph-mon[92084]: from='client.34556 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T14:17:36.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:35 vm04.local ceph-mon[92084]: pgmap v342: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:36.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:35 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1533304211' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:17:36.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:35 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1660128183' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:17:36.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.442+0000 7fedd3f6f700 1 -- 192.168.123.103:0/1379684100 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fedcc113a20 con 0x7fedcc06d7a0 2026-03-10T14:17:36.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.447+0000 7fedc2ffd700 1 -- 192.168.123.103:0/1379684100 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fedc8005c00 con 0x7fedcc06d7a0 2026-03-10T14:17:36.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 -- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fedb8077720 msgr2=0x7fedb8079bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:36.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fedb8077720 0x7fedb8079bd0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fedbc009fd0 tx=0x7fedbc009450 comp rx=0 tx=0).stop 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 -- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc06d7a0 msgr2=0x7fedcc117540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc06d7a0 0x7fedcc117540 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7fedc80038b0 tx=0x7fedc800bfd0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 -- 192.168.123.103:0/1379684100 shutdown_connections 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fedb8077720 0x7fedb8079bd0 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fedcc06d7a0 0x7fedcc117540 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 --2- 192.168.123.103:0/1379684100 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fedcc10ed80 0x7fedcc1124f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.450+0000 7fedc0ff9700 1 -- 192.168.123.103:0/1379684100 >> 192.168.123.103:0/1379684100 conn(0x7fedcc06c830 msgr2=0x7fedcc118910 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.451+0000 7fedc0ff9700 1 -- 192.168.123.103:0/1379684100 shutdown_connections 2026-03-10T14:17:36.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.451+0000 7fedc0ff9700 1 -- 192.168.123.103:0/1379684100 wait complete. 2026-03-10T14:17:36.464 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-10T14:17:36.507 DEBUG:teuthology.parallel:result is None 2026-03-10T14:17:36.507 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T14:17:36.511 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-10T14:17:36.511 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- bash -c 'ceph fs dump' 2026-03-10T14:17:36.676 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:36.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.958+0000 7fb45833d700 1 -- 192.168.123.103:0/2267745343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 msgr2=0x7fb4501009b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:36.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.958+0000 7fb45833d700 1 --2- 192.168.123.103:0/2267745343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 0x7fb4501009b0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fb44c009b00 tx=0x7fb44c009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:36.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.959+0000 7fb45833d700 1 -- 192.168.123.103:0/2267745343 shutdown_connections 2026-03-10T14:17:36.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.959+0000 7fb45833d700 1 --2- 192.168.123.103:0/2267745343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 0x7fb4501009b0 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.959+0000 7fb45833d700 1 --2- 192.168.123.103:0/2267745343 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb450106560 0x7fb450106930 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.959+0000 7fb45833d700 1 -- 192.168.123.103:0/2267745343 >> 192.168.123.103:0/2267745343 conn(0x7fb4500fc000 msgr2=0x7fb4500fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:36.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.960+0000 7fb45833d700 1 -- 192.168.123.103:0/2267745343 shutdown_connections 2026-03-10T14:17:36.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.960+0000 7fb45833d700 1 -- 192.168.123.103:0/2267745343 wait complete. 2026-03-10T14:17:36.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.960+0000 7fb45833d700 1 Processor -- start 2026-03-10T14:17:36.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.960+0000 7fb45833d700 1 -- start start 2026-03-10T14:17:36.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.961+0000 7fb45833d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 0x7fb4501982d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:36.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.961+0000 7fb45833d700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb450106560 0x7fb450198810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.961+0000 7fb45833d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb450198e60 con 0x7fb450100540 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.961+0000 7fb45833d700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb450198fa0 con 0x7fb450106560 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.961+0000 7fb4558d8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb450106560 0x7fb450198810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.961+0000 7fb4558d8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb450106560 0x7fb450198810 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49598/0 (socket says 192.168.123.103:49598) 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.961+0000 7fb4558d8700 1 -- 192.168.123.103:0/568146616 learned_addr learned my addr 192.168.123.103:0/568146616 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.961+0000 7fb4560d9700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 0x7fb4501982d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb4558d8700 1 -- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 msgr2=0x7fb4501982d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb4558d8700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 0x7fb4501982d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb4558d8700 1 -- 192.168.123.103:0/568146616 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb44c0097e0 con 0x7fb450106560 2026-03-10T14:17:36.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb4560d9700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 0x7fb4501982d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:36.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb4558d8700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb450106560 0x7fb450198810 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fb44c004930 tx=0x7fb44c004a10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:36.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb4477fe700 1 -- 192.168.123.103:0/568146616 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb44c01d070 con 0x7fb450106560 2026-03-10T14:17:36.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb45833d700 1 -- 192.168.123.103:0/568146616 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb45019cd90 con 0x7fb450106560 2026-03-10T14:17:36.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb4477fe700 1 -- 192.168.123.103:0/568146616 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb44c00bc50 con 0x7fb450106560 2026-03-10T14:17:36.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb4477fe700 1 -- 192.168.123.103:0/568146616 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb44c00f7e0 con 0x7fb450106560 2026-03-10T14:17:36.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.962+0000 7fb45833d700 1 -- 192.168.123.103:0/568146616 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb45019d280 con 0x7fb450106560 2026-03-10T14:17:36.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.963+0000 7fb4457fa700 1 -- 192.168.123.103:0/568146616 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb4380052f0 con 0x7fb450106560 2026-03-10T14:17:36.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.965+0000 7fb4477fe700 1 -- 192.168.123.103:0/568146616 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb44c00f940 con 0x7fb450106560 2026-03-10T14:17:36.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.965+0000 7fb4477fe700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb43c0778c0 0x7fb43c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:36.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.965+0000 7fb4560d9700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb43c0778c0 0x7fb43c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:36.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.965+0000 7fb4477fe700 1 -- 192.168.123.103:0/568146616 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fb44c09be80 con 0x7fb450106560 2026-03-10T14:17:36.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.966+0000 7fb4560d9700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb43c0778c0 0x7fb43c079d70 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fb440005fd0 tx=0x7fb440005e20 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:36.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:36.967+0000 7fb4477fe700 1 -- 192.168.123.103:0/568146616 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb44c064570 con 0x7fb450106560 2026-03-10T14:17:37.109 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:36 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1379684100' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:17:37.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.112+0000 7fb4457fa700 1 -- 192.168.123.103:0/568146616 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb438005c90 con 0x7fb450106560 2026-03-10T14:17:37.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.113+0000 7fb4477fe700 1 -- 192.168.123.103:0/568146616 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 40 v40) v1 ==== 76+0+2006 (secure 0 0 0) 0x7fb44c063cc0 con 0x7fb450106560 2026-03-10T14:17:37.112 INFO:teuthology.orchestra.run.vm03.stdout:e40 2026-03-10T14:17:37.112 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-10T14:15:22:636548+0000 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:epoch 40 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:flags 32 joinable allow_snaps allow_multimds_snaps allow_standby_replay 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-10T14:05:22.222304+0000 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-10T14:15:22.636520+0000 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 121 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 2 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:in 0,1 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:up {0=34474,1=34476} 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 34474 members: 34476,34474 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.aqaspa{0:34474} state up:active seq 8 join_fscid=1 addr [v2:192.168.123.103:6826/2839841111,v1:192.168.123.103:6827/2839841111] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.sslxuq{0:34480} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.104:6824/3487856112,v1:192.168.123.104:6825/3487856112] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.itwezo{1:34476} state up:active seq 6 join_fscid=1 addr [v2:192.168.123.103:6828/2389872767,v1:192.168.123.103:6829/2389872767] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm04.puavjd{1:44351} state up:standby-replay seq 1 join_fscid=1 addr [v2:192.168.123.104:6826/3930678535,v1:192.168.123.104:6827/3930678535] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:37.113 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:37.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.115+0000 7fb4457fa700 1 -- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb43c0778c0 msgr2=0x7fb43c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:37.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.115+0000 7fb4457fa700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb43c0778c0 0x7fb43c079d70 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fb440005fd0 tx=0x7fb440005e20 comp rx=0 tx=0).stop 2026-03-10T14:17:37.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.116+0000 7fb4457fa700 1 -- 192.168.123.103:0/568146616 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb450106560 msgr2=0x7fb450198810 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:37.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.116+0000 7fb4457fa700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb450106560 0x7fb450198810 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fb44c004930 tx=0x7fb44c004a10 comp rx=0 tx=0).stop 2026-03-10T14:17:37.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.116+0000 7fb4457fa700 1 -- 192.168.123.103:0/568146616 shutdown_connections 2026-03-10T14:17:37.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.116+0000 7fb4457fa700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fb43c0778c0 0x7fb43c079d70 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.116+0000 7fb4457fa700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450100540 0x7fb4501982d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.116+0000 7fb4457fa700 1 --2- 192.168.123.103:0/568146616 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fb450106560 0x7fb450198810 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.116+0000 7fb4457fa700 1 -- 192.168.123.103:0/568146616 >> 192.168.123.103:0/568146616 conn(0x7fb4500fc000 msgr2=0x7fb4500fd780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:37.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.116+0000 7fb4457fa700 1 -- 192.168.123.103:0/568146616 shutdown_connections 2026-03-10T14:17:37.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.117+0000 7fb4457fa700 1 -- 192.168.123.103:0/568146616 wait complete. 2026-03-10T14:17:37.116 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 40 2026-03-10T14:17:37.187 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-10T14:17:37.191 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 2026-03-10T14:17:37.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:36 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1379684100' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T14:17:37.346 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.629+0000 7f1b66afc700 1 -- 192.168.123.103:0/254800301 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b60102810 msgr2=0x7f1b60102c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.629+0000 7f1b66afc700 1 --2- 192.168.123.103:0/254800301 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b60102810 0x7f1b60102c80 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7f1b54009b50 tx=0x7f1b54009e60 comp rx=0 tx=0).stop 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.629+0000 7f1b66afc700 1 -- 192.168.123.103:0/254800301 shutdown_connections 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.629+0000 7f1b66afc700 1 --2- 192.168.123.103:0/254800301 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b60102810 0x7f1b60102c80 secure :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7f1b54009b50 tx=0x7f1b54009e60 comp rx=0 tx=0).stop 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.629+0000 7f1b66afc700 1 --2- 192.168.123.103:0/254800301 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1b60108810 0x7f1b60108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.629+0000 7f1b66afc700 1 -- 192.168.123.103:0/254800301 >> 192.168.123.103:0/254800301 conn(0x7f1b600fe330 msgr2=0x7f1b60100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.631+0000 7f1b66afc700 1 -- 192.168.123.103:0/254800301 shutdown_connections 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.631+0000 7f1b66afc700 1 -- 192.168.123.103:0/254800301 wait complete. 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b66afc700 1 Processor -- start 2026-03-10T14:17:37.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b66afc700 1 -- start start 2026-03-10T14:17:37.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b66afc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1b60108810 0x7f1b60075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:37.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b66afc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b600757a0 0x7f1b60075c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:37.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b66afc700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b600792e0 con 0x7f1b600757a0 2026-03-10T14:17:37.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b66afc700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b60079450 con 0x7f1b60108810 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b5ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b600757a0 0x7f1b60075c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b5ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b600757a0 0x7f1b60075c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:43942/0 (socket says 192.168.123.103:43942) 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b5ffff700 1 -- 192.168.123.103:0/2985055324 learned_addr learned my addr 192.168.123.103:0/2985055324 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b5ffff700 1 -- 192.168.123.103:0/2985055324 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1b60108810 msgr2=0x7f1b60075260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b5ffff700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1b60108810 0x7f1b60075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.632+0000 7f1b5ffff700 1 -- 192.168.123.103:0/2985055324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1b540097e0 con 0x7f1b600757a0 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.633+0000 7f1b5ffff700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b600757a0 0x7f1b60075c10 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f1b54005450 tx=0x7f1b54004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.633+0000 7f1b5dffb700 1 -- 192.168.123.103:0/2985055324 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b5401d070 con 0x7f1b600757a0 2026-03-10T14:17:37.632 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.633+0000 7f1b5dffb700 1 -- 192.168.123.103:0/2985055324 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1b54022470 con 0x7f1b600757a0 2026-03-10T14:17:37.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.633+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1b601a6c20 con 0x7f1b600757a0 2026-03-10T14:17:37.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.633+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1b601a70e0 con 0x7f1b600757a0 2026-03-10T14:17:37.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.635+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1b6010ad80 con 0x7f1b600757a0 2026-03-10T14:17:37.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.636+0000 7f1b5dffb700 1 -- 192.168.123.103:0/2985055324 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b5400f670 con 0x7f1b600757a0 2026-03-10T14:17:37.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.636+0000 7f1b5dffb700 1 -- 192.168.123.103:0/2985055324 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1b5400f890 con 0x7f1b600757a0 2026-03-10T14:17:37.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.636+0000 7f1b5dffb700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1b48077ab0 0x7f1b48079f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:37.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.636+0000 7f1b64898700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1b48077ab0 0x7f1b48079f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:37.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.636+0000 7f1b64898700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1b48077ab0 0x7f1b48079f60 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f1b60103950 tx=0x7f1b500098b0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:37.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.636+0000 7f1b5dffb700 1 -- 192.168.123.103:0/2985055324 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f1b5409c150 con 0x7f1b600757a0 2026-03-10T14:17:37.637 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.638+0000 7f1b5dffb700 1 -- 192.168.123.103:0/2985055324 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1b540648f0 con 0x7f1b600757a0 2026-03-10T14:17:37.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.784+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f1b6004ea50 con 0x7f1b600757a0 2026-03-10T14:17:37.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.785+0000 7f1b5dffb700 1 -- 192.168.123.103:0/2985055324 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 40 v40) v1 ==== 94+0+5303 (secure 0 0 0) 0x7f1b54064040 con 0x7f1b600757a0 2026-03-10T14:17:37.784 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:37.784 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":40,"btime":"2026-03-10T14:15:22:636548+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[],"filesystems":[{"mdsmap":{"epoch":40,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:22.636520+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0,1],"up":{"mds_0":34474,"mds_1":34476},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34474":{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":0,"incarnation":33,"state":"up:active","state_seq":8,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_34476":{"gid":34476,"name":"cephfs.vm03.itwezo","rank":1,"incarnation":38,"state":"up:active","state_seq":6,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_34480":{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_44351":{"gid":44351,"name":"cephfs.vm04.puavjd","rank":1,"incarnation":0,"state":"up:standby-replay","state_seq":1,"addr":"192.168.123.104:6827/3930678535","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3930678535},{"type":"v1","addr":"192.168.123.104:6827","nonce":3930678535}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34474,"qdb_cluster":[34476,34474]},"id":1}]} 2026-03-10T14:17:37.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.787+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1b48077ab0 msgr2=0x7f1b48079f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:37.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.787+0000 7f1b66afc700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1b48077ab0 0x7f1b48079f60 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f1b60103950 tx=0x7f1b500098b0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.787+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b600757a0 msgr2=0x7f1b60075c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:37.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.787+0000 7f1b66afc700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b600757a0 0x7f1b60075c10 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f1b54005450 tx=0x7f1b54004970 comp rx=0 tx=0).stop 2026-03-10T14:17:37.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.788+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 shutdown_connections 2026-03-10T14:17:37.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.788+0000 7f1b66afc700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f1b48077ab0 0x7f1b48079f60 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.788+0000 7f1b66afc700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f1b60108810 0x7f1b60075260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.786 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.788+0000 7f1b66afc700 1 --2- 192.168.123.103:0/2985055324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b600757a0 0x7f1b60075c10 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:37.787 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.788+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 >> 192.168.123.103:0/2985055324 conn(0x7f1b600fe330 msgr2=0x7f1b600ffd30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:37.787 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.788+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 shutdown_connections 2026-03-10T14:17:37.787 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:37.788+0000 7f1b66afc700 1 -- 192.168.123.103:0/2985055324 wait complete. 2026-03-10T14:17:37.787 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 40 2026-03-10T14:17:37.833 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 12, 'max_mds': 2, 'flags': 50} 2026-03-10T14:17:37.834 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 13 2026-03-10T14:17:37.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:37 vm03.local ceph-mon[103098]: pgmap v343: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:37.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:37 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/568146616' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:17:37.895 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:37 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2985055324' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T14:17:38.000 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.281+0000 7ff6fdb15700 1 -- 192.168.123.103:0/2227132640 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f80fff00 msgr2=0x7ff6f8100370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.281+0000 7ff6fdb15700 1 --2- 192.168.123.103:0/2227132640 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f80fff00 0x7ff6f8100370 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7ff6e8009b00 tx=0x7ff6e8009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.282+0000 7ff6fdb15700 1 -- 192.168.123.103:0/2227132640 shutdown_connections 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.282+0000 7ff6fdb15700 1 --2- 192.168.123.103:0/2227132640 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f80fff00 0x7ff6f8100370 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.282+0000 7ff6fdb15700 1 --2- 192.168.123.103:0/2227132640 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8104520 0x7ff6f81048f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.282+0000 7ff6fdb15700 1 -- 192.168.123.103:0/2227132640 >> 192.168.123.103:0/2227132640 conn(0x7ff6f80754a0 msgr2=0x7ff6f80758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.282+0000 7ff6fdb15700 1 -- 192.168.123.103:0/2227132640 shutdown_connections 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.282+0000 7ff6fdb15700 1 -- 192.168.123.103:0/2227132640 wait complete. 2026-03-10T14:17:38.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6fdb15700 1 Processor -- start 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6fdb15700 1 -- start start 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6fdb15700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f80fff00 0x7ff6f810a1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6fdb15700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8104520 0x7ff6f810a720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6fdb15700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6f810ac60 con 0x7ff6f80fff00 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6fdb15700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6f810add0 con 0x7ff6f8104520 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6f6ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8104520 0x7ff6f810a720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6f6ffd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8104520 0x7ff6f810a720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49634/0 (socket says 192.168.123.103:49634) 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6f6ffd700 1 -- 192.168.123.103:0/3140097253 learned_addr learned my addr 192.168.123.103:0/3140097253 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.283+0000 7ff6f6ffd700 1 -- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f80fff00 msgr2=0x7ff6f810a1e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.284+0000 7ff6f6ffd700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f80fff00 0x7ff6f810a1e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.284+0000 7ff6f6ffd700 1 -- 192.168.123.103:0/3140097253 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6e80097e0 con 0x7ff6f8104520 2026-03-10T14:17:38.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.284+0000 7ff6f6ffd700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8104520 0x7ff6f810a720 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7ff6e8009ad0 tx=0x7ff6e80052e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:38.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.284+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/3140097253 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6e801d070 con 0x7ff6f8104520 2026-03-10T14:17:38.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.284+0000 7ff6fdb15700 1 -- 192.168.123.103:0/3140097253 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6f810b050 con 0x7ff6f8104520 2026-03-10T14:17:38.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.284+0000 7ff6fdb15700 1 -- 192.168.123.103:0/3140097253 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6f8071ae0 con 0x7ff6f8104520 2026-03-10T14:17:38.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.285+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/3140097253 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff6e800bc50 con 0x7ff6f8104520 2026-03-10T14:17:38.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.285+0000 7ff6fdb15700 1 -- 192.168.123.103:0/3140097253 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff6f804ea50 con 0x7ff6f8104520 2026-03-10T14:17:38.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.285+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/3140097253 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6e800f650 con 0x7ff6f8104520 2026-03-10T14:17:38.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.286+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/3140097253 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff6e8022470 con 0x7ff6f8104520 2026-03-10T14:17:38.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.286+0000 7ff6f4ff9700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e4077910 0x7ff6e4079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:38.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.286+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/3140097253 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7ff6e809b930 con 0x7ff6f8104520 2026-03-10T14:17:38.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.287+0000 7ff6f77fe700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e4077910 0x7ff6e4079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:38.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.287+0000 7ff6f77fe700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e4077910 0x7ff6e4079dc0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7ff6e000a9b0 tx=0x7ff6e0005c90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:38.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.290+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/3140097253 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff6e80640d0 con 0x7ff6f8104520 2026-03-10T14:17:38.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:37 vm04.local ceph-mon[92084]: pgmap v343: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:38.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:37 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/568146616' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T14:17:38.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:37 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2985055324' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T14:17:38.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.445+0000 7ff6fdb15700 1 -- 192.168.123.103:0/3140097253 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7ff6f810b790 con 0x7ff6f8104520 2026-03-10T14:17:38.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.446+0000 7ff6f4ff9700 1 -- 192.168.123.103:0/3140097253 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v40) v1 ==== 107+0+4942 (secure 0 0 0) 0x7ff6e8031090 con 0x7ff6f8104520 2026-03-10T14:17:38.445 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:38.445 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":13,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[],"filesystems":[{"mdsmap":{"epoch":13,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:06:44.825228+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_14486":{"gid":14486,"name":"cephfs.vm03.itwezo","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.103:6829/1684840896","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1684840896},{"type":"v1","addr":"192.168.123.103:6829","nonce":1684840896}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[1],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24273":{"gid":24273,"name":"cephfs.vm04.puavjd","rank":1,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.104:6827/3934202413","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3934202413},{"type":"v1","addr":"192.168.123.104:6827","nonce":3934202413}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:38.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.451+0000 7ff6ee7fc700 1 -- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e4077910 msgr2=0x7ff6e4079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.451+0000 7ff6ee7fc700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e4077910 0x7ff6e4079dc0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7ff6e000a9b0 tx=0x7ff6e0005c90 comp rx=0 tx=0).stop 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.451+0000 7ff6ee7fc700 1 -- 192.168.123.103:0/3140097253 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8104520 msgr2=0x7ff6f810a720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.451+0000 7ff6ee7fc700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8104520 0x7ff6f810a720 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7ff6e8009ad0 tx=0x7ff6e80052e0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.454+0000 7ff6ee7fc700 1 -- 192.168.123.103:0/3140097253 shutdown_connections 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.454+0000 7ff6ee7fc700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff6e4077910 0x7ff6e4079dc0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.454+0000 7ff6ee7fc700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6f80fff00 0x7ff6f810a1e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.454+0000 7ff6ee7fc700 1 --2- 192.168.123.103:0/3140097253 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff6f8104520 0x7ff6f810a720 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.455+0000 7ff6ee7fc700 1 -- 192.168.123.103:0/3140097253 >> 192.168.123.103:0/3140097253 conn(0x7ff6f80754a0 msgr2=0x7ff6f80fece0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.455+0000 7ff6ee7fc700 1 -- 192.168.123.103:0/3140097253 shutdown_connections 2026-03-10T14:17:38.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.455+0000 7ff6ee7fc700 1 -- 192.168.123.103:0/3140097253 wait complete. 2026-03-10T14:17:38.459 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 13 2026-03-10T14:17:38.524 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 14 2026-03-10T14:17:38.676 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:38.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.929+0000 7f04adabe700 1 -- 192.168.123.103:0/1582924971 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 msgr2=0x7f04a8108a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:38.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.929+0000 7f04adabe700 1 --2- 192.168.123.103:0/1582924971 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 0x7f04a8108a30 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f0498009b00 tx=0x7f0498009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:38.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.930+0000 7f04adabe700 1 -- 192.168.123.103:0/1582924971 shutdown_connections 2026-03-10T14:17:38.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.930+0000 7f04adabe700 1 --2- 192.168.123.103:0/1582924971 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04a8102660 0x7f04a8102ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.930+0000 7f04adabe700 1 --2- 192.168.123.103:0/1582924971 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 0x7f04a8108a30 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.930+0000 7f04adabe700 1 -- 192.168.123.103:0/1582924971 >> 192.168.123.103:0/1582924971 conn(0x7f04a80fe140 msgr2=0x7f04a8100550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:38.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.930+0000 7f04adabe700 1 -- 192.168.123.103:0/1582924971 shutdown_connections 2026-03-10T14:17:38.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.931+0000 7f04adabe700 1 -- 192.168.123.103:0/1582924971 wait complete. 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.931+0000 7f04adabe700 1 Processor -- start 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.931+0000 7f04adabe700 1 -- start start 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.931+0000 7f04adabe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04a8102660 0x7f04a8075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.931+0000 7f04adabe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 0x7f04a80757a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.931+0000 7f04adabe700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f04a80793f0 con 0x7f04a8108660 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.931+0000 7f04adabe700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f04a8075ce0 con 0x7f04a8102660 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f04a77fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04a8102660 0x7f04a8075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f04a77fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04a8102660 0x7f04a8075260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49656/0 (socket says 192.168.123.103:49656) 2026-03-10T14:17:38.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f04a77fe700 1 -- 192.168.123.103:0/3127171411 learned_addr learned my addr 192.168.123.103:0/3127171411 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f04a77fe700 1 -- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 msgr2=0x7f04a80757a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f049edff700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 0x7f04a80757a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f04a77fe700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 0x7f04a80757a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f04a77fe700 1 -- 192.168.123.103:0/3127171411 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f04980097e0 con 0x7f04a8102660 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f04a77fe700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04a8102660 0x7f04a8075260 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f0498000c00 tx=0x7f0498004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.932+0000 7f049edff700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 0x7f04a80757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.933+0000 7f04a57fa700 1 -- 192.168.123.103:0/3127171411 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f049801d070 con 0x7f04a8102660 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.933+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f04a8075f60 con 0x7f04a8102660 2026-03-10T14:17:38.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.933+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f04a81a6a10 con 0x7f04a8102660 2026-03-10T14:17:38.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.933+0000 7f04a57fa700 1 -- 192.168.123.103:0/3127171411 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f049800bc50 con 0x7f04a8102660 2026-03-10T14:17:38.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.933+0000 7f04a57fa700 1 -- 192.168.123.103:0/3127171411 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f049800f670 con 0x7f04a8102660 2026-03-10T14:17:38.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.934+0000 7f04a57fa700 1 -- 192.168.123.103:0/3127171411 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f049800f7d0 con 0x7f04a8102660 2026-03-10T14:17:38.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.935+0000 7f04a57fa700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f04940778c0 0x7f0494079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:38.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.935+0000 7f049edff700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f04940778c0 0x7f0494079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:38.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.935+0000 7f04a57fa700 1 -- 192.168.123.103:0/3127171411 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f049809b340 con 0x7f04a8102660 2026-03-10T14:17:38.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.935+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0488005320 con 0x7f04a8102660 2026-03-10T14:17:38.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.936+0000 7f049edff700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f04940778c0 0x7f0494079d70 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f0490005fd0 tx=0x7f0490005ee0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:38.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:38.938+0000 7f04a57fa700 1 -- 192.168.123.103:0/3127171411 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f04980639b0 con 0x7f04a8102660 2026-03-10T14:17:39.078 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:38 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3140097253' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T14:17:39.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.079+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7f0488005190 con 0x7f04a8102660 2026-03-10T14:17:39.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.081+0000 7f04a57fa700 1 -- 192.168.123.103:0/3127171411 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v40) v1 ==== 107+0+4943 (secure 0 0 0) 0x7f0498027020 con 0x7f04a8102660 2026-03-10T14:17:39.081 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:39.081 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":14,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[],"filesystems":[{"mdsmap":{"epoch":14,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:07:48.854532+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[0],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_14486":{"gid":14486,"name":"cephfs.vm03.itwezo","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.103:6829/1684840896","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1684840896},{"type":"v1","addr":"192.168.123.103:6829","nonce":1684840896}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[1],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24273":{"gid":24273,"name":"cephfs.vm04.puavjd","rank":1,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.104:6827/3934202413","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3934202413},{"type":"v1","addr":"192.168.123.104:6827","nonce":3934202413}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:39.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f04940778c0 msgr2=0x7f0494079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f04940778c0 0x7f0494079d70 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f0490005fd0 tx=0x7f0490005ee0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04a8102660 msgr2=0x7f04a8075260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04a8102660 0x7f04a8075260 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f0498000c00 tx=0x7f0498004970 comp rx=0 tx=0).stop 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 shutdown_connections 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f04940778c0 0x7f0494079d70 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f04a8102660 0x7f04a8075260 secure :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f0498000c00 tx=0x7f0498004970 comp rx=0 tx=0).stop 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 --2- 192.168.123.103:0/3127171411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04a8108660 0x7f04a80757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.085+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 >> 192.168.123.103:0/3127171411 conn(0x7f04a80fe140 msgr2=0x7f04a80ff8a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.086+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 shutdown_connections 2026-03-10T14:17:39.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.086+0000 7f04adabe700 1 -- 192.168.123.103:0/3127171411 wait complete. 2026-03-10T14:17:39.085 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-10T14:17:39.313 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 15 2026-03-10T14:17:39.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:38 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3140097253' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T14:17:39.477 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:39.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.737+0000 7efd5256b700 1 -- 192.168.123.103:0/1397035057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c073a00 msgr2=0x7efd4c110ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:39.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.737+0000 7efd5256b700 1 --2- 192.168.123.103:0/1397035057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c073a00 0x7efd4c110ff0 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7efd3c009b00 tx=0x7efd3c009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:39.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.738+0000 7efd5256b700 1 -- 192.168.123.103:0/1397035057 shutdown_connections 2026-03-10T14:17:39.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.738+0000 7efd5256b700 1 --2- 192.168.123.103:0/1397035057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c073a00 0x7efd4c110ff0 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.738+0000 7efd5256b700 1 --2- 192.168.123.103:0/1397035057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd4c0730f0 0x7efd4c0734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.738+0000 7efd5256b700 1 -- 192.168.123.103:0/1397035057 >> 192.168.123.103:0/1397035057 conn(0x7efd4c0fc000 msgr2=0x7efd4c0fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:39.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.739+0000 7efd5256b700 1 -- 192.168.123.103:0/1397035057 shutdown_connections 2026-03-10T14:17:39.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.739+0000 7efd5256b700 1 -- 192.168.123.103:0/1397035057 wait complete. 2026-03-10T14:17:39.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.739+0000 7efd5256b700 1 Processor -- start 2026-03-10T14:17:39.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd5256b700 1 -- start start 2026-03-10T14:17:39.738 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd5256b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c0730f0 0x7efd4c1a2520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:39.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd5256b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd4c073a00 0x7efd4c1a2a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:39.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd5256b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd4c1a30f0 con 0x7efd4c0730f0 2026-03-10T14:17:39.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd5256b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd4c19c5f0 con 0x7efd4c073a00 2026-03-10T14:17:39.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd4bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c0730f0 0x7efd4c1a2520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:39.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd4b7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd4c073a00 0x7efd4c1a2a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:39.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd4bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c0730f0 0x7efd4c1a2520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:43980/0 (socket says 192.168.123.103:43980) 2026-03-10T14:17:39.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.740+0000 7efd4bfff700 1 -- 192.168.123.103:0/1206442807 learned_addr learned my addr 192.168.123.103:0/1206442807 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:39.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.741+0000 7efd4bfff700 1 -- 192.168.123.103:0/1206442807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd4c073a00 msgr2=0x7efd4c1a2a60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:39.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.741+0000 7efd4bfff700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd4c073a00 0x7efd4c1a2a60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.741+0000 7efd4bfff700 1 -- 192.168.123.103:0/1206442807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd3c0097e0 con 0x7efd4c0730f0 2026-03-10T14:17:39.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.741+0000 7efd4b7fe700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd4c073a00 0x7efd4c1a2a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:39.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.741+0000 7efd4bfff700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c0730f0 0x7efd4c1a2520 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7efd3400cc60 tx=0x7efd340074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:39.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.741+0000 7efd497fa700 1 -- 192.168.123.103:0/1206442807 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd34007af0 con 0x7efd4c0730f0 2026-03-10T14:17:39.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.742+0000 7efd497fa700 1 -- 192.168.123.103:0/1206442807 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7efd34004d10 con 0x7efd4c0730f0 2026-03-10T14:17:39.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.742+0000 7efd497fa700 1 -- 192.168.123.103:0/1206442807 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd340056e0 con 0x7efd4c0730f0 2026-03-10T14:17:39.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.742+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efd4c19c8d0 con 0x7efd4c0730f0 2026-03-10T14:17:39.741 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.742+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efd4c19cdd0 con 0x7efd4c0730f0 2026-03-10T14:17:39.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.743+0000 7efd497fa700 1 -- 192.168.123.103:0/1206442807 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7efd34004750 con 0x7efd4c0730f0 2026-03-10T14:17:39.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.743+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efd4c10e770 con 0x7efd4c0730f0 2026-03-10T14:17:39.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.743+0000 7efd497fa700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7efd380778c0 0x7efd38079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:39.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.744+0000 7efd497fa700 1 -- 192.168.123.103:0/1206442807 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7efd34013070 con 0x7efd4c0730f0 2026-03-10T14:17:39.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.744+0000 7efd4b7fe700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7efd380778c0 0x7efd38079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:39.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.746+0000 7efd4b7fe700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7efd380778c0 0x7efd38079d70 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7efd3c009fd0 tx=0x7efd3c005c00 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:39.745 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.747+0000 7efd497fa700 1 -- 192.168.123.103:0/1206442807 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7efd34062460 con 0x7efd4c0730f0 2026-03-10T14:17:39.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.895+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7efd4c04ea50 con 0x7efd4c0730f0 2026-03-10T14:17:39.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.896+0000 7efd497fa700 1 -- 192.168.123.103:0/1206442807 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v40) v1 ==== 107+0+4957 (secure 0 0 0) 0x7efd34061bb0 con 0x7efd4c0730f0 2026-03-10T14:17:39.896 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:39.896 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":15,"btime":"2026-03-10T14:12:35:280795+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[],"filesystems":[{"mdsmap":{"epoch":15,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:12:35.280794+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[0],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_14486":{"gid":14486,"name":"cephfs.vm03.itwezo","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.103:6829/1684840896","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1684840896},{"type":"v1","addr":"192.168.123.103:6829","nonce":1684840896}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24273":{"gid":24273,"name":"cephfs.vm04.puavjd","rank":1,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.104:6827/3934202413","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3934202413},{"type":"v1","addr":"192.168.123.104:6827","nonce":3934202413}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[14470,24263]},"id":1}]} 2026-03-10T14:17:39.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.900+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7efd380778c0 msgr2=0x7efd38079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:39.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.900+0000 7efd5256b700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7efd380778c0 0x7efd38079d70 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7efd3c009fd0 tx=0x7efd3c005c00 comp rx=0 tx=0).stop 2026-03-10T14:17:39.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.900+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c0730f0 msgr2=0x7efd4c1a2520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:39.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.900+0000 7efd5256b700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c0730f0 0x7efd4c1a2520 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7efd3400cc60 tx=0x7efd340074a0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.901+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 shutdown_connections 2026-03-10T14:17:39.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.901+0000 7efd5256b700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7efd380778c0 0x7efd38079d70 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.901+0000 7efd5256b700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd4c0730f0 0x7efd4c1a2520 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.901+0000 7efd5256b700 1 --2- 192.168.123.103:0/1206442807 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7efd4c073a00 0x7efd4c1a2a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:39.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.901+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 >> 192.168.123.103:0/1206442807 conn(0x7efd4c0fc000 msgr2=0x7efd4c102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:39.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.901+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 shutdown_connections 2026-03-10T14:17:39.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:39.902+0000 7efd5256b700 1 -- 192.168.123.103:0/1206442807 wait complete. 2026-03-10T14:17:39.901 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 15 2026-03-10T14:17:39.957 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 16 2026-03-10T14:17:40.121 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:40.151 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:39 vm03.local ceph-mon[103098]: pgmap v344: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:40.151 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:39 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3127171411' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T14:17:40.151 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:39 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1206442807' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T14:17:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:39 vm04.local ceph-mon[92084]: pgmap v344: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:40.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:39 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3127171411' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T14:17:40.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:39 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1206442807' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T14:17:40.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.382+0000 7f005bfaa700 1 -- 192.168.123.103:0/373006159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 msgr2=0x7f00540734c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:40.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.382+0000 7f005bfaa700 1 --2- 192.168.123.103:0/373006159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 0x7f00540734c0 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f0048009b00 tx=0x7f0048009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:40.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.382+0000 7f005bfaa700 1 -- 192.168.123.103:0/373006159 shutdown_connections 2026-03-10T14:17:40.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.382+0000 7f005bfaa700 1 --2- 192.168.123.103:0/373006159 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0054073a00 0x7f0054110ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.382+0000 7f005bfaa700 1 --2- 192.168.123.103:0/373006159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 0x7f00540734c0 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.382+0000 7f005bfaa700 1 -- 192.168.123.103:0/373006159 >> 192.168.123.103:0/373006159 conn(0x7f00540fc000 msgr2=0x7f00540fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:40.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.383+0000 7f005bfaa700 1 -- 192.168.123.103:0/373006159 shutdown_connections 2026-03-10T14:17:40.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.383+0000 7f005bfaa700 1 -- 192.168.123.103:0/373006159 wait complete. 2026-03-10T14:17:40.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.383+0000 7f005bfaa700 1 Processor -- start 2026-03-10T14:17:40.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.383+0000 7f005bfaa700 1 -- start start 2026-03-10T14:17:40.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.384+0000 7f005bfaa700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 0x7f005419c680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.384+0000 7f005bfaa700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0054073a00 0x7f005419cbc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.384+0000 7f005bfaa700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f005419d1c0 con 0x7f00540730f0 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.384+0000 7f005bfaa700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f005419d330 con 0x7f0054073a00 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.384+0000 7f0059d46700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 0x7f005419c680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.384+0000 7f0059d46700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 0x7f005419c680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44006/0 (socket says 192.168.123.103:44006) 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.384+0000 7f0059d46700 1 -- 192.168.123.103:0/277149105 learned_addr learned my addr 192.168.123.103:0/277149105 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.384+0000 7f0059545700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0054073a00 0x7f005419cbc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f0059d46700 1 -- 192.168.123.103:0/277149105 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0054073a00 msgr2=0x7f005419cbc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f0059d46700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0054073a00 0x7f005419cbc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f0059d46700 1 -- 192.168.123.103:0/277149105 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00480097e0 con 0x7f00540730f0 2026-03-10T14:17:40.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f0059d46700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 0x7f005419c680 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f00480094d0 tx=0x7f0048004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:40.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f0046ffd700 1 -- 192.168.123.103:0/277149105 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f004801d070 con 0x7f00540730f0 2026-03-10T14:17:40.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f00541a00d0 con 0x7f00540730f0 2026-03-10T14:17:40.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00541a0620 con 0x7f00540730f0 2026-03-10T14:17:40.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f0046ffd700 1 -- 192.168.123.103:0/277149105 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0048022470 con 0x7f00540730f0 2026-03-10T14:17:40.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.385+0000 7f0046ffd700 1 -- 192.168.123.103:0/277149105 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f004800f670 con 0x7f00540730f0 2026-03-10T14:17:40.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.387+0000 7f0046ffd700 1 -- 192.168.123.103:0/277149105 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f004800f8a0 con 0x7f00540730f0 2026-03-10T14:17:40.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.387+0000 7f0046ffd700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f00400778c0 0x7f0040079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:40.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.388+0000 7f0046ffd700 1 -- 192.168.123.103:0/277149105 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f004809c3a0 con 0x7f00540730f0 2026-03-10T14:17:40.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.388+0000 7f0059545700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f00400778c0 0x7f0040079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:40.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.388+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f00541a0260 con 0x7f00540730f0 2026-03-10T14:17:40.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.391+0000 7f0059545700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f00400778c0 0x7f0040079d70 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f005419dca0 tx=0x7f005000a380 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:40.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.392+0000 7f0046ffd700 1 -- 192.168.123.103:0/277149105 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f00541a0260 con 0x7f00540730f0 2026-03-10T14:17:40.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.535+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7f00541a0260 con 0x7f00540730f0 2026-03-10T14:17:40.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.538+0000 7f0046ffd700 1 -- 192.168.123.103:0/277149105 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v40) v1 ==== 107+0+4956 (secure 0 0 0) 0x7f00541a0260 con 0x7f00540730f0 2026-03-10T14:17:40.537 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:40.537 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":16,"btime":"2026-03-10T14:12:48:855184+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[],"filesystems":[{"mdsmap":{"epoch":16,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:12:48.656761+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_14486":{"gid":14486,"name":"cephfs.vm03.itwezo","rank":0,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.103:6829/1684840896","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1684840896},{"type":"v1","addr":"192.168.123.103:6829","nonce":1684840896}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24273":{"gid":24273,"name":"cephfs.vm04.puavjd","rank":1,"incarnation":0,"state":"up:standby-replay","state_seq":2,"addr":"192.168.123.104:6827/3934202413","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3934202413},{"type":"v1","addr":"192.168.123.104:6827","nonce":3934202413}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263,14470]},"id":1}]} 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f00400778c0 msgr2=0x7f0040079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f00400778c0 0x7f0040079d70 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f005419dca0 tx=0x7f005000a380 comp rx=0 tx=0).stop 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 msgr2=0x7f005419c680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 0x7f005419c680 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f00480094d0 tx=0x7f0048004930 comp rx=0 tx=0).stop 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 shutdown_connections 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f00400778c0 0x7f0040079d70 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00540730f0 0x7f005419c680 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 --2- 192.168.123.103:0/277149105 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0054073a00 0x7f005419cbc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.541+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 >> 192.168.123.103:0/277149105 conn(0x7f00540fc000 msgr2=0x7f0054102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.542+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 shutdown_connections 2026-03-10T14:17:40.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.542+0000 7f005bfaa700 1 -- 192.168.123.103:0/277149105 wait complete. 2026-03-10T14:17:40.541 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 16 2026-03-10T14:17:40.584 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 17 2026-03-10T14:17:40.727 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.996+0000 7f855cd2e700 1 -- 192.168.123.103:0/3962242658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85581016e0 msgr2=0x7f8558101ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.996+0000 7f855cd2e700 1 --2- 192.168.123.103:0/3962242658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85581016e0 0x7f8558101ab0 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f8540009b00 tx=0x7f8540009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.997+0000 7f855cd2e700 1 -- 192.168.123.103:0/3962242658 shutdown_connections 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.997+0000 7f855cd2e700 1 --2- 192.168.123.103:0/3962242658 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f8558101ff0 0x7f855810a4f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.997+0000 7f855cd2e700 1 --2- 192.168.123.103:0/3962242658 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85581016e0 0x7f8558101ab0 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.997+0000 7f855cd2e700 1 -- 192.168.123.103:0/3962242658 >> 192.168.123.103:0/3962242658 conn(0x7f85580faf00 msgr2=0x7f85580fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.997+0000 7f855cd2e700 1 -- 192.168.123.103:0/3962242658 shutdown_connections 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.997+0000 7f855cd2e700 1 -- 192.168.123.103:0/3962242658 wait complete. 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855cd2e700 1 Processor -- start 2026-03-10T14:17:40.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855cd2e700 1 -- start start 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855cd2e700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85581016e0 0x7f85580ff340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855cd2e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8558101ff0 0x7f85580ff880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855cd2e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85581010f0 con 0x7f8558101ff0 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855cd2e700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8558101260 con 0x7f85581016e0 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85581016e0 0x7f85580ff340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85581016e0 0x7f85580ff340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49730/0 (socket says 192.168.123.103:49730) 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.998+0000 7f855659c700 1 -- 192.168.123.103:0/1141684007 learned_addr learned my addr 192.168.123.103:0/1141684007 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.999+0000 7f855659c700 1 -- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8558101ff0 msgr2=0x7f85580ff880 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.999+0000 7f8555d9b700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8558101ff0 0x7f85580ff880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.999+0000 7f855659c700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8558101ff0 0x7f85580ff880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.999+0000 7f855659c700 1 -- 192.168.123.103:0/1141684007 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85400097e0 con 0x7f85581016e0 2026-03-10T14:17:40.997 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.999+0000 7f8555d9b700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8558101ff0 0x7f85580ff880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:40.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:40.999+0000 7f855659c700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85581016e0 0x7f85580ff340 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f85400048c0 tx=0x7f85400049a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:40.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.000+0000 7f854f7fe700 1 -- 192.168.123.103:0/1141684007 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f854001d070 con 0x7f85581016e0 2026-03-10T14:17:40.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.000+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85580ffe20 con 0x7f85581016e0 2026-03-10T14:17:40.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.000+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85581a6af0 con 0x7f85581016e0 2026-03-10T14:17:40.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.000+0000 7f854f7fe700 1 -- 192.168.123.103:0/1141684007 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f854000bc50 con 0x7f85581016e0 2026-03-10T14:17:40.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.000+0000 7f854f7fe700 1 -- 192.168.123.103:0/1141684007 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f854000f670 con 0x7f85581016e0 2026-03-10T14:17:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.002+0000 7f854f7fe700 1 -- 192.168.123.103:0/1141684007 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8540022470 con 0x7f85581016e0 2026-03-10T14:17:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.002+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8538005320 con 0x7f85581016e0 2026-03-10T14:17:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.002+0000 7f854f7fe700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8544077870 0x7f8544079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.002+0000 7f854f7fe700 1 -- 192.168.123.103:0/1141684007 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f854009b400 con 0x7f85581016e0 2026-03-10T14:17:41.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.003+0000 7f8555d9b700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8544077870 0x7f8544079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:41.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.003+0000 7f8555d9b700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8544077870 0x7f8544079d20 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f8548005fd0 tx=0x7f8548005ee0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:41.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.005+0000 7f854f7fe700 1 -- 192.168.123.103:0/1141684007 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8540063a70 con 0x7f85581016e0 2026-03-10T14:17:41.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:40 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/277149105' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T14:17:41.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.149+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7f8538005190 con 0x7f85581016e0 2026-03-10T14:17:41.149 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.150+0000 7f854f7fe700 1 -- 192.168.123.103:0/1141684007 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v40) v1 ==== 107+0+3377 (secure 0 0 0) 0x7f8540027d50 con 0x7f85581016e0 2026-03-10T14:17:41.149 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:41.150 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":17,"btime":"2026-03-10T14:14:11:668902+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:10.670686+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[14470,24263]},"id":1}]} 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.153+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8544077870 msgr2=0x7f8544079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.153+0000 7f855cd2e700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8544077870 0x7f8544079d20 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f8548005fd0 tx=0x7f8548005ee0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.153+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85581016e0 msgr2=0x7f85580ff340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.153+0000 7f855cd2e700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85581016e0 0x7f85580ff340 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f85400048c0 tx=0x7f85400049a0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.154+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 shutdown_connections 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.154+0000 7f855cd2e700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f8544077870 0x7f8544079d20 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.154+0000 7f855cd2e700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f85581016e0 0x7f85580ff340 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.154+0000 7f855cd2e700 1 --2- 192.168.123.103:0/1141684007 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8558101ff0 0x7f85580ff880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.154+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 >> 192.168.123.103:0/1141684007 conn(0x7f85580faf00 msgr2=0x7f8558104d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.154+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 shutdown_connections 2026-03-10T14:17:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.154+0000 7f855cd2e700 1 -- 192.168.123.103:0/1141684007 wait complete. 2026-03-10T14:17:41.153 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 17 2026-03-10T14:17:41.213 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 17 2026-03-10T14:17:41.213 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 18 2026-03-10T14:17:41.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:40 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/277149105' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T14:17:41.356 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:41.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.627+0000 7fefe2fd9700 1 -- 192.168.123.103:0/3088481057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc108810 msgr2=0x7fefdc108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:41.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.627+0000 7fefe2fd9700 1 --2- 192.168.123.103:0/3088481057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc108810 0x7fefdc108be0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7fefcc009b00 tx=0x7fefcc009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:41.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.628+0000 7fefe2fd9700 1 -- 192.168.123.103:0/3088481057 shutdown_connections 2026-03-10T14:17:41.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.628+0000 7fefe2fd9700 1 --2- 192.168.123.103:0/3088481057 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fefdc102810 0x7fefdc102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.628+0000 7fefe2fd9700 1 --2- 192.168.123.103:0/3088481057 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc108810 0x7fefdc108be0 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.628+0000 7fefe2fd9700 1 -- 192.168.123.103:0/3088481057 >> 192.168.123.103:0/3088481057 conn(0x7fefdc0fe330 msgr2=0x7fefdc100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:41.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.628+0000 7fefe2fd9700 1 -- 192.168.123.103:0/3088481057 shutdown_connections 2026-03-10T14:17:41.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.628+0000 7fefe2fd9700 1 -- 192.168.123.103:0/3088481057 wait complete. 2026-03-10T14:17:41.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.628+0000 7fefe2fd9700 1 Processor -- start 2026-03-10T14:17:41.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefe2fd9700 1 -- start start 2026-03-10T14:17:41.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefe2fd9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc102810 0x7fefdc198440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefe0d75700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc102810 0x7fefdc198440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefe2fd9700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fefdc108810 0x7fefdc198980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefe0d75700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc102810 0x7fefdc198440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44040/0 (socket says 192.168.123.103:44040) 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefe0d75700 1 -- 192.168.123.103:0/1790275416 learned_addr learned my addr 192.168.123.103:0/1790275416 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefe2fd9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fefdc199060 con 0x7fefdc102810 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fefdc19cdf0 con 0x7fefdc108810 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.629+0000 7fefdbfff700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fefdc108810 0x7fefdc198980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefe0d75700 1 -- 192.168.123.103:0/1790275416 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fefdc108810 msgr2=0x7fefdc198980 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefe0d75700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fefdc108810 0x7fefdc198980 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefe0d75700 1 -- 192.168.123.103:0/1790275416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fefcc0097e0 con 0x7fefdc102810 2026-03-10T14:17:41.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefe0d75700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc102810 0x7fefdc198440 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7fefcc009fd0 tx=0x7fefcc004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:41.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefd9ffb700 1 -- 192.168.123.103:0/1790275416 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fefcc01d070 con 0x7fefdc102810 2026-03-10T14:17:41.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefd9ffb700 1 -- 192.168.123.103:0/1790275416 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fefcc004b80 con 0x7fefdc102810 2026-03-10T14:17:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fefdc19d070 con 0x7fefdc102810 2026-03-10T14:17:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefd9ffb700 1 -- 192.168.123.103:0/1790275416 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fefcc00f700 con 0x7fefdc102810 2026-03-10T14:17:41.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.630+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fefdc19d560 con 0x7fefdc102810 2026-03-10T14:17:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.632+0000 7fefd9ffb700 1 -- 192.168.123.103:0/1790275416 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fefcc00f860 con 0x7fefdc102810 2026-03-10T14:17:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.632+0000 7fefd9ffb700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fefc4077870 0x7fefc4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.632+0000 7fefd9ffb700 1 -- 192.168.123.103:0/1790275416 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fefcc09b1b0 con 0x7fefdc102810 2026-03-10T14:17:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.632+0000 7fefdbfff700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fefc4077870 0x7fefc4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:41.631 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.632+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fefdc04ea50 con 0x7fefdc102810 2026-03-10T14:17:41.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.633+0000 7fefdbfff700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fefc4077870 0x7fefc4079d20 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fefdc199a60 tx=0x7fefd0009380 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:41.634 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.635+0000 7fefd9ffb700 1 -- 192.168.123.103:0/1790275416 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fefcc063b00 con 0x7fefdc102810 2026-03-10T14:17:41.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.773+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7fefdc1997a0 con 0x7fefdc102810 2026-03-10T14:17:41.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.777+0000 7fefd9ffb700 1 -- 192.168.123.103:0/1790275416 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v40) v1 ==== 107+0+4942 (secure 0 0 0) 0x7fefcc027020 con 0x7fefdc102810 2026-03-10T14:17:41.776 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:41.776 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":18,"btime":"2026-03-10T14:14:12:694222+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34412,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1608434114","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1608434114},{"type":"v1","addr":"192.168.123.103:6829","nonce":1608434114}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":18,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:11.784655+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263,14470]},"id":1}]} 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.780+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fefc4077870 msgr2=0x7fefc4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.780+0000 7fefe2fd9700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fefc4077870 0x7fefc4079d20 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fefdc199a60 tx=0x7fefd0009380 comp rx=0 tx=0).stop 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.780+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc102810 msgr2=0x7fefdc198440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.780+0000 7fefe2fd9700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc102810 0x7fefdc198440 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7fefcc009fd0 tx=0x7fefcc004930 comp rx=0 tx=0).stop 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.780+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 shutdown_connections 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.781+0000 7fefe2fd9700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fefc4077870 0x7fefc4079d20 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.781+0000 7fefe2fd9700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fefdc102810 0x7fefdc198440 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.781+0000 7fefe2fd9700 1 --2- 192.168.123.103:0/1790275416 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fefdc108810 0x7fefdc198980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:41.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.781+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 >> 192.168.123.103:0/1790275416 conn(0x7fefdc0fe330 msgr2=0x7fefdc0ffb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:41.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.781+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 shutdown_connections 2026-03-10T14:17:41.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:41.781+0000 7fefe2fd9700 1 -- 192.168.123.103:0/1790275416 wait complete. 2026-03-10T14:17:41.781 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 18 2026-03-10T14:17:41.821 DEBUG:tasks.fs:max_mds reduced in epoch 18 2026-03-10T14:17:41.821 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 18 2026-03-10T14:17:41.821 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 19 2026-03-10T14:17:41.977 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:42.033 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:41 vm03.local ceph-mon[103098]: pgmap v345: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:42.033 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:41 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1141684007' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T14:17:42.033 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:41 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1790275416' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T14:17:42.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.258+0000 7f7039502700 1 -- 192.168.123.103:0/3850532081 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 msgr2=0x7f7034068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.258+0000 7f7039502700 1 --2- 192.168.123.103:0/3850532081 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 0x7f7034068900 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f701c009b00 tx=0x7f701c009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.259+0000 7f7039502700 1 -- 192.168.123.103:0/3850532081 shutdown_connections 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.259+0000 7f7039502700 1 --2- 192.168.123.103:0/3850532081 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 0x7f7034068900 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.259+0000 7f7039502700 1 --2- 192.168.123.103:0/3850532081 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f70341013a0 0x7f7034101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.259+0000 7f7039502700 1 -- 192.168.123.103:0/3850532081 >> 192.168.123.103:0/3850532081 conn(0x7f70340754a0 msgr2=0x7f70340758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.259+0000 7f7039502700 1 -- 192.168.123.103:0/3850532081 shutdown_connections 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.259+0000 7f7039502700 1 -- 192.168.123.103:0/3850532081 wait complete. 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.259+0000 7f7039502700 1 Processor -- start 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.259+0000 7f7039502700 1 -- start start 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f7039502700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 0x7f703410cb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f7039502700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f70341013a0 0x7f7034103a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f7039502700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f703410d1d0 con 0x7f7034068490 2026-03-10T14:17:42.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f7039502700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f703410d340 con 0x7f70341013a0 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f70327fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f70341013a0 0x7f7034103a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f70327fc700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f70341013a0 0x7f7034103a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49770/0 (socket says 192.168.123.103:49770) 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f70327fc700 1 -- 192.168.123.103:0/804572445 learned_addr learned my addr 192.168.123.103:0/804572445 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f70327fc700 1 -- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 msgr2=0x7f703410cb40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.260+0000 7f7032ffd700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 0x7f703410cb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.261+0000 7f70327fc700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 0x7f703410cb40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.261+0000 7f70327fc700 1 -- 192.168.123.103:0/804572445 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f701c0097e0 con 0x7f70341013a0 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.261+0000 7f7032ffd700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 0x7f703410cb40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.261+0000 7f70327fc700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f70341013a0 0x7f7034103a20 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f701c00b5c0 tx=0x7f701c0049b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:42.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.261+0000 7f702bfff700 1 -- 192.168.123.103:0/804572445 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f701c01d070 con 0x7f70341013a0 2026-03-10T14:17:42.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.261+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7034103fc0 con 0x7f70341013a0 2026-03-10T14:17:42.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.261+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7034104510 con 0x7f70341013a0 2026-03-10T14:17:42.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.262+0000 7f702bfff700 1 -- 192.168.123.103:0/804572445 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f701c00bc50 con 0x7f70341013a0 2026-03-10T14:17:42.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.262+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f703404ea50 con 0x7f70341013a0 2026-03-10T14:17:42.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.262+0000 7f702bfff700 1 -- 192.168.123.103:0/804572445 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f701c00f670 con 0x7f70341013a0 2026-03-10T14:17:42.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.267+0000 7f702bfff700 1 -- 192.168.123.103:0/804572445 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f701c022470 con 0x7f70341013a0 2026-03-10T14:17:42.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.268+0000 7f702bfff700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7020077710 0x7f7020079bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:42.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.268+0000 7f7032ffd700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7020077710 0x7f7020079bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:42.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.268+0000 7f7032ffd700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7020077710 0x7f7020079bc0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f7024007950 tx=0x7f7024008040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:42.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.268+0000 7f702bfff700 1 -- 192.168.123.103:0/804572445 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f701c09bb40 con 0x7f70341013a0 2026-03-10T14:17:42.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.268+0000 7f702bfff700 1 -- 192.168.123.103:0/804572445 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f701c0cba90 con 0x7f70341013a0 2026-03-10T14:17:42.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:41 vm04.local ceph-mon[92084]: pgmap v345: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:42.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:41 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1141684007' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T14:17:42.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:41 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1790275416' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T14:17:42.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.408+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7f7034066e40 con 0x7f70341013a0 2026-03-10T14:17:42.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.409+0000 7f702bfff700 1 -- 192.168.123.103:0/804572445 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v40) v1 ==== 107+0+4938 (secure 0 0 0) 0x7f701c0642e0 con 0x7f70341013a0 2026-03-10T14:17:42.408 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:42.408 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":19,"btime":"2026-03-10T14:14:12:710257+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34412,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1608434114","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1608434114},{"type":"v1","addr":"192.168.123.103:6829","nonce":1608434114}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:12.710257+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:stopping","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263]},"id":1}]} 2026-03-10T14:17:42.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7020077710 msgr2=0x7f7020079bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7020077710 0x7f7020079bc0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f7024007950 tx=0x7f7024008040 comp rx=0 tx=0).stop 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f70341013a0 msgr2=0x7f7034103a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f70341013a0 0x7f7034103a20 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f701c00b5c0 tx=0x7f701c0049b0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 shutdown_connections 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7020077710 0x7f7020079bc0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7034068490 0x7f703410cb40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 --2- 192.168.123.103:0/804572445 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f70341013a0 0x7f7034103a20 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 >> 192.168.123.103:0/804572445 conn(0x7f70340754a0 msgr2=0x7f70340fddd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 shutdown_connections 2026-03-10T14:17:42.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.412+0000 7f7039502700 1 -- 192.168.123.103:0/804572445 wait complete. 2026-03-10T14:17:42.412 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 19 2026-03-10T14:17:42.477 DEBUG:tasks.fs:max_mds reduced in epoch 19 2026-03-10T14:17:42.477 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 19 2026-03-10T14:17:42.477 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 20 2026-03-10T14:17:42.624 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:42.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.873+0000 7fd3bee06700 1 -- 192.168.123.103:0/116509310 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 msgr2=0x7fd3b81056e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:42.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.873+0000 7fd3bee06700 1 --2- 192.168.123.103:0/116509310 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 0x7fd3b81056e0 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7fd3a8009b00 tx=0x7fd3a8009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:42.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.874+0000 7fd3bee06700 1 -- 192.168.123.103:0/116509310 shutdown_connections 2026-03-10T14:17:42.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.874+0000 7fd3bee06700 1 --2- 192.168.123.103:0/116509310 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3b8068490 0x7fd3b8068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.874+0000 7fd3bee06700 1 --2- 192.168.123.103:0/116509310 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 0x7fd3b81056e0 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.874+0000 7fd3bee06700 1 -- 192.168.123.103:0/116509310 >> 192.168.123.103:0/116509310 conn(0x7fd3b8075240 msgr2=0x7fd3b8075640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:42.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.874+0000 7fd3bee06700 1 -- 192.168.123.103:0/116509310 shutdown_connections 2026-03-10T14:17:42.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.874+0000 7fd3bee06700 1 -- 192.168.123.103:0/116509310 wait complete. 2026-03-10T14:17:42.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.874+0000 7fd3bee06700 1 Processor -- start 2026-03-10T14:17:42.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bee06700 1 -- start start 2026-03-10T14:17:42.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bee06700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3b8068490 0x7fd3b8199ab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:42.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bee06700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 0x7fd3b8199ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bee06700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd3b819a5f0 con 0x7fd3b8105310 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bee06700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd3b8193b30 con 0x7fd3b8068490 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bde04700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3b8068490 0x7fd3b8199ab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bde04700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3b8068490 0x7fd3b8199ab0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49784/0 (socket says 192.168.123.103:49784) 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bde04700 1 -- 192.168.123.103:0/4076786871 learned_addr learned my addr 192.168.123.103:0/4076786871 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.875+0000 7fd3bd603700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 0x7fd3b8199ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3bde04700 1 -- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 msgr2=0x7fd3b8199ff0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3bde04700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 0x7fd3b8199ff0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3bde04700 1 -- 192.168.123.103:0/4076786871 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd3a80097e0 con 0x7fd3b8068490 2026-03-10T14:17:42.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3bd603700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 0x7fd3b8199ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:42.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3bde04700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3b8068490 0x7fd3b8199ab0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd3a8004930 tx=0x7fd3a8004a10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:42.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3aeffd700 1 -- 192.168.123.103:0/4076786871 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd3a801d070 con 0x7fd3b8068490 2026-03-10T14:17:42.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd3b8193db0 con 0x7fd3b8068490 2026-03-10T14:17:42.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd3b81942a0 con 0x7fd3b8068490 2026-03-10T14:17:42.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3aeffd700 1 -- 192.168.123.103:0/4076786871 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd3a800bc50 con 0x7fd3b8068490 2026-03-10T14:17:42.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.876+0000 7fd3aeffd700 1 -- 192.168.123.103:0/4076786871 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd3a800f700 con 0x7fd3b8068490 2026-03-10T14:17:42.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.878+0000 7fd3aeffd700 1 -- 192.168.123.103:0/4076786871 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd3a800f860 con 0x7fd3b8068490 2026-03-10T14:17:42.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.878+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd39c005320 con 0x7fd3b8068490 2026-03-10T14:17:42.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.878+0000 7fd3aeffd700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd3a407bc80 0x7fd3a407e130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:42.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.878+0000 7fd3aeffd700 1 -- 192.168.123.103:0/4076786871 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fd3a809b160 con 0x7fd3b8068490 2026-03-10T14:17:42.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.879+0000 7fd3bd603700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd3a407bc80 0x7fd3a407e130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:42.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.879+0000 7fd3bd603700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd3a407bc80 0x7fd3a407e130 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7fd3b4005fd0 tx=0x7fd3b4005e20 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:42.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:42.881+0000 7fd3aeffd700 1 -- 192.168.123.103:0/4076786871 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd3a8063880 con 0x7fd3b8068490 2026-03-10T14:17:43.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.021+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7fd39c005190 con 0x7fd3b8068490 2026-03-10T14:17:43.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.021+0000 7fd3aeffd700 1 -- 192.168.123.103:0/4076786871 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v40) v1 ==== 107+0+4939 (secure 0 0 0) 0x7fd3a8062fd0 con 0x7fd3b8068490 2026-03-10T14:17:43.020 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:43.020 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":20,"btime":"2026-03-10T14:14:18:710963+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34412,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1608434114","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1608434114},{"type":"v1","addr":"192.168.123.103:6829","nonce":1608434114}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":20,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:18.660815+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0,1],"up":{"mds_0":24263,"mds_1":14470},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14470":{"gid":14470,"name":"cephfs.vm03.aqaspa","rank":1,"incarnation":6,"state":"up:stopping","state_seq":3,"addr":"192.168.123.103:6827/3503287793","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3503287793},{"type":"v1","addr":"192.168.123.103:6827","nonce":3503287793}]},"join_fscid":1,"export_targets":[0],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263]},"id":1}]} 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.024+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd3a407bc80 msgr2=0x7fd3a407e130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.024+0000 7fd3bee06700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd3a407bc80 0x7fd3a407e130 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7fd3b4005fd0 tx=0x7fd3b4005e20 comp rx=0 tx=0).stop 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.024+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3b8068490 msgr2=0x7fd3b8199ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.024+0000 7fd3bee06700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3b8068490 0x7fd3b8199ab0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd3a8004930 tx=0x7fd3a8004a10 comp rx=0 tx=0).stop 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.025+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 shutdown_connections 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.025+0000 7fd3bee06700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fd3a407bc80 0x7fd3a407e130 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.025+0000 7fd3bee06700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fd3b8068490 0x7fd3b8199ab0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.025+0000 7fd3bee06700 1 --2- 192.168.123.103:0/4076786871 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd3b8105310 0x7fd3b8199ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.025+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 >> 192.168.123.103:0/4076786871 conn(0x7fd3b8075240 msgr2=0x7fd3b80feb20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:43.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.025+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 shutdown_connections 2026-03-10T14:17:43.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.025+0000 7fd3bee06700 1 -- 192.168.123.103:0/4076786871 wait complete. 2026-03-10T14:17:43.024 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 20 2026-03-10T14:17:43.067 DEBUG:tasks.fs:max_mds reduced in epoch 20 2026-03-10T14:17:43.068 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 20 2026-03-10T14:17:43.068 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 21 2026-03-10T14:17:43.220 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:43.244 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:42 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/804572445' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T14:17:43.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:42 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/804572445' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T14:17:43.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.481+0000 7f5d48aba700 1 -- 192.168.123.103:0/4100692871 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44102810 msgr2=0x7f5d44102c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:43.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.481+0000 7f5d48aba700 1 --2- 192.168.123.103:0/4100692871 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44102810 0x7f5d44102c80 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f5d34009b00 tx=0x7f5d34009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:43.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.481+0000 7f5d48aba700 1 -- 192.168.123.103:0/4100692871 shutdown_connections 2026-03-10T14:17:43.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.481+0000 7f5d48aba700 1 --2- 192.168.123.103:0/4100692871 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44102810 0x7f5d44102c80 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.481+0000 7f5d48aba700 1 --2- 192.168.123.103:0/4100692871 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d44108810 0x7f5d44108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.481+0000 7f5d48aba700 1 -- 192.168.123.103:0/4100692871 >> 192.168.123.103:0/4100692871 conn(0x7f5d440fe330 msgr2=0x7f5d44100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:43.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.482+0000 7f5d48aba700 1 -- 192.168.123.103:0/4100692871 shutdown_connections 2026-03-10T14:17:43.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.482+0000 7f5d48aba700 1 -- 192.168.123.103:0/4100692871 wait complete. 2026-03-10T14:17:43.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.482+0000 7f5d48aba700 1 Processor -- start 2026-03-10T14:17:43.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.482+0000 7f5d48aba700 1 -- start start 2026-03-10T14:17:43.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d48aba700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d44102810 0x7f5d44198440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d48aba700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44108810 0x7f5d44198980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d48aba700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d44199060 con 0x7f5d44108810 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d48aba700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d4419cdf0 con 0x7f5d44102810 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d41d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44108810 0x7f5d44198980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d41d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44108810 0x7f5d44198980 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44092/0 (socket says 192.168.123.103:44092) 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d41d9b700 1 -- 192.168.123.103:0/2131183268 learned_addr learned my addr 192.168.123.103:0/2131183268 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d41d9b700 1 -- 192.168.123.103:0/2131183268 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d44102810 msgr2=0x7f5d44198440 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d41d9b700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d44102810 0x7f5d44198440 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d41d9b700 1 -- 192.168.123.103:0/2131183268 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d340097e0 con 0x7f5d44108810 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.483+0000 7f5d41d9b700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44108810 0x7f5d44198980 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f5d340094d0 tx=0x7f5d340049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:43.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.484+0000 7f5d3b7fe700 1 -- 192.168.123.103:0/2131183268 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d3401d070 con 0x7f5d44108810 2026-03-10T14:17:43.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.484+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d4419d070 con 0x7f5d44108810 2026-03-10T14:17:43.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.484+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d4419d560 con 0x7f5d44108810 2026-03-10T14:17:43.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.485+0000 7f5d3b7fe700 1 -- 192.168.123.103:0/2131183268 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5d3400bc50 con 0x7f5d44108810 2026-03-10T14:17:43.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.486+0000 7f5d3b7fe700 1 -- 192.168.123.103:0/2131183268 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d3400f800 con 0x7f5d44108810 2026-03-10T14:17:43.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.486+0000 7f5d3b7fe700 1 -- 192.168.123.103:0/2131183268 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5d3400fa20 con 0x7f5d44108810 2026-03-10T14:17:43.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.486+0000 7f5d3b7fe700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5d30077a60 0x7f5d30079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:43.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.486+0000 7f5d4259c700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5d30077a60 0x7f5d30079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:43.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.487+0000 7f5d3b7fe700 1 -- 192.168.123.103:0/2131183268 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f5d3409c3e0 con 0x7f5d44108810 2026-03-10T14:17:43.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.487+0000 7f5d4259c700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5d30077a60 0x7f5d30079f10 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f5d44103950 tx=0x7f5d2c006cb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:43.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.487+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d24005320 con 0x7f5d44108810 2026-03-10T14:17:43.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.490+0000 7f5d3b7fe700 1 -- 192.168.123.103:0/2131183268 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5d34064b80 con 0x7f5d44108810 2026-03-10T14:17:43.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.636+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7f5d24005190 con 0x7f5d44108810 2026-03-10T14:17:43.636 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.637+0000 7f5d3b7fe700 1 -- 192.168.123.103:0/2131183268 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v40) v1 ==== 107+0+4139 (secure 0 0 0) 0x7f5d34027020 con 0x7f5d44108810 2026-03-10T14:17:43.636 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:43.636 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":21,"btime":"2026-03-10T14:14:44:039384+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34412,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1608434114","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1608434114},{"type":"v1","addr":"192.168.123.103:6829","nonce":1608434114}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:43.661835+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24263},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263]},"id":1}]} 2026-03-10T14:17:43.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5d30077a60 msgr2=0x7f5d30079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:43.638 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5d30077a60 0x7f5d30079f10 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f5d44103950 tx=0x7f5d2c006cb0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44108810 msgr2=0x7f5d44198980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44108810 0x7f5d44198980 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f5d340094d0 tx=0x7f5d340049e0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 shutdown_connections 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f5d30077a60 0x7f5d30079f10 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f5d44102810 0x7f5d44198440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 --2- 192.168.123.103:0/2131183268 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d44108810 0x7f5d44198980 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 >> 192.168.123.103:0/2131183268 conn(0x7f5d440fe330 msgr2=0x7f5d440ffa20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 shutdown_connections 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:43.640+0000 7f5d48aba700 1 -- 192.168.123.103:0/2131183268 wait complete. 2026-03-10T14:17:43.639 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 21 2026-03-10T14:17:43.703 DEBUG:tasks.fs:max_mds reduced in epoch 21 2026-03-10T14:17:43.703 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 21 2026-03-10T14:17:43.704 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 22 2026-03-10T14:17:43.852 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:44.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.106+0000 7fba28ee8700 1 -- 192.168.123.103:0/1388101068 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 msgr2=0x7fba24108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:44.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.106+0000 7fba28ee8700 1 --2- 192.168.123.103:0/1388101068 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 0x7fba24108b50 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7fba0c009b00 tx=0x7fba0c009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:44.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.107+0000 7fba28ee8700 1 -- 192.168.123.103:0/1388101068 shutdown_connections 2026-03-10T14:17:44.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.107+0000 7fba28ee8700 1 --2- 192.168.123.103:0/1388101068 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba24102780 0x7fba24102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:44.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.107+0000 7fba28ee8700 1 --2- 192.168.123.103:0/1388101068 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 0x7fba24108b50 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:44.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.107+0000 7fba28ee8700 1 -- 192.168.123.103:0/1388101068 >> 192.168.123.103:0/1388101068 conn(0x7fba240fe280 msgr2=0x7fba24100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:44.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.107+0000 7fba28ee8700 1 -- 192.168.123.103:0/1388101068 shutdown_connections 2026-03-10T14:17:44.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.107+0000 7fba28ee8700 1 -- 192.168.123.103:0/1388101068 wait complete. 2026-03-10T14:17:44.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.107+0000 7fba28ee8700 1 Processor -- start 2026-03-10T14:17:44.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba28ee8700 1 -- start start 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba28ee8700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba24102780 0x7fba241983f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba28ee8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 0x7fba24198930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba28ee8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba24199010 con 0x7fba24108780 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba28ee8700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba2419cda0 con 0x7fba24102780 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba21d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 0x7fba24198930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba21d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 0x7fba24198930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60812/0 (socket says 192.168.123.103:60812) 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba21d9b700 1 -- 192.168.123.103:0/2978134709 learned_addr learned my addr 192.168.123.103:0/2978134709 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba2259c700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba24102780 0x7fba241983f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba21d9b700 1 -- 192.168.123.103:0/2978134709 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba24102780 msgr2=0x7fba241983f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba21d9b700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba24102780 0x7fba241983f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba21d9b700 1 -- 192.168.123.103:0/2978134709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fba0c0097e0 con 0x7fba24108780 2026-03-10T14:17:44.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.108+0000 7fba21d9b700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 0x7fba24198930 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7fba1400eb10 tx=0x7fba1400ee20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:44.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.109+0000 7fba1b7fe700 1 -- 192.168.123.103:0/2978134709 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fba1400cc40 con 0x7fba24108780 2026-03-10T14:17:44.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.109+0000 7fba1b7fe700 1 -- 192.168.123.103:0/2978134709 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fba1400cda0 con 0x7fba24108780 2026-03-10T14:17:44.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.109+0000 7fba1b7fe700 1 -- 192.168.123.103:0/2978134709 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fba14018810 con 0x7fba24108780 2026-03-10T14:17:44.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.109+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fba2419d080 con 0x7fba24108780 2026-03-10T14:17:44.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.109+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fba2419d5a0 con 0x7fba24108780 2026-03-10T14:17:44.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.110+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fba2404ea50 con 0x7fba24108780 2026-03-10T14:17:44.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.110+0000 7fba1b7fe700 1 -- 192.168.123.103:0/2978134709 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fba14010ba0 con 0x7fba24108780 2026-03-10T14:17:44.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.112+0000 7fba1b7fe700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fba100778c0 0x7fba10079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:44.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.113+0000 7fba1b7fe700 1 -- 192.168.123.103:0/2978134709 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fba14014070 con 0x7fba24108780 2026-03-10T14:17:44.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.113+0000 7fba2259c700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fba100778c0 0x7fba10079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:44.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.113+0000 7fba2259c700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fba100778c0 0x7fba10079d70 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fba0c006010 tx=0x7fba0c00b560 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:44.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.114+0000 7fba1b7fe700 1 -- 192.168.123.103:0/2978134709 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fba140621d0 con 0x7fba24108780 2026-03-10T14:17:44.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:43 vm03.local ceph-mon[103098]: pgmap v346: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:44.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:43 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/4076786871' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T14:17:44.253 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:43 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2131183268' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T14:17:44.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.254+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7fba24066e40 con 0x7fba24108780 2026-03-10T14:17:44.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.255+0000 7fba1b7fe700 1 -- 192.168.123.103:0/2978134709 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v40) v1 ==== 107+0+4922 (secure 0 0 0) 0x7fba14061920 con 0x7fba24108780 2026-03-10T14:17:44.255 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:44.255 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":22,"btime":"2026-03-10T14:14:45:042272+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34412,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1608434114","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1608434114},{"type":"v1","addr":"192.168.123.103:6829","nonce":1608434114}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34442,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/3588884656","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3588884656},{"type":"v1","addr":"192.168.123.103:6827","nonce":3588884656}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:43.661835+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24263},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263]},"id":1}]} 2026-03-10T14:17:44.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.259+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fba100778c0 msgr2=0x7fba10079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:44.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.259+0000 7fba28ee8700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fba100778c0 0x7fba10079d70 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fba0c006010 tx=0x7fba0c00b560 comp rx=0 tx=0).stop 2026-03-10T14:17:44.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.259+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 msgr2=0x7fba24198930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.259+0000 7fba28ee8700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 0x7fba24198930 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7fba1400eb10 tx=0x7fba1400ee20 comp rx=0 tx=0).stop 2026-03-10T14:17:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.260+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 shutdown_connections 2026-03-10T14:17:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.260+0000 7fba28ee8700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fba100778c0 0x7fba10079d70 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.260+0000 7fba28ee8700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fba24102780 0x7fba241983f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.260+0000 7fba28ee8700 1 --2- 192.168.123.103:0/2978134709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fba24108780 0x7fba24198930 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.260+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 >> 192.168.123.103:0/2978134709 conn(0x7fba240fe280 msgr2=0x7fba240ffa60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.260+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 shutdown_connections 2026-03-10T14:17:44.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:44.261+0000 7fba28ee8700 1 -- 192.168.123.103:0/2978134709 wait complete. 2026-03-10T14:17:44.260 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 22 2026-03-10T14:17:44.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:43 vm04.local ceph-mon[92084]: pgmap v346: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:44.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:43 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/4076786871' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T14:17:44.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:43 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2131183268' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T14:17:44.673 DEBUG:tasks.fs:max_mds reduced in epoch 22 2026-03-10T14:17:44.674 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 22 2026-03-10T14:17:44.674 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 23 2026-03-10T14:17:44.830 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:45.202 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:44 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2978134709' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T14:17:45.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.288+0000 7f7096a0b700 1 -- 192.168.123.103:0/1812417467 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7090073070 msgr2=0x7f7090073440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:45.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.288+0000 7f7096a0b700 1 --2- 192.168.123.103:0/1812417467 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7090073070 0x7f7090073440 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f7078009b30 tx=0x7f7078009e40 comp rx=0 tx=0).stop 2026-03-10T14:17:45.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.288+0000 7f7096a0b700 1 -- 192.168.123.103:0/1812417467 shutdown_connections 2026-03-10T14:17:45.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.288+0000 7f7096a0b700 1 --2- 192.168.123.103:0/1812417467 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7090073980 0x7f709010c8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.288+0000 7f7096a0b700 1 --2- 192.168.123.103:0/1812417467 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7090073070 0x7f7090073440 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.288+0000 7f7096a0b700 1 -- 192.168.123.103:0/1812417467 >> 192.168.123.103:0/1812417467 conn(0x7f70900fbfc0 msgr2=0x7f70900fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:45.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.289+0000 7f7096a0b700 1 -- 192.168.123.103:0/1812417467 shutdown_connections 2026-03-10T14:17:45.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.289+0000 7f7096a0b700 1 -- 192.168.123.103:0/1812417467 wait complete. 2026-03-10T14:17:45.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.289+0000 7f7096a0b700 1 Processor -- start 2026-03-10T14:17:45.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.289+0000 7f7096a0b700 1 -- start start 2026-03-10T14:17:45.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.290+0000 7f7096a0b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7090073070 0x7f70901983c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:45.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.290+0000 7f7096a0b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7090073980 0x7f7090198900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:45.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.290+0000 7f7096a0b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7090198fe0 con 0x7f7090073070 2026-03-10T14:17:45.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.290+0000 7f7096a0b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f709019cd70 con 0x7f7090073980 2026-03-10T14:17:45.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.290+0000 7f708f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7090073980 0x7f7090198900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:45.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.290+0000 7f708f7fe700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7090073980 0x7f7090198900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49018/0 (socket says 192.168.123.103:49018) 2026-03-10T14:17:45.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.290+0000 7f708f7fe700 1 -- 192.168.123.103:0/1369998573 learned_addr learned my addr 192.168.123.103:0/1369998573 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:45.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.290+0000 7f708ffff700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7090073070 0x7f70901983c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:45.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.291+0000 7f708f7fe700 1 -- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7090073070 msgr2=0x7f70901983c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:45.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.291+0000 7f708f7fe700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7090073070 0x7f70901983c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.291+0000 7f708f7fe700 1 -- 192.168.123.103:0/1369998573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f70780097e0 con 0x7f7090073980 2026-03-10T14:17:45.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.291+0000 7f708f7fe700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7090073980 0x7f7090198900 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f708000b700 tx=0x7f708000ba10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:45.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.292+0000 7f708d7fa700 1 -- 192.168.123.103:0/1369998573 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f70800107c0 con 0x7f7090073980 2026-03-10T14:17:45.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.292+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f709019d050 con 0x7f7090073980 2026-03-10T14:17:45.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.292+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f709019d5a0 con 0x7f7090073980 2026-03-10T14:17:45.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.293+0000 7f708d7fa700 1 -- 192.168.123.103:0/1369998573 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7080010e00 con 0x7f7090073980 2026-03-10T14:17:45.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.293+0000 7f708d7fa700 1 -- 192.168.123.103:0/1369998573 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f708000f360 con 0x7f7090073980 2026-03-10T14:17:45.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.294+0000 7f708d7fa700 1 -- 192.168.123.103:0/1369998573 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7080010920 con 0x7f7090073980 2026-03-10T14:17:45.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.294+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7070005320 con 0x7f7090073980 2026-03-10T14:17:45.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.294+0000 7f708d7fa700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f707c077870 0x7f707c079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:45.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.294+0000 7f708ffff700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f707c077870 0x7f707c079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:45.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.294+0000 7f708d7fa700 1 -- 192.168.123.103:0/1369998573 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f7080099280 con 0x7f7090073980 2026-03-10T14:17:45.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.295+0000 7f708ffff700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f707c077870 0x7f707c079d20 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f7078009b00 tx=0x7f7078005ab0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:45.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.297+0000 7f708d7fa700 1 -- 192.168.123.103:0/1369998573 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7080061220 con 0x7f7090073980 2026-03-10T14:17:45.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:44 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2978134709' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T14:17:45.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.437+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7f7070005190 con 0x7f7090073980 2026-03-10T14:17:45.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.438+0000 7f708d7fa700 1 -- 192.168.123.103:0/1369998573 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v40) v1 ==== 107+0+4139 (secure 0 0 0) 0x7f7080061040 con 0x7f7090073980 2026-03-10T14:17:45.437 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:45.437 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":23,"btime":"2026-03-10T14:14:54:516822+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34412,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1608434114","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1608434114},{"type":"v1","addr":"192.168.123.103:6829","nonce":1608434114}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:43.661835+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24263},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263]},"id":1}]} 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.441+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f707c077870 msgr2=0x7f707c079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.441+0000 7f7096a0b700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f707c077870 0x7f707c079d20 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f7078009b00 tx=0x7f7078005ab0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.441+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7090073980 msgr2=0x7f7090198900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.441+0000 7f7096a0b700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7090073980 0x7f7090198900 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f708000b700 tx=0x7f708000ba10 comp rx=0 tx=0).stop 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.442+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 shutdown_connections 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.442+0000 7f7096a0b700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f707c077870 0x7f707c079d20 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.442+0000 7f7096a0b700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7090073070 0x7f70901983c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.442+0000 7f7096a0b700 1 --2- 192.168.123.103:0/1369998573 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7090073980 0x7f7090198900 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.442+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 >> 192.168.123.103:0/1369998573 conn(0x7f70900fbfc0 msgr2=0x7f70901070e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:45.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.442+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 shutdown_connections 2026-03-10T14:17:45.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.442+0000 7f7096a0b700 1 -- 192.168.123.103:0/1369998573 wait complete. 2026-03-10T14:17:45.441 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 23 2026-03-10T14:17:45.506 DEBUG:tasks.fs:max_mds reduced in epoch 23 2026-03-10T14:17:45.506 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 23 2026-03-10T14:17:45.506 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 24 2026-03-10T14:17:45.664 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.927+0000 7fe4bcc54700 1 -- 192.168.123.103:0/3187729469 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8108780 msgr2=0x7fe4b8108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.927+0000 7fe4bcc54700 1 --2- 192.168.123.103:0/3187729469 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8108780 0x7fe4b8108b50 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fe4a0009a60 tx=0x7fe4a0009d70 comp rx=0 tx=0).stop 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.928+0000 7fe4bcc54700 1 -- 192.168.123.103:0/3187729469 shutdown_connections 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.928+0000 7fe4bcc54700 1 --2- 192.168.123.103:0/3187729469 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe4b8102780 0x7fe4b8102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.928+0000 7fe4bcc54700 1 --2- 192.168.123.103:0/3187729469 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8108780 0x7fe4b8108b50 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.928+0000 7fe4bcc54700 1 -- 192.168.123.103:0/3187729469 >> 192.168.123.103:0/3187729469 conn(0x7fe4b80fe280 msgr2=0x7fe4b8100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.928+0000 7fe4bcc54700 1 -- 192.168.123.103:0/3187729469 shutdown_connections 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.928+0000 7fe4bcc54700 1 -- 192.168.123.103:0/3187729469 wait complete. 2026-03-10T14:17:45.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.929+0000 7fe4bcc54700 1 Processor -- start 2026-03-10T14:17:45.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.929+0000 7fe4bcc54700 1 -- start start 2026-03-10T14:17:45.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.929+0000 7fe4bcc54700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8102780 0x7fe4b8198350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:45.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.929+0000 7fe4bcc54700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe4b8108780 0x7fe4b8198890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:45.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.929+0000 7fe4bcc54700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe4b8198f70 con 0x7fe4b8108780 2026-03-10T14:17:45.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.929+0000 7fe4bcc54700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe4b819cd00 con 0x7fe4b8102780 2026-03-10T14:17:45.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8102780 0x7fe4b8198350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:45.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b659c700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8102780 0x7fe4b8198350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49032/0 (socket says 192.168.123.103:49032) 2026-03-10T14:17:45.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b659c700 1 -- 192.168.123.103:0/1630033306 learned_addr learned my addr 192.168.123.103:0/1630033306 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b5d9b700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe4b8108780 0x7fe4b8198890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b659c700 1 -- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe4b8108780 msgr2=0x7fe4b8198890 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b659c700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe4b8108780 0x7fe4b8198890 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b659c700 1 -- 192.168.123.103:0/1630033306 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe4a80097e0 con 0x7fe4b8102780 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b5d9b700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe4b8108780 0x7fe4b8198890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4b659c700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8102780 0x7fe4b8198350 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fe4a0000c00 tx=0x7fe4a000f740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4af7fe700 1 -- 192.168.123.103:0/1630033306 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe4a001d070 con 0x7fe4b8102780 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.930+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe4a0009710 con 0x7fe4b8102780 2026-03-10T14:17:45.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.931+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe4b819d2e0 con 0x7fe4b8102780 2026-03-10T14:17:45.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.931+0000 7fe4af7fe700 1 -- 192.168.123.103:0/1630033306 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe4a000fba0 con 0x7fe4b8102780 2026-03-10T14:17:45.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.931+0000 7fe4af7fe700 1 -- 192.168.123.103:0/1630033306 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe4a0017710 con 0x7fe4b8102780 2026-03-10T14:17:45.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.932+0000 7fe4af7fe700 1 -- 192.168.123.103:0/1630033306 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe4a0017870 con 0x7fe4b8102780 2026-03-10T14:17:45.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.933+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe4b804ea50 con 0x7fe4b8102780 2026-03-10T14:17:45.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.933+0000 7fe4af7fe700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe4a4077910 0x7fe4a4079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:45.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.933+0000 7fe4b5d9b700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe4a4077910 0x7fe4a4079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:45.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.933+0000 7fe4af7fe700 1 -- 192.168.123.103:0/1630033306 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fe4a009b150 con 0x7fe4b8102780 2026-03-10T14:17:45.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.933+0000 7fe4b5d9b700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe4a4077910 0x7fe4a4079dc0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7fe4b8199970 tx=0x7fe4a8009500 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:45.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:45.936+0000 7fe4af7fe700 1 -- 192.168.123.103:0/1630033306 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe4a000fd80 con 0x7fe4b8102780 2026-03-10T14:17:46.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.085+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7fe4b819d5c0 con 0x7fe4b8102780 2026-03-10T14:17:46.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.086+0000 7fe4af7fe700 1 -- 192.168.123.103:0/1630033306 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v40) v1 ==== 107+0+4990 (secure 0 0 0) 0x7fe4a0063280 con 0x7fe4b8102780 2026-03-10T14:17:46.085 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:46.085 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":24,"btime":"2026-03-10T14:14:57:544182+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34412,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1608434114","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1608434114},{"type":"v1","addr":"192.168.123.103:6829","nonce":1608434114}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:43.661835+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24263},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263]},"id":1}]} 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe4a4077910 msgr2=0x7fe4a4079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe4a4077910 0x7fe4a4079dc0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7fe4b8199970 tx=0x7fe4a8009500 comp rx=0 tx=0).stop 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8102780 msgr2=0x7fe4b8198350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8102780 0x7fe4b8198350 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fe4a0000c00 tx=0x7fe4a000f740 comp rx=0 tx=0).stop 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 shutdown_connections 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe4a4077910 0x7fe4a4079dc0 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe4b8102780 0x7fe4b8198350 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 --2- 192.168.123.103:0/1630033306 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe4b8108780 0x7fe4b8198890 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.089+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 >> 192.168.123.103:0/1630033306 conn(0x7fe4b80fe280 msgr2=0x7fe4b80ff9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.090+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 shutdown_connections 2026-03-10T14:17:46.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.090+0000 7fe4bcc54700 1 -- 192.168.123.103:0/1630033306 wait complete. 2026-03-10T14:17:46.089 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 24 2026-03-10T14:17:46.155 DEBUG:tasks.fs:max_mds reduced in epoch 24 2026-03-10T14:17:46.155 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 24 2026-03-10T14:17:46.155 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 25 2026-03-10T14:17:46.311 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:46 vm04.local ceph-mon[92084]: pgmap v347: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:46 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1369998573' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T14:17:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:17:46.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:46 vm03.local ceph-mon[103098]: pgmap v347: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:46.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:46 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1369998573' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T14:17:46.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:17:46.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.555+0000 7fa704dfd700 1 -- 192.168.123.103:0/1305311260 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 msgr2=0x7fa700102c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:46.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.555+0000 7fa704dfd700 1 --2- 192.168.123.103:0/1305311260 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 0x7fa700102c80 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7fa6f0009b00 tx=0x7fa6f0009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:46.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.555+0000 7fa704dfd700 1 -- 192.168.123.103:0/1305311260 shutdown_connections 2026-03-10T14:17:46.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.555+0000 7fa704dfd700 1 --2- 192.168.123.103:0/1305311260 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 0x7fa700102c80 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.555+0000 7fa704dfd700 1 --2- 192.168.123.103:0/1305311260 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa700108810 0x7fa700108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.555+0000 7fa704dfd700 1 -- 192.168.123.103:0/1305311260 >> 192.168.123.103:0/1305311260 conn(0x7fa7000fe330 msgr2=0x7fa700100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:46.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.556+0000 7fa704dfd700 1 -- 192.168.123.103:0/1305311260 shutdown_connections 2026-03-10T14:17:46.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.556+0000 7fa704dfd700 1 -- 192.168.123.103:0/1305311260 wait complete. 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.556+0000 7fa704dfd700 1 Processor -- start 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.556+0000 7fa704dfd700 1 -- start start 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.556+0000 7fa704dfd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 0x7fa7001983e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.556+0000 7fa704dfd700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa700108810 0x7fa700198920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.556+0000 7fa704dfd700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa700198f70 con 0x7fa700102810 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.556+0000 7fa704dfd700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7001990b0 con 0x7fa700108810 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fdd9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa700108810 0x7fa700198920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fdd9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa700108810 0x7fa700198920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49040/0 (socket says 192.168.123.103:49040) 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fdd9b700 1 -- 192.168.123.103:0/1285548732 learned_addr learned my addr 192.168.123.103:0/1285548732 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fe59c700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 0x7fa7001983e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:46.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fdd9b700 1 -- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 msgr2=0x7fa7001983e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:46.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fdd9b700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 0x7fa7001983e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fdd9b700 1 -- 192.168.123.103:0/1285548732 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa6f00097e0 con 0x7fa700108810 2026-03-10T14:17:46.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fe59c700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 0x7fa7001983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T14:17:46.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.557+0000 7fa6fdd9b700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa700108810 0x7fa700198920 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fa6f0004930 tx=0x7fa6f0004a10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:46.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.558+0000 7fa6f77fe700 1 -- 192.168.123.103:0/1285548732 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa6f001d070 con 0x7fa700108810 2026-03-10T14:17:46.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.558+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa70019cea0 con 0x7fa700108810 2026-03-10T14:17:46.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.558+0000 7fa6f77fe700 1 -- 192.168.123.103:0/1285548732 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa6f000bc50 con 0x7fa700108810 2026-03-10T14:17:46.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.558+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa70019d390 con 0x7fa700108810 2026-03-10T14:17:46.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.558+0000 7fa6f77fe700 1 -- 192.168.123.103:0/1285548732 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa6f0022620 con 0x7fa700108810 2026-03-10T14:17:46.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.558+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa70004ea50 con 0x7fa700108810 2026-03-10T14:17:46.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.559+0000 7fa6f77fe700 1 -- 192.168.123.103:0/1285548732 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa6f0022b50 con 0x7fa700108810 2026-03-10T14:17:46.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.559+0000 7fa6f77fe700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa6ec0778c0 0x7fa6ec079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:46.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.560+0000 7fa6fe59c700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa6ec0778c0 0x7fa6ec079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:46.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.560+0000 7fa6f77fe700 1 -- 192.168.123.103:0/1285548732 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fa6f009b4b0 con 0x7fa700108810 2026-03-10T14:17:46.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.560+0000 7fa6fe59c700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa6ec0778c0 0x7fa6ec079d70 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7fa700103950 tx=0x7fa6e800a680 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:46.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.562+0000 7fa6f77fe700 1 -- 192.168.123.103:0/1285548732 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa6f0063aa0 con 0x7fa700108810 2026-03-10T14:17:46.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.707+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7fa70019d660 con 0x7fa700108810 2026-03-10T14:17:46.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.708+0000 7fa6f77fe700 1 -- 192.168.123.103:0/1285548732 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v40) v1 ==== 107+0+4207 (secure 0 0 0) 0x7fa6f00631f0 con 0x7fa700108810 2026-03-10T14:17:46.707 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:46.707 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":25,"btime":"2026-03-10T14:15:00:514482+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:43.661835+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24263},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263]},"id":1}]} 2026-03-10T14:17:46.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa6ec0778c0 msgr2=0x7fa6ec079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:46.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa6ec0778c0 0x7fa6ec079d70 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7fa700103950 tx=0x7fa6e800a680 comp rx=0 tx=0).stop 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa700108810 msgr2=0x7fa700198920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa700108810 0x7fa700198920 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fa6f0004930 tx=0x7fa6f0004a10 comp rx=0 tx=0).stop 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 shutdown_connections 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa6ec0778c0 0x7fa6ec079d70 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa700102810 0x7fa7001983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 --2- 192.168.123.103:0/1285548732 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa700108810 0x7fa700198920 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.711+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 >> 192.168.123.103:0/1285548732 conn(0x7fa7000fe330 msgr2=0x7fa7000ffb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.712+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 shutdown_connections 2026-03-10T14:17:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:46.712+0000 7fa704dfd700 1 -- 192.168.123.103:0/1285548732 wait complete. 2026-03-10T14:17:46.711 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 25 2026-03-10T14:17:46.770 DEBUG:tasks.fs:max_mds reduced in epoch 25 2026-03-10T14:17:46.770 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 25 2026-03-10T14:17:46.770 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 26 2026-03-10T14:17:46.910 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.151+0000 7f3d3cb44700 1 -- 192.168.123.103:0/2362670501 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 msgr2=0x7f3d38068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.151+0000 7f3d3cb44700 1 --2- 192.168.123.103:0/2362670501 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 0x7f3d38068900 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f3d28009b00 tx=0x7f3d28009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.152+0000 7f3d3cb44700 1 -- 192.168.123.103:0/2362670501 shutdown_connections 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.152+0000 7f3d3cb44700 1 --2- 192.168.123.103:0/2362670501 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 0x7f3d38068900 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.152+0000 7f3d3cb44700 1 --2- 192.168.123.103:0/2362670501 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d381013a0 0x7f3d38101770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.152+0000 7f3d3cb44700 1 -- 192.168.123.103:0/2362670501 >> 192.168.123.103:0/2362670501 conn(0x7f3d380754a0 msgr2=0x7f3d380758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.152+0000 7f3d3cb44700 1 -- 192.168.123.103:0/2362670501 shutdown_connections 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.152+0000 7f3d3cb44700 1 -- 192.168.123.103:0/2362670501 wait complete. 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d3cb44700 1 Processor -- start 2026-03-10T14:17:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d3cb44700 1 -- start start 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d3cb44700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 0x7f3d38198330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d3cb44700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d381013a0 0x7f3d38198870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d3cb44700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d38198ec0 con 0x7f3d38068490 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d35d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d381013a0 0x7f3d38198870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d35d9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d381013a0 0x7f3d38198870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49064/0 (socket says 192.168.123.103:49064) 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d35d9b700 1 -- 192.168.123.103:0/1797759560 learned_addr learned my addr 192.168.123.103:0/1797759560 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.153+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d38199000 con 0x7f3d381013a0 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d3659c700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 0x7f3d38198330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d35d9b700 1 -- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 msgr2=0x7f3d38198330 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d35d9b700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 0x7f3d38198330 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d35d9b700 1 -- 192.168.123.103:0/1797759560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d280097e0 con 0x7f3d381013a0 2026-03-10T14:17:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d3659c700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 0x7f3d38198330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d35d9b700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d381013a0 0x7f3d38198870 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f3d28004930 tx=0x7f3d28004a10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d2f7fe700 1 -- 192.168.123.103:0/1797759560 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d2801d070 con 0x7f3d381013a0 2026-03-10T14:17:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d2f7fe700 1 -- 192.168.123.103:0/1797759560 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3d2800bc50 con 0x7f3d381013a0 2026-03-10T14:17:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d3819cdf0 con 0x7f3d381013a0 2026-03-10T14:17:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d2f7fe700 1 -- 192.168.123.103:0/1797759560 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d2800f830 con 0x7f3d381013a0 2026-03-10T14:17:47.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.154+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d3819d2e0 con 0x7f3d381013a0 2026-03-10T14:17:47.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.155+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d3804ea50 con 0x7f3d381013a0 2026-03-10T14:17:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.156+0000 7f3d2f7fe700 1 -- 192.168.123.103:0/1797759560 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3d2800f990 con 0x7f3d381013a0 2026-03-10T14:17:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.156+0000 7f3d2f7fe700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3d240778c0 0x7f3d24079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.156+0000 7f3d3659c700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3d240778c0 0x7f3d24079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.157+0000 7f3d2f7fe700 1 -- 192.168.123.103:0/1797759560 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f3d2809bee0 con 0x7f3d381013a0 2026-03-10T14:17:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.157+0000 7f3d3659c700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3d240778c0 0x7f3d24079d70 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f3d20005fd0 tx=0x7f3d20005e20 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:47.157 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.158+0000 7f3d2f7fe700 1 -- 192.168.123.103:0/1797759560 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3d280645d0 con 0x7f3d381013a0 2026-03-10T14:17:47.302 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:47 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1630033306' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T14:17:47.302 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:47 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1285548732' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T14:17:47.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.302+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7f3d38066e40 con 0x7f3d381013a0 2026-03-10T14:17:47.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.305+0000 7f3d2f7fe700 1 -- 192.168.123.103:0/1797759560 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v40) v1 ==== 107+0+5058 (secure 0 0 0) 0x7f3d28063d20 con 0x7f3d381013a0 2026-03-10T14:17:47.304 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:47.304 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":26,"btime":"2026-03-10T14:15:04:771420+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:14:43.661835+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24263},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24263":{"gid":24263,"name":"cephfs.vm04.sslxuq","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.104:6825/291713758","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":291713758},{"type":"v1","addr":"192.168.123.104:6825","nonce":291713758}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24263,"qdb_cluster":[24263]},"id":1}]} 2026-03-10T14:17:47.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.307+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3d240778c0 msgr2=0x7f3d24079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:47.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.307+0000 7f3d3cb44700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3d240778c0 0x7f3d24079d70 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f3d20005fd0 tx=0x7f3d20005e20 comp rx=0 tx=0).stop 2026-03-10T14:17:47.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.308+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d381013a0 msgr2=0x7f3d38198870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:47.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.308+0000 7f3d3cb44700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d381013a0 0x7f3d38198870 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f3d28004930 tx=0x7f3d28004a10 comp rx=0 tx=0).stop 2026-03-10T14:17:47.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.308+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 shutdown_connections 2026-03-10T14:17:47.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.308+0000 7f3d3cb44700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f3d240778c0 0x7f3d24079d70 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.309+0000 7f3d3cb44700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3d38068490 0x7f3d38198330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.309+0000 7f3d3cb44700 1 --2- 192.168.123.103:0/1797759560 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f3d381013a0 0x7f3d38198870 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.309+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 >> 192.168.123.103:0/1797759560 conn(0x7f3d380754a0 msgr2=0x7f3d380fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:47.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.309+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 shutdown_connections 2026-03-10T14:17:47.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.309+0000 7f3d3cb44700 1 -- 192.168.123.103:0/1797759560 wait complete. 2026-03-10T14:17:47.308 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 26 2026-03-10T14:17:47.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:47 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1630033306' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T14:17:47.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:47 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1285548732' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T14:17:47.360 DEBUG:tasks.fs:max_mds reduced in epoch 26 2026-03-10T14:17:47.360 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 26 2026-03-10T14:17:47.360 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 27 2026-03-10T14:17:47.514 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:47.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.764+0000 7f729bbce700 1 -- 192.168.123.103:0/2055026544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7294108810 msgr2=0x7f7294108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:47.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.764+0000 7f729bbce700 1 --2- 192.168.123.103:0/2055026544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7294108810 0x7f7294108be0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f7284009b50 tx=0x7f7284009e60 comp rx=0 tx=0).stop 2026-03-10T14:17:47.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.765+0000 7f729bbce700 1 -- 192.168.123.103:0/2055026544 shutdown_connections 2026-03-10T14:17:47.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.765+0000 7f729bbce700 1 --2- 192.168.123.103:0/2055026544 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7294102810 0x7f7294102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.765+0000 7f729bbce700 1 --2- 192.168.123.103:0/2055026544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7294108810 0x7f7294108be0 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.765+0000 7f729bbce700 1 -- 192.168.123.103:0/2055026544 >> 192.168.123.103:0/2055026544 conn(0x7f72940fe330 msgr2=0x7f7294100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:47.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.765+0000 7f729bbce700 1 -- 192.168.123.103:0/2055026544 shutdown_connections 2026-03-10T14:17:47.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.765+0000 7f729bbce700 1 -- 192.168.123.103:0/2055026544 wait complete. 2026-03-10T14:17:47.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f729bbce700 1 Processor -- start 2026-03-10T14:17:47.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f729bbce700 1 -- start start 2026-03-10T14:17:47.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f729bbce700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7294102810 0x7f729419e510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:47.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f729bbce700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f729419ea50 0x7f7294198630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:47.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f729bbce700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f729419f010 con 0x7f729419ea50 2026-03-10T14:17:47.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f729bbce700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7294198b70 con 0x7f7294102810 2026-03-10T14:17:47.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f7299169700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f729419ea50 0x7f7294198630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:47.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f7299169700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f729419ea50 0x7f7294198630 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60924/0 (socket says 192.168.123.103:60924) 2026-03-10T14:17:47.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.766+0000 7f7299169700 1 -- 192.168.123.103:0/3407642276 learned_addr learned my addr 192.168.123.103:0/3407642276 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:47.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f7299169700 1 -- 192.168.123.103:0/3407642276 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7294102810 msgr2=0x7f729419e510 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:47.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f729996a700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7294102810 0x7f729419e510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:47.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f7299169700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7294102810 0x7f729419e510 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f7299169700 1 -- 192.168.123.103:0/3407642276 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f72840097e0 con 0x7f729419ea50 2026-03-10T14:17:47.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f729996a700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7294102810 0x7f729419e510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:47.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f7299169700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f729419ea50 0x7f7294198630 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f729000eb10 tx=0x7f729000eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:47.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f728affd700 1 -- 192.168.123.103:0/3407642276 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f729000cca0 con 0x7f729419ea50 2026-03-10T14:17:47.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f728affd700 1 -- 192.168.123.103:0/3407642276 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f729000ce00 con 0x7f729419ea50 2026-03-10T14:17:47.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.767+0000 7f728affd700 1 -- 192.168.123.103:0/3407642276 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72900189c0 con 0x7f729419ea50 2026-03-10T14:17:47.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.768+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7294198e50 con 0x7f729419ea50 2026-03-10T14:17:47.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.768+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72941993a0 con 0x7f729419ea50 2026-03-10T14:17:47.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.768+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f729410ad80 con 0x7f729419ea50 2026-03-10T14:17:47.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.771+0000 7f728affd700 1 -- 192.168.123.103:0/3407642276 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7290018b20 con 0x7f729419ea50 2026-03-10T14:17:47.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.772+0000 7f728affd700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7280077990 0x7f7280079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:47.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.772+0000 7f729996a700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7280077990 0x7f7280079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:47.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.772+0000 7f728affd700 1 -- 192.168.123.103:0/3407642276 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f7290014070 con 0x7f729419ea50 2026-03-10T14:17:47.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.772+0000 7f728affd700 1 -- 192.168.123.103:0/3407642276 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f72900ca9f0 con 0x7f729419ea50 2026-03-10T14:17:47.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.773+0000 7f729996a700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7280077990 0x7f7280079e40 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f7284009b20 tx=0x7f72840058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:47.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.914+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7f729404ea50 con 0x7f729419ea50 2026-03-10T14:17:47.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.916+0000 7f728affd700 1 -- 192.168.123.103:0/3407642276 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v40) v1 ==== 107+0+4261 (secure 0 0 0) 0x7f7290062b50 con 0x7f729419ea50 2026-03-10T14:17:47.915 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:47.915 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":27,"btime":"2026-03-10T14:15:07:526477+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34408,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":18},{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:07.526476+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":120,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[1],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:47.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.918+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7280077990 msgr2=0x7f7280079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:47.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.918+0000 7f729bbce700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7280077990 0x7f7280079e40 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f7284009b20 tx=0x7f72840058e0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.918+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f729419ea50 msgr2=0x7f7294198630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:47.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.918+0000 7f729bbce700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f729419ea50 0x7f7294198630 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f729000eb10 tx=0x7f729000eed0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.919+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 shutdown_connections 2026-03-10T14:17:47.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.919+0000 7f729bbce700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f7280077990 0x7f7280079e40 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.919+0000 7f729bbce700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f7294102810 0x7f729419e510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.919+0000 7f729bbce700 1 --2- 192.168.123.103:0/3407642276 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f729419ea50 0x7f7294198630 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:47.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.919+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 >> 192.168.123.103:0/3407642276 conn(0x7f72940fe330 msgr2=0x7f72940fff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:47.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.919+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 shutdown_connections 2026-03-10T14:17:47.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:47.919+0000 7f729bbce700 1 -- 192.168.123.103:0/3407642276 wait complete. 2026-03-10T14:17:47.919 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 27 2026-03-10T14:17:47.990 DEBUG:tasks.fs:max_mds reduced in epoch 27 2026-03-10T14:17:47.990 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 27 2026-03-10T14:17:47.990 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 28 2026-03-10T14:17:48.144 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:48.169 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:48 vm03.local ceph-mon[103098]: pgmap v348: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:48.169 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:48 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1797759560' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T14:17:48.169 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:48 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3407642276' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T14:17:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:48 vm04.local ceph-mon[92084]: pgmap v348: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:48 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1797759560' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T14:17:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:48 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3407642276' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T14:17:48.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.409+0000 7fa10d208700 1 -- 192.168.123.103:0/1691386125 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 msgr2=0x7fa1081051e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:48.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.409+0000 7fa10d208700 1 --2- 192.168.123.103:0/1691386125 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 0x7fa1081051e0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fa0f8009b00 tx=0x7fa0f8009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:48.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.410+0000 7fa10d208700 1 -- 192.168.123.103:0/1691386125 shutdown_connections 2026-03-10T14:17:48.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.410+0000 7fa10d208700 1 --2- 192.168.123.103:0/1691386125 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 0x7fa1081051e0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:48.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.410+0000 7fa10d208700 1 --2- 192.168.123.103:0/1691386125 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1080686f0 0x7fa108068ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:48.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.410+0000 7fa10d208700 1 -- 192.168.123.103:0/1691386125 >> 192.168.123.103:0/1691386125 conn(0x7fa1080754a0 msgr2=0x7fa1080758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:48.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.410+0000 7fa10d208700 1 -- 192.168.123.103:0/1691386125 shutdown_connections 2026-03-10T14:17:48.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.410+0000 7fa10d208700 1 -- 192.168.123.103:0/1691386125 wait complete. 2026-03-10T14:17:48.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa10d208700 1 Processor -- start 2026-03-10T14:17:48.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa10d208700 1 -- start start 2026-03-10T14:17:48.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa10d208700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1080686f0 0x7fa108196150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:48.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa106d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1080686f0 0x7fa108196150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:48.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa106d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1080686f0 0x7fa108196150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60942/0 (socket says 192.168.123.103:60942) 2026-03-10T14:17:48.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa10d208700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 0x7fa108196690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa10d208700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa108196d70 con 0x7fa1080686f0 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa10d208700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa10819ab00 con 0x7fa108069000 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.411+0000 7fa106d9d700 1 -- 192.168.123.103:0/2965203598 learned_addr learned my addr 192.168.123.103:0/2965203598 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.412+0000 7fa10659c700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 0x7fa108196690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.412+0000 7fa10659c700 1 -- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1080686f0 msgr2=0x7fa108196150 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.412+0000 7fa10659c700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1080686f0 0x7fa108196150 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.412+0000 7fa10659c700 1 -- 192.168.123.103:0/2965203598 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa0f0009710 con 0x7fa108069000 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.412+0000 7fa106d9d700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1080686f0 0x7fa108196150 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.412+0000 7fa10659c700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 0x7fa108196690 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fa0f8009b00 tx=0x7fa0f800ba00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.412+0000 7fa0fffff700 1 -- 192.168.123.103:0/2965203598 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0f801d070 con 0x7fa108069000 2026-03-10T14:17:48.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.412+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa0f80097e0 con 0x7fa108069000 2026-03-10T14:17:48.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.413+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa10819b000 con 0x7fa108069000 2026-03-10T14:17:48.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.413+0000 7fa0fffff700 1 -- 192.168.123.103:0/2965203598 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa0f800f460 con 0x7fa108069000 2026-03-10T14:17:48.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.413+0000 7fa0fffff700 1 -- 192.168.123.103:0/2965203598 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0f8005170 con 0x7fa108069000 2026-03-10T14:17:48.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.414+0000 7fa0fffff700 1 -- 192.168.123.103:0/2965203598 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa0f800fa90 con 0x7fa108069000 2026-03-10T14:17:48.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.414+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa10804ea50 con 0x7fa108069000 2026-03-10T14:17:48.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.415+0000 7fa0fffff700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa0f4077870 0x7fa0f4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:48.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.415+0000 7fa0fffff700 1 -- 192.168.123.103:0/2965203598 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fa0f809ac20 con 0x7fa108069000 2026-03-10T14:17:48.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.415+0000 7fa106d9d700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa0f4077870 0x7fa0f4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:48.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.415+0000 7fa106d9d700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa0f4077870 0x7fa0f4079d20 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7fa0f0005950 tx=0x7fa0f0012040 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:48.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.418+0000 7fa0fffff700 1 -- 192.168.123.103:0/2965203598 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa0f8063340 con 0x7fa108069000 2026-03-10T14:17:48.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.561+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7fa108066e40 con 0x7fa108069000 2026-03-10T14:17:48.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.564+0000 7fa0fffff700 1 -- 192.168.123.103:0/2965203598 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v40) v1 ==== 107+0+4272 (secure 0 0 0) 0x7fa0f8062a90 con 0x7fa108069000 2026-03-10T14:17:48.563 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:48.563 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":28,"btime":"2026-03-10T14:15:07:532106+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26}],"filesystems":[{"mdsmap":{"epoch":28,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:07.532102+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":120,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":34408},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34408":{"gid":34408,"name":"cephfs.vm04.puavjd","rank":0,"incarnation":28,"state":"up:replay","state_seq":1,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:48.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.567+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa0f4077870 msgr2=0x7fa0f4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:48.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.567+0000 7fa10d208700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa0f4077870 0x7fa0f4079d20 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7fa0f0005950 tx=0x7fa0f0012040 comp rx=0 tx=0).stop 2026-03-10T14:17:48.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 msgr2=0x7fa108196690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:48.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 0x7fa108196690 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fa0f8009b00 tx=0x7fa0f800ba00 comp rx=0 tx=0).stop 2026-03-10T14:17:48.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 shutdown_connections 2026-03-10T14:17:48.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fa0f4077870 0x7fa0f4079d20 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:48.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1080686f0 0x7fa108196150 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:48.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 --2- 192.168.123.103:0/2965203598 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fa108069000 0x7fa108196690 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:48.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 >> 192.168.123.103:0/2965203598 conn(0x7fa1080754a0 msgr2=0x7fa1080fea20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:48.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 shutdown_connections 2026-03-10T14:17:48.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:48.568+0000 7fa10d208700 1 -- 192.168.123.103:0/2965203598 wait complete. 2026-03-10T14:17:48.567 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 28 2026-03-10T14:17:48.635 DEBUG:tasks.fs:max_mds reduced in epoch 28 2026-03-10T14:17:48.635 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 28 2026-03-10T14:17:48.635 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 29 2026-03-10T14:17:48.785 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:49.066 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:49 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2965203598' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T14:17:49.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.067+0000 7f551221f700 1 -- 192.168.123.103:0/1634323514 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c069000 msgr2=0x7f550c1051e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:49.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.067+0000 7f551221f700 1 --2- 192.168.123.103:0/1634323514 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c069000 0x7f550c1051e0 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f54f4009b00 tx=0x7f54f4009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:49.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.068+0000 7f551221f700 1 -- 192.168.123.103:0/1634323514 shutdown_connections 2026-03-10T14:17:49.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.068+0000 7f551221f700 1 --2- 192.168.123.103:0/1634323514 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c069000 0x7f550c1051e0 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.068+0000 7f551221f700 1 --2- 192.168.123.103:0/1634323514 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f550c0686f0 0x7f550c068ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.068+0000 7f551221f700 1 -- 192.168.123.103:0/1634323514 >> 192.168.123.103:0/1634323514 conn(0x7f550c0754a0 msgr2=0x7f550c0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:49.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.069+0000 7f551221f700 1 -- 192.168.123.103:0/1634323514 shutdown_connections 2026-03-10T14:17:49.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.069+0000 7f551221f700 1 -- 192.168.123.103:0/1634323514 wait complete. 2026-03-10T14:17:49.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.069+0000 7f551221f700 1 Processor -- start 2026-03-10T14:17:49.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f551221f700 1 -- start start 2026-03-10T14:17:49.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f551221f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c0686f0 0x7f550c198340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f551221f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f550c069000 0x7f550c198880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f551221f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f550c198f60 con 0x7f550c0686f0 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f551221f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f550c19ccf0 con 0x7f550c069000 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f5503fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f550c069000 0x7f550c198880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f550b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c0686f0 0x7f550c198340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f5503fff700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f550c069000 0x7f550c198880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49090/0 (socket says 192.168.123.103:49090) 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.070+0000 7f5503fff700 1 -- 192.168.123.103:0/566627079 learned_addr learned my addr 192.168.123.103:0/566627079 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.071+0000 7f5503fff700 1 -- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c0686f0 msgr2=0x7f550c198340 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.071+0000 7f5503fff700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c0686f0 0x7f550c198340 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.071+0000 7f5503fff700 1 -- 192.168.123.103:0/566627079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f54f40097e0 con 0x7f550c069000 2026-03-10T14:17:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.071+0000 7f550b7fe700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c0686f0 0x7f550c198340 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:17:49.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.071+0000 7f5503fff700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f550c069000 0x7f550c198880 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f54f400b5c0 tx=0x7f54f40048c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:49.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.071+0000 7f55097fa700 1 -- 192.168.123.103:0/566627079 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54f401d070 con 0x7f550c069000 2026-03-10T14:17:49.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.071+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f550c19cf70 con 0x7f550c069000 2026-03-10T14:17:49.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.071+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f550c19d460 con 0x7f550c069000 2026-03-10T14:17:49.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.072+0000 7f55097fa700 1 -- 192.168.123.103:0/566627079 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f54f4022470 con 0x7f550c069000 2026-03-10T14:17:49.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.072+0000 7f55097fa700 1 -- 192.168.123.103:0/566627079 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f54f400f670 con 0x7f550c069000 2026-03-10T14:17:49.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.073+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f550c04ea50 con 0x7f550c069000 2026-03-10T14:17:49.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.074+0000 7f55097fa700 1 -- 192.168.123.103:0/566627079 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f54f40225e0 con 0x7f550c069000 2026-03-10T14:17:49.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.074+0000 7f55097fa700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54ec0778c0 0x7f54ec079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:49.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.074+0000 7f55097fa700 1 -- 192.168.123.103:0/566627079 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f54f409b5f0 con 0x7f550c069000 2026-03-10T14:17:49.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.074+0000 7f550b7fe700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54ec0778c0 0x7f54ec079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:49.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.075+0000 7f550b7fe700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54ec0778c0 0x7f54ec079d70 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f550c076740 tx=0x7f54fc00b410 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:49.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.076+0000 7f55097fa700 1 -- 192.168.123.103:0/566627079 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f54f4063d90 con 0x7f550c069000 2026-03-10T14:17:49.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.220+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7f550c066e40 con 0x7f550c069000 2026-03-10T14:17:49.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.221+0000 7f55097fa700 1 -- 192.168.123.103:0/566627079 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v40) v1 ==== 107+0+5127 (secure 0 0 0) 0x7f54f40634e0 con 0x7f550c069000 2026-03-10T14:17:49.220 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:49.220 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":29,"btime":"2026-03-10T14:15:10:753474+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:10.616188+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":120,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":34408},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34408":{"gid":34408,"name":"cephfs.vm04.puavjd","rank":0,"incarnation":28,"state":"up:reconnect","state_seq":16,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:49.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.223+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54ec0778c0 msgr2=0x7f54ec079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:49.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.223+0000 7f551221f700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54ec0778c0 0x7f54ec079d70 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f550c076740 tx=0x7f54fc00b410 comp rx=0 tx=0).stop 2026-03-10T14:17:49.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f550c069000 msgr2=0x7f550c198880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:49.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f550c069000 0x7f550c198880 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f54f400b5c0 tx=0x7f54f40048c0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 shutdown_connections 2026-03-10T14:17:49.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f54ec0778c0 0x7f54ec079d70 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f550c0686f0 0x7f550c198340 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 --2- 192.168.123.103:0/566627079 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f550c069000 0x7f550c198880 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 >> 192.168.123.103:0/566627079 conn(0x7f550c0754a0 msgr2=0x7f550c1021b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:49.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 shutdown_connections 2026-03-10T14:17:49.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.224+0000 7f551221f700 1 -- 192.168.123.103:0/566627079 wait complete. 2026-03-10T14:17:49.223 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 29 2026-03-10T14:17:49.294 DEBUG:tasks.fs:max_mds reduced in epoch 29 2026-03-10T14:17:49.294 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 29 2026-03-10T14:17:49.294 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 30 2026-03-10T14:17:49.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:49 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2965203598' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T14:17:49.455 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:49.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.710+0000 7f0f7fa06700 1 -- 192.168.123.103:0/2011010635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f780730f0 msgr2=0x7f0f780734c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:49.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.710+0000 7f0f7fa06700 1 --2- 192.168.123.103:0/2011010635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f780730f0 0x7f0f780734c0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f0f68009b50 tx=0x7f0f68009e60 comp rx=0 tx=0).stop 2026-03-10T14:17:49.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.711+0000 7f0f7fa06700 1 -- 192.168.123.103:0/2011010635 shutdown_connections 2026-03-10T14:17:49.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.711+0000 7f0f7fa06700 1 --2- 192.168.123.103:0/2011010635 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f78073a00 0x7f0f78111040 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.711+0000 7f0f7fa06700 1 --2- 192.168.123.103:0/2011010635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f780730f0 0x7f0f780734c0 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.711+0000 7f0f7fa06700 1 -- 192.168.123.103:0/2011010635 >> 192.168.123.103:0/2011010635 conn(0x7f0f780fc090 msgr2=0x7f0f780fe4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:49.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.711+0000 7f0f7fa06700 1 -- 192.168.123.103:0/2011010635 shutdown_connections 2026-03-10T14:17:49.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.712+0000 7f0f7fa06700 1 -- 192.168.123.103:0/2011010635 wait complete. 2026-03-10T14:17:49.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.712+0000 7f0f7fa06700 1 Processor -- start 2026-03-10T14:17:49.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.712+0000 7f0f7fa06700 1 -- start start 2026-03-10T14:17:49.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.712+0000 7f0f7fa06700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f78073a00 0x7f0f781a27a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:49.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7fa06700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f781a2ce0 0x7f0f7819c820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:49.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7fa06700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f781a32a0 con 0x7f0f78073a00 2026-03-10T14:17:49.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7fa06700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f7819cd60 con 0x7f0f781a2ce0 2026-03-10T14:17:49.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7cfa1700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f781a2ce0 0x7f0f7819c820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:49.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7cfa1700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f781a2ce0 0x7f0f7819c820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49110/0 (socket says 192.168.123.103:49110) 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7cfa1700 1 -- 192.168.123.103:0/613327173 learned_addr learned my addr 192.168.123.103:0/613327173 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7cfa1700 1 -- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f78073a00 msgr2=0x7f0f781a27a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7d7a2700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f78073a00 0x7f0f781a27a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7cfa1700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f78073a00 0x7f0f781a27a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7cfa1700 1 -- 192.168.123.103:0/613327173 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0f680097e0 con 0x7f0f781a2ce0 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.713+0000 7f0f7d7a2700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f78073a00 0x7f0f781a27a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.714+0000 7f0f7cfa1700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f781a2ce0 0x7f0f7819c820 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f0f7400eb10 tx=0x7f0f7400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.714+0000 7f0f6e7fc700 1 -- 192.168.123.103:0/613327173 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0f7400cca0 con 0x7f0f781a2ce0 2026-03-10T14:17:49.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.714+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0f7819d040 con 0x7f0f781a2ce0 2026-03-10T14:17:49.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.714+0000 7f0f6e7fc700 1 -- 192.168.123.103:0/613327173 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0f7400ce00 con 0x7f0f781a2ce0 2026-03-10T14:17:49.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.714+0000 7f0f6e7fc700 1 -- 192.168.123.103:0/613327173 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0f74018950 con 0x7f0f781a2ce0 2026-03-10T14:17:49.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.714+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0f7819d590 con 0x7f0f781a2ce0 2026-03-10T14:17:49.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.715+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0f7810e7c0 con 0x7f0f781a2ce0 2026-03-10T14:17:49.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.715+0000 7f0f6e7fc700 1 -- 192.168.123.103:0/613327173 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0f74039b10 con 0x7f0f781a2ce0 2026-03-10T14:17:49.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.716+0000 7f0f6e7fc700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0f640778c0 0x7f0f64079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:49.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.716+0000 7f0f6e7fc700 1 -- 192.168.123.103:0/613327173 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f0f74014070 con 0x7f0f781a2ce0 2026-03-10T14:17:49.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.716+0000 7f0f7d7a2700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0f640778c0 0x7f0f64079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:49.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.717+0000 7f0f7d7a2700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0f640778c0 0x7f0f64079d70 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f0f68005950 tx=0x7f0f680058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:49.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.718+0000 7f0f6e7fc700 1 -- 192.168.123.103:0/613327173 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0f74062ee0 con 0x7f0f781a2ce0 2026-03-10T14:17:49.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.865+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 30, "format": "json"} v 0) v1 -- 0x7f0f7804ea50 con 0x7f0f781a2ce0 2026-03-10T14:17:49.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.865+0000 7f0f6e7fc700 1 -- 192.168.123.103:0/613327173 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 30, "format": "json"}]=0 dumped fsmap epoch 30 v40) v1 ==== 107+0+5124 (secure 0 0 0) 0x7f0f74062630 con 0x7f0f781a2ce0 2026-03-10T14:17:49.864 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:49.864 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":30,"btime":"2026-03-10T14:15:11:758034+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:10.762081+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":120,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":34408},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34408":{"gid":34408,"name":"cephfs.vm04.puavjd","rank":0,"incarnation":28,"state":"up:rejoin","state_seq":17,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:49.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0f640778c0 msgr2=0x7f0f64079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0f640778c0 0x7f0f64079d70 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f0f68005950 tx=0x7f0f680058e0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f781a2ce0 msgr2=0x7f0f7819c820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f781a2ce0 0x7f0f7819c820 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f0f7400eb10 tx=0x7f0f7400eed0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 shutdown_connections 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0f640778c0 0x7f0f64079d70 secure :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f0f68005950 tx=0x7f0f680058e0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0f78073a00 0x7f0f781a27a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 --2- 192.168.123.103:0/613327173 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0f781a2ce0 0x7f0f7819c820 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 >> 192.168.123.103:0/613327173 conn(0x7f0f780fc090 msgr2=0x7f0f78102b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:49.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 shutdown_connections 2026-03-10T14:17:49.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:49.868+0000 7f0f7fa06700 1 -- 192.168.123.103:0/613327173 wait complete. 2026-03-10T14:17:49.868 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 30 2026-03-10T14:17:49.933 DEBUG:tasks.fs:max_mds reduced in epoch 30 2026-03-10T14:17:49.933 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 30 2026-03-10T14:17:49.934 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 31 2026-03-10T14:17:50.096 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:50.122 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:50 vm03.local ceph-mon[103098]: pgmap v349: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:50.122 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:50 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/566627079' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T14:17:50.122 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:50 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/613327173' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-10T14:17:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:50 vm04.local ceph-mon[92084]: pgmap v349: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:50 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/566627079' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T14:17:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:50 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/613327173' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-10T14:17:50.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.367+0000 7f49a4fbf700 1 -- 192.168.123.103:0/4136553597 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 msgr2=0x7f49a01051e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:50.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.367+0000 7f49a4fbf700 1 --2- 192.168.123.103:0/4136553597 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 0x7f49a01051e0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f4990009b00 tx=0x7f4990009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:50.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.368+0000 7f49a4fbf700 1 -- 192.168.123.103:0/4136553597 shutdown_connections 2026-03-10T14:17:50.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.368+0000 7f49a4fbf700 1 --2- 192.168.123.103:0/4136553597 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 0x7f49a01051e0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:50.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.368+0000 7f49a4fbf700 1 --2- 192.168.123.103:0/4136553597 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f49a00686f0 0x7f49a0068ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:50.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.368+0000 7f49a4fbf700 1 -- 192.168.123.103:0/4136553597 >> 192.168.123.103:0/4136553597 conn(0x7f49a00754a0 msgr2=0x7f49a00758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:50.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.369+0000 7f49a4fbf700 1 -- 192.168.123.103:0/4136553597 shutdown_connections 2026-03-10T14:17:50.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.369+0000 7f49a4fbf700 1 -- 192.168.123.103:0/4136553597 wait complete. 2026-03-10T14:17:50.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f49a4fbf700 1 Processor -- start 2026-03-10T14:17:50.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f49a4fbf700 1 -- start start 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f49a4fbf700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f49a00686f0 0x7f49a0198450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f49a4fbf700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 0x7f49a0198990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f49a4fbf700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49a0199070 con 0x7f49a00686f0 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f49a4fbf700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49a019ce00 con 0x7f49a0069000 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f499dd9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 0x7f49a0198990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f499dd9b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 0x7f49a0198990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49134/0 (socket says 192.168.123.103:49134) 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.370+0000 7f499dd9b700 1 -- 192.168.123.103:0/2992230822 learned_addr learned my addr 192.168.123.103:0/2992230822 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f499e59c700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f49a00686f0 0x7f49a0198450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f499dd9b700 1 -- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f49a00686f0 msgr2=0x7f49a0198450 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f499dd9b700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f49a00686f0 0x7f49a0198450 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:50.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f499dd9b700 1 -- 192.168.123.103:0/2992230822 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4988009710 con 0x7f49a0069000 2026-03-10T14:17:50.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f499e59c700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f49a00686f0 0x7f49a0198450 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:50.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f499dd9b700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 0x7f49a0198990 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f499000ba30 tx=0x7f499000bb10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:50.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f49977fe700 1 -- 192.168.123.103:0/2992230822 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f499001d070 con 0x7f49a0069000 2026-03-10T14:17:50.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f49900097e0 con 0x7f49a0069000 2026-03-10T14:17:50.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f49977fe700 1 -- 192.168.123.103:0/2992230822 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f499000f460 con 0x7f49a0069000 2026-03-10T14:17:50.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.371+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f49a019d3e0 con 0x7f49a0069000 2026-03-10T14:17:50.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.373+0000 7f49977fe700 1 -- 192.168.123.103:0/2992230822 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f499000fc20 con 0x7f49a0069000 2026-03-10T14:17:50.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.373+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f49a0104960 con 0x7f49a0069000 2026-03-10T14:17:50.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.376+0000 7f49977fe700 1 -- 192.168.123.103:0/2992230822 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f499000f5d0 con 0x7f49a0069000 2026-03-10T14:17:50.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.376+0000 7f49977fe700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f498c0778c0 0x7f498c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:50.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.376+0000 7f49977fe700 1 -- 192.168.123.103:0/2992230822 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f499009b160 con 0x7f49a0069000 2026-03-10T14:17:50.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.377+0000 7f499e59c700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f498c0778c0 0x7f498c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:50.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.378+0000 7f499e59c700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f498c0778c0 0x7f498c079d70 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f4988009fb0 tx=0x7f4988009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:50.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.378+0000 7f49977fe700 1 -- 192.168.123.103:0/2992230822 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f49900637d0 con 0x7f49a0069000 2026-03-10T14:17:50.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.519+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 31, "format": "json"} v 0) v1 -- 0x7f49a004ea50 con 0x7f49a0069000 2026-03-10T14:17:50.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.520+0000 7f49977fe700 1 -- 192.168.123.103:0/2992230822 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 31, "format": "json"}]=0 dumped fsmap epoch 31 v40) v1 ==== 107+0+5133 (secure 0 0 0) 0x7f4990062f20 con 0x7f49a0069000 2026-03-10T14:17:50.519 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:50.519 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":31,"btime":"2026-03-10T14:15:12:761135+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29}],"filesystems":[{"mdsmap":{"epoch":31,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:12.761134+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":120,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":34408},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34408":{"gid":34408,"name":"cephfs.vm04.puavjd","rank":0,"incarnation":28,"state":"up:active","state_seq":18,"addr":"192.168.123.104:6827/3074647252","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3074647252},{"type":"v1","addr":"192.168.123.104:6827","nonce":3074647252}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34408,"qdb_cluster":[34408]},"id":1}]} 2026-03-10T14:17:50.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.522+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f498c0778c0 msgr2=0x7f498c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:50.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.522+0000 7f49a4fbf700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f498c0778c0 0x7f498c079d70 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f4988009fb0 tx=0x7f4988009450 comp rx=0 tx=0).stop 2026-03-10T14:17:50.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.523+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 msgr2=0x7f49a0198990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:50.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.523+0000 7f49a4fbf700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 0x7f49a0198990 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f499000ba30 tx=0x7f499000bb10 comp rx=0 tx=0).stop 2026-03-10T14:17:50.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.523+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 shutdown_connections 2026-03-10T14:17:50.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.523+0000 7f49a4fbf700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f498c0778c0 0x7f498c079d70 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:50.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.523+0000 7f49a4fbf700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f49a00686f0 0x7f49a0198450 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:50.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.523+0000 7f49a4fbf700 1 --2- 192.168.123.103:0/2992230822 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f49a0069000 0x7f49a0198990 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:50.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.523+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 >> 192.168.123.103:0/2992230822 conn(0x7f49a00754a0 msgr2=0x7f49a0101730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:50.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.524+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 shutdown_connections 2026-03-10T14:17:50.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:50.524+0000 7f49a4fbf700 1 -- 192.168.123.103:0/2992230822 wait complete. 2026-03-10T14:17:50.523 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 31 2026-03-10T14:17:50.627 DEBUG:tasks.fs:max_mds reduced in epoch 31 2026-03-10T14:17:50.628 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 31 2026-03-10T14:17:50.628 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 32 2026-03-10T14:17:50.788 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:51.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.070+0000 7fca7a5d3700 1 -- 192.168.123.103:0/734945399 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca7410e9e0 msgr2=0x7fca7410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:51.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.070+0000 7fca7a5d3700 1 --2- 192.168.123.103:0/734945399 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca7410e9e0 0x7fca7410edb0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fca6c00b3a0 tx=0x7fca6c00b6b0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.070+0000 7fca7a5d3700 1 -- 192.168.123.103:0/734945399 shutdown_connections 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.070+0000 7fca7a5d3700 1 --2- 192.168.123.103:0/734945399 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca74071b60 0x7fca74071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.070+0000 7fca7a5d3700 1 --2- 192.168.123.103:0/734945399 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca7410e9e0 0x7fca7410edb0 secure :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fca6c00b3a0 tx=0x7fca6c00b6b0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.070+0000 7fca7a5d3700 1 -- 192.168.123.103:0/734945399 >> 192.168.123.103:0/734945399 conn(0x7fca7406c6c0 msgr2=0x7fca7406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.071+0000 7fca7a5d3700 1 -- 192.168.123.103:0/734945399 shutdown_connections 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.071+0000 7fca7a5d3700 1 -- 192.168.123.103:0/734945399 wait complete. 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.071+0000 7fca7a5d3700 1 Processor -- start 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.072+0000 7fca7a5d3700 1 -- start start 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.072+0000 7fca7a5d3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca74071b60 0x7fca74115730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.072+0000 7fca7a5d3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca74119770 0x7fca74115c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.072+0000 7fca7a5d3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca74116240 con 0x7fca74119770 2026-03-10T14:17:51.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.072+0000 7fca7a5d3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca741163b0 con 0x7fca74071b60 2026-03-10T14:17:51.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.073+0000 7fca795d1700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca74071b60 0x7fca74115730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:51.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.073+0000 7fca795d1700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca74071b60 0x7fca74115730 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:49160/0 (socket says 192.168.123.103:49160) 2026-03-10T14:17:51.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.073+0000 7fca795d1700 1 -- 192.168.123.103:0/2639527215 learned_addr learned my addr 192.168.123.103:0/2639527215 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:51.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.073+0000 7fca78dd0700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca74119770 0x7fca74115c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:51.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.073+0000 7fca795d1700 1 -- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca74119770 msgr2=0x7fca74115c70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:51.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.073+0000 7fca795d1700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca74119770 0x7fca74115c70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.073+0000 7fca795d1700 1 -- 192.168.123.103:0/2639527215 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca6c00b050 con 0x7fca74071b60 2026-03-10T14:17:51.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.073+0000 7fca795d1700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca74071b60 0x7fca74115730 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fca6c015040 tx=0x7fca6c003ce0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:51.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.074+0000 7fca6a7fc700 1 -- 192.168.123.103:0/2639527215 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca6c00e040 con 0x7fca74071b60 2026-03-10T14:17:51.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.074+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca741b7b60 con 0x7fca74071b60 2026-03-10T14:17:51.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.074+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca741b7ec0 con 0x7fca74071b60 2026-03-10T14:17:51.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.074+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca7410bc30 con 0x7fca74071b60 2026-03-10T14:17:51.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.075+0000 7fca6a7fc700 1 -- 192.168.123.103:0/2639527215 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fca6c007cb0 con 0x7fca74071b60 2026-03-10T14:17:51.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.075+0000 7fca6a7fc700 1 -- 192.168.123.103:0/2639527215 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca6c01e9b0 con 0x7fca74071b60 2026-03-10T14:17:51.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.076+0000 7fca6a7fc700 1 -- 192.168.123.103:0/2639527215 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fca6c004460 con 0x7fca74071b60 2026-03-10T14:17:51.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.076+0000 7fca6a7fc700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca60077910 0x7fca60079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:51.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.076+0000 7fca6a7fc700 1 -- 192.168.123.103:0/2639527215 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fca6c02c030 con 0x7fca74071b60 2026-03-10T14:17:51.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.077+0000 7fca78dd0700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca60077910 0x7fca60079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:51.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.077+0000 7fca78dd0700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca60077910 0x7fca60079dc0 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fca64009de0 tx=0x7fca64009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:51.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.080+0000 7fca6a7fc700 1 -- 192.168.123.103:0/2639527215 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fca6c064210 con 0x7fca74071b60 2026-03-10T14:17:51.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.222+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 32, "format": "json"} v 0) v1 -- 0x7fca74116d70 con 0x7fca74071b60 2026-03-10T14:17:51.220 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:51 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2992230822' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-10T14:17:51.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.223+0000 7fca6a7fc700 1 -- 192.168.123.103:0/2639527215 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 32, "format": "json"}]=0 dumped fsmap epoch 32 v40) v1 ==== 107+0+4329 (secure 0 0 0) 0x7fca6c063960 con 0x7fca74071b60 2026-03-10T14:17:51.222 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:51.223 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":32,"btime":"2026-03-10T14:15:13:809495+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29}],"filesystems":[{"mdsmap":{"epoch":32,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:13.809495+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[1],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:51.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca60077910 msgr2=0x7fca60079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:51.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca60077910 0x7fca60079dc0 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fca64009de0 tx=0x7fca64009450 comp rx=0 tx=0).stop 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca74071b60 msgr2=0x7fca74115730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca74071b60 0x7fca74115730 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fca6c015040 tx=0x7fca6c003ce0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 shutdown_connections 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fca60077910 0x7fca60079dc0 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fca74071b60 0x7fca74115730 secure :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fca6c015040 tx=0x7fca6c003ce0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 --2- 192.168.123.103:0/2639527215 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fca74119770 0x7fca74115c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 >> 192.168.123.103:0/2639527215 conn(0x7fca7406c6c0 msgr2=0x7fca7406fb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 shutdown_connections 2026-03-10T14:17:51.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.227+0000 7fca7a5d3700 1 -- 192.168.123.103:0/2639527215 wait complete. 2026-03-10T14:17:51.227 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 32 2026-03-10T14:17:51.290 DEBUG:tasks.fs:max_mds reduced in epoch 32 2026-03-10T14:17:51.290 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 32 2026-03-10T14:17:51.290 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 33 2026-03-10T14:17:51.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:51 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2992230822' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-10T14:17:51.451 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:51.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.718+0000 7f27a3d4f700 1 -- 192.168.123.103:0/811036725 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c102780 msgr2=0x7f279c102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:51.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.718+0000 7f27a3d4f700 1 --2- 192.168.123.103:0/811036725 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c102780 0x7f279c102bf0 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7f2798009b50 tx=0x7f2798009e60 comp rx=0 tx=0).stop 2026-03-10T14:17:51.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.720+0000 7f27a3d4f700 1 -- 192.168.123.103:0/811036725 shutdown_connections 2026-03-10T14:17:51.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.720+0000 7f27a3d4f700 1 --2- 192.168.123.103:0/811036725 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c102780 0x7f279c102bf0 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.720+0000 7f27a3d4f700 1 --2- 192.168.123.103:0/811036725 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f279c108780 0x7f279c108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.720+0000 7f27a3d4f700 1 -- 192.168.123.103:0/811036725 >> 192.168.123.103:0/811036725 conn(0x7f279c0fe280 msgr2=0x7f279c100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:51.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.720+0000 7f27a3d4f700 1 -- 192.168.123.103:0/811036725 shutdown_connections 2026-03-10T14:17:51.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.720+0000 7f27a3d4f700 1 -- 192.168.123.103:0/811036725 wait complete. 2026-03-10T14:17:51.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a3d4f700 1 Processor -- start 2026-03-10T14:17:51.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a3d4f700 1 -- start start 2026-03-10T14:17:51.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a3d4f700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f279c102780 0x7f279c1983b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:51.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a3d4f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c108780 0x7f279c1988f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:51.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a3d4f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f279c198fd0 con 0x7f279c108780 2026-03-10T14:17:51.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a3d4f700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f279c19cd10 con 0x7f279c102780 2026-03-10T14:17:51.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a12ea700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c108780 0x7f279c1988f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:51.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a12ea700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c108780 0x7f279c1988f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:32802/0 (socket says 192.168.123.103:32802) 2026-03-10T14:17:51.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.721+0000 7f27a12ea700 1 -- 192.168.123.103:0/702560127 learned_addr learned my addr 192.168.123.103:0/702560127 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:51.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f27a1aeb700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f279c102780 0x7f279c1983b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:51.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f27a12ea700 1 -- 192.168.123.103:0/702560127 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f279c102780 msgr2=0x7f279c1983b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:51.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f27a12ea700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f279c102780 0x7f279c1983b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f27a12ea700 1 -- 192.168.123.103:0/702560127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f27980097e0 con 0x7f279c108780 2026-03-10T14:17:51.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f27a1aeb700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f279c102780 0x7f279c1983b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:51.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f27a12ea700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c108780 0x7f279c1988f0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f2798005950 tx=0x7f27980049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:51.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f278effd700 1 -- 192.168.123.103:0/702560127 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f279801d070 con 0x7f279c108780 2026-03-10T14:17:51.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f278effd700 1 -- 192.168.123.103:0/702560127 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2798004b80 con 0x7f279c108780 2026-03-10T14:17:51.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f278effd700 1 -- 192.168.123.103:0/702560127 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f279800f670 con 0x7f279c108780 2026-03-10T14:17:51.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.722+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f279c19cf90 con 0x7f279c108780 2026-03-10T14:17:51.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.723+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f279c19d480 con 0x7f279c108780 2026-03-10T14:17:51.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.724+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f279c04ea50 con 0x7f279c108780 2026-03-10T14:17:51.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.725+0000 7f278effd700 1 -- 192.168.123.103:0/702560127 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f279800bc50 con 0x7f279c108780 2026-03-10T14:17:51.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.725+0000 7f278effd700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f27880779e0 0x7f2788079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:51.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.725+0000 7f278effd700 1 -- 192.168.123.103:0/702560127 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f279809be60 con 0x7f279c108780 2026-03-10T14:17:51.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.727+0000 7f278effd700 1 -- 192.168.123.103:0/702560127 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2798064550 con 0x7f279c108780 2026-03-10T14:17:51.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.727+0000 7f27a1aeb700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f27880779e0 0x7f2788079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:51.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.728+0000 7f27a1aeb700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f27880779e0 0x7f2788079e90 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f279c1038c0 tx=0x7f2790009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:51.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.873+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 33, "format": "json"} v 0) v1 -- 0x7f279c066e40 con 0x7f279c108780 2026-03-10T14:17:51.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.874+0000 7f278effd700 1 -- 192.168.123.103:0/702560127 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 33, "format": "json"}]=0 dumped fsmap epoch 33 v40) v1 ==== 107+0+4408 (secure 0 0 0) 0x7f2798063ca0 con 0x7f279c108780 2026-03-10T14:17:51.873 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:51.873 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":33,"btime":"2026-03-10T14:15:13:816839+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29}],"filesystems":[{"mdsmap":{"epoch":33,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:13.816835+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34474},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34474":{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":0,"incarnation":33,"state":"up:replay","state_seq":1,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:51.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f27880779e0 msgr2=0x7f2788079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:51.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f27880779e0 0x7f2788079e90 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f279c1038c0 tx=0x7f2790009450 comp rx=0 tx=0).stop 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c108780 msgr2=0x7f279c1988f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c108780 0x7f279c1988f0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f2798005950 tx=0x7f27980049e0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 shutdown_connections 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f27880779e0 0x7f2788079e90 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f279c102780 0x7f279c1983b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 --2- 192.168.123.103:0/702560127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f279c108780 0x7f279c1988f0 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.877+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 >> 192.168.123.103:0/702560127 conn(0x7f279c0fe280 msgr2=0x7f279c0ffbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.878+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 shutdown_connections 2026-03-10T14:17:51.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:51.878+0000 7f27a3d4f700 1 -- 192.168.123.103:0/702560127 wait complete. 2026-03-10T14:17:51.877 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 33 2026-03-10T14:17:51.947 DEBUG:tasks.fs:max_mds reduced in epoch 33 2026-03-10T14:17:51.948 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 33 2026-03-10T14:17:51.948 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 34 2026-03-10T14:17:52.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:52 vm03.local ceph-mon[103098]: pgmap v350: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:52.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:52 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2639527215' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-10T14:17:52.028 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:52 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/702560127' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-10T14:17:52.102 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:52 vm04.local ceph-mon[92084]: pgmap v350: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:52 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2639527215' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-10T14:17:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:52 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/702560127' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-10T14:17:52.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.345+0000 7fe30dcb3700 1 -- 192.168.123.103:0/3825853324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 msgr2=0x7fe308102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:52.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.345+0000 7fe30dcb3700 1 --2- 192.168.123.103:0/3825853324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 0x7fe308102bf0 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7fe2f8009b50 tx=0x7fe2f8009e60 comp rx=0 tx=0).stop 2026-03-10T14:17:52.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.345+0000 7fe30dcb3700 1 -- 192.168.123.103:0/3825853324 shutdown_connections 2026-03-10T14:17:52.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.345+0000 7fe30dcb3700 1 --2- 192.168.123.103:0/3825853324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 0x7fe308102bf0 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.345+0000 7fe30dcb3700 1 --2- 192.168.123.103:0/3825853324 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe308108780 0x7fe308108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.345+0000 7fe30dcb3700 1 -- 192.168.123.103:0/3825853324 >> 192.168.123.103:0/3825853324 conn(0x7fe3080fe280 msgr2=0x7fe308100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:52.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.346+0000 7fe30dcb3700 1 -- 192.168.123.103:0/3825853324 shutdown_connections 2026-03-10T14:17:52.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.346+0000 7fe30dcb3700 1 -- 192.168.123.103:0/3825853324 wait complete. 2026-03-10T14:17:52.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.346+0000 7fe30dcb3700 1 Processor -- start 2026-03-10T14:17:52.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.346+0000 7fe30dcb3700 1 -- start start 2026-03-10T14:17:52.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe30dcb3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 0x7fe308075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:52.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe30dcb3700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe308108780 0x7fe3080757a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:52.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe30dcb3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe3080793a0 con 0x7fe308102780 2026-03-10T14:17:52.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe30dcb3700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe308075ce0 con 0x7fe308108780 2026-03-10T14:17:52.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe3077fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 0x7fe308075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:52.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe3077fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 0x7fe308075260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:32818/0 (socket says 192.168.123.103:32818) 2026-03-10T14:17:52.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe3077fe700 1 -- 192.168.123.103:0/564190331 learned_addr learned my addr 192.168.123.103:0/564190331 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:52.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe306ffd700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe308108780 0x7fe3080757a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:52.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe306ffd700 1 -- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 msgr2=0x7fe308075260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:52.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe306ffd700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 0x7fe308075260 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.347+0000 7fe306ffd700 1 -- 192.168.123.103:0/564190331 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe2f80097e0 con 0x7fe308108780 2026-03-10T14:17:52.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.348+0000 7fe306ffd700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe308108780 0x7fe3080757a0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fe2f8005950 tx=0x7fe2f8004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:52.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.348+0000 7fe304ff9700 1 -- 192.168.123.103:0/564190331 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2f801d070 con 0x7fe308108780 2026-03-10T14:17:52.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.348+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe308075f60 con 0x7fe308108780 2026-03-10T14:17:52.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.348+0000 7fe304ff9700 1 -- 192.168.123.103:0/564190331 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe2f8022470 con 0x7fe308108780 2026-03-10T14:17:52.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.348+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe3081a6b10 con 0x7fe308108780 2026-03-10T14:17:52.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.349+0000 7fe304ff9700 1 -- 192.168.123.103:0/564190331 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe2f800f610 con 0x7fe308108780 2026-03-10T14:17:52.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.350+0000 7fe304ff9700 1 -- 192.168.123.103:0/564190331 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe2f800ba40 con 0x7fe308108780 2026-03-10T14:17:52.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.350+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe30804ea50 con 0x7fe308108780 2026-03-10T14:17:52.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.350+0000 7fe304ff9700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2f40778e0 0x7fe2f4079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:52.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.350+0000 7fe3077fe700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2f40778e0 0x7fe2f4079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:52.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.351+0000 7fe304ff9700 1 -- 192.168.123.103:0/564190331 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7fe2f809b140 con 0x7fe308108780 2026-03-10T14:17:52.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.351+0000 7fe3077fe700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2f40778e0 0x7fe2f4079d90 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7fe2f0005950 tx=0x7fe2f0009450 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:52.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.353+0000 7fe304ff9700 1 -- 192.168.123.103:0/564190331 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe2f80637e0 con 0x7fe308108780 2026-03-10T14:17:52.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.487+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 34, "format": "json"} v 0) v1 -- 0x7fe3080767e0 con 0x7fe308108780 2026-03-10T14:17:52.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.488+0000 7fe304ff9700 1 -- 192.168.123.103:0/564190331 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 34, "format": "json"}]=0 dumped fsmap epoch 34 v40) v1 ==== 107+0+4411 (secure 0 0 0) 0x7fe2f800bcf0 con 0x7fe308108780 2026-03-10T14:17:52.487 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:52.487 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":34,"btime":"2026-03-10T14:15:17:037880+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29}],"filesystems":[{"mdsmap":{"epoch":34,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:16.402800+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34474},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34474":{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":0,"incarnation":33,"state":"up:reconnect","state_seq":6,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2f40778e0 msgr2=0x7fe2f4079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2f40778e0 0x7fe2f4079d90 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7fe2f0005950 tx=0x7fe2f0009450 comp rx=0 tx=0).stop 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe308108780 msgr2=0x7fe3080757a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe308108780 0x7fe3080757a0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fe2f8005950 tx=0x7fe2f8004930 comp rx=0 tx=0).stop 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 shutdown_connections 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7fe2f40778e0 0x7fe2f4079d90 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe308102780 0x7fe308075260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 --2- 192.168.123.103:0/564190331 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7fe308108780 0x7fe3080757a0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 >> 192.168.123.103:0/564190331 conn(0x7fe3080fe280 msgr2=0x7fe3080ffbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 shutdown_connections 2026-03-10T14:17:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.491+0000 7fe30dcb3700 1 -- 192.168.123.103:0/564190331 wait complete. 2026-03-10T14:17:52.491 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 34 2026-03-10T14:17:52.539 DEBUG:tasks.fs:max_mds reduced in epoch 34 2026-03-10T14:17:52.539 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 34 2026-03-10T14:17:52.539 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 35 2026-03-10T14:17:52.680 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:52.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.922+0000 7f457fc8b700 1 -- 192.168.123.103:0/1704458151 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578102790 msgr2=0x7f4578102c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:52.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.922+0000 7f457fc8b700 1 --2- 192.168.123.103:0/1704458151 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578102790 0x7f4578102c00 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7f4574009b00 tx=0x7f4574009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:52.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.923+0000 7f457fc8b700 1 -- 192.168.123.103:0/1704458151 shutdown_connections 2026-03-10T14:17:52.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.923+0000 7f457fc8b700 1 --2- 192.168.123.103:0/1704458151 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578102790 0x7f4578102c00 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.923+0000 7f457fc8b700 1 --2- 192.168.123.103:0/1704458151 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4578108790 0x7f4578108b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.923+0000 7f457fc8b700 1 -- 192.168.123.103:0/1704458151 >> 192.168.123.103:0/1704458151 conn(0x7f45780fe2b0 msgr2=0x7f45781006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:52.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.923+0000 7f457fc8b700 1 -- 192.168.123.103:0/1704458151 shutdown_connections 2026-03-10T14:17:52.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.924+0000 7f457fc8b700 1 -- 192.168.123.103:0/1704458151 wait complete. 2026-03-10T14:17:52.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.924+0000 7f457fc8b700 1 Processor -- start 2026-03-10T14:17:52.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.924+0000 7f457fc8b700 1 -- start start 2026-03-10T14:17:52.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.924+0000 7f457fc8b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4578102790 0x7f4578198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:52.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.924+0000 7f457fc8b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578108790 0x7f45781988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:52.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.924+0000 7f457fc8b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4578198fb0 con 0x7f4578108790 2026-03-10T14:17:52.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.924+0000 7f457fc8b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f457819cd40 con 0x7f4578102790 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457d226700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578108790 0x7f45781988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457d226700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578108790 0x7f45781988d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:32842/0 (socket says 192.168.123.103:32842) 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457d226700 1 -- 192.168.123.103:0/3861717580 learned_addr learned my addr 192.168.123.103:0/3861717580 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457d226700 1 -- 192.168.123.103:0/3861717580 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4578102790 msgr2=0x7f4578198390 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457da27700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4578102790 0x7f4578198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457d226700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4578102790 0x7f4578198390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457d226700 1 -- 192.168.123.103:0/3861717580 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45740097e0 con 0x7f4578108790 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457da27700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4578102790 0x7f4578198390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:52.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.925+0000 7f457d226700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578108790 0x7f45781988d0 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f45740048c0 tx=0x7f45740049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:52.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.926+0000 7f456effd700 1 -- 192.168.123.103:0/3861717580 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f457401d070 con 0x7f4578108790 2026-03-10T14:17:52.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.926+0000 7f456effd700 1 -- 192.168.123.103:0/3861717580 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f457400bc50 con 0x7f4578108790 2026-03-10T14:17:52.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.926+0000 7f456effd700 1 -- 192.168.123.103:0/3861717580 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f457400f780 con 0x7f4578108790 2026-03-10T14:17:52.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.926+0000 7f457fc8b700 1 -- 192.168.123.103:0/3861717580 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f457819cfc0 con 0x7f4578108790 2026-03-10T14:17:52.925 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.926+0000 7f457fc8b700 1 -- 192.168.123.103:0/3861717580 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f457819d480 con 0x7f4578108790 2026-03-10T14:17:52.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.927+0000 7f456cff9700 1 -- 192.168.123.103:0/3861717580 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f457804ea50 con 0x7f4578108790 2026-03-10T14:17:52.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.928+0000 7f456effd700 1 -- 192.168.123.103:0/3861717580 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4574022a50 con 0x7f4578108790 2026-03-10T14:17:52.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.928+0000 7f456effd700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4564077990 0x7f4564079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:52.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.928+0000 7f456effd700 1 -- 192.168.123.103:0/3861717580 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f457409b440 con 0x7f4578108790 2026-03-10T14:17:52.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.931+0000 7f456effd700 1 -- 192.168.123.103:0/3861717580 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4574063b30 con 0x7f4578108790 2026-03-10T14:17:52.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.931+0000 7f457da27700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4564077990 0x7f4564079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:52.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:52.932+0000 7f457da27700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4564077990 0x7f4564079e40 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f4568005fd0 tx=0x7f4568005dc0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:53.083 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:53 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/564190331' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 34, "format": "json"}]: dispatch 2026-03-10T14:17:53.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.088+0000 7f456cff9700 1 -- 192.168.123.103:0/3861717580 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 35, "format": "json"} v 0) v1 -- 0x7f4578066e40 con 0x7f4578108790 2026-03-10T14:17:53.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.089+0000 7f456effd700 1 -- 192.168.123.103:0/3861717580 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 35, "format": "json"}]=0 dumped fsmap epoch 35 v40) v1 ==== 107+0+4408 (secure 0 0 0) 0x7f4574027020 con 0x7f4578108790 2026-03-10T14:17:53.088 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:53.088 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":35,"btime":"2026-03-10T14:15:18:044104+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29}],"filesystems":[{"mdsmap":{"epoch":35,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:17.049490+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34474},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34474":{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":0,"incarnation":33,"state":"up:rejoin","state_seq":7,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.092+0000 7f456cff9700 1 -- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4564077990 msgr2=0x7f4564079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.092+0000 7f456cff9700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4564077990 0x7f4564079e40 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f4568005fd0 tx=0x7f4568005dc0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.092+0000 7f456cff9700 1 -- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578108790 msgr2=0x7f45781988d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.092+0000 7f456cff9700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578108790 0x7f45781988d0 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f45740048c0 tx=0x7f45740049a0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.093+0000 7f456cff9700 1 -- 192.168.123.103:0/3861717580 shutdown_connections 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.093+0000 7f456cff9700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f4564077990 0x7f4564079e40 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.093+0000 7f456cff9700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f4578102790 0x7f4578198390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.093+0000 7f456cff9700 1 --2- 192.168.123.103:0/3861717580 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4578108790 0x7f45781988d0 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.093+0000 7f456cff9700 1 -- 192.168.123.103:0/3861717580 >> 192.168.123.103:0/3861717580 conn(0x7f45780fe2b0 msgr2=0x7f45780ffb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:53.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.093+0000 7f456cff9700 1 -- 192.168.123.103:0/3861717580 shutdown_connections 2026-03-10T14:17:53.092 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.093+0000 7f456cff9700 1 -- 192.168.123.103:0/3861717580 wait complete. 2026-03-10T14:17:53.093 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 35 2026-03-10T14:17:53.152 DEBUG:tasks.fs:max_mds reduced in epoch 35 2026-03-10T14:17:53.152 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 35 2026-03-10T14:17:53.152 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 36 2026-03-10T14:17:53.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:53 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/564190331' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 34, "format": "json"}]: dispatch 2026-03-10T14:17:53.332 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:53.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.687+0000 7ff50e863700 1 -- 192.168.123.103:0/1889416805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508108550 msgr2=0x7ff508108920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:53.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.688+0000 7ff50e863700 1 --2- 192.168.123.103:0/1889416805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508108550 0x7ff508108920 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7ff4f8009b00 tx=0x7ff4f8009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:53.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.688+0000 7ff50e863700 1 -- 192.168.123.103:0/1889416805 shutdown_connections 2026-03-10T14:17:53.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.688+0000 7ff50e863700 1 --2- 192.168.123.103:0/1889416805 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff508102530 0x7ff5081029a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.688+0000 7ff50e863700 1 --2- 192.168.123.103:0/1889416805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508108550 0x7ff508108920 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.688+0000 7ff50e863700 1 -- 192.168.123.103:0/1889416805 >> 192.168.123.103:0/1889416805 conn(0x7ff5080fe070 msgr2=0x7ff508100480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:53.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.689+0000 7ff50e863700 1 -- 192.168.123.103:0/1889416805 shutdown_connections 2026-03-10T14:17:53.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.689+0000 7ff50e863700 1 -- 192.168.123.103:0/1889416805 wait complete. 2026-03-10T14:17:53.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.689+0000 7ff50e863700 1 Processor -- start 2026-03-10T14:17:53.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.689+0000 7ff50e863700 1 -- start start 2026-03-10T14:17:53.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.690+0000 7ff50e863700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508102530 0x7ff508198120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:53.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.690+0000 7ff50e863700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff508108550 0x7ff508198660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:53.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.690+0000 7ff507fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508102530 0x7ff508198120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:53.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.690+0000 7ff507fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508102530 0x7ff508198120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46588/0 (socket says 192.168.123.103:46588) 2026-03-10T14:17:53.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.690+0000 7ff507fff700 1 -- 192.168.123.103:0/2939165179 learned_addr learned my addr 192.168.123.103:0/2939165179 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:53.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.691+0000 7ff50e863700 1 -- 192.168.123.103:0/2939165179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff508198d40 con 0x7ff508102530 2026-03-10T14:17:53.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.691+0000 7ff5077fe700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff508108550 0x7ff508198660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:53.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.691+0000 7ff50e863700 1 -- 192.168.123.103:0/2939165179 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff50819c8c0 con 0x7ff508108550 2026-03-10T14:17:53.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.693+0000 7ff507fff700 1 -- 192.168.123.103:0/2939165179 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff508108550 msgr2=0x7ff508198660 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:53.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.693+0000 7ff507fff700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff508108550 0x7ff508198660 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.693+0000 7ff507fff700 1 -- 192.168.123.103:0/2939165179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff4f80097e0 con 0x7ff508102530 2026-03-10T14:17:53.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.693+0000 7ff507fff700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508102530 0x7ff508198120 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7ff4f8005f50 tx=0x7ff4f80051d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:53.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.695+0000 7ff5057fa700 1 -- 192.168.123.103:0/2939165179 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff4f801d070 con 0x7ff508102530 2026-03-10T14:17:53.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.695+0000 7ff5057fa700 1 -- 192.168.123.103:0/2939165179 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff4f800bc50 con 0x7ff508102530 2026-03-10T14:17:53.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.695+0000 7ff5057fa700 1 -- 192.168.123.103:0/2939165179 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff4f800f920 con 0x7ff508102530 2026-03-10T14:17:53.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.695+0000 7ff50e863700 1 -- 192.168.123.103:0/2939165179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff50819cb40 con 0x7ff508102530 2026-03-10T14:17:53.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.695+0000 7ff50e863700 1 -- 192.168.123.103:0/2939165179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff50819d030 con 0x7ff508102530 2026-03-10T14:17:53.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.696+0000 7ff5057fa700 1 -- 192.168.123.103:0/2939165179 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff4f800fa80 con 0x7ff508102530 2026-03-10T14:17:53.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.696+0000 7ff5057fa700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff4f00779e0 0x7ff4f0079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:53.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.697+0000 7ff5057fa700 1 -- 192.168.123.103:0/2939165179 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7ff4f809b2f0 con 0x7ff508102530 2026-03-10T14:17:53.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.697+0000 7ff5077fe700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff4f00779e0 0x7ff4f0079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:53.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.697+0000 7ff50e863700 1 -- 192.168.123.103:0/2939165179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff50804f2a0 con 0x7ff508102530 2026-03-10T14:17:53.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.697+0000 7ff5077fe700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff4f00779e0 0x7ff4f0079e90 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7ff4fc005950 tx=0x7ff4fc0058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:53.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.700+0000 7ff5057fa700 1 -- 192.168.123.103:0/2939165179 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff4f8063a90 con 0x7ff508102530 2026-03-10T14:17:53.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.843+0000 7ff50e863700 1 -- 192.168.123.103:0/2939165179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 36, "format": "json"} v 0) v1 -- 0x7ff508103ca0 con 0x7ff508102530 2026-03-10T14:17:53.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.844+0000 7ff5057fa700 1 -- 192.168.123.103:0/2939165179 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 36, "format": "json"}]=0 dumped fsmap epoch 36 v40) v1 ==== 107+0+5268 (secure 0 0 0) 0x7ff4f8027020 con 0x7ff508102530 2026-03-10T14:17:53.842 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:53.843 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":36,"btime":"2026-03-10T14:15:19:273266+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29},{"gid":44351,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3930678535","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3930678535},{"type":"v1","addr":"192.168.123.104:6827","nonce":3930678535}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":36}],"filesystems":[{"mdsmap":{"epoch":36,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:19.273264+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34474},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34474":{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":0,"incarnation":33,"state":"up:active","state_seq":8,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34474,"qdb_cluster":[34474]},"id":1}]} 2026-03-10T14:17:53.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.851+0000 7ff4eeffd700 1 -- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff4f00779e0 msgr2=0x7ff4f0079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:53.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.851+0000 7ff4eeffd700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff4f00779e0 0x7ff4f0079e90 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7ff4fc005950 tx=0x7ff4fc0058e0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.851+0000 7ff4eeffd700 1 -- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508102530 msgr2=0x7ff508198120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:53.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.851+0000 7ff4eeffd700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508102530 0x7ff508198120 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7ff4f8005f50 tx=0x7ff4f80051d0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.851+0000 7ff4eeffd700 1 -- 192.168.123.103:0/2939165179 shutdown_connections 2026-03-10T14:17:53.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.852+0000 7ff4eeffd700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff4f00779e0 0x7ff4f0079e90 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.852+0000 7ff4eeffd700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff508102530 0x7ff508198120 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.852+0000 7ff4eeffd700 1 --2- 192.168.123.103:0/2939165179 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff508108550 0x7ff508198660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:53.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.852+0000 7ff4eeffd700 1 -- 192.168.123.103:0/2939165179 >> 192.168.123.103:0/2939165179 conn(0x7ff5080fe070 msgr2=0x7ff5080ff990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:53.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.852+0000 7ff4eeffd700 1 -- 192.168.123.103:0/2939165179 shutdown_connections 2026-03-10T14:17:53.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:53.852+0000 7ff4eeffd700 1 -- 192.168.123.103:0/2939165179 wait complete. 2026-03-10T14:17:53.854 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 36 2026-03-10T14:17:53.933 DEBUG:tasks.fs:max_mds reduced in epoch 36 2026-03-10T14:17:53.934 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 36 2026-03-10T14:17:53.934 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 37 2026-03-10T14:17:54.082 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:54.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:54 vm03.local ceph-mon[103098]: pgmap v351: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:54.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:54 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/3861717580' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 35, "format": "json"}]: dispatch 2026-03-10T14:17:54.145 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:54 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2939165179' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 36, "format": "json"}]: dispatch 2026-03-10T14:17:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:54 vm04.local ceph-mon[92084]: pgmap v351: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:17:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:54 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/3861717580' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 35, "format": "json"}]: dispatch 2026-03-10T14:17:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:54 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2939165179' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 36, "format": "json"}]: dispatch 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.338+0000 7f0e7497b700 1 -- 192.168.123.103:0/1925821929 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c102810 msgr2=0x7f0e6c102c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.338+0000 7f0e7497b700 1 --2- 192.168.123.103:0/1925821929 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c102810 0x7f0e6c102c80 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f0e60009b50 tx=0x7f0e60009e60 comp rx=0 tx=0).stop 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.338+0000 7f0e7497b700 1 -- 192.168.123.103:0/1925821929 shutdown_connections 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.338+0000 7f0e7497b700 1 --2- 192.168.123.103:0/1925821929 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c102810 0x7f0e6c102c80 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.338+0000 7f0e7497b700 1 --2- 192.168.123.103:0/1925821929 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e6c108810 0x7f0e6c108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.338+0000 7f0e7497b700 1 -- 192.168.123.103:0/1925821929 >> 192.168.123.103:0/1925821929 conn(0x7f0e6c0fe330 msgr2=0x7f0e6c100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.338+0000 7f0e7497b700 1 -- 192.168.123.103:0/1925821929 shutdown_connections 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.338+0000 7f0e7497b700 1 -- 192.168.123.103:0/1925821929 wait complete. 2026-03-10T14:17:54.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.339+0000 7f0e7497b700 1 Processor -- start 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.339+0000 7f0e7497b700 1 -- start start 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.339+0000 7f0e7497b700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e6c102810 0x7f0e6c075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.339+0000 7f0e7497b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c108810 0x7f0e6c0757a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.339+0000 7f0e7497b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e6c0793f0 con 0x7f0e6c108810 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.339+0000 7f0e7497b700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e6c075ce0 con 0x7f0e6c102810 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.339+0000 7f0e72717700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e6c102810 0x7f0e6c075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e72717700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e6c102810 0x7f0e6c075260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.104:3300/0 says I am v2:192.168.123.103:37532/0 (socket says 192.168.123.103:37532) 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e72717700 1 -- 192.168.123.103:0/1757126044 learned_addr learned my addr 192.168.123.103:0/1757126044 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e71f16700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c108810 0x7f0e6c0757a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:54.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e72717700 1 -- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c108810 msgr2=0x7f0e6c0757a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:54.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e72717700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c108810 0x7f0e6c0757a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e72717700 1 -- 192.168.123.103:0/1757126044 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e600097e0 con 0x7f0e6c102810 2026-03-10T14:17:54.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e71f16700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c108810 0x7f0e6c0757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:17:54.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e72717700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e6c102810 0x7f0e6c075260 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f0e64009fd0 tx=0x7f0e6400edf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:54.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.340+0000 7f0e5f7fe700 1 -- 192.168.123.103:0/1757126044 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e64009980 con 0x7f0e6c102810 2026-03-10T14:17:54.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.341+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0e6c075fc0 con 0x7f0e6c102810 2026-03-10T14:17:54.339 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.341+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0e6c1a6bb0 con 0x7f0e6c102810 2026-03-10T14:17:54.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.341+0000 7f0e5f7fe700 1 -- 192.168.123.103:0/1757126044 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0e64004500 con 0x7f0e6c102810 2026-03-10T14:17:54.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.341+0000 7f0e5f7fe700 1 -- 192.168.123.103:0/1757126044 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e64010430 con 0x7f0e6c102810 2026-03-10T14:17:54.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.342+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0e50005320 con 0x7f0e6c102810 2026-03-10T14:17:54.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.342+0000 7f0e5f7fe700 1 -- 192.168.123.103:0/1757126044 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0e6400cc40 con 0x7f0e6c102810 2026-03-10T14:17:54.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.342+0000 7f0e5f7fe700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e580778e0 0x7f0e58079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:54.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.342+0000 7f0e5f7fe700 1 -- 192.168.123.103:0/1757126044 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f0e64014070 con 0x7f0e6c102810 2026-03-10T14:17:54.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.343+0000 7f0e71f16700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e580778e0 0x7f0e58079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:54.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.343+0000 7f0e71f16700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e580778e0 0x7f0e58079d90 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f0e60009b20 tx=0x7f0e600058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:54.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.345+0000 7f0e5f7fe700 1 -- 192.168.123.103:0/1757126044 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0e64062580 con 0x7f0e6c102810 2026-03-10T14:17:54.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.484+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 37, "format": "json"} v 0) v1 -- 0x7f0e50005190 con 0x7f0e6c102810 2026-03-10T14:17:54.485 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:54.485 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":37,"btime":"2026-03-10T14:15:21:615555+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34476,"name":"cephfs.vm03.itwezo","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":26},{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29},{"gid":44351,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3930678535","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3930678535},{"type":"v1","addr":"192.168.123.104:6827","nonce":3930678535}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":36}],"filesystems":[{"mdsmap":{"epoch":37,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:20.618219+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0],"up":{"mds_0":34474},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34474":{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":0,"incarnation":33,"state":"up:active","state_seq":8,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34474,"qdb_cluster":[34474]},"id":1}]} 2026-03-10T14:17:54.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.485+0000 7f0e5f7fe700 1 -- 192.168.123.103:0/1757126044 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 37, "format": "json"}]=0 dumped fsmap epoch 37 v40) v1 ==== 107+0+5268 (secure 0 0 0) 0x7f0e64061cd0 con 0x7f0e6c102810 2026-03-10T14:17:54.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.487+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e580778e0 msgr2=0x7f0e58079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:54.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.487+0000 7f0e7497b700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e580778e0 0x7f0e58079d90 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f0e60009b20 tx=0x7f0e600058e0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e6c102810 msgr2=0x7f0e6c075260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:54.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e6c102810 0x7f0e6c075260 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f0e64009fd0 tx=0x7f0e6400edf0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 shutdown_connections 2026-03-10T14:17:54.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f0e580778e0 0x7f0e58079d90 unknown :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f0e6c102810 0x7f0e6c075260 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 --2- 192.168.123.103:0/1757126044 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e6c108810 0x7f0e6c0757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 >> 192.168.123.103:0/1757126044 conn(0x7f0e6c0fe330 msgr2=0x7f0e6c0ffb70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:54.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 shutdown_connections 2026-03-10T14:17:54.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.488+0000 7f0e7497b700 1 -- 192.168.123.103:0/1757126044 wait complete. 2026-03-10T14:17:54.487 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 37 2026-03-10T14:17:54.548 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 37 2026-03-10T14:17:54.548 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 38 2026-03-10T14:17:54.686 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:54.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.909+0000 7f97b1c97700 1 -- 192.168.123.103:0/2484054343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 msgr2=0x7f97ac068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:54.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.909+0000 7f97b1c97700 1 --2- 192.168.123.103:0/2484054343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 0x7f97ac068900 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7f979c009b00 tx=0x7f979c009e10 comp rx=0 tx=0).stop 2026-03-10T14:17:54.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.910+0000 7f97b1c97700 1 -- 192.168.123.103:0/2484054343 shutdown_connections 2026-03-10T14:17:54.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.910+0000 7f97b1c97700 1 --2- 192.168.123.103:0/2484054343 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 0x7f97ac068900 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.910+0000 7f97b1c97700 1 --2- 192.168.123.103:0/2484054343 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f97ac1013c0 0x7f97ac101790 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.910+0000 7f97b1c97700 1 -- 192.168.123.103:0/2484054343 >> 192.168.123.103:0/2484054343 conn(0x7f97ac0754a0 msgr2=0x7f97ac0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:54.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.910+0000 7f97b1c97700 1 -- 192.168.123.103:0/2484054343 shutdown_connections 2026-03-10T14:17:54.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.910+0000 7f97b1c97700 1 -- 192.168.123.103:0/2484054343 wait complete. 2026-03-10T14:17:54.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.910+0000 7f97b1c97700 1 Processor -- start 2026-03-10T14:17:54.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.911+0000 7f97b1c97700 1 -- start start 2026-03-10T14:17:54.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.911+0000 7f97b1c97700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 0x7f97ac198340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:54.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.911+0000 7f97ab7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 0x7f97ac198340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:54.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.912+0000 7f97ab7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 0x7f97ac198340 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46620/0 (socket says 192.168.123.103:46620) 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.912+0000 7f97b1c97700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f97ac1013c0 0x7f97ac198880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.912+0000 7f97b1c97700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97ac198ed0 con 0x7f97ac068490 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.912+0000 7f97b1c97700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97ac199010 con 0x7f97ac1013c0 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.912+0000 7f97ab7fe700 1 -- 192.168.123.103:0/275464704 learned_addr learned my addr 192.168.123.103:0/275464704 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.912+0000 7f97aaffd700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f97ac1013c0 0x7f97ac198880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.912+0000 7f97aaffd700 1 -- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 msgr2=0x7f97ac198340 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.913+0000 7f97aaffd700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 0x7f97ac198340 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.913+0000 7f97aaffd700 1 -- 192.168.123.103:0/275464704 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f979c0097e0 con 0x7f97ac1013c0 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.913+0000 7f97ab7fe700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 0x7f97ac198340 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T14:17:54.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.913+0000 7f97aaffd700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f97ac1013c0 0x7f97ac198880 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f979c009fd0 tx=0x7f979c004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:54.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.913+0000 7f97a8ff9700 1 -- 192.168.123.103:0/275464704 <== mon.1 v2:192.168.123.104:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f979c01d070 con 0x7f97ac1013c0 2026-03-10T14:17:54.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.913+0000 7f97a8ff9700 1 -- 192.168.123.103:0/275464704 <== mon.1 v2:192.168.123.104:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f979c004b90 con 0x7f97ac1013c0 2026-03-10T14:17:54.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.913+0000 7f97b1c97700 1 -- 192.168.123.103:0/275464704 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f97ac19ce00 con 0x7f97ac1013c0 2026-03-10T14:17:54.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.913+0000 7f97b1c97700 1 -- 192.168.123.103:0/275464704 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f97ac19d2f0 con 0x7f97ac1013c0 2026-03-10T14:17:54.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.914+0000 7f97a8ff9700 1 -- 192.168.123.103:0/275464704 <== mon.1 v2:192.168.123.104:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f979c00f670 con 0x7f97ac1013c0 2026-03-10T14:17:54.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.914+0000 7f97a27fc700 1 -- 192.168.123.103:0/275464704 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f97ac04ea50 con 0x7f97ac1013c0 2026-03-10T14:17:54.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.915+0000 7f97a8ff9700 1 -- 192.168.123.103:0/275464704 <== mon.1 v2:192.168.123.104:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f979c00bc50 con 0x7f97ac1013c0 2026-03-10T14:17:54.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.916+0000 7f97a8ff9700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9798077700 0x7f9798079bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:54.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.916+0000 7f97a8ff9700 1 -- 192.168.123.103:0/275464704 <== mon.1 v2:192.168.123.104:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f979c09b340 con 0x7f97ac1013c0 2026-03-10T14:17:54.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.916+0000 7f97ab7fe700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9798077700 0x7f9798079bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:54.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.916+0000 7f97ab7fe700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9798077700 0x7f9798079bb0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f979400f520 tx=0x7f9794005f90 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:54.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:54.919+0000 7f97a8ff9700 1 -- 192.168.123.103:0/275464704 <== mon.1 v2:192.168.123.104:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f979c063ae0 con 0x7f97ac1013c0 2026-03-10T14:17:55.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.065+0000 7f97a27fc700 1 -- 192.168.123.103:0/275464704 --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 38, "format": "json"} v 0) v1 -- 0x7f97ac066e40 con 0x7f97ac1013c0 2026-03-10T14:17:55.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.066+0000 7f97a8ff9700 1 -- 192.168.123.103:0/275464704 <== mon.1 v2:192.168.123.104:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 38, "format": "json"}]=0 dumped fsmap epoch 38 v40) v1 ==== 107+0+5285 (secure 0 0 0) 0x7f979c063230 con 0x7f97ac1013c0 2026-03-10T14:17:55.065 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:55.065 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":38,"btime":"2026-03-10T14:15:21:621951+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29},{"gid":44351,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3930678535","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3930678535},{"type":"v1","addr":"192.168.123.104:6827","nonce":3930678535}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":36}],"filesystems":[{"mdsmap":{"epoch":38,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:21.621947+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0,1],"up":{"mds_0":34474,"mds_1":34476},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34474":{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":0,"incarnation":33,"state":"up:active","state_seq":8,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_34476":{"gid":34476,"name":"cephfs.vm03.itwezo","rank":1,"incarnation":38,"state":"up:starting","state_seq":1,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34474,"qdb_cluster":[34474]},"id":1}]} 2026-03-10T14:17:55.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 -- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9798077700 msgr2=0x7f9798079bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:55.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9798077700 0x7f9798079bb0 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f979400f520 tx=0x7f9794005f90 comp rx=0 tx=0).stop 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 -- 192.168.123.103:0/275464704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f97ac1013c0 msgr2=0x7f97ac198880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f97ac1013c0 0x7f97ac198880 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f979c009fd0 tx=0x7f979c004930 comp rx=0 tx=0).stop 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 -- 192.168.123.103:0/275464704 shutdown_connections 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f9798077700 0x7f9798079bb0 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac068490 0x7f97ac198340 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 --2- 192.168.123.103:0/275464704 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f97ac1013c0 0x7f97ac198880 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.069+0000 7f97a27fc700 1 -- 192.168.123.103:0/275464704 >> 192.168.123.103:0/275464704 conn(0x7f97ac0754a0 msgr2=0x7f97ac0fddd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.070+0000 7f97a27fc700 1 -- 192.168.123.103:0/275464704 shutdown_connections 2026-03-10T14:17:55.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.070+0000 7f97a27fc700 1 -- 192.168.123.103:0/275464704 wait complete. 2026-03-10T14:17:55.069 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 38 2026-03-10T14:17:55.132 DEBUG:tasks.fs:allow_standby_replay disabled in epoch 38 2026-03-10T14:17:55.132 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph fs dump --format=json 39 2026-03-10T14:17:55.278 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:55.305 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:55 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1757126044' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 37, "format": "json"}]: dispatch 2026-03-10T14:17:55.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:55 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1757126044' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 37, "format": "json"}]: dispatch 2026-03-10T14:17:55.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.522+0000 7f9a14bb6700 1 -- 192.168.123.103:0/35443719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a100686f0 msgr2=0x7f9a10068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:55.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.522+0000 7f9a14bb6700 1 --2- 192.168.123.103:0/35443719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a100686f0 0x7f9a10068ac0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f99f8009b50 tx=0x7f99f8009e60 comp rx=0 tx=0).stop 2026-03-10T14:17:55.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.523+0000 7f9a14bb6700 1 -- 192.168.123.103:0/35443719 shutdown_connections 2026-03-10T14:17:55.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.523+0000 7f9a14bb6700 1 --2- 192.168.123.103:0/35443719 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a10069000 0x7f9a101051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.523+0000 7f9a14bb6700 1 --2- 192.168.123.103:0/35443719 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a100686f0 0x7f9a10068ac0 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.523+0000 7f9a14bb6700 1 -- 192.168.123.103:0/35443719 >> 192.168.123.103:0/35443719 conn(0x7f9a100754a0 msgr2=0x7f9a100758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:55.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.523+0000 7f9a14bb6700 1 -- 192.168.123.103:0/35443719 shutdown_connections 2026-03-10T14:17:55.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.523+0000 7f9a14bb6700 1 -- 192.168.123.103:0/35443719 wait complete. 2026-03-10T14:17:55.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.523+0000 7f9a14bb6700 1 Processor -- start 2026-03-10T14:17:55.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a14bb6700 1 -- start start 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a14bb6700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a100686f0 0x7f9a10194010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a14bb6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a10069000 0x7f9a10194550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a14bb6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a10194c30 con 0x7f9a10069000 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a14bb6700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a101989c0 con 0x7f9a100686f0 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a0dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a10069000 0x7f9a10194550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a0dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a10069000 0x7f9a10194550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46634/0 (socket says 192.168.123.103:46634) 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a0dd9b700 1 -- 192.168.123.103:0/1963524459 learned_addr learned my addr 192.168.123.103:0/1963524459 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.524+0000 7f9a0dd9b700 1 -- 192.168.123.103:0/1963524459 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a100686f0 msgr2=0x7f9a10194010 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a0e59c700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a100686f0 0x7f9a10194010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a0dd9b700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a100686f0 0x7f9a10194010 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a0dd9b700 1 -- 192.168.123.103:0/1963524459 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f99f80097e0 con 0x7f9a10069000 2026-03-10T14:17:55.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a0e59c700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a100686f0 0x7f9a10194010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T14:17:55.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a0dd9b700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a10069000 0x7f9a10194550 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f9a0000eb10 tx=0x7f9a0000eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:55.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a077fe700 1 -- 192.168.123.103:0/1963524459 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9a0000cca0 con 0x7f9a10069000 2026-03-10T14:17:55.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a077fe700 1 -- 192.168.123.103:0/1963524459 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9a0000ce00 con 0x7f9a10069000 2026-03-10T14:17:55.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9a10198ca0 con 0x7f9a10069000 2026-03-10T14:17:55.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.525+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9a101991f0 con 0x7f9a10069000 2026-03-10T14:17:55.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.526+0000 7f9a077fe700 1 -- 192.168.123.103:0/1963524459 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9a000105e0 con 0x7f9a10069000 2026-03-10T14:17:55.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.527+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9a10108b30 con 0x7f9a10069000 2026-03-10T14:17:55.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.527+0000 7f9a077fe700 1 -- 192.168.123.103:0/1963524459 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9a00010810 con 0x7f9a10069000 2026-03-10T14:17:55.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.527+0000 7f9a077fe700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f99fc07bcd0 0x7f99fc07e180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:55.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.527+0000 7f9a077fe700 1 -- 192.168.123.103:0/1963524459 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f9a00014070 con 0x7f9a10069000 2026-03-10T14:17:55.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.529+0000 7f9a0e59c700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f99fc07bcd0 0x7f99fc07e180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:55.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.529+0000 7f9a0e59c700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f99fc07bcd0 0x7f99fc07e180 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7f99f800b5c0 tx=0x7f99f80058e0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:55.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.530+0000 7f9a077fe700 1 -- 192.168.123.103:0/1963524459 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9a00062840 con 0x7f9a10069000 2026-03-10T14:17:55.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.667+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 39, "format": "json"} v 0) v1 -- 0x7f9a1004ea50 con 0x7f9a10069000 2026-03-10T14:17:55.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.668+0000 7f9a077fe700 1 -- 192.168.123.103:0/1963524459 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 39, "format": "json"}]=0 dumped fsmap epoch 39 v40) v1 ==== 107+0+5288 (secure 0 0 0) 0x7f9a00061f90 con 0x7f9a10069000 2026-03-10T14:17:55.667 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:17:55.667 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":39,"btime":"2026-03-10T14:15:22:628805+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34480,"name":"cephfs.vm04.sslxuq","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6825/3487856112","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6824","nonce":3487856112},{"type":"v1","addr":"192.168.123.104:6825","nonce":3487856112}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":29},{"gid":44351,"name":"cephfs.vm04.puavjd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.104:6827/3930678535","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.104:6826","nonce":3930678535},{"type":"v1","addr":"192.168.123.104:6827","nonce":3930678535}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":36}],"filesystems":[{"mdsmap":{"epoch":39,"flags":50,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":true,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":32,"explicitly_allowed_features":32,"created":"2026-03-10T14:05:22.222304+0000","modified":"2026-03-10T14:15:22.628803+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":121,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0,1],"up":{"mds_0":34474,"mds_1":34476},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34474":{"gid":34474,"name":"cephfs.vm03.aqaspa","rank":0,"incarnation":33,"state":"up:active","state_seq":8,"addr":"192.168.123.103:6827/2839841111","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":2839841111},{"type":"v1","addr":"192.168.123.103:6827","nonce":2839841111}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_34476":{"gid":34476,"name":"cephfs.vm03.itwezo","rank":1,"incarnation":38,"state":"up:active","state_seq":6,"addr":"192.168.123.103:6829/2389872767","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":2389872767},{"type":"v1","addr":"192.168.123.103:6829","nonce":2389872767}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34474,"qdb_cluster":[34474,34476]},"id":1}]} 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.670+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f99fc07bcd0 msgr2=0x7f99fc07e180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.670+0000 7f9a14bb6700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f99fc07bcd0 0x7f99fc07e180 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7f99f800b5c0 tx=0x7f99f80058e0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.670+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a10069000 msgr2=0x7f9a10194550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.670+0000 7f9a14bb6700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a10069000 0x7f9a10194550 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f9a0000eb10 tx=0x7f9a0000eed0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.671+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 shutdown_connections 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.671+0000 7f9a14bb6700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f99fc07bcd0 0x7f99fc07e180 unknown :-1 s=CLOSED pgs=169 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.671+0000 7f9a14bb6700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f9a100686f0 0x7f9a10194010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.671+0000 7f9a14bb6700 1 --2- 192.168.123.103:0/1963524459 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9a10069000 0x7f9a10194550 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.671+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 >> 192.168.123.103:0/1963524459 conn(0x7f9a100754a0 msgr2=0x7f9a100ff710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.671+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 shutdown_connections 2026-03-10T14:17:55.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:55.671+0000 7f9a14bb6700 1 -- 192.168.123.103:0/1963524459 wait complete. 2026-03-10T14:17:55.670 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 39 2026-03-10T14:17:55.707 DEBUG:teuthology.run_tasks:Unwinding manager ceph-fuse 2026-03-10T14:17:55.710 INFO:tasks.ceph_fuse:Unmounting ceph-fuse clients... 2026-03-10T14:17:55.710 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:17:55.710 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T14:17:55.726 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:17:55.726 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T14:17:55.781 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd blocklist ls 2026-03-10T14:17:55.957 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:56.199 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:56 vm03.local ceph-mon[103098]: pgmap v352: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:56.199 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:56 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/275464704' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 38, "format": "json"}]: dispatch 2026-03-10T14:17:56.199 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:56 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/1963524459' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 39, "format": "json"}]: dispatch 2026-03-10T14:17:56.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.198+0000 7f6fec9ec700 1 -- 192.168.123.103:0/3721123776 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6fe4101ff0 msgr2=0x7f6fe410a4f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:56.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.198+0000 7f6fec9ec700 1 --2- 192.168.123.103:0/3721123776 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6fe4101ff0 0x7f6fe410a4f0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f6fd8009a60 tx=0x7f6fd8009d70 comp rx=0 tx=0).stop 2026-03-10T14:17:56.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.200+0000 7f6fec9ec700 1 -- 192.168.123.103:0/3721123776 shutdown_connections 2026-03-10T14:17:56.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.200+0000 7f6fec9ec700 1 --2- 192.168.123.103:0/3721123776 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6fe4101ff0 0x7f6fe410a4f0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.200+0000 7f6fec9ec700 1 --2- 192.168.123.103:0/3721123776 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6fe41016e0 0x7f6fe4101ab0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.200+0000 7f6fec9ec700 1 -- 192.168.123.103:0/3721123776 >> 192.168.123.103:0/3721123776 conn(0x7f6fe40faf00 msgr2=0x7f6fe40fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:56.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.201+0000 7f6fec9ec700 1 -- 192.168.123.103:0/3721123776 shutdown_connections 2026-03-10T14:17:56.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.201+0000 7f6fec9ec700 1 -- 192.168.123.103:0/3721123776 wait complete. 2026-03-10T14:17:56.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.201+0000 7f6fec9ec700 1 Processor -- start 2026-03-10T14:17:56.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.201+0000 7f6fec9ec700 1 -- start start 2026-03-10T14:17:56.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fec9ec700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6fe41016e0 0x7f6fe4196150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:56.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fec9ec700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6fe4101ff0 0x7f6fe4196690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:56.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fec9ec700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6fe4196d70 con 0x7f6fe4101ff0 2026-03-10T14:17:56.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fe9f87700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6fe4101ff0 0x7f6fe4196690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:56.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fe9f87700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6fe4101ff0 0x7f6fe4196690 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46648/0 (socket says 192.168.123.103:46648) 2026-03-10T14:17:56.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fe9f87700 1 -- 192.168.123.103:0/2487745397 learned_addr learned my addr 192.168.123.103:0/2487745397 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:56.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fec9ec700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6fe419ab00 con 0x7f6fe41016e0 2026-03-10T14:17:56.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fe9f87700 1 -- 192.168.123.103:0/2487745397 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6fe41016e0 msgr2=0x7f6fe4196150 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:56.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fe9f87700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6fe41016e0 0x7f6fe4196150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.202+0000 7f6fe9f87700 1 -- 192.168.123.103:0/2487745397 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6fdc0097e0 con 0x7f6fe4101ff0 2026-03-10T14:17:56.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.203+0000 7f6fe9f87700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6fe4101ff0 0x7f6fe4196690 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f6fd800b5c0 tx=0x7f6fd800fb00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:56.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.203+0000 7f6fd77fe700 1 -- 192.168.123.103:0/2487745397 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6fd801d070 con 0x7f6fe4101ff0 2026-03-10T14:17:56.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.203+0000 7f6fd77fe700 1 -- 192.168.123.103:0/2487745397 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6fd80037c0 con 0x7f6fe4101ff0 2026-03-10T14:17:56.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.203+0000 7f6fd77fe700 1 -- 192.168.123.103:0/2487745397 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6fd8017830 con 0x7f6fe4101ff0 2026-03-10T14:17:56.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.203+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6fd8009710 con 0x7f6fe4101ff0 2026-03-10T14:17:56.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.203+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6fe419b000 con 0x7f6fe4101ff0 2026-03-10T14:17:56.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.204+0000 7f6fd77fe700 1 -- 192.168.123.103:0/2487745397 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6fd8017990 con 0x7f6fe4101ff0 2026-03-10T14:17:56.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.204+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6fe404ea50 con 0x7f6fe4101ff0 2026-03-10T14:17:56.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.206+0000 7f6fd77fe700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6fd0077870 0x7f6fd0079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:56.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.206+0000 7f6fd77fe700 1 -- 192.168.123.103:0/2487745397 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7f6fd809af40 con 0x7f6fe4101ff0 2026-03-10T14:17:56.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.206+0000 7f6fea788700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6fd0077870 0x7f6fd0079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:56.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.206+0000 7f6fea788700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6fd0077870 0x7f6fd0079d20 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f6fdc0097b0 tx=0x7f6fdc009700 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:56.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.208+0000 7f6fd77fe700 1 -- 192.168.123.103:0/2487745397 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6fd80646b0 con 0x7f6fe4101ff0 2026-03-10T14:17:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:56 vm04.local ceph-mon[92084]: pgmap v352: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:17:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:56 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/275464704' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 38, "format": "json"}]: dispatch 2026-03-10T14:17:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:56 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/1963524459' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 39, "format": "json"}]: dispatch 2026-03-10T14:17:56.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.326+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f6fe40689d0 con 0x7f6fe4101ff0 2026-03-10T14:17:56.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.327+0000 7f6fd77fe700 1 -- 192.168.123.103:0/2487745397 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 35 entries v125) v1 ==== 81+0+2139 (secure 0 0 0) 0x7f6fd8063e00 con 0x7f6fe4101ff0 2026-03-10T14:17:56.325 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6827/3074647252 2026-03-11T14:15:13.809253+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6826/3074647252 2026-03-11T14:15:13.809253+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6824/291713758 2026-03-11T14:15:07.526290+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2591561901 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/2096941445 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1542797438 2026-03-11T14:03:07.554401+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1489630015 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6825/291713758 2026-03-11T14:15:07.526290+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/14041722 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/2419158435 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3848481828 2026-03-11T14:03:43.076138+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1546295591 2026-03-11T14:08:01.256654+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/675588081 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1921920925 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6829/3678786563 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3383031364 2026-03-11T14:03:07.554401+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1628969362 2026-03-11T14:03:43.076138+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1771049444 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6828/3678786563 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/621911225 2026-03-11T14:03:07.554401+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1586580482 2026-03-11T14:08:01.256654+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/330658303 2026-03-11T14:03:43.076138+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/2 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/2419158435 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/2 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/1524456666 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/3116813445 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1469376943 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3603802148 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/152036745 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/4124605323 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3410780149 2026-03-11T14:08:01.256654+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1419867485 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2268959518 2026-03-11T14:08:01.256654+0000 2026-03-10T14:17:56.326 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1435569031 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.329+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6fd0077870 msgr2=0x7f6fd0079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.329+0000 7f6fec9ec700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6fd0077870 0x7f6fd0079d20 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f6fdc0097b0 tx=0x7f6fdc009700 comp rx=0 tx=0).stop 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.329+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6fe4101ff0 msgr2=0x7f6fe4196690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.329+0000 7f6fec9ec700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6fe4101ff0 0x7f6fe4196690 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f6fd800b5c0 tx=0x7f6fd800fb00 comp rx=0 tx=0).stop 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.330+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 shutdown_connections 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.330+0000 7f6fec9ec700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7f6fd0077870 0x7f6fd0079d20 unknown :-1 s=CLOSED pgs=170 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.330+0000 7f6fec9ec700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7f6fe41016e0 0x7f6fe4196150 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.330+0000 7f6fec9ec700 1 --2- 192.168.123.103:0/2487745397 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6fe4101ff0 0x7f6fe4196690 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.330+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 >> 192.168.123.103:0/2487745397 conn(0x7f6fe40faf00 msgr2=0x7f6fe4104ca0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:56.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.330+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 shutdown_connections 2026-03-10T14:17:56.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.330+0000 7f6fec9ec700 1 -- 192.168.123.103:0/2487745397 wait complete. 2026-03-10T14:17:56.329 INFO:teuthology.orchestra.run.vm03.stderr:listed 35 entries 2026-03-10T14:17:56.391 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-10T14:17:56.391 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T14:17:56.409 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph osd blocklist ls 2026-03-10T14:17:56.600 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:17:56.849 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.850+0000 7ff2ab63a700 1 -- 192.168.123.103:0/2133204745 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 msgr2=0x7ff2a410c8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:56.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.850+0000 7ff2ab63a700 1 --2- 192.168.123.103:0/2133204745 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 0x7ff2a410c8a0 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7ff2a0009b30 tx=0x7ff2a0009e40 comp rx=0 tx=0).stop 2026-03-10T14:17:56.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.851+0000 7ff2ab63a700 1 -- 192.168.123.103:0/2133204745 shutdown_connections 2026-03-10T14:17:56.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.851+0000 7ff2ab63a700 1 --2- 192.168.123.103:0/2133204745 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 0x7ff2a410c8a0 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.851+0000 7ff2ab63a700 1 --2- 192.168.123.103:0/2133204745 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff2a4073070 0x7ff2a4073440 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.851+0000 7ff2ab63a700 1 -- 192.168.123.103:0/2133204745 >> 192.168.123.103:0/2133204745 conn(0x7ff2a40fc000 msgr2=0x7ff2a40fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:56.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.851+0000 7ff2ab63a700 1 -- 192.168.123.103:0/2133204745 shutdown_connections 2026-03-10T14:17:56.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.851+0000 7ff2ab63a700 1 -- 192.168.123.103:0/2133204745 wait complete. 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2ab63a700 1 Processor -- start 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2ab63a700 1 -- start start 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2ab63a700 1 --2- >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff2a4073070 0x7ff2a4198410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2ab63a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 0x7ff2a4198950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2ab63a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2a4199030 con 0x7ff2a4073980 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2ab63a700 1 -- --> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2a419cdc0 con 0x7ff2a4073070 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2a8bd5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 0x7ff2a4198950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2a8bd5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 0x7ff2a4198950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46652/0 (socket says 192.168.123.103:46652) 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.852+0000 7ff2a8bd5700 1 -- 192.168.123.103:0/686376294 learned_addr learned my addr 192.168.123.103:0/686376294 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-10T14:17:56.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.853+0000 7ff2a8bd5700 1 -- 192.168.123.103:0/686376294 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff2a4073070 msgr2=0x7ff2a4198410 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T14:17:56.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.853+0000 7ff2a8bd5700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff2a4073070 0x7ff2a4198410 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.853+0000 7ff2a8bd5700 1 -- 192.168.123.103:0/686376294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff2a00097e0 con 0x7ff2a4073980 2026-03-10T14:17:56.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.853+0000 7ff2a8bd5700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 0x7ff2a4198950 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7ff2a0004c40 tx=0x7ff2a000b900 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:56.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.853+0000 7ff29a7fc700 1 -- 192.168.123.103:0/686376294 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2a001d070 con 0x7ff2a4073980 2026-03-10T14:17:56.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.853+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff2a419d040 con 0x7ff2a4073980 2026-03-10T14:17:56.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.853+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2a419d530 con 0x7ff2a4073980 2026-03-10T14:17:56.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.855+0000 7ff29a7fc700 1 -- 192.168.123.103:0/686376294 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff2a000bd40 con 0x7ff2a4073980 2026-03-10T14:17:56.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.855+0000 7ff29a7fc700 1 -- 192.168.123.103:0/686376294 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2a000f8d0 con 0x7ff2a4073980 2026-03-10T14:17:56.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.855+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff288005320 con 0x7ff2a4073980 2026-03-10T14:17:56.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.858+0000 7ff29a7fc700 1 -- 192.168.123.103:0/686376294 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff2a000faf0 con 0x7ff2a4073980 2026-03-10T14:17:56.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.858+0000 7ff29a7fc700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff290077800 0x7ff290079cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T14:17:56.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.859+0000 7ff29a7fc700 1 -- 192.168.123.103:0/686376294 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(125..125 src has 1..125) v4 ==== 6577+0+0 (secure 0 0 0) 0x7ff2a009c3c0 con 0x7ff2a4073980 2026-03-10T14:17:56.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.859+0000 7ff2a93d6700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff290077800 0x7ff290079cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T14:17:56.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.859+0000 7ff29a7fc700 1 -- 192.168.123.103:0/686376294 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff2a00224c0 con 0x7ff2a4073980 2026-03-10T14:17:56.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.859+0000 7ff2a93d6700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff290077800 0x7ff290079cb0 secure :-1 s=READY pgs=171 cs=0 l=1 rev1=1 crypto rx=0x7ff294009780 tx=0x7ff294006cb0 comp rx=0 tx=0).ready entity=mgr.34100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T14:17:56.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.993+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7ff288005f70 con 0x7ff2a4073980 2026-03-10T14:17:56.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.994+0000 7ff29a7fc700 1 -- 192.168.123.103:0/686376294 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 35 entries v125) v1 ==== 81+0+2139 (secure 0 0 0) 0x7ff2a0064b60 con 0x7ff2a4073980 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6827/3074647252 2026-03-11T14:15:13.809253+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6826/3074647252 2026-03-11T14:15:13.809253+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6824/291713758 2026-03-11T14:15:07.526290+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2591561901 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/2096941445 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1542797438 2026-03-11T14:03:07.554401+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1489630015 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6825/291713758 2026-03-11T14:15:07.526290+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/14041722 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/2419158435 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3848481828 2026-03-11T14:03:43.076138+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1546295591 2026-03-11T14:08:01.256654+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/675588081 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1921920925 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6829/3678786563 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3383031364 2026-03-11T14:03:07.554401+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1628969362 2026-03-11T14:03:43.076138+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1771049444 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:6828/3678786563 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.993 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/621911225 2026-03-11T14:03:07.554401+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1586580482 2026-03-11T14:08:01.256654+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/330658303 2026-03-11T14:03:43.076138+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/2 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/2419158435 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/2 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/1524456666 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/3116813445 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1469376943 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3603802148 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.104:0/152036745 2026-03-11T14:08:26.756484+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/4124605323 2026-03-11T14:02:53.370766+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3410780149 2026-03-11T14:08:01.256654+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1419867485 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2268959518 2026-03-11T14:08:01.256654+0000 2026-03-10T14:17:56.994 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1435569031 2026-03-11T14:09:00.437684+0000 2026-03-10T14:17:56.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.996+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff290077800 msgr2=0x7ff290079cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:56.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.996+0000 7ff2ab63a700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff290077800 0x7ff290079cb0 secure :-1 s=READY pgs=171 cs=0 l=1 rev1=1 crypto rx=0x7ff294009780 tx=0x7ff294006cb0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.996+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 msgr2=0x7ff2a4198950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T14:17:56.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.997+0000 7ff2ab63a700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 0x7ff2a4198950 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7ff2a0004c40 tx=0x7ff2a000b900 comp rx=0 tx=0).stop 2026-03-10T14:17:56.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.997+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 shutdown_connections 2026-03-10T14:17:56.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.997+0000 7ff2ab63a700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:6800/65079813,v1:192.168.123.103:6801/65079813] conn(0x7ff290077800 0x7ff290079cb0 unknown :-1 s=CLOSED pgs=171 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.997+0000 7ff2ab63a700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.104:3300/0,v1:192.168.123.104:6789/0] conn(0x7ff2a4073070 0x7ff2a4198410 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.997+0000 7ff2ab63a700 1 --2- 192.168.123.103:0/686376294 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff2a4073980 0x7ff2a4198950 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T14:17:56.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.997+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 >> 192.168.123.103:0/686376294 conn(0x7ff2a40fc000 msgr2=0x7ff2a41070e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T14:17:56.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.997+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 shutdown_connections 2026-03-10T14:17:56.996 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-10T14:17:56.998+0000 7ff2ab63a700 1 -- 192.168.123.103:0/686376294 wait complete. 2026-03-10T14:17:56.997 INFO:teuthology.orchestra.run.vm03.stderr:listed 35 entries 2026-03-10T14:17:57.047 INFO:tasks.cephfs.fuse_mount:Running fusermount -u on ubuntu@vm03.local... 2026-03-10T14:17:57.048 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T14:17:57.048 DEBUG:teuthology.orchestra.run.vm03:> sudo fusermount -u /home/ubuntu/cephtest/mnt.0 2026-03-10T14:17:57.084 INFO:teuthology.orchestra.run:waiting for 300 2026-03-10T14:17:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:57 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/2487745397' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T14:17:57.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:57 vm04.local ceph-mon[92084]: from='client.? 192.168.123.103:0/686376294' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T14:17:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:57 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/2487745397' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T14:17:57.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:57 vm03.local ceph-mon[103098]: from='client.? 192.168.123.103:0/686376294' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T14:17:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:17:58 vm04.local ceph-mon[92084]: pgmap v353: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:17:58.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:17:58 vm03.local ceph-mon[103098]: pgmap v353: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:00.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:00 vm03.local ceph-mon[103098]: pgmap v354: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:00.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:00 vm04.local ceph-mon[92084]: pgmap v354: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:01.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:18:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:18:02.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:02 vm03.local ceph-mon[103098]: pgmap v355: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:02 vm04.local ceph-mon[92084]: pgmap v355: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:03.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:18:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:18:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:18:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:18:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:18:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:18:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:18:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:18:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:04 vm03.local ceph-mon[103098]: pgmap v356: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:04 vm04.local ceph-mon[92084]: pgmap v356: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:06 vm03.local ceph-mon[103098]: pgmap v357: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:06 vm04.local ceph-mon[92084]: pgmap v357: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:07.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:07 vm03.local ceph-mon[103098]: pgmap v358: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:08.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:07 vm04.local ceph-mon[92084]: pgmap v358: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:10.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:09 vm04.local ceph-mon[92084]: pgmap v359: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:10.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:09 vm03.local ceph-mon[103098]: pgmap v359: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:12.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:11 vm04.local ceph-mon[92084]: pgmap v360: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:12.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:11 vm03.local ceph-mon[103098]: pgmap v360: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:14.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:13 vm04.local ceph-mon[92084]: pgmap v361: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:13 vm03.local ceph-mon[103098]: pgmap v361: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:15 vm04.local ceph-mon[92084]: pgmap v362: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:18:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:15 vm03.local ceph-mon[103098]: pgmap v362: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:16.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:18:18.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:17 vm04.local ceph-mon[92084]: pgmap v363: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:18.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:17 vm03.local ceph-mon[103098]: pgmap v363: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:20.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:19 vm04.local ceph-mon[92084]: pgmap v364: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:20.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:19 vm03.local ceph-mon[103098]: pgmap v364: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:22.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:21 vm04.local ceph-mon[92084]: pgmap v365: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:22.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:21 vm03.local ceph-mon[103098]: pgmap v365: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:24.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:23 vm04.local ceph-mon[92084]: pgmap v366: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:24.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:23 vm03.local ceph-mon[103098]: pgmap v366: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:26.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:25 vm04.local ceph-mon[92084]: pgmap v367: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:26.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:25 vm03.local ceph-mon[103098]: pgmap v367: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:28.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:27 vm03.local ceph-mon[103098]: pgmap v368: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:28.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:27 vm04.local ceph-mon[92084]: pgmap v368: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:30.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:29 vm03.local ceph-mon[103098]: pgmap v369: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:29 vm04.local ceph-mon[92084]: pgmap v369: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:31.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:18:31.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:18:32.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:31 vm03.local ceph-mon[103098]: pgmap v370: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:32.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:31 vm04.local ceph-mon[92084]: pgmap v370: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:34.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:33 vm03.local ceph-mon[103098]: pgmap v371: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:34.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:33 vm04.local ceph-mon[92084]: pgmap v371: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:36.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:35 vm03.local ceph-mon[103098]: pgmap v372: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:36.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:35 vm04.local ceph-mon[92084]: pgmap v372: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:37 vm04.local ceph-mon[92084]: pgmap v373: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:38.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:37 vm03.local ceph-mon[103098]: pgmap v373: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:39 vm04.local ceph-mon[92084]: pgmap v374: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:40.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:39 vm03.local ceph-mon[103098]: pgmap v374: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:41.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:41 vm04.local ceph-mon[92084]: pgmap v375: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:41.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:41 vm03.local ceph-mon[103098]: pgmap v375: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:43 vm04.local ceph-mon[92084]: pgmap v376: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:44.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:43 vm03.local ceph-mon[103098]: pgmap v376: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:46.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:45 vm04.local ceph-mon[92084]: pgmap v377: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:46.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:18:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:45 vm03.local ceph-mon[103098]: pgmap v377: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:18:48.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:47 vm04.local ceph-mon[92084]: pgmap v378: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:47 vm03.local ceph-mon[103098]: pgmap v378: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:50.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:49 vm04.local ceph-mon[92084]: pgmap v379: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:50.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:49 vm03.local ceph-mon[103098]: pgmap v379: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:52.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:51 vm04.local ceph-mon[92084]: pgmap v380: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:52.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:51 vm03.local ceph-mon[103098]: pgmap v380: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:54.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:53 vm04.local ceph-mon[92084]: pgmap v381: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:54.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:53 vm03.local ceph-mon[103098]: pgmap v381: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:18:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:55 vm04.local ceph-mon[92084]: pgmap v382: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:56.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:55 vm03.local ceph-mon[103098]: pgmap v382: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:18:58.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:57 vm04.local ceph-mon[92084]: pgmap v383: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:18:58.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:57 vm03.local ceph-mon[103098]: pgmap v383: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:00.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:18:59 vm03.local ceph-mon[103098]: pgmap v384: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:00.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:18:59 vm04.local ceph-mon[92084]: pgmap v384: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:01.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:00 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:19:01.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:00 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:19:02.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:01 vm03.local ceph-mon[103098]: pgmap v385: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:02.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:01 vm04.local ceph-mon[92084]: pgmap v385: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:03.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:02 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:19:03.277 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:02 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:19:04.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:03 vm03.local ceph-mon[103098]: pgmap v386: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:19:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:19:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:19:04.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:03 vm04.local ceph-mon[92084]: pgmap v386: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:19:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:19:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:19:06.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:05 vm03.local ceph-mon[103098]: pgmap v387: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:06.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:05 vm04.local ceph-mon[92084]: pgmap v387: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:08.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:07 vm03.local ceph-mon[103098]: pgmap v388: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:08.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:07 vm04.local ceph-mon[92084]: pgmap v388: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:10.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:09 vm03.local ceph-mon[103098]: pgmap v389: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:10.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:09 vm04.local ceph-mon[92084]: pgmap v389: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:12.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:11 vm04.local ceph-mon[92084]: pgmap v390: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:12.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:11 vm03.local ceph-mon[103098]: pgmap v390: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:13 vm04.local ceph-mon[92084]: pgmap v391: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:14.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:13 vm03.local ceph-mon[103098]: pgmap v391: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:15 vm04.local ceph-mon[92084]: pgmap v392: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:19:16.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:15 vm03.local ceph-mon[103098]: pgmap v392: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:16.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:19:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:17 vm04.local ceph-mon[92084]: pgmap v393: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:18.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:17 vm03.local ceph-mon[103098]: pgmap v393: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:19 vm04.local ceph-mon[92084]: pgmap v394: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:20.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:19 vm03.local ceph-mon[103098]: pgmap v394: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:22.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:21 vm04.local ceph-mon[92084]: pgmap v395: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:22.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:21 vm03.local ceph-mon[103098]: pgmap v395: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:24.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:23 vm04.local ceph-mon[92084]: pgmap v396: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:24.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:23 vm03.local ceph-mon[103098]: pgmap v396: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:25 vm04.local ceph-mon[92084]: pgmap v397: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:26.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:25 vm03.local ceph-mon[103098]: pgmap v397: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:28.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:27 vm03.local ceph-mon[103098]: pgmap v398: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:28.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:27 vm04.local ceph-mon[92084]: pgmap v398: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:30.280 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:29 vm04.local ceph-mon[92084]: pgmap v399: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:30.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:29 vm03.local ceph-mon[103098]: pgmap v399: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:31.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:19:31.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:19:32.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:31 vm04.local ceph-mon[92084]: pgmap v400: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:32.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:31 vm03.local ceph-mon[103098]: pgmap v400: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:34.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:33 vm04.local ceph-mon[92084]: pgmap v401: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:34.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:33 vm03.local ceph-mon[103098]: pgmap v401: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:36.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:35 vm04.local ceph-mon[92084]: pgmap v402: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:36.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:35 vm03.local ceph-mon[103098]: pgmap v402: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:37 vm04.local ceph-mon[92084]: pgmap v403: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:38.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:37 vm03.local ceph-mon[103098]: pgmap v403: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:40.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:40 vm04.local ceph-mon[92084]: pgmap v404: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:40.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:40 vm03.local ceph-mon[103098]: pgmap v404: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:42.331 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:42 vm03.local ceph-mon[103098]: pgmap v405: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:42.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:42 vm04.local ceph-mon[92084]: pgmap v405: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:44.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:44 vm03.local ceph-mon[103098]: pgmap v406: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:44.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:44 vm04.local ceph-mon[92084]: pgmap v406: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:46.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:46 vm03.local ceph-mon[103098]: pgmap v407: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:46.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:46 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:19:46.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:46 vm04.local ceph-mon[92084]: pgmap v407: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:46.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:46 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:19:48.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:48 vm03.local ceph-mon[103098]: pgmap v408: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:48.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:48 vm04.local ceph-mon[92084]: pgmap v408: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:50.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:50 vm04.local ceph-mon[92084]: pgmap v409: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:50.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:50 vm03.local ceph-mon[103098]: pgmap v409: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:52.451 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:52 vm03.local ceph-mon[103098]: pgmap v410: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:52.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:52 vm04.local ceph-mon[92084]: pgmap v410: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:54.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:54 vm04.local ceph-mon[92084]: pgmap v411: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:54.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:54 vm03.local ceph-mon[103098]: pgmap v411: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:19:56.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:56 vm04.local ceph-mon[92084]: pgmap v412: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:56.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:56 vm03.local ceph-mon[103098]: pgmap v412: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:19:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:19:58 vm04.local ceph-mon[92084]: pgmap v413: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:19:58.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:19:58 vm03.local ceph-mon[103098]: pgmap v413: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:00.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:00 vm04.local ceph-mon[92084]: pgmap v414: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:00.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:00 vm04.local ceph-mon[92084]: overall HEALTH_OK 2026-03-10T14:20:00.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:00 vm03.local ceph-mon[103098]: pgmap v414: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:00.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:00 vm03.local ceph-mon[103098]: overall HEALTH_OK 2026-03-10T14:20:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:20:01.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:20:02.562 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:02 vm03.local ceph-mon[103098]: pgmap v415: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:02 vm04.local ceph-mon[92084]: pgmap v415: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:03.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:20:03.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:20:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:04 vm04.local ceph-mon[92084]: pgmap v416: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.6 KiB/s rd, 3 op/s 2026-03-10T14:20:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:20:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:20:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:20:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:20:04.314 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:20:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:04 vm03.local ceph-mon[103098]: pgmap v416: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.6 KiB/s rd, 3 op/s 2026-03-10T14:20:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm04", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:20:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-10T14:20:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:20:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:20:04.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:20:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:06 vm04.local ceph-mon[92084]: pgmap v417: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:06.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:06 vm03.local ceph-mon[103098]: pgmap v417: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:08 vm04.local ceph-mon[92084]: pgmap v418: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:08 vm03.local ceph-mon[103098]: pgmap v418: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:10.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:10 vm04.local ceph-mon[92084]: pgmap v419: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:10.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:10 vm03.local ceph-mon[103098]: pgmap v419: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:12.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:12 vm04.local ceph-mon[92084]: pgmap v420: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:12.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:12 vm03.local ceph-mon[103098]: pgmap v420: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.0 KiB/s rd, 3 op/s 2026-03-10T14:20:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:14 vm04.local ceph-mon[92084]: pgmap v421: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:14.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:14 vm03.local ceph-mon[103098]: pgmap v421: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:15.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:15 vm04.local ceph-mon[92084]: pgmap v422: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:15.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:15 vm03.local ceph-mon[103098]: pgmap v422: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:16.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:16 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:20:16.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:16 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:20:17.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:17 vm04.local ceph-mon[92084]: pgmap v423: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:17.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:17 vm03.local ceph-mon[103098]: pgmap v423: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:20.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:19 vm04.local ceph-mon[92084]: pgmap v424: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:20.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:19 vm03.local ceph-mon[103098]: pgmap v424: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:22.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:21 vm04.local ceph-mon[92084]: pgmap v425: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:22.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:21 vm03.local ceph-mon[103098]: pgmap v425: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:24.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:23 vm04.local ceph-mon[92084]: pgmap v426: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:24.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:23 vm03.local ceph-mon[103098]: pgmap v426: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:26.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:25 vm04.local ceph-mon[92084]: pgmap v427: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:26.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:25 vm03.local ceph-mon[103098]: pgmap v427: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:28.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:27 vm03.local ceph-mon[103098]: pgmap v428: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:28.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:27 vm04.local ceph-mon[92084]: pgmap v428: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:29 vm04.local ceph-mon[92084]: pgmap v429: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:30.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:29 vm03.local ceph-mon[103098]: pgmap v429: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:31.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:31 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:20:31.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:31 vm03.local ceph-mon[103098]: pgmap v430: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:31 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:20:32.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:31 vm04.local ceph-mon[92084]: pgmap v430: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:34.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:33 vm04.local ceph-mon[92084]: pgmap v431: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:34.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:33 vm03.local ceph-mon[103098]: pgmap v431: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:36.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:35 vm04.local ceph-mon[92084]: pgmap v432: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:36.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:35 vm03.local ceph-mon[103098]: pgmap v432: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:38.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:37 vm04.local ceph-mon[92084]: pgmap v433: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:38.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:37 vm03.local ceph-mon[103098]: pgmap v433: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:40.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:39 vm04.local ceph-mon[92084]: pgmap v434: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:40.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:39 vm03.local ceph-mon[103098]: pgmap v434: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:42.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:41 vm04.local ceph-mon[92084]: pgmap v435: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:42.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:41 vm03.local ceph-mon[103098]: pgmap v435: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:44.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:43 vm03.local ceph-mon[103098]: pgmap v436: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:44.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:43 vm04.local ceph-mon[92084]: pgmap v436: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:45 vm03.local ceph-mon[103098]: pgmap v437: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:20:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:45 vm04.local ceph-mon[92084]: pgmap v437: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:20:48.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:47 vm03.local ceph-mon[103098]: pgmap v438: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:48.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:47 vm04.local ceph-mon[92084]: pgmap v438: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:50.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:49 vm03.local ceph-mon[103098]: pgmap v439: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:49 vm04.local ceph-mon[92084]: pgmap v439: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:51 vm04.local ceph-mon[92084]: pgmap v440: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:52.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:51 vm03.local ceph-mon[103098]: pgmap v440: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:54.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:53 vm04.local ceph-mon[92084]: pgmap v441: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:54.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:53 vm03.local ceph-mon[103098]: pgmap v441: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:20:56.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:55 vm04.local ceph-mon[92084]: pgmap v442: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:56.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:55 vm03.local ceph-mon[103098]: pgmap v442: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:20:58.175 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:57 vm03.local ceph-mon[103098]: pgmap v443: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:20:58.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:57 vm04.local ceph-mon[92084]: pgmap v443: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:00.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:20:59 vm04.local ceph-mon[92084]: pgmap v444: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:00.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:20:59 vm03.local ceph-mon[103098]: pgmap v444: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:01.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:21:01.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:01 vm04.local ceph-mon[92084]: pgmap v445: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:01.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:21:01.858 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:01 vm03.local ceph-mon[103098]: pgmap v445: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:04.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:03 vm04.local ceph-mon[92084]: pgmap v446: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:04.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:21:04.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:21:04.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:21:04.064 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:03 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:21:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:03 vm03.local ceph-mon[103098]: pgmap v446: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:21:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:21:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:21:04.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:03 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:21:06.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:05 vm04.local ceph-mon[92084]: pgmap v447: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:06.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:05 vm03.local ceph-mon[103098]: pgmap v447: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:08.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:07 vm04.local ceph-mon[92084]: pgmap v448: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:08.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:07 vm03.local ceph-mon[103098]: pgmap v448: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:10.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:09 vm04.local ceph-mon[92084]: pgmap v449: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:10.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:09 vm03.local ceph-mon[103098]: pgmap v449: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:12.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:11 vm04.local ceph-mon[92084]: pgmap v450: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:12.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:11 vm03.local ceph-mon[103098]: pgmap v450: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:14.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:13 vm03.local ceph-mon[103098]: pgmap v451: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:14.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:13 vm04.local ceph-mon[92084]: pgmap v451: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:15 vm04.local ceph-mon[92084]: pgmap v452: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:16.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:21:16.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:15 vm03.local ceph-mon[103098]: pgmap v452: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:16.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:21:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:17 vm04.local ceph-mon[92084]: pgmap v453: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:18.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:17 vm03.local ceph-mon[103098]: pgmap v453: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:20.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:19 vm04.local ceph-mon[92084]: pgmap v454: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:20.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:19 vm03.local ceph-mon[103098]: pgmap v454: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:22.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:21 vm04.local ceph-mon[92084]: pgmap v455: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:22.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:21 vm03.local ceph-mon[103098]: pgmap v455: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:24.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:23 vm04.local ceph-mon[92084]: pgmap v456: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:24.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:23 vm03.local ceph-mon[103098]: pgmap v456: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:25 vm04.local ceph-mon[92084]: pgmap v457: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:26.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:25 vm03.local ceph-mon[103098]: pgmap v457: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:28.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:27 vm04.local ceph-mon[92084]: pgmap v458: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:28.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:27 vm03.local ceph-mon[103098]: pgmap v458: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:29 vm04.local ceph-mon[92084]: pgmap v459: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:30.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:29 vm03.local ceph-mon[103098]: pgmap v459: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:31.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:21:31.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:21:32.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:31 vm04.local ceph-mon[92084]: pgmap v460: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:32.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:31 vm03.local ceph-mon[103098]: pgmap v460: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:34.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:34 vm04.local ceph-mon[92084]: pgmap v461: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:34.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:34 vm03.local ceph-mon[103098]: pgmap v461: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:36.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:36 vm04.local ceph-mon[92084]: pgmap v462: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:36.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:36 vm03.local ceph-mon[103098]: pgmap v462: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:38.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:38 vm04.local ceph-mon[92084]: pgmap v463: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:38.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:38 vm03.local ceph-mon[103098]: pgmap v463: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:39.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:39 vm04.local ceph-mon[92084]: pgmap v464: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:39.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:39 vm03.local ceph-mon[103098]: pgmap v464: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:42.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:41 vm04.local ceph-mon[92084]: pgmap v465: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:42.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:41 vm03.local ceph-mon[103098]: pgmap v465: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:43 vm04.local ceph-mon[92084]: pgmap v466: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:44.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:43 vm03.local ceph-mon[103098]: pgmap v466: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:45 vm03.local ceph-mon[103098]: pgmap v467: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:21:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:45 vm04.local ceph-mon[92084]: pgmap v467: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:46.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:21:48.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:48 vm03.local ceph-mon[103098]: pgmap v468: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:48.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:48 vm04.local ceph-mon[92084]: pgmap v468: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:50.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:50 vm04.local ceph-mon[92084]: pgmap v469: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:50.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:50 vm03.local ceph-mon[103098]: pgmap v469: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:52.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:52 vm03.local ceph-mon[103098]: pgmap v470: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:52.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:52 vm04.local ceph-mon[92084]: pgmap v470: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:54.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:54 vm03.local ceph-mon[103098]: pgmap v471: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:54.366 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:54 vm04.local ceph-mon[92084]: pgmap v471: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:21:56.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:56 vm03.local ceph-mon[103098]: pgmap v472: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:56.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:56 vm04.local ceph-mon[92084]: pgmap v472: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:21:58.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:21:58 vm03.local ceph-mon[103098]: pgmap v473: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:21:58.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:21:58 vm04.local ceph-mon[92084]: pgmap v473: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:00.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:00 vm03.local ceph-mon[103098]: pgmap v474: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:00.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:00 vm04.local ceph-mon[92084]: pgmap v474: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:01.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:01 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:22:01.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:01 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:22:02.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:02 vm03.local ceph-mon[103098]: pgmap v475: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:02.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:02 vm04.local ceph-mon[92084]: pgmap v475: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:04.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:04 vm03.local ceph-mon[103098]: pgmap v476: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:22:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:22:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:22:04.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:04 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:22:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:04 vm04.local ceph-mon[92084]: pgmap v476: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T14:22:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T14:22:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T14:22:04.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:04 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' 2026-03-10T14:22:06.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:06 vm03.local ceph-mon[103098]: pgmap v477: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:06.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:06 vm04.local ceph-mon[92084]: pgmap v477: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:08.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:08 vm04.local ceph-mon[92084]: pgmap v478: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:08.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:08 vm03.local ceph-mon[103098]: pgmap v478: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:10.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:10 vm04.local ceph-mon[92084]: pgmap v479: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:10.607 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:10 vm03.local ceph-mon[103098]: pgmap v479: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:11.563 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:11 vm04.local ceph-mon[92084]: pgmap v480: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:11.608 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:11 vm03.local ceph-mon[103098]: pgmap v480: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:14.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:13 vm04.local ceph-mon[92084]: pgmap v481: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:14.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:13 vm03.local ceph-mon[103098]: pgmap v481: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:15 vm04.local ceph-mon[92084]: pgmap v482: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:16.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:15 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:22:16.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:15 vm03.local ceph-mon[103098]: pgmap v482: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:16.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:15 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:22:18.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:17 vm03.local ceph-mon[103098]: pgmap v483: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:18.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:17 vm04.local ceph-mon[92084]: pgmap v483: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:20.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:19 vm03.local ceph-mon[103098]: pgmap v484: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:20.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:19 vm04.local ceph-mon[92084]: pgmap v484: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:22.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:21 vm03.local ceph-mon[103098]: pgmap v485: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:22.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:21 vm04.local ceph-mon[92084]: pgmap v485: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:24.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:23 vm03.local ceph-mon[103098]: pgmap v486: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:24.116 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:23 vm04.local ceph-mon[92084]: pgmap v486: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:26.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:25 vm03.local ceph-mon[103098]: pgmap v487: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:26.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:25 vm04.local ceph-mon[92084]: pgmap v487: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:28.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:27 vm04.local ceph-mon[92084]: pgmap v488: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:28.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:27 vm03.local ceph-mon[103098]: pgmap v488: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:30.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:29 vm04.local ceph-mon[92084]: pgmap v489: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:30.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:29 vm03.local ceph-mon[103098]: pgmap v489: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:31.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:30 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:22:31.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:30 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:22:32.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:31 vm04.local ceph-mon[92084]: pgmap v490: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:32.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:31 vm03.local ceph-mon[103098]: pgmap v490: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:34.220 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:33 vm03.local ceph-mon[103098]: pgmap v491: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:34.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:33 vm04.local ceph-mon[92084]: pgmap v491: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:36.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:35 vm04.local ceph-mon[92084]: pgmap v492: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:36.357 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:35 vm03.local ceph-mon[103098]: pgmap v492: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:38.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:38 vm03.local ceph-mon[103098]: pgmap v493: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:38.564 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:38 vm04.local ceph-mon[92084]: pgmap v493: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:40.813 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:40 vm04.local ceph-mon[92084]: pgmap v494: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:40.857 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:40 vm03.local ceph-mon[103098]: pgmap v494: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:42.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:41 vm04.local ceph-mon[92084]: pgmap v495: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:42.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:41 vm03.local ceph-mon[103098]: pgmap v495: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:44.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:43 vm04.local ceph-mon[92084]: pgmap v496: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:44.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:43 vm03.local ceph-mon[103098]: pgmap v496: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:46.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:45 vm04.local ceph-mon[92084]: pgmap v497: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:46.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:45 vm04.local ceph-mon[92084]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:22:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:45 vm03.local ceph-mon[103098]: pgmap v497: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:46.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:45 vm03.local ceph-mon[103098]: from='mgr.34100 192.168.123.103:0/3760038152' entity='mgr.vm03.rwbbep' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T14:22:48.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:47 vm04.local ceph-mon[92084]: pgmap v498: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:48.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:47 vm03.local ceph-mon[103098]: pgmap v498: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:50.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:49 vm04.local ceph-mon[92084]: pgmap v499: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:50.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:49 vm03.local ceph-mon[103098]: pgmap v499: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:52.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:51 vm03.local ceph-mon[103098]: pgmap v500: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:52.313 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:51 vm04.local ceph-mon[92084]: pgmap v500: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 4 op/s 2026-03-10T14:22:54.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:53 vm03.local ceph-mon[103098]: pgmap v501: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:54.116 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:53 vm04.local ceph-mon[92084]: pgmap v501: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 1.7 KiB/s rd, 3 op/s 2026-03-10T14:22:56.063 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:55 vm04.local ceph-mon[92084]: pgmap v502: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:56.108 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:55 vm03.local ceph-mon[103098]: pgmap v502: 65 pgs: 65 active+clean; 151 MiB data, 749 MiB used, 119 GiB / 120 GiB avail; 2.1 KiB/s rd, 3 op/s 2026-03-10T14:22:56.131 ERROR:tasks.cephfs.fuse_mount:process failed to terminate after unmount. This probably indicates a bug within ceph-fuse. 2026-03-10T14:22:56.131 ERROR:teuthology.run_tasks:Manager failed: ceph-fuse Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T14:22:56.132 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-10T14:22:56.135 INFO:tasks.cephadm:Teardown begin 2026-03-10T14:22:56.135 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephadm.py", line 2252, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T14:22:56.135 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T14:22:56.161 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T14:22:56.187 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-10T14:22:56.187 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid b81bf660-1c89-11f1-b612-27d302cdb124 -- ceph mgr module disable cephadm 2026-03-10T14:22:56.347 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/mon.vm03/config 2026-03-10T14:22:56.498 INFO:teuthology.orchestra.run.vm03.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-10T14:22:56.515 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-10T14:22:56.515 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-10T14:22:56.515 DEBUG:teuthology.orchestra.run.vm03:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T14:22:56.532 DEBUG:teuthology.orchestra.run.vm04:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T14:22:56.547 INFO:tasks.cephadm:Stopping all daemons... 2026-03-10T14:22:56.547 INFO:tasks.cephadm.mon.vm03:Stopping mon.vm03... 2026-03-10T14:22:56.547 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03 2026-03-10T14:22:56.681 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:56 vm03.local systemd[1]: Stopping Ceph mon.vm03 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:22:56.937 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:56 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[103094]: 2026-03-10T14:22:56.681+0000 7f3df9a1f640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm03 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:22:56.937 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:56 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03[103094]: 2026-03-10T14:22:56.681+0000 7f3df9a1f640 -1 mon.vm03@0(leader) e3 *** Got Signal Terminated *** 2026-03-10T14:22:56.937 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 10 14:22:56 vm03.local podman[153481]: 2026-03-10 14:22:56.939177317 +0000 UTC m=+0.272390802 container died c2a0f005ef9d25695d3e74722de7ae656b06ac4b6bde2933041a0986fe82fead (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm03, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-10T14:22:57.020 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm03.service' 2026-03-10T14:22:57.056 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T14:22:57.056 INFO:tasks.cephadm.mon.vm03:Stopped mon.vm03 2026-03-10T14:22:57.056 INFO:tasks.cephadm.mon.vm04:Stopping mon.vm04... 2026-03-10T14:22:57.056 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm04 2026-03-10T14:22:57.410 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:57 vm04.local systemd[1]: Stopping Ceph mon.vm04 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:22:57.410 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:57 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04[92080]: 2026-03-10T14:22:57.177+0000 7f898cdf7640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm04 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:22:57.410 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:57 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04[92080]: 2026-03-10T14:22:57.177+0000 7f898cdf7640 -1 mon.vm04@1(peon) e3 *** Got Signal Terminated *** 2026-03-10T14:22:57.410 INFO:journalctl@ceph.mon.vm04.vm04.stdout:Mar 10 14:22:57 vm04.local podman[122227]: 2026-03-10 14:22:57.412233998 +0000 UTC m=+0.255085961 container died 111e2285827974b1bf30ba12303f7bca9e3c39a4934cc50785de31c8619c29f0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-mon-vm04, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0) 2026-03-10T14:22:57.494 DEBUG:teuthology.orchestra.run.vm04:> sudo pkill -f 'journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@mon.vm04.service' 2026-03-10T14:22:57.527 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T14:22:57.527 INFO:tasks.cephadm.mon.vm04:Stopped mon.vm04 2026-03-10T14:22:57.527 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-10T14:22:57.527 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.0 2026-03-10T14:22:57.858 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:22:57 vm03.local systemd[1]: Stopping Ceph osd.0 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:22:57.858 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:22:57 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[109293]: 2026-03-10T14:22:57.631+0000 7fbabd5d7640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:22:57.858 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:22:57 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[109293]: 2026-03-10T14:22:57.631+0000 7fbabd5d7640 -1 osd.0 125 *** Got signal Terminated *** 2026-03-10T14:22:57.858 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:22:57 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0[109293]: 2026-03-10T14:22:57.631+0000 7fbabd5d7640 -1 osd.0 125 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:23:02.924 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:23:02 vm03.local podman[153597]: 2026-03-10 14:23:02.675328279 +0000 UTC m=+5.055769665 container died 6e24e5898f4d2cc880aa183a560b9a87a532849eec08cd87bbf36ff8eda787a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True) 2026-03-10T14:23:02.924 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:23:02 vm03.local podman[153597]: 2026-03-10 14:23:02.698602997 +0000 UTC m=+5.079044364 container remove 6e24e5898f4d2cc880aa183a560b9a87a532849eec08cd87bbf36ff8eda787a4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223) 2026-03-10T14:23:02.924 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:23:02 vm03.local bash[153597]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0 2026-03-10T14:23:02.924 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:23:02 vm03.local podman[153662]: 2026-03-10 14:23:02.833536496 +0000 UTC m=+0.016009691 container create 18b41a2229d217ee579f616419d55efd258b5aeab14e9f2b65ba33758d1f5d7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:23:02.924 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:23:02 vm03.local podman[153662]: 2026-03-10 14:23:02.875327165 +0000 UTC m=+0.057800360 container init 18b41a2229d217ee579f616419d55efd258b5aeab14e9f2b65ba33758d1f5d7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T14:23:02.924 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:23:02 vm03.local podman[153662]: 2026-03-10 14:23:02.87806461 +0000 UTC m=+0.060537805 container start 18b41a2229d217ee579f616419d55efd258b5aeab14e9f2b65ba33758d1f5d7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T14:23:02.925 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:23:02 vm03.local podman[153662]: 2026-03-10 14:23:02.878894684 +0000 UTC m=+0.061367879 container attach 18b41a2229d217ee579f616419d55efd258b5aeab14e9f2b65ba33758d1f5d7c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-0-deactivate, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T14:23:02.925 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 10 14:23:02 vm03.local podman[153662]: 2026-03-10 14:23:02.827095779 +0000 UTC m=+0.009568984 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:23:03.057 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.0.service' 2026-03-10T14:23:03.102 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T14:23:03.102 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-10T14:23:03.102 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-10T14:23:03.103 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.1 2026-03-10T14:23:03.176 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:03 vm03.local systemd[1]: Stopping Ceph osd.1 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:23:03.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:03 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[114446]: 2026-03-10T14:23:03.240+0000 7f92edd19640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:23:03.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:03 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[114446]: 2026-03-10T14:23:03.240+0000 7f92edd19640 -1 osd.1 125 *** Got signal Terminated *** 2026-03-10T14:23:03.608 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:03 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1[114446]: 2026-03-10T14:23:03.240+0000 7f92edd19640 -1 osd.1 125 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:23:08.544 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:08 vm03.local podman[153759]: 2026-03-10 14:23:08.278640243 +0000 UTC m=+5.050995830 container died bf01c6df212002144d9e3f7fb6a019e7c690e0d146a614e0aa020a3dee497632 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3) 2026-03-10T14:23:08.545 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:08 vm03.local podman[153759]: 2026-03-10 14:23:08.306763278 +0000 UTC m=+5.079118865 container remove bf01c6df212002144d9e3f7fb6a019e7c690e0d146a614e0aa020a3dee497632 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T14:23:08.545 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:08 vm03.local bash[153759]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1 2026-03-10T14:23:08.545 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:08 vm03.local podman[153840]: 2026-03-10 14:23:08.45429542 +0000 UTC m=+0.017566285 container create 8076c05e791ab6c9fb764465a9ea2a1e9aebd840b2aeada9e3f83b2ae3242455 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T14:23:08.545 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:08 vm03.local podman[153840]: 2026-03-10 14:23:08.495636318 +0000 UTC m=+0.058907193 container init 8076c05e791ab6c9fb764465a9ea2a1e9aebd840b2aeada9e3f83b2ae3242455 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True) 2026-03-10T14:23:08.545 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:08 vm03.local podman[153840]: 2026-03-10 14:23:08.498530415 +0000 UTC m=+0.061801280 container start 8076c05e791ab6c9fb764465a9ea2a1e9aebd840b2aeada9e3f83b2ae3242455 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223) 2026-03-10T14:23:08.545 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:08 vm03.local podman[153840]: 2026-03-10 14:23:08.505076078 +0000 UTC m=+0.068346953 container attach 8076c05e791ab6c9fb764465a9ea2a1e9aebd840b2aeada9e3f83b2ae3242455 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-1-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True) 2026-03-10T14:23:08.545 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 10 14:23:08 vm03.local podman[153840]: 2026-03-10 14:23:08.447368473 +0000 UTC m=+0.010639348 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:23:08.666 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.1.service' 2026-03-10T14:23:08.744 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T14:23:08.744 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-10T14:23:08.744 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-10T14:23:08.744 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.2 2026-03-10T14:23:09.108 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:08 vm03.local systemd[1]: Stopping Ceph osd.2 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:23:09.108 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:08 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[118520]: 2026-03-10T14:23:08.890+0000 7f2abf695640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:23:09.108 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:08 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[118520]: 2026-03-10T14:23:08.890+0000 7f2abf695640 -1 osd.2 125 *** Got signal Terminated *** 2026-03-10T14:23:09.108 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:08 vm03.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2[118520]: 2026-03-10T14:23:08.890+0000 7f2abf695640 -1 osd.2 125 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:23:14.192 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:13 vm03.local podman[153936]: 2026-03-10 14:23:13.924774696 +0000 UTC m=+5.049400681 container died e0d768b7046892884878c67e1381d02cdf55a824347f3dedd3b317c45ba4ad05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:23:14.192 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:13 vm03.local podman[153936]: 2026-03-10 14:23:13.957232375 +0000 UTC m=+5.081858360 container remove e0d768b7046892884878c67e1381d02cdf55a824347f3dedd3b317c45ba4ad05 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:23:14.192 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:13 vm03.local bash[153936]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2 2026-03-10T14:23:14.192 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:14 vm03.local podman[154004]: 2026-03-10 14:23:14.103524096 +0000 UTC m=+0.019316361 container create 706076b42a5e715ae1930e918509afed3e527fd5219c2c04bcfb3de428827572 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default) 2026-03-10T14:23:14.192 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:14 vm03.local podman[154004]: 2026-03-10 14:23:14.143534914 +0000 UTC m=+0.059327188 container init 706076b42a5e715ae1930e918509afed3e527fd5219c2c04bcfb3de428827572 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) 2026-03-10T14:23:14.192 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:14 vm03.local podman[154004]: 2026-03-10 14:23:14.147502862 +0000 UTC m=+0.063295116 container start 706076b42a5e715ae1930e918509afed3e527fd5219c2c04bcfb3de428827572 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:23:14.192 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:14 vm03.local podman[154004]: 2026-03-10 14:23:14.148505659 +0000 UTC m=+0.064297923 container attach 706076b42a5e715ae1930e918509afed3e527fd5219c2c04bcfb3de428827572 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-2-deactivate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:23:14.192 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 10 14:23:14 vm03.local podman[154004]: 2026-03-10 14:23:14.096502281 +0000 UTC m=+0.012294545 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:23:14.303 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.2.service' 2026-03-10T14:23:14.339 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T14:23:14.339 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-10T14:23:14.339 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-10T14:23:14.339 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.3 2026-03-10T14:23:14.813 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:14 vm04.local systemd[1]: Stopping Ceph osd.3 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:23:14.813 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:14 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[98440]: 2026-03-10T14:23:14.443+0000 7f4350859640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:23:14.814 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:14 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[98440]: 2026-03-10T14:23:14.443+0000 7f4350859640 -1 osd.3 125 *** Got signal Terminated *** 2026-03-10T14:23:14.814 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:14 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3[98440]: 2026-03-10T14:23:14.443+0000 7f4350859640 -1 osd.3 125 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:23:19.731 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:19 vm04.local podman[122333]: 2026-03-10 14:23:19.4800234 +0000 UTC m=+5.050879579 container died f7fc2aafa9d94bca3b636d8f1ea6262f9c4afe89932b2a8faa2e9ccf992163be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T14:23:19.731 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:19 vm04.local podman[122333]: 2026-03-10 14:23:19.509690384 +0000 UTC m=+5.080546563 container remove f7fc2aafa9d94bca3b636d8f1ea6262f9c4afe89932b2a8faa2e9ccf992163be (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default) 2026-03-10T14:23:19.731 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:19 vm04.local bash[122333]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3 2026-03-10T14:23:19.731 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:19 vm04.local podman[122401]: 2026-03-10 14:23:19.640151486 +0000 UTC m=+0.014730555 container create 029fd8bdd493c5364fee0020ec718f809a84b2a27d4508a0092a50aaef18e05b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T14:23:19.731 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:19 vm04.local podman[122401]: 2026-03-10 14:23:19.680490638 +0000 UTC m=+0.055069707 container init 029fd8bdd493c5364fee0020ec718f809a84b2a27d4508a0092a50aaef18e05b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223) 2026-03-10T14:23:19.731 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:19 vm04.local podman[122401]: 2026-03-10 14:23:19.689359179 +0000 UTC m=+0.063938248 container start 029fd8bdd493c5364fee0020ec718f809a84b2a27d4508a0092a50aaef18e05b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2) 2026-03-10T14:23:19.731 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:19 vm04.local podman[122401]: 2026-03-10 14:23:19.694845339 +0000 UTC m=+0.069424408 container attach 029fd8bdd493c5364fee0020ec718f809a84b2a27d4508a0092a50aaef18e05b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-3-deactivate, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T14:23:19.731 INFO:journalctl@ceph.osd.3.vm04.stdout:Mar 10 14:23:19 vm04.local podman[122401]: 2026-03-10 14:23:19.634126027 +0000 UTC m=+0.008705107 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T14:23:19.840 DEBUG:teuthology.orchestra.run.vm04:> sudo pkill -f 'journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.3.service' 2026-03-10T14:23:19.871 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T14:23:19.871 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-10T14:23:19.871 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-10T14:23:19.871 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.4 2026-03-10T14:23:20.018 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:19 vm04.local systemd[1]: Stopping Ceph osd.4 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:23:20.313 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:20 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[102211]: 2026-03-10T14:23:20.019+0000 7f2934d0d640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:23:20.313 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:20 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[102211]: 2026-03-10T14:23:20.019+0000 7f2934d0d640 -1 osd.4 125 *** Got signal Terminated *** 2026-03-10T14:23:20.313 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:20 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4[102211]: 2026-03-10T14:23:20.019+0000 7f2934d0d640 -1 osd.4 125 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:23:25.049 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:24 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:24.781+0000 7fe24281a640 -1 osd.5 125 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-10T14:23:00.870952+0000 front 2026-03-10T14:23:00.870927+0000 (oldest deadline 2026-03-10T14:23:24.370686+0000) 2026-03-10T14:23:25.314 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:25 vm04.local podman[122496]: 2026-03-10 14:23:25.049959317 +0000 UTC m=+5.044033582 container died 89f9225212d4f6f2c196a5d4f8cf9312d316f9b19bd283499954ba54538cdbcd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) 2026-03-10T14:23:25.315 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:25 vm04.local podman[122496]: 2026-03-10 14:23:25.08161844 +0000 UTC m=+5.075692705 container remove 89f9225212d4f6f2c196a5d4f8cf9312d316f9b19bd283499954ba54538cdbcd (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:23:25.315 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:25 vm04.local bash[122496]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4 2026-03-10T14:23:25.315 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:25 vm04.local podman[122575]: 2026-03-10 14:23:25.224040552 +0000 UTC m=+0.017581532 container create 9147bef796d294333fbd2d7fb8682d5cea8fe2eb72b04f979379d504b10a4e78 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2) 2026-03-10T14:23:25.315 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:25 vm04.local podman[122575]: 2026-03-10 14:23:25.26572347 +0000 UTC m=+0.059264450 container init 9147bef796d294333fbd2d7fb8682d5cea8fe2eb72b04f979379d504b10a4e78 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223) 2026-03-10T14:23:25.315 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:25 vm04.local podman[122575]: 2026-03-10 14:23:25.268914162 +0000 UTC m=+0.062455142 container start 9147bef796d294333fbd2d7fb8682d5cea8fe2eb72b04f979379d504b10a4e78 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T14:23:25.315 INFO:journalctl@ceph.osd.4.vm04.stdout:Mar 10 14:23:25 vm04.local podman[122575]: 2026-03-10 14:23:25.272905243 +0000 UTC m=+0.066446223 container attach 9147bef796d294333fbd2d7fb8682d5cea8fe2eb72b04f979379d504b10a4e78 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-4-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20260223) 2026-03-10T14:23:25.426 DEBUG:teuthology.orchestra.run.vm04:> sudo pkill -f 'journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.4.service' 2026-03-10T14:23:25.463 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T14:23:25.463 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-10T14:23:25.463 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-10T14:23:25.463 DEBUG:teuthology.orchestra.run.vm04:> sudo systemctl stop ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.5 2026-03-10T14:23:25.612 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:25 vm04.local systemd[1]: Stopping Ceph osd.5 for b81bf660-1c89-11f1-b612-27d302cdb124... 2026-03-10T14:23:25.612 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:25 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:25.613+0000 7fe246a13640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T14:23:25.612 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:25 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:25.613+0000 7fe246a13640 -1 osd.5 125 *** Got signal Terminated *** 2026-03-10T14:23:26.063 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:25 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:25.613+0000 7fe246a13640 -1 osd.5 125 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T14:23:26.063 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:25 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:25.769+0000 7fe24281a640 -1 osd.5 125 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-10T14:23:00.870952+0000 front 2026-03-10T14:23:00.870927+0000 (oldest deadline 2026-03-10T14:23:24.370686+0000) 2026-03-10T14:23:27.063 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:26 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:26.758+0000 7fe24281a640 -1 osd.5 125 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-10T14:23:00.870952+0000 front 2026-03-10T14:23:00.870927+0000 (oldest deadline 2026-03-10T14:23:24.370686+0000) 2026-03-10T14:23:28.063 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:27 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:27.759+0000 7fe24281a640 -1 osd.5 125 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-10T14:23:00.870952+0000 front 2026-03-10T14:23:00.870927+0000 (oldest deadline 2026-03-10T14:23:24.370686+0000) 2026-03-10T14:23:29.064 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:28 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:28.765+0000 7fe24281a640 -1 osd.5 125 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-10T14:23:00.870952+0000 front 2026-03-10T14:23:00.870927+0000 (oldest deadline 2026-03-10T14:23:24.370686+0000) 2026-03-10T14:23:30.063 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:29 vm04.local ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5[105826]: 2026-03-10T14:23:29.800+0000 7fe24281a640 -1 osd.5 125 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-10T14:23:00.870952+0000 front 2026-03-10T14:23:00.870927+0000 (oldest deadline 2026-03-10T14:23:24.370686+0000) 2026-03-10T14:23:30.911 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:30 vm04.local podman[122672]: 2026-03-10 14:23:30.652338276 +0000 UTC m=+5.052822075 container died 6c7573f5f3fabd30cf294937d83ad4c9cebbc6c2b9d4740d891e2a13356758ee (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T14:23:30.911 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:30 vm04.local podman[122672]: 2026-03-10 14:23:30.675946878 +0000 UTC m=+5.076430668 container remove 6c7573f5f3fabd30cf294937d83ad4c9cebbc6c2b9d4740d891e2a13356758ee (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS) 2026-03-10T14:23:30.911 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:30 vm04.local bash[122672]: ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5 2026-03-10T14:23:30.911 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:30 vm04.local podman[122737]: 2026-03-10 14:23:30.820043045 +0000 UTC m=+0.016154110 container create 1c0293898f15152e8e816a019e238156e8ae96b9406da5e0bcc25eace4a6c279 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T14:23:30.911 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:30 vm04.local podman[122737]: 2026-03-10 14:23:30.85721082 +0000 UTC m=+0.053321895 container init 1c0293898f15152e8e816a019e238156e8ae96b9406da5e0bcc25eace4a6c279 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T14:23:30.911 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:30 vm04.local podman[122737]: 2026-03-10 14:23:30.860311044 +0000 UTC m=+0.056422109 container start 1c0293898f15152e8e816a019e238156e8ae96b9406da5e0bcc25eace4a6c279 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-10T14:23:30.911 INFO:journalctl@ceph.osd.5.vm04.stdout:Mar 10 14:23:30 vm04.local podman[122737]: 2026-03-10 14:23:30.861375476 +0000 UTC m=+0.057486541 container attach 1c0293898f15152e8e816a019e238156e8ae96b9406da5e0bcc25eace4a6c279 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-b81bf660-1c89-11f1-b612-27d302cdb124-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_REF=squid) 2026-03-10T14:23:31.021 DEBUG:teuthology.orchestra.run.vm04:> sudo pkill -f 'journalctl -f -n 0 -u ceph-b81bf660-1c89-11f1-b612-27d302cdb124@osd.5.service' 2026-03-10T14:23:31.053 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T14:23:31.053 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-10T14:23:31.053 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid b81bf660-1c89-11f1-b612-27d302cdb124 --force --keep-logs 2026-03-10T14:23:31.149 INFO:teuthology.orchestra.run.vm03.stdout:Deleting cluster with fsid: b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:23:32.732 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm03.stderr:ceph-fuse[91444]: fuse finished with error 0 and tester_r 0 2026-03-10T14:23:43.638 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid b81bf660-1c89-11f1-b612-27d302cdb124 --force --keep-logs 2026-03-10T14:23:43.732 INFO:teuthology.orchestra.run.vm04.stdout:Deleting cluster with fsid: b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:23:49.444 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T14:23:49.477 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T14:23:49.504 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-10T14:23:49.505 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059/remote/vm03/crash 2026-03-10T14:23:49.505 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/crash -- . 2026-03-10T14:23:49.674 INFO:teuthology.orchestra.run.vm03.stderr:tar: /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/crash: Cannot open: No such file or directory 2026-03-10T14:23:49.674 INFO:teuthology.orchestra.run.vm03.stderr:tar: Error is not recoverable: exiting now 2026-03-10T14:23:49.676 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059/remote/vm04/crash 2026-03-10T14:23:49.676 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/crash -- . 2026-03-10T14:23:49.703 INFO:teuthology.orchestra.run.vm04.stderr:tar: /var/lib/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/crash: Cannot open: No such file or directory 2026-03-10T14:23:49.703 INFO:teuthology.orchestra.run.vm04.stderr:tar: Error is not recoverable: exiting now 2026-03-10T14:23:49.704 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-10T14:23:49.704 DEBUG:teuthology.orchestra.run.vm03:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-10T14:23:49.763 INFO:tasks.cephadm:Compressing logs... 2026-03-10T14:23:49.763 DEBUG:teuthology.orchestra.run.vm03:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T14:23:49.764 DEBUG:teuthology.orchestra.run.vm04:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T14:23:49.788 INFO:teuthology.orchestra.run.vm04.stderr:find: gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T14:23:49.788 INFO:teuthology.orchestra.run.vm04.stderr:‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T14:23:49.788 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-volume.log 2026-03-10T14:23:49.789 INFO:teuthology.orchestra.run.vm03.stderr:find: gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T14:23:49.789 INFO:teuthology.orchestra.run.vm03.stderr:‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T14:23:49.790 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-volume.log: 92.6% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T14:23:49.790 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-client.ceph-exporter.vm04.log 2026-03-10T14:23:49.790 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mgr.vm04.ywwcto.log 2026-03-10T14:23:49.791 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mon.vm03.log 2026-03-10T14:23:49.792 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.log 2026-03-10T14:23:49.793 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mon.vm03.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.audit.log 2026-03-10T14:23:49.793 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mgr.vm03.rwbbep.log 2026-03-10T14:23:49.794 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-client.ceph-exporter.vm04.log: 94.1% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-client.ceph-exporter.vm04.log.gz 2026-03-10T14:23:49.802 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.audit.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.cephadm.log 2026-03-10T14:23:49.802 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mgr.vm03.rwbbep.log: 91.4% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.audit.log.gz 2026-03-10T14:23:49.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mgr.vm04.ywwcto.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mon.vm04.log 2026-03-10T14:23:49.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.audit.log 2026-03-10T14:23:49.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mon.vm04.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.log 2026-03-10T14:23:49.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.audit.log: 94.0% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-volume.log.gz 2026-03-10T14:23:49.802 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.cephadm.log 2026-03-10T14:23:49.802 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.3.log 2026-03-10T14:23:49.803 INFO:teuthology.orchestra.run.vm04.stderr: 88.6% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.log.gz 2026-03-10T14:23:49.804 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.cephadm.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.4.log 2026-03-10T14:23:49.806 INFO:teuthology.orchestra.run.vm04.stderr: 91.5% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.audit.log.gz 2026-03-10T14:23:49.807 INFO:teuthology.orchestra.run.vm04.stderr: 85.2% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.cephadm.log.gz 2026-03-10T14:23:49.809 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.3.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.5.log 2026-03-10T14:23:49.812 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm04.sslxuq.log 2026-03-10T14:23:49.814 INFO:teuthology.orchestra.run.vm03.stderr: 88.6% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.log.gz 2026-03-10T14:23:49.814 INFO:teuthology.orchestra.run.vm03.stderr: 91.8% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T14:23:49.817 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-volume.log 2026-03-10T14:23:49.818 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.cephadm.log: 85.3% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph.cephadm.log.gz 2026-03-10T14:23:49.819 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-client.ceph-exporter.vm03.log 2026-03-10T14:23:49.820 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.0.log 2026-03-10T14:23:49.821 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-client.ceph-exporter.vm03.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.1.log 2026-03-10T14:23:49.821 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.5.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm04.puavjd.log 2026-03-10T14:23:49.825 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm04.sslxuq.log: 89.2% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mgr.vm04.ywwcto.log.gz 2026-03-10T14:23:49.830 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.0.log: 92.7% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-client.ceph-exporter.vm03.log.gz 2026-03-10T14:23:49.834 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.1.log 2026-03-10T14:23:49.834 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.2.log 2026-03-10T14:23:49.835 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.1.log: 94.1% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-volume.log.gz 2026-03-10T14:23:49.836 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm03.aqaspa.log 2026-03-10T14:23:49.837 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm03.itwezo.log 2026-03-10T14:23:49.856 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm03.aqaspa.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.0.log 2026-03-10T14:23:50.418 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm03.itwezo.log: /var/log/ceph/ceph-client.0.log: 89.4% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mgr.vm03.rwbbep.log.gz 2026-03-10T14:23:50.429 INFO:teuthology.orchestra.run.vm04.stderr:/var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm04.puavjd.log: /var/log/ceph/ceph-client.1.log: 92.2% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mon.vm04.log.gz 2026-03-10T14:23:51.629 INFO:teuthology.orchestra.run.vm03.stderr: 90.4% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mon.vm03.log.gz 2026-03-10T14:23:59.524 INFO:teuthology.orchestra.run.vm03.stderr: 94.9% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm03.itwezo.log.gz 2026-03-10T14:23:59.722 INFO:teuthology.orchestra.run.vm04.stderr: 93.5% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.4.log.gz 2026-03-10T14:24:01.989 INFO:teuthology.orchestra.run.vm03.stderr: 93.6% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.2.log.gz 2026-03-10T14:24:02.207 INFO:teuthology.orchestra.run.vm03.stderr: 93.1% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.0.log.gz 2026-03-10T14:24:03.598 INFO:teuthology.orchestra.run.vm04.stderr: 94.1% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.5.log.gz 2026-03-10T14:24:03.691 INFO:teuthology.orchestra.run.vm04.stderr: 93.6% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.3.log.gz 2026-03-10T14:24:04.170 INFO:teuthology.orchestra.run.vm03.stderr: 93.1% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-osd.1.log.gz 2026-03-10T14:24:04.595 INFO:teuthology.orchestra.run.vm04.stderr: 95.1% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm04.puavjd.log.gz 2026-03-10T14:24:08.237 INFO:teuthology.orchestra.run.vm03.stderr:gzip: /var/log/ceph/ceph-client.0.log: file size changed while zipping 2026-03-10T14:24:08.373 INFO:teuthology.orchestra.run.vm03.stderr: 93.6% -- replaced with /var/log/ceph/ceph-client.0.log.gz 2026-03-10T14:24:08.404 INFO:teuthology.orchestra.run.vm04.stderr:gzip: /var/log/ceph/ceph-client.1.log: file size changed while zipping 2026-03-10T14:24:08.404 INFO:teuthology.orchestra.run.vm04.stderr: 93.5% -- replaced with /var/log/ceph/ceph-client.1.log.gz 2026-03-10T14:24:31.706 INFO:teuthology.orchestra.run.vm03.stderr: 93.6% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm03.aqaspa.log.gz 2026-03-10T14:24:31.708 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-10T14:24:31.708 INFO:teuthology.orchestra.run.vm03.stderr:real 0m41.929s 2026-03-10T14:24:31.708 INFO:teuthology.orchestra.run.vm03.stderr:user 0m55.034s 2026-03-10T14:24:31.708 INFO:teuthology.orchestra.run.vm03.stderr:sys 0m3.920s 2026-03-10T14:24:38.416 INFO:teuthology.orchestra.run.vm04.stderr: 93.3% -- replaced with /var/log/ceph/b81bf660-1c89-11f1-b612-27d302cdb124/ceph-mds.cephfs.vm04.sslxuq.log.gz 2026-03-10T14:24:38.418 INFO:teuthology.orchestra.run.vm04.stderr: 2026-03-10T14:24:38.418 INFO:teuthology.orchestra.run.vm04.stderr:real 0m48.640s 2026-03-10T14:24:38.418 INFO:teuthology.orchestra.run.vm04.stderr:user 1m1.670s 2026-03-10T14:24:38.418 INFO:teuthology.orchestra.run.vm04.stderr:sys 0m4.161s 2026-03-10T14:24:38.418 INFO:tasks.cephadm:Archiving logs... 2026-03-10T14:24:38.418 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059/remote/vm03/log 2026-03-10T14:24:38.418 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T14:24:41.752 DEBUG:teuthology.misc:Transferring archived files from vm04:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059/remote/vm04/log 2026-03-10T14:24:41.752 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T14:24:44.872 INFO:tasks.cephadm:Removing cluster... 2026-03-10T14:24:44.873 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid b81bf660-1c89-11f1-b612-27d302cdb124 --force 2026-03-10T14:24:44.976 INFO:teuthology.orchestra.run.vm03.stdout:Deleting cluster with fsid: b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:24:45.305 DEBUG:teuthology.orchestra.run.vm04:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid b81bf660-1c89-11f1-b612-27d302cdb124 --force 2026-03-10T14:24:45.415 INFO:teuthology.orchestra.run.vm04.stdout:Deleting cluster with fsid: b81bf660-1c89-11f1-b612-27d302cdb124 2026-03-10T14:24:45.749 INFO:tasks.cephadm:Removing cephadm ... 2026-03-10T14:24:45.749 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T14:24:45.764 DEBUG:teuthology.orchestra.run.vm04:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T14:24:45.779 INFO:tasks.cephadm:Teardown complete 2026-03-10T14:24:45.779 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-10T14:24:45.782 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/teuthology/teuthology/task/install/__init__.py", line 644, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T14:24:45.782 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-10T14:24:45.782 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T14:24:45.805 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T14:24:45.854 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T14:24:45.854 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-10T14:24:45.854 DEBUG:teuthology.orchestra.run.vm03:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T14:24:45.854 DEBUG:teuthology.orchestra.run.vm03:> sudo yum -y remove $d || true 2026-03-10T14:24:45.854 DEBUG:teuthology.orchestra.run.vm03:> done 2026-03-10T14:24:45.861 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T14:24:45.861 DEBUG:teuthology.orchestra.run.vm04:> 2026-03-10T14:24:45.861 DEBUG:teuthology.orchestra.run.vm04:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T14:24:45.861 DEBUG:teuthology.orchestra.run.vm04:> sudo yum -y remove $d || true 2026-03-10T14:24:45.861 DEBUG:teuthology.orchestra.run.vm04:> done 2026-03-10T14:24:46.141 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 @ceph 31 M 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:Remove 2 Packages 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 31 M 2026-03-10T14:24:46.142 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:46.147 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:46.147 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:46.163 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:46.163 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 @ceph 31 M 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 31 M 2026-03-10T14:24:46.179 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:46.184 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:46.184 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:46.197 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:46.200 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:46.200 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:46.223 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:46.223 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:46.223 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T14:24:46.223 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T14:24:46.223 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T14:24:46.223 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.225 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:46.234 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:46.235 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:46.249 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T14:24:46.261 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:46.261 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:46.261 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T14:24:46.261 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T14:24:46.261 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T14:24:46.261 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.262 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:46.273 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:46.287 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T14:24:46.317 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T14:24:46.317 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:46.364 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T14:24:46.364 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:46.375 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T14:24:46.375 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.375 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:46.375 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T14:24:46.375 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.375 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:46.421 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T14:24:46.421 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.421 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:46.421 INFO:teuthology.orchestra.run.vm04.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T14:24:46.421 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.421 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test x86_64 2:18.2.0-0.el9 @ceph 164 M 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:46.597 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:46.598 INFO:teuthology.orchestra.run.vm03.stdout:Remove 4 Packages 2026-03-10T14:24:46.598 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.598 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 166 M 2026-03-10T14:24:46.598 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:46.600 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:46.600 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:46.632 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:46.632 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:46.642 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test x86_64 2:18.2.0-0.el9 @ceph 164 M 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:Remove 4 Packages 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 166 M 2026-03-10T14:24:46.643 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:46.646 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:46.646 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:46.673 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:46.673 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:46.690 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:46.698 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T14:24:46.701 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T14:24:46.705 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T14:24:46.724 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:46.725 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T14:24:46.731 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T14:24:46.734 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T14:24:46.737 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T14:24:46.754 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T14:24:46.806 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T14:24:46.807 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T14:24:46.807 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T14:24:46.807 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T14:24:46.823 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T14:24:46.823 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T14:24:46.823 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T14:24:46.823 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T14:24:46.860 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T14:24:46.860 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.860 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:46.860 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test-2:18.2.0-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T14:24:46.860 INFO:teuthology.orchestra.run.vm03.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T14:24:46.860 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:46.860 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:46.879 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T14:24:46.879 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.879 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:46.879 INFO:teuthology.orchestra.run.vm04.stdout: ceph-test-2:18.2.0-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T14:24:46.879 INFO:teuthology.orchestra.run.vm04.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T14:24:46.879 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:46.879 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:47.091 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: ceph x86_64 2:18.2.0-0.el9 @ceph 0 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds x86_64 2:18.2.0-0.el9 @ceph 6.4 M 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon x86_64 2:18.2.0-0.el9 @ceph 20 M 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd x86_64 2:18.2.0-0.el9 @ceph 61 M 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.092 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:47.093 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:47.093 INFO:teuthology.orchestra.run.vm03.stdout:Remove 8 Packages 2026-03-10T14:24:47.093 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.093 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 89 M 2026-03-10T14:24:47.093 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:47.095 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:47.096 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:47.103 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: ceph x86_64 2:18.2.0-0.el9 @ceph 0 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mds x86_64 2:18.2.0-0.el9 @ceph 6.4 M 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon x86_64 2:18.2.0-0.el9 @ceph 20 M 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: ceph-osd x86_64 2:18.2.0-0.el9 @ceph 61 M 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:Remove 8 Packages 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 89 M 2026-03-10T14:24:47.104 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:47.107 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:47.107 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:47.120 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:47.120 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:47.132 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:47.132 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:47.160 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:47.161 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T14:24:47.175 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:47.183 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T14:24:47.183 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:47.183 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T14:24:47.183 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T14:24:47.183 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T14:24:47.183 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.184 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T14:24:47.185 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T14:24:47.194 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T14:24:47.207 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T14:24:47.207 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T14:24:47.207 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.209 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T14:24:47.210 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T14:24:47.210 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:47.210 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T14:24:47.210 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T14:24:47.210 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T14:24:47.210 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.214 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T14:24:47.222 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T14:24:47.226 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T14:24:47.229 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T14:24:47.232 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T14:24:47.233 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T14:24:47.235 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T14:24:47.235 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T14:24:47.235 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.236 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T14:24:47.254 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T14:24:47.257 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T14:24:47.259 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T14:24:47.259 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T14:24:47.259 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:47.260 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T14:24:47.260 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T14:24:47.260 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T14:24:47.260 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.260 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T14:24:47.261 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T14:24:47.268 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T14:24:47.283 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T14:24:47.284 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:47.284 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T14:24:47.284 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T14:24:47.284 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T14:24:47.284 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.284 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T14:24:47.286 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T14:24:47.286 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:47.286 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T14:24:47.286 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T14:24:47.286 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T14:24:47.286 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.287 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T14:24:47.290 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T14:24:47.314 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T14:24:47.314 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:47.315 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T14:24:47.315 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T14:24:47.315 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T14:24:47.315 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.315 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T14:24:47.376 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T14:24:47.376 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T14:24:47.376 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T14:24:47.376 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 3/8 2026-03-10T14:24:47.377 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 4/8 2026-03-10T14:24:47.377 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T14:24:47.377 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T14:24:47.377 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T14:24:47.405 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T14:24:47.405 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T14:24:47.405 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T14:24:47.405 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 3/8 2026-03-10T14:24:47.405 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 4/8 2026-03-10T14:24:47.405 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T14:24:47.405 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T14:24:47.405 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout: ceph-2:18.2.0-0.el9.x86_64 ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.442 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout: ceph-2:18.2.0-0.el9.x86_64 ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.457 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:47.654 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:47.658 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base x86_64 2:18.2.0-0.el9 @ceph 22 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 @ceph 399 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 @ceph 4.5 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 654 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 @ceph-noarch 7.1 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 @ceph-noarch 66 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 @ceph-noarch 567 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common x86_64 2:18.2.0-0.el9 @ceph 70 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 @ceph-noarch 319 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 @ceph-noarch 1.4 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 @ceph-noarch 40 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 @ceph 138 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 @ceph 426 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 @ceph 1.5 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 @ceph 570 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T14:24:47.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout:Remove 84 Packages 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 433 M 2026-03-10T14:24:47.660 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:47.680 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base x86_64 2:18.2.0-0.el9 @ceph 22 M 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 @ceph 399 k 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 @ceph 4.5 M 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 654 k 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 @ceph-noarch 7.1 M 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 @ceph-noarch 66 M 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 @ceph-noarch 567 k 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T14:24:47.690 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common x86_64 2:18.2.0-0.el9 @ceph 70 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 @ceph-noarch 319 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 @ceph-noarch 1.4 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 @ceph-noarch 40 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 @ceph 138 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 @ceph 426 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 @ceph 1.5 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 @ceph 570 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T14:24:47.691 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout:Remove 84 Packages 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 433 M 2026-03-10T14:24:47.692 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:47.712 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:47.712 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:47.805 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:47.806 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:47.848 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:47.848 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:47.959 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:47.959 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T14:24:47.969 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T14:24:47.990 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T14:24:47.990 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:47.990 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T14:24:47.990 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T14:24:47.991 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T14:24:47.991 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:47.991 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T14:24:47.992 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:47.992 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T14:24:48.003 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T14:24:48.006 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T14:24:48.015 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 3/84 2026-03-10T14:24:48.015 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T14:24:48.022 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T14:24:48.022 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:48.022 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T14:24:48.022 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T14:24:48.022 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T14:24:48.022 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:48.023 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T14:24:48.039 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T14:24:48.049 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 3/84 2026-03-10T14:24:48.050 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T14:24:48.076 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T14:24:48.086 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T14:24:48.092 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T14:24:48.092 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T14:24:48.106 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T14:24:48.113 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T14:24:48.117 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T14:24:48.117 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T14:24:48.119 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T14:24:48.125 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T14:24:48.127 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T14:24:48.129 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T14:24:48.132 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T14:24:48.132 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T14:24:48.141 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T14:24:48.145 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T14:24:48.153 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T14:24:48.155 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T14:24:48.158 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T14:24:48.161 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T14:24:48.162 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T14:24:48.167 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T14:24:48.172 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T14:24:48.175 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T14:24:48.253 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T14:24:48.255 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T14:24:48.275 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T14:24:48.282 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T14:24:48.293 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T14:24:48.296 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T14:24:48.300 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T14:24:48.303 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T14:24:48.306 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T14:24:48.315 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T14:24:48.323 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T14:24:48.323 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T14:24:48.330 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T14:24:48.338 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T14:24:48.347 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T14:24:48.349 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T14:24:48.359 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T14:24:48.367 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T14:24:48.367 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T14:24:48.376 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T14:24:48.437 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T14:24:48.472 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T14:24:48.478 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T14:24:48.481 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T14:24:48.506 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T14:24:48.531 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T14:24:48.554 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T14:24:48.558 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T14:24:48.561 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T14:24:48.564 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T14:24:48.566 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T14:24:48.567 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T14:24:48.569 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T14:24:48.574 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T14:24:48.580 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T14:24:48.584 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T14:24:48.585 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T14:24:48.589 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T14:24:48.591 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T14:24:48.592 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T14:24:48.595 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T14:24:48.596 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T14:24:48.599 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T14:24:48.602 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T14:24:48.606 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T14:24:48.622 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T14:24:48.632 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T14:24:48.637 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T14:24:48.648 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T14:24:48.661 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T14:24:48.663 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T14:24:48.666 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T14:24:48.669 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T14:24:48.671 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T14:24:48.688 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T14:24:48.696 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T14:24:48.696 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:48.696 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T14:24:48.696 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T14:24:48.696 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T14:24:48.696 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:48.696 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T14:24:48.701 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T14:24:48.705 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T14:24:48.705 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T14:24:48.709 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T14:24:48.713 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T14:24:48.716 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T14:24:48.729 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T14:24:48.729 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:48.729 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T14:24:48.729 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:48.729 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T14:24:48.737 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T14:24:48.737 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:48.737 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T14:24:48.738 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T14:24:48.738 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T14:24:48.738 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:48.738 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T14:24:48.739 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T14:24:48.741 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T14:24:48.743 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T14:24:48.746 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T14:24:48.748 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T14:24:48.749 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T14:24:48.750 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T14:24:48.753 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T14:24:48.756 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T14:24:48.759 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T14:24:48.767 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T14:24:48.768 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T14:24:48.768 INFO:teuthology.orchestra.run.vm04.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T14:24:48.768 INFO:teuthology.orchestra.run.vm04.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T14:24:48.768 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:48.769 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T14:24:48.772 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T14:24:48.774 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T14:24:48.777 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T14:24:48.779 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T14:24:48.780 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T14:24:48.781 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T14:24:48.784 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T14:24:48.786 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T14:24:48.789 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T14:24:48.791 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T14:24:48.793 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T14:24:48.796 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T14:24:48.797 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T14:24:48.800 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T14:24:48.801 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T14:24:48.804 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T14:24:48.807 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T14:24:48.808 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T14:24:48.811 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T14:24:48.814 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T14:24:48.818 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T14:24:48.818 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T14:24:48.825 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T14:24:48.826 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T14:24:48.827 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T14:24:48.831 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T14:24:48.832 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T14:24:48.834 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T14:24:48.836 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T14:24:48.838 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T14:24:48.840 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 71/84 2026-03-10T14:24:48.840 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T14:24:48.846 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T14:24:48.846 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 72/84 2026-03-10T14:24:48.850 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T14:24:48.851 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T14:24:48.856 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T14:24:48.863 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T14:24:48.869 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T14:24:48.872 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T14:24:48.872 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T14:24:48.872 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:48.872 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T14:24:48.877 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T14:24:48.879 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T14:24:48.886 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T14:24:48.893 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T14:24:48.897 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T14:24:48.899 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T14:24:48.899 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T14:24:48.900 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T14:24:48.903 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 71/84 2026-03-10T14:24:48.910 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 72/84 2026-03-10T14:24:48.915 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T14:24:48.935 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T14:24:48.935 INFO:teuthology.orchestra.run.vm04.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T14:24:48.935 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:48.942 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T14:24:48.963 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T14:24:48.963 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /sys 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /proc 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /mnt 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /var/tmp 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /home 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /root 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout:skipping the directory /tmp 2026-03-10T14:24:55.172 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /sys 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /proc 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /mnt 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /var/tmp 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /home 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /root 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /tmp 2026-03-10T14:24:55.175 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:55.183 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T14:24:55.198 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T14:24:55.211 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T14:24:55.215 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ceph-common-2:18.2.0-0.el9.x86_64 77/84 2026-03-10T14:24:55.218 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T14:24:55.221 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T14:24:55.221 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T14:24:55.227 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T14:24:55.230 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ceph-common-2:18.2.0-0.el9.x86_64 77/84 2026-03-10T14:24:55.233 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T14:24:55.235 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T14:24:55.235 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T14:24:55.235 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T14:24:55.239 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T14:24:55.242 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T14:24:55.246 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T14:24:55.246 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T14:24:55.250 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T14:24:55.252 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T14:24:55.254 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T14:24:55.257 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T14:24:55.257 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 1/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 3/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 4/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 5/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 6/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 7/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 8/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 9/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 10/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 11/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 12/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T14:24:55.353 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 17/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 21/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 30/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T14:24:55.356 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T14:24:55.357 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T14:24:55.358 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 1/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 3/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 4/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 5/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 6/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 7/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 8/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 9/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 10/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 11/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 12/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T14:24:55.359 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 17/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 21/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 30/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T14:24:55.361 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T14:24:55.362 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T14:24:55.436 INFO:teuthology.orchestra.run.vm04.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:55.437 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T14:24:55.458 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:55.459 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:55.678 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:55.678 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:55.678 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:55.678 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:55.678 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:55.678 INFO:teuthology.orchestra.run.vm04.stdout: cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 200 k 2026-03-10T14:24:55.678 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:55.679 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:55.679 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:55.679 INFO:teuthology.orchestra.run.vm04.stdout:Remove 1 Package 2026-03-10T14:24:55.679 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:55.679 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 200 k 2026-03-10T14:24:55.679 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:55.680 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:55.680 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:55.682 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:55.682 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout: cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 200 k 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:Remove 1 Package 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 200 k 2026-03-10T14:24:55.693 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:55.695 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:55.695 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:55.697 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:55.697 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:55.700 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:55.700 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T14:24:55.715 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:55.716 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T14:24:55.828 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T14:24:55.834 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T14:24:55.872 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T14:24:55.873 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:55.873 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:55.873 INFO:teuthology.orchestra.run.vm04.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.873 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:55.873 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:55.878 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T14:24:55.878 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:55.878 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:55.878 INFO:teuthology.orchestra.run.vm03.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T14:24:55.878 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:55.878 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:56.077 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T14:24:56.077 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:56.079 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:56.080 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:56.080 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:56.097 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T14:24:56.097 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:56.100 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:56.101 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:56.101 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:56.264 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr 2026-03-10T14:24:56.264 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:56.267 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:56.267 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:56.267 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:56.272 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr 2026-03-10T14:24:56.272 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:56.275 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:56.276 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:56.276 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:56.445 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T14:24:56.445 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:56.448 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:56.448 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:56.448 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:56.449 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T14:24:56.449 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:56.452 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:56.452 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:56.452 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:56.618 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T14:24:56.618 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:56.620 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T14:24:56.620 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:56.621 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:56.621 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:56.621 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:56.623 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:56.624 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:56.624 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:56.805 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-rook 2026-03-10T14:24:56.806 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:56.807 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-rook 2026-03-10T14:24:56.807 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:56.809 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:56.810 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:56.810 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:56.810 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:56.810 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:56.810 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:56.995 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T14:24:56.995 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:56.998 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:56.998 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:56.998 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:56.999 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T14:24:57.000 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:57.002 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:57.003 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:57.003 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:57.192 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 @ceph 2.4 M 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout:Remove 1 Package 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 2.4 M 2026-03-10T14:24:57.193 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:57.195 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:57.195 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:57.202 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 @ceph 2.4 M 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout:Remove 1 Package 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 2.4 M 2026-03-10T14:24:57.203 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:57.205 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:57.205 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:57.209 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:57.209 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:57.221 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:57.221 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:57.239 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:57.251 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:57.253 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T14:24:57.271 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T14:24:57.330 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T14:24:57.344 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T14:24:57.469 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T14:24:57.469 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:57.469 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:57.469 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:57.469 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:57.469 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:57.470 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T14:24:57.470 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:57.470 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:57.470 INFO:teuthology.orchestra.run.vm04.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:57.470 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:57.470 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:57.680 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel x86_64 2:18.2.0-0.el9 @ceph 456 k 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 @ceph 137 k 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:Remove 2 Packages 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 593 k 2026-03-10T14:24:57.681 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:57.683 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:57.683 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:57.686 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout: Package Architecture Version Repository Size 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout: librados-devel x86_64 2:18.2.0-0.el9 @ceph 456 k 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 @ceph 137 k 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:Remove 2 Packages 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 593 k 2026-03-10T14:24:57.687 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:57.689 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:57.689 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:57.694 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:57.694 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:57.700 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:57.700 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:57.722 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:57.724 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:57.727 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:57.729 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:57.737 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T14:24:57.742 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T14:24:57.798 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T14:24:57.798 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:57.802 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T14:24:57.803 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T14:24:57.846 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T14:24:57.846 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:57.846 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:57.846 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:57.846 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:57.846 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:57.847 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T14:24:57.847 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:57.847 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:57.847 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:57.847 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:57.847 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:58.056 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 @ceph 1.8 M 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 @ceph 480 k 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 @ceph 186 k 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:Remove 3 Packages 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 2.5 M 2026-03-10T14:24:58.057 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:58.059 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:58.059 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:58.062 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:58.063 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:58.063 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T14:24:58.063 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 @ceph 1.8 M 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 @ceph 480 k 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 @ceph 186 k 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout:Remove 3 Packages 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 2.5 M 2026-03-10T14:24:58.064 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:58.066 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:58.066 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:58.072 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:58.072 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:58.078 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:58.079 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:58.101 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:58.104 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cephfs-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T14:24:58.105 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T14:24:58.105 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T14:24:58.107 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:58.110 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-cephfs-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T14:24:58.111 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T14:24:58.111 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T14:24:58.175 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T14:24:58.175 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T14:24:58.175 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T14:24:58.179 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T14:24:58.179 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T14:24:58.179 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T14:24:58.216 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T14:24:58.216 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.216 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:58.216 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.216 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.217 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.217 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.217 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:58.226 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T14:24:58.226 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.226 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:58.226 INFO:teuthology.orchestra.run.vm04.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.226 INFO:teuthology.orchestra.run.vm04.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.226 INFO:teuthology.orchestra.run.vm04.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.226 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.226 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:58.384 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: libcephfs-devel 2026-03-10T14:24:58.384 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:58.387 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:58.387 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:58.387 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:58.401 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: libcephfs-devel 2026-03-10T14:24:58.402 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:58.404 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:58.405 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:58.405 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:58.584 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: librados2 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw x86_64 2:18.2.0-0.el9 @ceph 269 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 @ceph 230 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 @ceph 490 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: librbd1 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: librgw2 x86_64 2:18.2.0-0.el9 @ceph 15 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T14:24:58.585 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout:Remove 21 Packages 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 74 M 2026-03-10T14:24:58.586 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-10T14:24:58.590 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-10T14:24:58.590 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-10T14:24:58.610 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: Package Arch Version Repository Size 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout:Removing: 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: librados2 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout:Removing dependent packages: 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw x86_64 2:18.2.0-0.el9 @ceph 269 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 @ceph 230 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 @ceph 490 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout:Removing unused dependencies: 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: librbd1 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: librgw2 x86_64 2:18.2.0-0.el9 @ceph 15 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T14:24:58.612 INFO:teuthology.orchestra.run.vm04.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T14:24:58.613 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.613 INFO:teuthology.orchestra.run.vm04.stdout:Transaction Summary 2026-03-10T14:24:58.613 INFO:teuthology.orchestra.run.vm04.stdout:================================================================================ 2026-03-10T14:24:58.613 INFO:teuthology.orchestra.run.vm04.stdout:Remove 21 Packages 2026-03-10T14:24:58.613 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.613 INFO:teuthology.orchestra.run.vm04.stdout:Freed space: 74 M 2026-03-10T14:24:58.613 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction check 2026-03-10T14:24:58.614 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-10T14:24:58.614 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-10T14:24:58.617 INFO:teuthology.orchestra.run.vm04.stdout:Transaction check succeeded. 2026-03-10T14:24:58.617 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction test 2026-03-10T14:24:58.643 INFO:teuthology.orchestra.run.vm04.stdout:Transaction test succeeded. 2026-03-10T14:24:58.643 INFO:teuthology.orchestra.run.vm04.stdout:Running transaction 2026-03-10T14:24:58.658 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-10T14:24:58.661 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-nbd-2:18.2.0-0.el9.x86_64 1/21 2026-03-10T14:24:58.664 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-fuse-2:18.2.0-0.el9.x86_64 2/21 2026-03-10T14:24:58.666 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rgw-2:18.2.0-0.el9.x86_64 3/21 2026-03-10T14:24:58.666 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T14:24:58.681 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T14:24:58.683 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T14:24:58.685 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rbd-2:18.2.0-0.el9.x86_64 6/21 2026-03-10T14:24:58.687 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rados-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T14:24:58.687 INFO:teuthology.orchestra.run.vm04.stdout: Preparing : 1/1 2026-03-10T14:24:58.689 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T14:24:58.689 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T14:24:58.690 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-nbd-2:18.2.0-0.el9.x86_64 1/21 2026-03-10T14:24:58.693 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : rbd-fuse-2:18.2.0-0.el9.x86_64 2/21 2026-03-10T14:24:58.695 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rgw-2:18.2.0-0.el9.x86_64 3/21 2026-03-10T14:24:58.695 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T14:24:58.706 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T14:24:58.706 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T14:24:58.707 INFO:teuthology.orchestra.run.vm03.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T14:24:58.707 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.710 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T14:24:58.713 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T14:24:58.715 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rbd-2:18.2.0-0.el9.x86_64 6/21 2026-03-10T14:24:58.717 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : python3-rados-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T14:24:58.719 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T14:24:58.720 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T14:24:58.721 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T14:24:58.723 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T14:24:58.726 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T14:24:58.728 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T14:24:58.730 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T14:24:58.734 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T14:24:58.735 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T14:24:58.735 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T14:24:58.735 INFO:teuthology.orchestra.run.vm04.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T14:24:58.736 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.738 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T14:24:58.741 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T14:24:58.744 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T14:24:58.746 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T14:24:58.748 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T14:24:58.750 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T14:24:58.753 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T14:24:58.755 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T14:24:58.757 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T14:24:58.759 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T14:24:58.763 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T14:24:58.763 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T14:24:58.766 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T14:24:58.769 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T14:24:58.772 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T14:24:58.774 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T14:24:58.776 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T14:24:58.790 INFO:teuthology.orchestra.run.vm04.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T14:24:58.831 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T14:24:58.831 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T14:24:58.831 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T14:24:58.831 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T14:24:58.831 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 8/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 14/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 15/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 16/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 18/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 19/21 2026-03-10T14:24:58.832 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T14:24:58.853 INFO:teuthology.orchestra.run.vm04.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T14:24:58.853 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 8/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 14/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 15/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 16/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 18/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 19/21 2026-03-10T14:24:58.854 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: librados2-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-10T14:24:58.885 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout:Removed: 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: librados2-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.905 INFO:teuthology.orchestra.run.vm04.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout: 2026-03-10T14:24:58.906 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:59.094 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: librbd1 2026-03-10T14:24:59.095 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:59.098 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:59.099 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:59.099 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:59.135 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: librbd1 2026-03-10T14:24:59.135 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:59.139 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:59.139 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:59.140 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:59.292 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rados 2026-03-10T14:24:59.292 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:59.296 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:59.296 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:59.296 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:59.352 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rados 2026-03-10T14:24:59.352 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:59.355 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:59.356 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:59.356 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:59.477 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rgw 2026-03-10T14:24:59.478 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:59.481 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:59.481 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:59.482 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:59.534 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rgw 2026-03-10T14:24:59.534 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:59.538 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:59.538 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:59.538 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:59.667 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-cephfs 2026-03-10T14:24:59.667 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:59.670 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:59.671 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:59.671 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:59.720 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-cephfs 2026-03-10T14:24:59.721 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:59.724 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:59.725 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:59.725 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:24:59.864 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rbd 2026-03-10T14:24:59.864 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:24:59.868 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:24:59.868 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:24:59.868 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:24:59.905 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: python3-rbd 2026-03-10T14:24:59.905 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:24:59.909 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:24:59.909 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:24:59.909 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:25:00.056 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-fuse 2026-03-10T14:25:00.057 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:25:00.060 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:25:00.060 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:25:00.061 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:25:00.087 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-fuse 2026-03-10T14:25:00.087 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:25:00.091 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:25:00.091 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:25:00.091 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:25:00.255 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-mirror 2026-03-10T14:25:00.255 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:25:00.259 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:25:00.260 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:25:00.260 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:25:00.271 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-mirror 2026-03-10T14:25:00.271 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:25:00.274 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:25:00.275 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:25:00.275 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:25:00.447 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-nbd 2026-03-10T14:25:00.447 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-10T14:25:00.451 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-10T14:25:00.451 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-10T14:25:00.451 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-10T14:25:00.462 INFO:teuthology.orchestra.run.vm04.stdout:No match for argument: rbd-nbd 2026-03-10T14:25:00.462 INFO:teuthology.orchestra.run.vm04.stderr:No packages marked for removal. 2026-03-10T14:25:00.465 INFO:teuthology.orchestra.run.vm04.stdout:Dependencies resolved. 2026-03-10T14:25:00.466 INFO:teuthology.orchestra.run.vm04.stdout:Nothing to do. 2026-03-10T14:25:00.466 INFO:teuthology.orchestra.run.vm04.stdout:Complete! 2026-03-10T14:25:00.480 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean all 2026-03-10T14:25:00.492 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean all 2026-03-10T14:25:00.605 INFO:teuthology.orchestra.run.vm03.stdout:56 files removed 2026-03-10T14:25:00.620 INFO:teuthology.orchestra.run.vm04.stdout:56 files removed 2026-03-10T14:25:00.632 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T14:25:00.635 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T14:25:00.658 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean expire-cache 2026-03-10T14:25:00.659 DEBUG:teuthology.orchestra.run.vm04:> sudo yum clean expire-cache 2026-03-10T14:25:00.818 INFO:teuthology.orchestra.run.vm04.stdout:Cache was expired 2026-03-10T14:25:00.818 INFO:teuthology.orchestra.run.vm04.stdout:0 files removed 2026-03-10T14:25:00.821 INFO:teuthology.orchestra.run.vm03.stdout:Cache was expired 2026-03-10T14:25:00.821 INFO:teuthology.orchestra.run.vm03.stdout:0 files removed 2026-03-10T14:25:00.842 DEBUG:teuthology.parallel:result is None 2026-03-10T14:25:00.844 DEBUG:teuthology.parallel:result is None 2026-03-10T14:25:00.845 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm03.local 2026-03-10T14:25:00.845 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm04.local 2026-03-10T14:25:00.845 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T14:25:00.845 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T14:25:00.872 DEBUG:teuthology.orchestra.run.vm03:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T14:25:00.874 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T14:25:00.943 DEBUG:teuthology.parallel:result is None 2026-03-10T14:25:00.944 DEBUG:teuthology.parallel:result is None 2026-03-10T14:25:00.945 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-10T14:25:00.947 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-10T14:25:00.948 DEBUG:teuthology.orchestra.run.vm03:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T14:25:00.985 DEBUG:teuthology.orchestra.run.vm04:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T14:25:00.997 INFO:teuthology.orchestra.run.vm03.stderr:bash: line 1: ntpq: command not found 2026-03-10T14:25:01.002 INFO:teuthology.orchestra.run.vm04.stderr:bash: line 1: ntpq: command not found 2026-03-10T14:25:01.044 INFO:teuthology.orchestra.run.vm04.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm04.stdout:=============================================================================== 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm04.stdout:^- where-you.at 2 7 377 56 +1864us[+1864us] +/- 42ms 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm04.stdout:^* 79.133.44.143 1 7 377 82 +1944us[+1945us] +/- 10ms 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm04.stdout:^+ ntp1.lwlcom.net 1 7 377 84 -1363us[-1362us] +/- 15ms 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm04.stdout:^+ ntp2.lwlcom.net 1 7 377 83 -1351us[-1351us] +/- 15ms 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm03.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm03.stdout:=============================================================================== 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm03.stdout:^- gw-001.oit.one 2 7 377 18 -468us[ -468us] +/- 51ms 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm03.stdout:^* 79.133.44.143 1 7 377 81 +1953us[+1961us] +/- 10ms 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm03.stdout:^+ ntp1.lwlcom.net 1 7 377 85 -1350us[-1343us] +/- 16ms 2026-03-10T14:25:01.045 INFO:teuthology.orchestra.run.vm03.stdout:^+ ntp2.lwlcom.net 1 7 377 84 -1349us[-1341us] +/- 15ms 2026-03-10T14:25:01.045 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-10T14:25:01.048 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-10T14:25:01.048 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-10T14:25:01.051 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-10T14:25:01.054 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-10T14:25:01.057 INFO:teuthology.task.internal:Duration was 1570.714498 seconds 2026-03-10T14:25:01.057 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-10T14:25:01.059 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-10T14:25:01.060 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T14:25:01.088 DEBUG:teuthology.orchestra.run.vm04:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T14:25:01.127 INFO:teuthology.orchestra.run.vm03.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T14:25:01.132 INFO:teuthology.orchestra.run.vm04.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T14:25:01.384 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-10T14:25:01.384 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm03.local 2026-03-10T14:25:01.384 DEBUG:teuthology.orchestra.run.vm03:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T14:25:01.453 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm04.local 2026-03-10T14:25:01.453 DEBUG:teuthology.orchestra.run.vm04:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T14:25:01.484 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-10T14:25:01.484 DEBUG:teuthology.orchestra.run.vm03:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T14:25:01.496 DEBUG:teuthology.orchestra.run.vm04:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T14:25:02.257 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-10T14:25:02.257 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T14:25:02.259 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T14:25:02.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T14:25:02.282 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T14:25:02.282 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T14:25:02.283 INFO:teuthology.orchestra.run.vm04.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T14:25:02.283 INFO:teuthology.orchestra.run.vm04.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T14:25:02.284 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T14:25:02.284 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T14:25:02.285 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T14:25:02.285 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T14:25:02.286 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T14:25:02.415 INFO:teuthology.orchestra.run.vm04.stderr: 97.7% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T14:25:02.466 INFO:teuthology.orchestra.run.vm03.stderr: 96.8% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T14:25:02.468 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-10T14:25:02.472 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-10T14:25:02.472 DEBUG:teuthology.orchestra.run.vm03:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T14:25:02.538 DEBUG:teuthology.orchestra.run.vm04:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T14:25:02.566 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-10T14:25:02.569 DEBUG:teuthology.orchestra.run.vm03:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T14:25:02.580 DEBUG:teuthology.orchestra.run.vm04:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T14:25:02.606 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = core 2026-03-10T14:25:02.634 INFO:teuthology.orchestra.run.vm04.stdout:kernel.core_pattern = core 2026-03-10T14:25:02.649 DEBUG:teuthology.orchestra.run.vm03:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T14:25:02.680 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T14:25:02.680 DEBUG:teuthology.orchestra.run.vm04:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T14:25:02.706 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T14:25:02.706 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-10T14:25:02.709 INFO:teuthology.task.internal:Transferring archived files... 2026-03-10T14:25:02.710 DEBUG:teuthology.misc:Transferring archived files from vm03:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059/remote/vm03 2026-03-10T14:25:02.710 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T14:25:02.759 DEBUG:teuthology.misc:Transferring archived files from vm04:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1059/remote/vm04 2026-03-10T14:25:02.759 DEBUG:teuthology.orchestra.run.vm04:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T14:25:02.796 INFO:teuthology.task.internal:Removing archive directory... 2026-03-10T14:25:02.796 DEBUG:teuthology.orchestra.run.vm03:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T14:25:02.798 DEBUG:teuthology.orchestra.run.vm04:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T14:25:02.856 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-10T14:25:02.860 INFO:teuthology.task.internal:Not uploading archives. 2026-03-10T14:25:02.860 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-10T14:25:02.863 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-10T14:25:02.864 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T14:25:02.866 DEBUG:teuthology.orchestra.run.vm04:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T14:25:02.883 INFO:teuthology.orchestra.run.vm03.stdout: 8532146 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 10 14:25 /home/ubuntu/cephtest 2026-03-10T14:25:02.883 INFO:teuthology.orchestra.run.vm03.stdout: 16904579 0 d--------- 2 ubuntu ubuntu 6 Mar 10 14:05 /home/ubuntu/cephtest/mnt.0 2026-03-10T14:25:02.884 INFO:teuthology.orchestra.run.vm03.stderr:find: ‘/home/ubuntu/cephtest/mnt.0’: Permission denied 2026-03-10T14:25:02.884 INFO:teuthology.orchestra.run.vm03.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-10T14:25:02.898 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T14:25:02.899 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 48, in base yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm03 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-10T14:25:02.899 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-10T14:25:02.902 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T14:25:02.903 INFO:teuthology.run:Summary data: description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.0} 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/yes 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1570.7144978046417 failure_reason: reached maximum tries (50) after waiting for 300 seconds flavor: default owner: kyr status: fail success: false 2026-03-10T14:25:02.903 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T14:25:02.930 INFO:teuthology.run:FAIL